Hyperparameters example with Simple Python

We’ll show a minimal example with adjustable:

  • Learning rate
  • Number of epochs

import random

# === Dataset: simple y = 2 * x
inputs = [i for i in range(10)]
targets = [2 * i for i in inputs]

# === Model: one weight (w)
w = random.uniform(0, 1)  # Random init

# === Hyperparameters
learning_rate = 0.01
epochs = 20

print(f"Initial Weight: {w:.4f}")
print("Training Started...\n")

# === Training Loop
for epoch in range(epochs):
    total_loss = 0
    for x, y_true in zip(inputs, targets):
        y_pred = w * x
        error = y_pred - y_true
        loss = error ** 2

        # Gradient of loss w.r.t w: dL/dw = 2 * error * x
        gradient = 2 * error * x

        # Update weight using learning rate
        w -= learning_rate * gradient
        total_loss += loss

    print(f"Epoch {epoch+1}: Weight = {w:.4f}, Total Loss = {total_loss:.4f}")

print("\nTraining Completed.")
print(f"Final Weight: {w:.4f}")

Step-by-Step Explanation

  • We initialize a weight w randomly.
  • Inputs and targets represent a simple function: y = 2x.
  • Learning rate and epochs are hyperparameters.
  • Gradient Descent is used to adjust the weight.
  • After each epoch, we update the weight to reduce loss.

Hyperparameters in Neural Networks – Summary