Error Reduction in prediction of Results with Simple Python

Simple Python Pseudocode (No Libraries)


# A single weight learning with gradient descent
weight = 0.5   # initial guess
learning_rate = 0.01

# Dummy training data: input x = 2, target y = 4
for epoch in range(10):
    x = 2
    y = 4

    # Prediction
    y_pred = weight * x

    # Loss (MSE)
    loss = (y - y_pred) ** 2

    # Gradient of loss w.r.t. weight
    grad = -2 * x * (y - y_pred)

    # Weight update
    weight = weight - learning_rate * grad

    print(f"Epoch {epoch+1}: Weight={weight:.4f}, Loss={loss:.4f}")

We’ll notice:

  • Loss decreases
  • Weight adjusts toward the optimal value (here, ideal weight is 2)

A graphical Chart

null

Here’s the visual chart:

  • Left graph shows how the weight is gradually adjusting over each epoch, moving toward the optimal value (which is 2 for this case).
  • Right graph shows the loss steadily decreasing, indicating that the model is learning and predictions are improving.

Next – Backpropagation in Neural Networks