Summary – Prediction Error in Neural Network
- Error = Actual – Predicted
- Neural networks use this error to adjust weights through backpropagation.
- Loss functions (like MSE) are mathematical forms of error.
- Minimizing error is the goal of training.
- Persistently high error means the model is failing to learn patterns in the data.
- Backpropagation uses gradients to measure the impact of weights on the error.
- Gradients guide how much to adjust each weight and bias.
- Repeating this process allows the model to learn the pattern and reduce the error over time.