Basic Math Concepts – Overfitting vs Underfitting Impact in Neural Network

Underfitting happens when:

  • Model is too simple → not enough parameters
  • Example: Linear model trying to fit curved data
  • Prediction:

    ŷ = w ⋅ x + b

    Can’t capture real pattern if the actual function is, say, quadratic or more complex.

Overfitting happens when:

  • Too many parameters → model fits even the noise
  • Example: High-degree polynomial model

    Screenshot

    captures tiny fluctuations which do not generalize.

Generalization Gap:

Generalization Gap=Training Loss−Validation Loss

A big gap means possible overfitting.

Quick Visual (ASCII Chart)

Graph Visualization

null

Here’s the visual graph showing how training error and validation error behave as model complexity increases:

  • On the left side: both errors are high → underfitting.
  • In the middle: both errors are low → ideal fit.
  • On the right side: training error is low, but validation error increases → overfitting.

Next – Mean Square Error usage in Neural Network Use Cases