Basic Math Concepts – Gradient Descent Concept Relevancy in Neural Network

To really understand Gradient Descent, we should know:

1. Linear Equation Basics:

  • Understand y = mx + c (we use y = wx + b in neural networks)

2. Mean Squared Error (Loss Function):

  • MSE = average of (predicted – actual)²
  • Helps us know how “bad” the prediction is.

3. Derivatives / Slopes:

  • How a small change in weight affects the loss
  • We don’t need to do calculus, but knowing that derivative = slope helps.

4. Minimization:

  • We want to reduce the loss → we follow the negative slope

5. Learning Rate:

  • A small number that controls how fast or slow you move in the right direction.

Summary Visual (Text Style)

Gradient Descent concept relevancy in neural network – Visual Roadmap