Gradient Descent Concept Relevancy in Neural Network

1. What is Gradient Descent in Neural Network?

Story Time: Climbing Down the Hill

Imagine we’re standing on a foggy mountain at night. Our goal? Reach the lowest point in the valley because that’s where the treasure is hidden.

But we can’t see far. So, what do we do? We can feel around with our feet and take small steps downhill, making sure each step is lower than the last.

Every time we take a step:

  • We’re checking how steep the slope is at our current position (this is the gradient).
  • We move a little in the direction that reduces our height (this is the descent).

Eventually, we reach the lowest point—the place where taking another step doesn’t make we go any lower. That’s your optimal solution.

In neural networks:

  • We want to minimize the error (difference between prediction and reality).
  • Gradient Descent helps us adjust the weights and biases so our network learns and improves.
  • It’s like saying: “Which direction should I move the weights so the prediction gets better?”

Gradient Descent concept relevancy in neural network – Gradient Descent example with Simple Python