Activation Function relevancy in Neural Network
1. What is an Activation Function? (In Simple Words)
In a neural network, each neuron calculates a number (a weighted sum), but before it passes that number to the next layer, it applies a special function to it — this is called the activation function.
Think of it as a gatekeeper or decision-maker inside each neuron. It decides:
“Should I activate and pass this signal forward — or not?”
“Should I make the signal stronger, weaker — or squish it into a smaller range?”
Why Do We Need Activation Functions?
Without an activation function:
- A neural network would just be doing a big linear equation — no matter how many layers we add.
- It wouldn’t be able to learn complex patterns like recognizing faces, translating languages, or predicting the stock market.
With activation functions:
- The network can learn non-linear things — like “If the temperature is high and the humidity is low, then it’s probably a desert.”
Real-Life Analogy
Imagine we’re trying to decide whether to go jogging based on two things:
- The temperature
- Whether it’s raining
We create a formula like:
score = (0.5 × temperature) + (-1.0 × rain_value)
where rain_value = 1 if it’s raining, and 0 if it’s not.
Now, instead of blindly using that score, we add a rule like:
“If the score is positive, go jogging. If it’s negative, stay home.”
This rule is like an activation function.
Common Activation Functions (Made Simple)
Function | What It Does (Simply) | Output Range |
---|---|---|
ReLU | Keeps positive values, turns negatives into 0 | 0 to ∞ |
Sigmoid | Squashes any number between 0 and 1 (like a soft yes/no) | 0 to 1 |
Tanh | Like sigmoid, but squashes between -1 and 1 (yes/no with strength) | -1 to 1 |
How It Works Inside a Neural Network (Step-by-Step)
- Input Layer: We give inputs (like age, salary, etc.)
- Weighted Sum: Each neuron adds up inputs × weights + bias.
- Activation Function: This sum is passed through a function (like ReLU or Sigmoid).
- Output: The activated value goes to the next layer.
- Repeat until the final output.
2. Example in Real Numbers
Let’s say a neuron calculates a sum:
z = 2.5
If we use ReLU activation:
output = max(0, 2.5) = 2.5
If we use Sigmoid activation:
output = 1 / (1 + e^-2.5) ≈ 0.92
So different activation functions transform the signal differently.
Why So Important?
Because without them, our network is just a stack of basic math.
With them, our network can:
- Make decisions
- Understand patterns
- Learn from data!
Activation Function relevancy in Neural Network – Activation Function example with Simple Python