Basic Math Concepts – Hidden Layers Relevancy in Neural Network
1. Real-World Setup Recap (with math symbols)
Let’s say we want to judge a cake using a neural network.
We have:
- Inputs:
- x₁: Sweetness
- x₂: Moistness
1. Input Layer → Hidden Layer 1 (Basic Transformation)
Let’s say Hidden Layer 1 has 2 neurons (just like 2 mini-judges).
Each neuron takes both inputs, applies weights (w), adds a bias (b), and then passes the signal through an activation function f.
Neuron 1:
z₁ = x₁·w₁₁ + x₂·w₂₁ + b₁a₁ = f(z₁)
Neuron 2:
z₂ = x₁·w₁₂ + x₂·w₂₂ + b₂a₂ = f(z₂)
- a₁, a₂ are the outputs from Hidden Layer 1
- These are like “judgment summaries” from expert judges based on combinations of sweetness and moistness.
Let’s plug in some real numbers for better understanding:
- Inputs: x₁ = 7 (Sweetness), x₂ = 6 (Moistness)
- Weights:
- Neuron 1: w₁₁ = 0.5, w₂₁ = 0.4, b₁ = 1
- Neuron 2: w₁₂ = -0.3, w₂₂ = 0.8, b₂ = 0.5
- Activation function: Let’s use ReLU (Rectified Linear Unit):
f(z) = max(0, z)
Now calculate:
z₁ = 7·0.5 + 6·0.4 + 1 = 3.5 + 2.4 + 1 = 6.9
a₁ = max(0, 6.9) = 6.9z₂ = 7·(-0.3) + 6·0.8 + 0.5 = -2.1 + 4.8 + 0.5 = 3.2
a₂ = max(0, 3.2) = 3.2
So now, Hidden Layer 1 outputs:
[a₁, a₂] = [6.9, 3.2]
2. Hidden Layer 1 → Hidden Layer 2 (Deeper Thinking)
Now let’s say Hidden Layer 2 has 1 neuron (a high-level judge who evaluates previous judgments).
z₃ = a₁·w₃₁ + a₂·w₃₂ + b₃a₃ = f(z₃)
Let’s use:
- w₃₁ = 0.6, w₃₂ = 0.9, b₃ = -1
- Inputs: a₁ = 6.9, a₂ = 3.2
Calculate:
z₃ = 6.9·0.6 + 3.2·0.9 – 1 = 4.14 + 2.88 – 1 = 6.02
a₃ = max(0, 6.02) = 6.02
3. Final Output (Cake Score)
Output Layer is a single neuron giving final score:
score = a₃*w₄ + b₄
Assume:
- w₄ = 1.2, b₄ = 0.3
score = 6.02 · 1.2 + 0.3 = 7.524
Final Cake Score = 7.524 / 10
4. So How Does “Ensemble Effect” Happen?
- Layer 1 mixes inputs differently (different weights) → basic interpretations.
- Layer 2 combines Layer 1 outputs → higher abstraction.
- Output Layer converts that into one final signal.
Each layer adds complexity, filtering, and intelligence. They ensemble their judgment step-by-step like a decision pipeline.
Hidden Layers relevancy in Neural Network – Hidden Layers example with Simple Python