Basic Math Concepts – Standardization in Neural Networks

Concept Why It’s Needed
Mean To center the feature
Standard Deviation To scale it to unit variance
Sigmoid To squash outputs in neural nets
Derivative For backpropagation (chain rule)
Basic Linear Algebra Weight updates and dot products
Squared Error Loss calculation

Standardization in Neural Networks – Summary