Basic Math Concepts – Neural Network (Primary Concepts)
1. Algebra
- Understanding variables, equations, and expressions
- Working with linear combinations: y = w1*x1 + w2*x2 + b
- Solving simple equations and interpreting coefficients
Why it matters: Neurons compute weighted sums of inputs using algebra.
2. Functions
- Concept of input → output mapping
- Familiarity with common functions (e.g., linear, sigmoid, tanh, ReLU)
- Graphing simple functions
Why it matters: Activation functions in neurons decide how signals pass through.
3. Basic Calculus (Conceptual Level)
- What a derivative means: change in output with respect to input
- Gradient: direction of steepest change
- Slope and optimization intuition (e.g., finding minima)
Why it matters: Backpropagation uses gradients to update weights via calculus.
4. Probability & Statistics (Very Basic)
- Mean, variance, and normalization
- Concept of error/loss (e.g., Mean Squared Error)
- Basic intuition of overfitting/underfitting
Why it matters: Neural networks try to reduce error using statistical reasoning.
5. Matrix Operations (Optional but Helpful)
- Vectors and matrices
- Dot product and transpose
- Element-wise operations
Why it matters: Neural networks use matrix multiplication for forward and backward propagation, especially in multi-layered models.