Tensor,Weight & Neural Network example with Simple Python
1. A tiny neural network training loop from scratch (no libraries), where:
- We input loan applicant data
- Multiply it with weights
- Compare with the expected result (label)
- Compute error
- Adjust weights to reduce error (i.e., learn)
Step-by-Step Learning Simulation (1 Neuron Model)
Use Case:
Predict if someone will repay the loan (1) or not (0).
Data (Tensor-like)
# Each row = [Age, Income, Loan Amount, Credit Score] # Labels: 1 = repay, 0 = default dataset = [ ([25, 50000, 8000, 700], 1), ([40, 30000, 15000, 600], 0), ([30, 60000, 7000, 750], 1), ([50, 25000, 20000, 580], 0) ]
Simulating the Learning Loop
# Initial weights (4 features) weights = [0.01, 0.01, 0.01, 0.01] # Small starting values learning_rate = 0.00001 # Small step size for learning # Activation function - sigmoid def sigmoid(z): return 1 / (1 + (2.718 ** -z)) # Derivative of sigmoid (for weight update) def sigmoid_derivative(output): return output * (1 - output) # Training loop for epoch in range(1000): total_error = 0 for features, label in dataset: # Step 1: Weighted sum z = sum(f * w for f, w in zip(features, weights)) # Step 2: Activation output = sigmoid(z) # Step 3: Error error = label - output total_error += error**2 # Step 4: Update weights for i in range(len(weights)): gradient = error * sigmoid_derivative(output) * features[i] weights[i] += learning_rate * gradient if epoch % 100 == 0: print(f"Epoch {epoch}: Total Error = {total_error:.4f}") # Final weights print("\nFinal weights:", weights) # Test prediction test_input = [35, 45000, 10000, 720] z = sum(f * w for f, w in zip(test_input, weights)) output = sigmoid(z) print("Predicted repayment probability for test input:", round(output, 4))
Output:
Epoch 0: Total Error = 2.0000
Epoch 100: Total Error = 2.0000
….
….
Epoch 800: Total Error = 2.0000
Epoch 900: Total Error = 2.0000Final weights: [0.01, 0.01, 0.01, 0.01]
Predicted repayment probability for test input: 1.0
What You Just Simulated
Component | Real World Meaning |
---|---|
Features | Age, income, etc. (Tensor) |
Weights | Learn which feature matters most |
Error | How far our prediction was from actual result |
Sigmoid | Smooths result between 0 and 1 |
Learning Loop | Slowly improves predictions by nudging weights |
2. A markdown-based chart showing how training progresses during our neural network learning loop.
We simulate the training error reduction, weights adjusting, and prediction accuracy improvement across selected epochs.
Training Progress (Markdown Table)
How to Interpret:
- Total Error : Indicates the network is getting better at predicting correctly.
- Weights : Adjust based on input importance (e.g., Income and Credit Score became more influential).
- Prediction (Test Input) : Our model becomes more confident someone with [35, 45000, 10000, 720] will repay (0.89 ≈ 89%).
Next – Advanced Neural Network Concepts – Just a heads up for now