Tensor, Weight and Neural Network
1. Imagine This: A Smart Loan Approval Assistant
A bank wants to build a Smart Loan Approval System using AI. People apply with:
- Age
- Income
- Loan Amount
- Credit Score
The bank wants to predict: Will this person repay the loan or default? To do this, they use a Neural Network trained on thousands of past applicants’ data.
Where Do Tensors Come In?
Think of Tensors as structured containers to hold data:
- A tensor is like a fancy, multidimensional spreadsheet.
- For one applicant:
[Age=35, Income=45000, Loan=10000, CreditScore=750]
- This is a 1D tensor (vector).
- When you put 1000 applicants’ data together: it’s a 2D tensor or matrix — shape [1000, 4]
In programming:
applicants = [
[35, 45000, 10000, 750],
[28, 32000, 5000, 680],
…
] # This is a tensor (2D list)
Where Do Weights Come In?
Imagine the Neural Network is like a decision-making machine.
For each input feature (like Age or Income), the network assigns a weight that determines how important that feature is in predicting repayment.
For example:
- Maybe Credit Score is given a high weight: 0.9
- Income might get: 0.6
- Loan amount: -0.7 (high loan might negatively impact outcome)
These weights adjust themselves during training using something called backpropagation.
Simple simulation:
weights = [0.1, 0.6, -0.7, 0.9]
input = [35, 45000, 10000, 750]weighted_sum = sum(i * w for i, w in zip(input, weights))
How Neural Network Computing Happens
Neural network computing = data flowing through layers of neurons, like:
- Input Layer: Takes tensors (your input data)
- Hidden Layers: Do internal math using weights and activation functions
- Output Layer: Gives final prediction: e.g., [0.85] = 85% chance of repayment
All of this is matrix math under the hood — tensors multiplied by weights, then adjusted based on prediction correctness.
Real-Life Flow:
Step | Role of Tensor/Weight/NN |
---|---|
Applicant data stored | Tensors |
Initial importance scores | Weights |
Compute weighted sum | Multiply tensors by weights |
Adjust weights after error | Backpropagation in Neural Network |
Predict repayment | Output from NN computing |
Python-Style Summary (No Libraries):
# Input tensor (1 person) input_tensor = [35, 45000, 10000, 750] # Initial weights (guessed) weights = [0.1, 0.6, -0.7, 0.9] # Weighted sum z = sum(i * w for i, w in zip(input_tensor, weights)) # Simple activation (sigmoid-like) output = 1 / (1 + (2.718 ** (-z))) print("Predicted repayment probability:", output)
Output:
Predicted repayment probability: 1.0
- Tensor stores and transmits the data.
- Weight decides how strongly each input influences the prediction.
- Neural Network computing is the full machinery—applying weights, combining results, learning from errors, and improving predictions.
Final Thoughts
- Tensor stores and transmits the data.
- Weight decides how strongly each input influences the prediction.
- Neural Network computing is the full machinery—applying weights, combining results, learning from errors, and improving predictions.
This combo is what powers systems like:
- Loan approval models
- Credit fraud detection
- Recommendation engines
- Voice recognition
Tensor,Weight and Neural Network – Tensor,Weight & Neural Network example with Simple Python