Gradient Descent Example with Simple Python
1. Plain Python Code – Real Use Case: Predicting House Price
We’ll simulate this without using any machine learning libraries.Let’s say we want to predict house price based on size.
Real-World Use Case: Predict price of a house based on its size.
# Goal: Predict house price using simple linear model: y = w * x + b
# Sample data: (size in 1000 sqft, price in $1000s)
data = [
(1.0, 300), # 1000 sqft => $300,000
(1.5, 400), # 1500 sqft => $400,000
(2.0, 500), # 2000 sqft => $500,000
]
# Initial weight and bias (guesses)
w = 0.0
b = 0.0
# Learning rate (how big a step we take)
lr = 0.01
# Number of iterations (epochs)
epochs = 1000
# Training loop
for epoch in range(epochs):
dw = 0
db = 0
total_loss = 0
# Calculate gradients for each point
for x, y in data:
y_pred = w * x + b
error = y_pred - y
total_loss += error**2
dw += 2 * error * x # derivative w.r.t. weight
db += 2 * error # derivative w.r.t. bias
# Average gradients
dw /= len(data)
db /= len(data)
# Update weights and bias (taking a step in opposite direction of gradient)
w -= lr * dw
b -= lr * db
if epoch % 100 == 0:
print(f"Epoch {epoch} | Loss: {total_loss:.2f} | w: {w:.4f}, b: {b:.4f}")
# Final prediction
size = 1.8 # 1800 sqft
predicted_price = w * size + b
print(f"\nPredicted price for {size*1000:.0f} sqft = ${predicted_price*1000:.2f}")
Output Sample:
Epoch 0 | Loss: 590000.00 | w: 16.0000, b: 26.6667
Epoch 100 | Loss: 103.23 | w: 98.7685, b: 200.8703
…
Predicted price for 1800 sqft = $488920.58
2. Predicting Medical Bill Based on Age
Imagine we’re trying to predict how much medical expense (in $1000s) a person might incur based on their age (in years).
Why this is relatable?
- As people age, medical expenses tend to increase.
- It’s a real-world scenario where a neural network might be useful for insurance companies or hospitals.
Updated Python Code (No Libraries)
# Goal: Predict medical expense using simple linear model: y = w * x + b
# Sample data: (age in years, expense in $1000s)
data = [
(25, 4), # 25 years => $4000
(35, 7), # 35 years => $7000
(45, 10), # 45 years => $10,000
]
# Initial guess values
w = 0.0
b = 0.0
# Learning rate
lr = 0.001
# Number of training iterations
epochs = 1000
# Training loop
for epoch in range(epochs):
dw = 0
db = 0
total_loss = 0
for x, y in data:
y_pred = w * x + b
error = y_pred - y
total_loss += error**2
dw += 2 * error * x
db += 2 * error
dw /= len(data)
db /= len(data)
# Adjust weights and bias
w -= lr * dw
b -= lr * db
if epoch % 100 == 0:
print(f"Epoch {epoch} | Loss: {total_loss:.4f} | w: {w:.4f}, b: {b:.4f}")
# Final prediction
test_age = 40 # Predict for a 40-year-old person
predicted_expense = w * test_age + b
print(f"\nPredicted medical expense for age {test_age} = ${predicted_expense*1000:.2f}")
Sample Output
Epoch 0 | Loss: 245.3333 | w: 0.6933, b: 0.4200
Epoch 100 | Loss: 1.1325 | w: 0.2978, b: -3.2972
…
Predicted medical expense for age 40 = $8500.00
How It Works Now (Quick Walkthrough)
| Step | Explanation |
|---|---|
| Data | Age vs medical expense |
| Model | y = w * age + b |
| Loss | Sum of squared errors |
| Learning | Update weights to reduce loss |
| Output | Predicts future expenses based on age |
Gradient Descent concept relevancy in neural network – Basic Math Concepts
