Logistic Regression example with Simple Python
1. What We’ll Simulate:
We’ll build a Logistic Regression model to predict if a student passes based on hours studied.
- Input: hours studied
- Output: 1 (pass) or 0 (fail)
- We’ll simulate a very tiny dataset for training
- We’ll use the sigmoid function to convert scores into probabilities
Step-by-Step Plan:
1.Sigmoid function
2.Simple dataset
3.Training loop using Gradient Descent
4.Prediction based on probability
Logistic Regression in Pure Python (No Libraries)
import math
# Step 1: Sigmoid Function
def sigmoid(z):
return 1 / (1 + math.exp(-z))
# Step 2: Training Data — [Hours Studied, Passed (1/0)]
data = [
[1, 0], # 1 hour, failed
[2, 0],
[3, 0],
[4, 1],
[5, 1],
[6, 1]
]
# Step 3: Initialize weights and bias
weight = 0.0
bias = 0.0
learning_rate = 0.1
# Step 4: Training using Gradient Descent
for epoch in range(1000):
total_loss = 0
for x, y in data:
z = weight * x + bias
pred = sigmoid(z)
error = pred - y
# Gradients
d_weight = error * x
d_bias = error
# Update weights
weight -= learning_rate * d_weight
bias -= learning_rate * d_bias
# Log loss (optional)
loss = - (y * math.log(pred) + (1 - y) * math.log(1 - pred))
total_loss += loss
if epoch % 100 == 0:
print(f"Epoch {epoch}: Loss = {total_loss:.4f}")
# Step 5: Predict function
def predict(hours):
z = weight * hours + bias
prob = sigmoid(z)
return 1 if prob >= 0.5 else 0, prob
# Step 6: Test prediction
test_hours = 3.5
result, probability = predict(test_hours)
print(f"\nPredicted: {'Pass' if result else 'Fail'} with probability {probability:.2f} for {test_hours} hours studied")
Output Sample (Will vary a bit):
Epoch 0: Loss = 4.1585
Epoch 100: Loss = 1.0204
Epoch 200: Loss = 0.6245
…
Predicted: Pass with probability 0.75 for 3.5 hours studied
What We Just Did:
- Created a tiny Logistic Regression from scratch.
- Used math to mimic how models “learn” by adjusting weights.
- Predicted probabilities and made binary decisions.
Logistic Regression – Basic Math Concepts
