L1 Regularization example with Simple Python

1. This is a super basic 1-layer neural net example (no external libraries):

import random

# Generate some fake data
X = [i for i in range(-10, 11)]
y = [2 * xi + 1 for xi in X]  # true relation y = 2x + 1

# Initialize weight and bias
w = random.uniform(-1, 1)
b = random.uniform(-1, 1)

# Training with L1 Regularization
lr = 0.01  # learning rate
lambda_l1 = 0.1  # L1 regularization strength

for epoch in range(100):
    total_loss = 0
    dw, db = 0, 0

    for xi, yi in zip(X, y):
        y_pred = w * xi + b
        error = y_pred - yi
        total_loss += error ** 2

        # Gradient for MSE
        dw += 2 * error * xi
        db += 2 * error

    # Add L1 Regularization Gradient (Sign function)
    dw += lambda_l1 * (1 if w > 0 else -1)

    # Update weights and bias
    w -= lr * dw / len(X)
    b -= lr * db / len(X)

    # Print loss every 10 epochs
    if epoch % 10 == 0:
        print(f"Epoch {epoch}: Loss = {total_loss / len(X):.4f}, Weight = {w:.4f}, Bias = {b:.4f}")

Output:

Epoch 0: Loss = 134.6831, Weight = 1.4956, Bias = -0.8370
Epoch 10: Loss = 2.3457, Weight = 1.9999, Bias = -0.5009
Epoch 20: Loss = 1.5660, Weight = 1.9999, Bias = -0.2264
Epoch 30: Loss = 1.0455, Weight = 1.9999, Bias = -0.0020
Epoch 40: Loss = 0.6980, Weight = 1.9999, Bias = 0.1813
Epoch 50: Loss = 0.4660, Weight = 1.9999, Bias = 0.3310
Epoch 60: Loss = 0.3111, Weight = 1.9999, Bias = 0.4534
Epoch 70: Loss = 0.2077, Weight = 1.9999, Bias = 0.5534
Epoch 80: Loss = 0.1386, Weight = 1.9999, Bias = 0.6351
Epoch 90: Loss = 0.0926, Weight = 1.9999, Bias = 0.7018

L1 Regularization in Neural Network – Basic Math Concepts