Binary Cross Entropy with Simple Python

Binary Cross Entropy Simulation (No Libraries)

import math

# Define a simple cross-entropy loss function
def cross_entropy_loss(y_true, y_pred):
    # Clamp predictions to avoid log(0)
    epsilon = 1e-15
    y_pred = max(min(y_pred, 1 - epsilon), epsilon)
    return - (y_true * math.log(y_pred) + (1 - y_true) * math.log(1 - y_pred))

# Test cases: (actual, predicted)
data = [
    (1, 0.9),  # good prediction
    (1, 0.1),  # bad prediction
    (0, 0.1),  # good prediction
    (0, 0.9),  # bad prediction
]

# Display loss for each prediction
for actual, predicted in data:
    loss = cross_entropy_loss(actual, predicted)
    print(f"Actual: {actual}, Predicted: {predicted:.2f} → Loss: {loss:.4f}")

Output Example

Actual: 1, Predicted: 0.90 → Loss: 0.1054
Actual: 1, Predicted: 0.10 → Loss: 2.3026
Actual: 0, Predicted: 0.10 → Loss: 0.1054
Actual: 0, Predicted: 0.90 → Loss: 2.3026

What This Shows

  • Smaller loss when the prediction is closer to the actual label.
  • Larger loss when the prediction is confident but wrong.

Binary Cross Entropy relevancy in Neural Network – Basic Math Concepts