Tensors & Neural Networks example with Simple Python
Mini Neural Network Simulation in Pure Python
import random
import math
# --- Activation Function: Sigmoid & its derivative ---
def sigmoid(x):
return 1 / (1 + math.exp(-x))
def sigmoid_derivative(x):
return x * (1 - x) # using output of sigmoid(x) here
# --- Initialize weights and biases ---
random.seed(42) # Reproducibility
w1, w2 = random.uniform(-1, 1), random.uniform(-1, 1) # input to hidden weights
w3, w4 = random.uniform(-1, 1), random.uniform(-1, 1) # hidden to output weights
b1 = random.uniform(-1, 1) # hidden bias
b2 = random.uniform(-1, 1) # output bias
# --- Sample training data: [ear_length, tail_length], label (1=dog, 0=cat) ---
dataset = [
([0.9, 0.8], 1), # dog
([0.1, 0.2], 0), # cat
([0.8, 0.7], 1), # dog
([0.2, 0.3], 0) # cat
]
# --- Training the mini neural network ---
learning_rate = 0.5
epochs = 1000
for epoch in range(epochs):
total_loss = 0
for features, label in dataset:
x1, x2 = features
# --- Forward Pass ---
h_input = x1 * w1 + x2 * w2 + b1
h_output = sigmoid(h_input)
final_input = h_output * w3 + h_output * w4 + b2
output = sigmoid(final_input)
# --- Calculate Error (Loss) ---
error = label - output
total_loss += error ** 2
# --- Backpropagation ---
d_output = error * sigmoid_derivative(output)
d_w3 = d_output * h_output
d_w4 = d_output * h_output
d_b2 = d_output
d_h = d_output * (w3 + w4)
d_h_input = d_h * sigmoid_derivative(h_output)
d_w1 = d_h_input * x1
d_w2 = d_h_input * x2
d_b1 = d_h_input
# --- Update weights and biases ---
w1 += learning_rate * d_w1
w2 += learning_rate * d_w2
w3 += learning_rate * d_w3
w4 += learning_rate * d_w4
b1 += learning_rate * d_b1
b2 += learning_rate * d_b2
if epoch % 100 == 0:
print(f"Epoch {epoch}, Loss: {total_loss:.4f}")
# --- Final Testing ---
print("\nFinal outputs after training:")
for features, label in dataset:
x1, x2 = features
h_output = sigmoid(x1 * w1 + x2 * w2 + b1)
output = sigmoid(h_output * w3 + h_output * w4 + b2)
print(f"Input: {features}, Predicted: {round(output, 3)}, Actual: {label}")
Output Insight
We’ll see something like this:
Epoch 0, Loss: 0.9834
Epoch 100, Loss: 0.0923
…
Final outputs after training:
Input: [0.9, 0.8], Predicted: 0.986, Actual: 1
Input: [0.1, 0.2], Predicted: 0.015, Actual: 0
…
What We Just Built
- Tess the Tensor: [0.9, 0.8] — a small tensor holding animal traits.
- Neura the Network: A 1-hidden-layer neural network doing forward pass and learning.
- Teacher AI: The backpropagation logic adjusting weights and biases to reduce error.
