Neural Network example with Simple Python

1. Goal:

Predict a student’s final grade (0 to 1 scale) based on:

  • Test Score (out of 100)
  • Homework Score (out of 100)

She’ll train the smart notebook to learn how to combine these to predict the final grade (e.g., 0.95 means A+ student, 0.30 means weak performance).

Step-by-step Simulation in Python

Let’s simulate a tiny neural network with 1 neuron:


# Simple Neural Network Simulation – 1 Neuron

# Step 1: Inputs from students (Test Score, Homework Score)
data = [
    {"input": [90, 95], "target": 0.95},  # High performer
    {"input": [70, 65], "target": 0.70},  # Average student
    {"input": [50, 40], "target": 0.40},  # Struggling student
]

# Step 2: Initialize weights randomly (importance given to test & homework)
weights = [0.5, 0.5]  # Equal importance initially
bias = 0.0

# Step 3: Activation function (like brain neuron decides to fire or not)
def sigmoid(x):
    return 1 / (1 + (2.71828 ** -x))

# Step 4: Derivative of sigmoid (used for adjusting weights)
def sigmoid_derivative(x):
    return x * (1 - x)

# Step 5: Training the notebook
learning_rate = 0.01

for epoch in range(1000):
    for record in data:
        x1, x2 = record["input"]
        target = record["target"]

        # 1. Weighted sum
        z = x1 * weights[0] + x2 * weights[1] + bias

        # 2. Activation
        prediction = sigmoid(z)

        # 3. Error
        error = target - prediction

        # 4. Gradient calculation
        d_pred = error * sigmoid_derivative(prediction)

        # 5. Update weights & bias
        weights[0] += learning_rate * d_pred * x1
        weights[1] += learning_rate * d_pred * x2
        bias += learning_rate * d_pred

# Step 6: Try a prediction!
test_score = 85
homework_score = 80
z = test_score * weights[0] + homework_score * weights[1] + bias
final_grade = sigmoid(z)

print("Predicted grade (0 to 1 scale):", round(final_grade, 2))


What Just Happened (Story Mode Recap)

  • Miss Nira’s notebook guessed a grade.
  • She said “Wrong!” and gave the actual grade.
  • The notebook adjusted its weights (importance of test/homework).
  • After many tries (1000), it learned the pattern.
  • Now it can predict grades for new students!

Real Neural Networks just have many neurons in layers, and the same principles:

  • Weighted inputs
  • Activation (e.g., sigmoid)
  • Feedback and learning

2. A single neuron to a multi-layer neural network

The Smart Notebook Evolves – Hidden Layer Edition

Miss Nira noticed that 1 neuron was a bit too simple. It couldn’t capture some subtle patterns, like:

  • Some students do great in homework but average in tests.
  • Some consistently improve even if their scores aren’t the highest.

So she built a smarter version of her notebook — this time with:

  • Input Layer (Test Score, Homework Score)
  • Hidden Layer (2 neurons to “think” deeper)
  • Output Layer (1 neuron to predict final grade)

Python Simulation: A 2–1–1 Neural Network (2 Inputs → 2 Hidden Neurons → 1 Output)


# Multi-layer Neural Network (2-input → 2-hidden → 1-output)
# No libraries, pure Python

import math

# Activation functions
def sigmoid(x):
    return 1 / (1 + math.exp(-x))

def sigmoid_derivative(x):
    return x * (1 - x)

# Training data: [Test Score, Homework Score] → Final Grade
data = [
    {"input": [90, 95], "target": 0.95},
    {"input": [70, 65], "target": 0.70},
    {"input": [50, 40], "target": 0.40},
]

# Initialize weights
import random

# Layer 1: Input to Hidden (2 neurons in hidden layer)
weights_input_hidden = [
    [random.uniform(-1, 1), random.uniform(-1, 1)],  # for input 1 (test)
    [random.uniform(-1, 1), random.uniform(-1, 1)]   # for input 2 (homework)
]
bias_hidden = [random.uniform(-1, 1), random.uniform(-1, 1)]

# Layer 2: Hidden to Output
weights_hidden_output = [random.uniform(-1, 1), random.uniform(-1, 1)]
bias_output = random.uniform(-1, 1)

# Training loop
learning_rate = 0.01

for epoch in range(1000):
    for record in data:
        x = record["input"]
        target = record["target"]

        # ---- Forward Pass ----
        # Hidden layer input
        hidden_input = [
            x[0] * weights_input_hidden[0][i] + x[1] * weights_input_hidden[1][i] + bias_hidden[i]
            for i in range(2)
        ]

        # Hidden layer output
        hidden_output = [sigmoid(val) for val in hidden_input]

        # Output layer input
        output_input = sum([hidden_output[i] * weights_hidden_output[i] for i in range(2)]) + bias_output
        output = sigmoid(output_input)

        # ---- Backward Pass ----
        # Output error
   error = target - output
        d_output = error * sigmoid_derivative(output)

        # Hidden layer error
        d_hidden = [
            d_output * weights_hidden_output[i] * sigmoid_derivative(hidden_output[i])
            for i in range(2)
        ]

        # ---- Update Weights ----
        # Hidden to Output
        for i in range(2):
            weights_hidden_output[i] += learning_rate * d_output * hidden_output[i]
        bias_output += learning_rate * d_output

        # Input to Hidden
        for i in range(2):  # each input feature
            for j in range(2):  # each hidden neuron
                weights_input_hidden[i][j] += learning_rate * d_hidden[j] * x[i]
        for j in range(2):
            bias_hidden[j] += learning_rate * d_hidden[j]

# ---- Predict for New Student ----
def predict(test_score, homework_score):
    x = [test_score, homework_score]
    hidden_input = [
        x[0] * weights_input_hidden[0][i] + x[1] * weights_input_hidden[1][i] + bias_hidden[i]
        for i in range(2)
    ]
    hidden_output = [sigmoid(val) for val in hidden_input]
    output_input = sum([hidden_output[i] * weights_hidden_output[i] for i in range(2)]) + bias_output
    return sigmoid(output_input)

# Example Prediction
test_score = 85
homework_score = 80
final_grade = predict(test_score, homework_score)
print("Predicted Grade:", round(final_grade, 2))

Story Mode Explanation (Recap)

Miss Nira’s upgraded notebook:

  • Used two mini-judges (hidden neurons) to analyze student qualities.
  • One neuron might be sensitive to “consistency” (high scores in both).
  • Another might be sensitive to “improvement trend.”
  • These “mini-judges” passed their decisions to a final judge (output neuron), who gave the final grade prediction.

Over time, by comparing its predictions with actual results, the notebook:

  • Tuned all its weights,
  • Balanced the influence of each neuron,
  • And became smart enough to grade students even in new unseen cases.

That’s a Basic Deep Neural Network!

And this simulation is the foundation of:

  • Image recognition (like face detection)
  • Natural language processing (like ChatGPT!)
  • And even self-driving car decision systems

3. Neural Network Flow (Text-based Diagram)

Layer-by-Layer Breakdown:

Input Layer

  • Test Score
  • Homework Score

Hidden Layer (2 neurons)

  • H1 (Hidden Neuron 1): Might focus on strong academic performance overall
  • H2 (Hidden Neuron 2): Might focus on consistency or improvement patterns

Each neuron calculates a weighted sum, applies an activation function (sigmoid), and passes the result forward.

Output Layer

  • Takes outputs from H1 and H2
  • Applies a final weight+activation to give a predicted grade

Prediction Example:

If:

  • Test Score = 85
  • Homework Score = 80

Then after passing through all these layers and activations, the final output might be:

Predicted Grade = 0.91 → “A”

Neural Network (Primary Concepts) – Basic Math Concepts