Encoding example with simple python
Story Analogy: “The Weighted Voting Machine”
Imagine we have a robot — let’s call him Neo.
Neo wants to learn how to judge the sentiment of a word:
- “great” (should be positive),
- “okay” (should be neutral),
- “bad” (should be negative).
But Neo doesn’t understand language. So we will create three light switches, each corresponding to a sentiment.
Each word turns on certain switches using weights. If a switch gets more power (numerical weight), it turns on brighter.
Neo picks the sentiment of the brightest glowing switch — no need for complicated probability math.
This is what we simulate using encoding + basic scoring.
Python Code (Without Softmax, No Libraries)
# Step 1: Encode input words
def encode_input(word):
if word == "great":
return [1, 0, 0]
elif word == "okay":
return [0, 1, 0]
elif word == "bad":
return [0, 0, 1]
# Step 2: Encode output sentiments
def encode_output(sentiment):
if sentiment == "positive":
return [1, 0, 0]
elif sentiment == "neutral":
return [0, 1, 0]
elif sentiment == "negative":
return [0, 0, 1]
# Step 3: Training data
X = [encode_input("great"), encode_input("okay"), encode_input("bad")]
Y = [encode_output("positive"), encode_output("neutral"), encode_output("negative")]
# Step 4: Initialize weights manually (3 inputs x 3 outputs)
weights = [
[0.2, 0.1, -0.3], # input for "great"
[0.0, 0.5, -0.1], # input for "okay"
[-0.4, -0.2, 0.6] # input for "bad"
]
learning_rate = 0.1
# Step 5: Predict output (no softmax, just raw scores)
def predict(input_vector):
scores = [0, 0, 0]
for i in range(3): # output class
for j in range(3): # input features
scores[i] += input_vector[j] * weights[j][i]
return scores
# Step 6: Train with simple score difference
def train(X, Y, epochs=300):
global weights
for epoch in range(epochs):
total_error = 0
for i in range(len(X)):
input_vec = X[i]
target = Y[i]
predicted_scores = predict(input_vec)
# Convert raw scores to 1 (max score), others 0
max_index = predicted_scores.index(max(predicted_scores))
predicted_output = [1 if j == max_index else 0 for j in range(3)]
# Update weights with simple difference rule
for j in range(3): # output class
error = target[j] - predicted_output[j]
total_error += abs(error)
for k in range(3): # input feature
weights[k][j] += learning_rate * error * input_vec[k]
if epoch % 50 == 0:
print(f"Epoch {epoch} - Total Error: {total_error}")
# Step 7: Train the model
train(X, Y)
# Step 8: Test the model
test_words = ["great", "okay", "bad"]
sentiments = ["positive", "neutral", "negative"]
for word in test_words:
encoded = encode_input(word)
result_scores = predict(encoded)
predicted_label = sentiments[result_scores.index(max(result_scores))]
print(f"Input: {word} → Prediction: {predicted_label}")
Next – One-Hot in Neural Network
