Encoding in Neural Networks
Part 1: Story-like Analogy – “The Language of the Robots”
Imagine a world where robots process information, but they don’t understand human languages or categories like we do.
A robot trainer’’s task is to teach a robot how to understand customer reviews – whether a review is “positive”, “neutral”, or “negative”.
Now here’s the twist:
- To the robot, words like “positive” mean nothing.
- It only understands numbers.
- So you need to translate or “encode” that word into a number.
You decide to do the following:
Sentiment | Code |
---|---|
positive | [1, 0, 0] |
neutral | [0, 1, 0] |
negative | [0, 0, 1] |
This is called One-Hot Encoding — it’s like giving the robot a unique signal for each category.
Now the robot can work with the data — this is the first step before feeding into the neural network.
Part 2: Implementation Idea – Encoding in Neural Networks
In neural networks, we encode categorical or text data into numerical format:
- One-hot encoding
- Label encoding
- Embedding (for advanced text like in NLP)
Encoding in Neural Networks – Encoding example with simple python