Knowledge Distillation in Neural Networks

Vishal Rajput
AIGuys
Published in
5 min readApr 26, 2022

--

Knowledge distillation

Have you ever wondered, why we need teachers to learn things? Why can’t we learn everything on our own once we have learnt how to read, write and comprehend? The problem lies in what to learn and how much to learn. A teacher has many years of experience and thus he/she can guide us to focus on the important things only. It can help us in distilling the knowledge, knowing the important bits and pieces. The same thing…

--

--

Vishal Rajput
AIGuys

3x🏆Top writer in AI | Author: Ultimate NN Programming with Python | 🤝LinkedIn https://www.linkedin.com/in/vishal-rajput-999164122/