Knowledge Distillation : Simplified

Take a peek into the world of Teacher Student networks

Prakhar Ganesh
Towards Data Science
4 min readAug 12, 2019

--

What is Knowledge Distillation?

Neural models in recent years have been successful in almost every field including extremely complex problem statements. However, these models are huge in size, with millions (and billions) of parameters, and thus cannot be deployed on edge devices.

Knowledge distillation refers to the idea of model compression by teaching a smaller network, step by step, exactly what to do…

--

--