Photo by Immo Wegmann on Unsplash

KL Divergence Python Example

Cory Maklin
Towards Data Science
5 min readAug 20, 2019

--

As you progress in your career as a data scientist, you will inevitable come across the Kullback–Leibler (KL) divergence. We can think of the KL divergence as distance metric (although it isn’t symmetric) that quantifies the difference between two probability distributions. One common scenario where this is useful is when we are working with a complex distribution. Rather than working with the distribution directly, we can make our life…

--

--