Why Is Cross Entropy Equal to KL-Divergence?

J. Rafid Siddiqui, PhD
Towards Data Science
4 min readFeb 6, 2022

--

Figure 1: Two probability distributions sampled from normal distribution (Image by author)

It is a common practice to use cross-entropy in the loss function while constructing a Generative Adversarial Network [1] even though original concept suggests the use of KL-divergence. This creates confusion often for the person new to the field. In this article we go through the concepts of entropy, cross-entropy and Kullback-Leibler Divergence [2] and see how they can be approximated to be equal.

--

--

AI Research Scientist, Educator, and Innovator.Writes about Deep learning, Computer Vision, Machine Learning, AI, & Philosophy. bit.ly/MLMethodsBook