![The Kullback–Leibler divergence between continuous probability distributions](https://blogs.sas.com/content/iml/files/2020/06/KLDivCont3-640x336.png)
In a previous article, I discussed the definition of the Kullback-Leibler (K-L) divergence between two discrete probability distributions. For completeness, this article shows how to compute the Kullback-Leibler divergence between two continuous distributions. When f and g are discrete distributions, the K-L divergence is the sum of f(x)*log(f(x)/g(x)) over all