# Evidence lower bound

In statistics, the **evidence lower bound** (**ELBO**, also **variational lower bound** or **negative variational free energy**) is the quantity optimized in Variational Bayesian methods. These methods handle cases where a distribution over unobserved variables is optimized as an approximation to the true posterior , given observed data . Then the *evidence lower bound* is defined as [1]:

where is cross entropy. Maximizing the evidence lower bound minimizes , the Kullback–Leibler divergence a measure of dissimilarity of from the true posterior. The primary reason why this quantity is preferred for optimization is that it can be computed without access to the posterior, given a good choice of .

For other measures of dissimilarity to be optimized to fit see Divergence (statistics)[2].

## References

- Yang, Xitong. "Understanding the Variational Lower Bound" (PDF).
*Institute for Advanced Computer Studies*. University of Maryland. Retrieved 20 March 2018. - Minka, Thomas (2005),
*Divergence measures and message passing.*(PDF) - Bishop, Christopher M. (2006), "10.1 Variational Inference",
*Pattern Recognition and Machine Learning*(PDF)