Skip to content

Latest commit

 

History

History
17 lines (13 loc) · 468 Bytes

kullback_leibler_divergence.md

File metadata and controls

17 lines (13 loc) · 468 Bytes
tags
information_theory

Kullback–Leibler divergence

Kullback-Leibler divergence (or KL divergence or relative entropy) is defined for two random variables $P$ and $Q$ as:

$$ D_{KL}(P || Q) = H(P, Q) - H(P) = \mathbb{E}_{x \sim P(x)} \left[\log \frac{P(x)}{Q(x)} \right], $$

where $H$ is a cross-entropy. KL divergence tells us how much surprise we add if we asses the surprise of an event according to $Q$, instead of $P$.