• ↑↓ pour naviguer
  • pour ouvrir
  • pour sélectionner
  • ⌘ ⌥ ↵ pour ouvrir dans un panneau
  • esc pour rejeter
⌘ '
raccourcis clavier

denoted as DKL(PQ)D_{\text{KL}}(P \parallel Q)

definition

The statistical distance between a model probability distribution QQ difference from a true probability distribution PP:

DKL(PQ)=xXP(x)log(P(x)Q(x))D_{\text{KL}}(P \parallel Q) = \sum_{x \in \mathcal{X}} P(x) \log (\frac{P(x)}{Q(x)})

Alternative form 1:

KL(pq)=Exp(logp(x)q(x))=xP(x)logp(x)q(x)dx\begin{aligned} \text{KL}(p \parallel q) &= E_{x \sim p}(\log \frac{p(x)}{q(x)}) \\ &= \int_x P(x) \log \frac{p(x)}{q(x)} dx \end{aligned}

For relative entropy if x>0,Q(x)=0    P(x)=0\forall x > 0, Q(x) = 0 \implies P(x) = 0 absolute continuity

For distribution PP and QQ of a continuous random variable, then KL divergence is:

DKL(PQ)=+p(x)logp(x)q(x)dxD_{\text{KL}}(P \parallel Q) = \int_{-\infty}^{+ \infty} p(x) \log \frac{p(x)}{q(x)} dx

where pp and qq denote probability densities of PP and QQ