dummy

1 KLD

When there are two probability distributions \(P(x)\) and \(Q(x)\) for the same random variable x, the distance of these probability distributions can be evaluated using Kullback-Leibler (KL) divergence.

\[ \begin{align} D_{KL} (P || Q) &= \mathbb {E}_{x \sim P} [log \frac {P (x)} {Q (x)}] \\ & = \mathbb {E}_ {x \ sim P} [log P (x)-log Q (x)] \\ & = \ int_ {x} P (x) (log P (x)-log Q ( x)) \ end {align} \]

2 References

浅谈KL散度 JSD可視化