大地小神
你们都是大傻瓜, 我是天下大赢家
-
dummy 阅读全文… -
Scatter,Covariance,Correlation Matrix
dummy 阅读全文… -
KL散度和JS散度
dummy 1 KLD
When there are two probability distributions \(P(x)\) and \(Q(x)\) for the same random variable x, the distance of these probability distributions can be evaluated using Kullback-Leibler (KL) divergence.
\[ \begin{align} D_{KL} (P || Q) &= \mathbb {E}_{x \sim P} [log \frac {P (x)} {Q (x)}] \\ & = \mathbb {E}_ {x \ sim P} [log P (x)-log Q (x)] \\ & = \ int_ {x} P (x) (log P (x)-log Q ( x)) \ end {align} \]
2 References
阅读全文… -
(Draft)RL-rlpyt
dummy 阅读全文… -
LSTM
-
GRU
-
Torch内存
-
Opera mp4 play
dummy sudo snap install chromium-ffmpeg
cp /snap/chromium-ffmpeg/15/chromium-ffmpeg-95241 /usr/lib/x86_64-linux-gnu/opera
-
(Draft)Tensor-scatter gather
dummy 阅读全文… -
(Draft)Tensor-index slice
dummy 阅读全文…