Gdshen' Blog


  • Home

  • Archives

  • Tags

basic math for deep learning

Posted on 2017-03-22

Calculus

Linear Algebra

  • scalar
  • vector
    • row vector
    • column vector
  • matrix

Probability and Statistics

Information theory

KL-divergence(or relative entropy)

Kullback-Leibler divergence, a.k.a. relative entropy

For discrete probability distribution $P$ and $Q$, the Kullback-Leibler divergence from $Q$ to $p$ is defined to be

For distributions $P$ and $Q$ of a continuous random variable, the Kullback-Leibler divergence is defined to be the integral:

The above form of KL-divergence can also be called as entropy of $P$ relative to $Q$.

Cross entropy

The cross entropy for distribution $p$ and $q$ over a given set is defined as follows

Note that often we denote $q$ as the true distribution and $p$ as an “unnatural” probability distribution.

make-static-blog-on-github-pages-from-scratch

Posted on 2017-03-22

Procedure

  1. initiate a repo in github
  2. hexo init
  3. install dependency
  4. set travis to auto deploy
Read more »
Guodong Shen

Guodong Shen

2 posts
RSS
  • Liu Jiang
© 2017 Guodong Shen
Powered by Hexo
Theme - NexT.Pisces