Deep Learning is currently dominating the machine-learning realm, and will be so for the near future. However, probabilistic methods, especially the Bayesian methodologies are still useful in many areas. Interestingly, many deep learning mythologies can be found to have close resemblance with their Bayesian counterparts, for example, those between Recurrent Neural Networks and Recursive Bayesian Estimation. Many probability methods have also assisted the deep learning methodologies such as the classic Expectation Maximization has been found to assist recently popular matrix capsule’s routing algorithm, as well as how Gumbel-Max trick has been helping to re-parametrize Softmax distributions. In this class, we will go through some of the popular probabilistic models and making some very interesting relationships with many recently popular Deep Learning methodologies.

  1. EM - Matrix Capsule Net
  2. Determinant Point Process - NN Compression
  3. LSTM -> Kalman Filter