This course was created with the
course builder. Create your online course today.
Start now
Create your course
with
Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Bayesian Methods for Machine Learning
Introduction to Bayesian methods
001. Think bayesian & Statistics review (7:26)
002. Bayesian approach to statistics (5:25)
003. How to define a model (3:22)
004. Example thief & alarm (11:10)
005. Linear regression (10:49)
Conjugate Priors
006. Analytical inference (3:50)
007. Conjugate distributions (2:59)
008. Example Normal, precision (5:34)
009. Example Bernoulli (4:30)
Latent Variable Models
010. Latent Variable Models (11:32)
011. Probabilistic clustering (6:32)
012. Gaussian Mixture Model (10:14)
013. Training GMM (10:41)
014. Example of GMM training (10:35)
Expectation Maximization Algorithm
015. Jensen's inequality & Kullback Leibler divergence (9:40)
016. Expectation-Maximization algorithm (10:50)
017. E-step details (12:21)
018. M-step details (6:34)
019. Example EM for discrete mixture, E-step (10:23)
020. Example EM for discrete mixture, M-step (12:17)
021. Summary of Expectation Maximization (6:46)
Applications and examples
022. General EM for GMM (12:37)
023. K-means from probabilistic perspective (9:44)
024. K-means, M-step (7:06)
025. Probabilistic PCA (13:07)
026. EM for Probabilistic PCA (7:24)
Variational Inference
027. Why approximate inference (5:15)
028. Mean field approximation (13:59)
029. Example Ising model (15:39)
030. Variational EM & Review (5:52)
Latent Dirichlet Allocation
031. Topic modeling (5:21)
032. Dirichlet distribution (6:44)
033. Latent Dirichlet Allocation (5:57)
034. LDA E-step, theta (11:46)
035. LDA E-step, z (8:50)
036. LDA M-step & prediction (13:22)
037. Extensions of LDA (5:09)
MCMC
038. Monte Carlo estimation (12:46)
039. Sampling from 1-d distributions (13:29)
040. Markov Chains (13:07)
041. Gibbs sampling (12:31)
042. Example of Gibbs sampling (7:54)
043. Metropolis-Hastings (8:17)
044. Metropolis-Hastings choosing the critic (8:43)
045. Example of Metropolis-Hastings (9:56)
046. Markov Chain Monte Carlo summary (8:50)
047. MCMC for LDA (15:22)
048. Bayesian Neural Networks (11:05)
Variational Autoencoders
049. Scaling Variational Inference & Unbiased estimates (6:25)
050. Modeling a distribution of images (10:32)
051. Using CNNs with a mixture of Gaussians (8:00)
052. Scaling variational EM (15:08)
053. Gradient of decoder (6:16)
054. Log derivative trick (6:43)
055. Reparameterization trick (7:58)
Variational Dropout
056. Learning with priors (5:50)
057. Dropout as Bayesian procedure (5:57)
058. Sparse variational dropout (5:42)
Gaussian Processes and Bayesian Optimization
059. Nonparametric methods (6:02)
060. Gaussian processes (8:04)
061. GP for machine learning (5:35)
062. Derivation of main formula (11:19)
063. Nuances of GP (12:11)
064. Bayesian optimization (10:15)
065. Applications of Bayesian optimization (5:05)
016. Expectation-Maximization algorithm
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock