filmov
tv
Gaussian Mixture Model | Intuition & Introduction | TensorFlow Probability
Показать описание
If your (univariate) distribution has more than one mode (peaks), there is a good chance you can model it with a Gaussian Mixture Model (GMM), a Mixture Distribution of Gaussian/Normal. That is helpful for a soft clustering of points in one dimension. For this you select the number of modes you expect (= the number of peaks). This will then correspond to the number of (latent) classes as well as the number of Gaussians that have to be defined.
In this video, I provide an intuition to this by looking at the grade distribution after an exam, with a first peak at 2.5 and a second peak at the grade corresponding to a fail. We will implement this model in TensorFlow Probability.
-------
-------
Timestamps:
00:00 Introduction
00:38 A Multi-Modal Distribution
01:10 Clustering of Points
02:04 A Superposition of Gaussians?
03:59 Using Mixture Coefficients
05:05 A special case of Mixture Distributions
05:33 The Directed Graphical Model
07:52 Alternative Model with plates
08:45 The joint
10:28 TFP: Defining the Parameters
11:27 TFP: The Categorical
12:12 TFP: The batched Normal
13:13 TFP: GMM in Principle
14:13 TFP: Using the TFP Mixture Distribution
15:15 TFP: Plotting the probability density
17:05 Outro
Комментарии