Gaussian Mixture Model (GMM): Introduction [E12]

preview_player
Показать описание
In this video, I have given an introduction to the Gaussian Mixture Model, I talked about the Maximum likelihood, which in the next lecture will be elaborated and then we will find the parameters mean , covariance matrix and the pi-ks

Notes link:
Рекомендации по теме
Комментарии
Автор

very clean explanation . Thanks Pratik.

RAJANKUMAR-miib
Автор

Thank you Pratik for an amazing explanation

mowlanicabilla
Автор

bro the main formula which you said - Probability of the test sample to fall under the particular gaussian. How do we get that? I mean, is there any method to derive it or we just have the formula and the formula remains same no matter how many gaussian curves appear??

Thoda btao

dipankarnandi
Автор

Hi Pratik. The P(x) is the joint probability right? Why did you say it is the likelihood. Can you please explain how to interpret P(x)?

sudeshnadutta
Автор

thanks for such a nice explanation..please suggest me some platform where i can solve numerical on machine learning

mdzeeshan