filmov
tv
AI Week 10 - Expectation-Maximization algorithm.
Показать описание
Expectation maximization algorithm.
After this lecture, a student shall be able to . . .
• define and explain the task of maximum likelihood estimation;
• explain why we can maximize log-likelihood instead of likelihood, describe the advantages;
• describe the issues we face when trying to maximize the likelihood in case of incomplete data;
• explain the general high-level principle of Expectation-Maximization algorithm;
• describe the pros and cons of the EM algorithm, especially what happens with the likelihood in one EM iteration;
• describe the EM algorithm for mixture distributions, including the notion of responsibilities;
• explain the Baum-Welch algorithm, i.e. the application of EM to HMM; what parameters are learned and how (conceptually).
After this lecture, a student shall be able to . . .
• define and explain the task of maximum likelihood estimation;
• explain why we can maximize log-likelihood instead of likelihood, describe the advantages;
• describe the issues we face when trying to maximize the likelihood in case of incomplete data;
• explain the general high-level principle of Expectation-Maximization algorithm;
• describe the pros and cons of the EM algorithm, especially what happens with the likelihood in one EM iteration;
• describe the EM algorithm for mixture distributions, including the notion of responsibilities;
• explain the Baum-Welch algorithm, i.e. the application of EM to HMM; what parameters are learned and how (conceptually).