Principal Component Analysis (PCA) - THE MATH YOU SHOULD KNOW!

preview_player
Показать описание
In this video, we are going to see exactly how we can perform dimensionality reduction with a famous Feature Extraction technique - Principal Component Analysis PCA. We’ll get into the math that powers it

REFERENCES


IMAGE REFERENCES
Рекомендации по теме
Комментарии
Автор

goated mind blowing brainstroming i am crying what a spectacular content my brother

gajendrasinghdhaked
Автор

You definitely deserve more subscribers, mate. Keep up the good work. Absolutely clear and easy to follow.

saadmalik
Автор

Have been following your channel for a while. Think they're really useful as there's not much videos around explaining the math involved in various techniques. Pls keep doing what you are doing! Thanks!

wlxxiii
Автор

Thanks. Just a question: if you have multi groups of points to separate them. What is the method of ML could solve this problem. I need an unsupervised algorithm.

fadydawra
Автор

HI,
Can you please give some intuition on the Information equation? Why is it equal to 1/N ZT Z at 2:14?

harry
Автор

You are the best teacher on planet earth and the neighboring planets. Cheers!

darasingh
Автор

Beautiful as always, keep up the good work!

PerpetuityLJW
Автор

there are lots of knowledge points I need to recall and review from undergraduate study in order to fully understand this

MrZidane
Автор

I am confused as in if S is the covariance matrix of Z which itself has M dimensions, then how could we end by selecting first M Eigenvalue and Eigenvector pairs, it should be S is the covariance matrix of X having D dimension and then Diagonalizing S we select M pairs

rajatshrivastav
Автор

How do you determine what dimension value of m that you want? And how do you figure how accurate the new m dimensional data set is in reflecting the original data set? Do you just compare variances?

rexwinn
Автор

Great video, keep it up! If you want to try an agile software to perform PCA use CAT - Chemometric Agile Tool"!

cat-chemometricagiletool
Автор

same process to SVD transform a hyper sphere into 2D ellipse 
but I cannot understand the part of feature mapping will u make a video with example and show the calculation?

ccuuttww
Автор

Based on the dimensions of Z, U and X, the expression Z=UX does not hold. Can you clarify this point?

MultiPRAKS
Автор

To fast to follow. If I already knew the topic I can definitely watch this video to look for a quick reference, else, am still in limbo.

ssshukla
Автор

At 2:08, UX = (D by M) (N by D) => what is the shape of Z?

jasdnbsad
Автор

2:06 A typo is in the dimensions of X and Z, where they flip the row and column dimensions.


2:16 Something is not proper here when he says S is the covariance matrix of z.
Note that
Cov(a, b)=E(ab)-E(a)E(b), \forall a, b \in X,
so this S is only the first part of the covariance matrix, rather than the whole part. Indeed, the complete covariance matrix is
Cov(Z) = E(ZZ') - uu' = 1/N(ZZ')-(1/N^2)Z11'Z',
where u_i is the mean of z_i (u:=[u_1, ..., u_M]) and 1's are column vectors. By comparison, see 5:40. Note that \sum_i Var(z_i), and hence he claims that the sum of eigenvalues equals to the variance is correct. However, again, Cov(Z)~=1/N(ZZ').

CTT
Автор

Hands down the most best and accurate explanation of PCA. Woaw

RAJATTHEPAGAL
Автор

Why "information" is a covariance matrix?

Cat_Sterling
Автор

Maybe good for intuition, but YouTube videos are generally a bad choice for scientific papers, theses etc., especially regarding uncorrected mistakes and your lack of sources where you got the PCA calculations from (not the eigenvector and diagonalization reference stuff), I recommend Google Scholar, e.g., I Jolliffe - 2011 - Springer "Principal component analysis" for a good mathematical calculation that one can understand based on the intuition of, e.g., StatQuest's explanation here on YouTube. Better sources exist for both the mathematical formulas and the intuition, this video while not awful doesn't do either really well tbh

driverg
Автор

This is not so much explaining as tabulating.

lizimoodyspecter