Eigen values and Eigen vectors (PCA): Dimensionality reduction Lecture 15@ Applied AI Course

preview_player
Показать описание
For more information please visit
#ArtificialIntelligence,#MachineLearning,#DeepLearning,#DataScience,#NLP,#AI,#ML
Рекомендации по теме
Комментарии
Автор

nice explanation of eigen vectors and its corresponding values; also the geometric picture of spread of the information on 2dto1d!!

rajmaheshwarreddy
Автор

Thank you sir. Brilliant methods and explanations to help a near beginner trying to grasp the nuance of eigenvalues and eigenvectors as applied in PCA and unsupervised machine learning. I'm not there yet, but I sure feel like I can and will after listening to you. Who needs machines ... lol?

MuctaruKabba
Автор

Applied AI Course 
In 1:57 covariance matrix says Cov of X(S)= Transpose(X).X But in previous vedio we saw Cov of X(S)=1/n( Transpose(X).X ) ..Why this difference??

debanjandas
Автор

Thank you verymuch
It covers
1. column std. matrix
2.Co variance matrix
3.eigen vector
4.eigen value
5.Orthogonal matrix of v1, v2, .... vd
6.projection
7.projection 100%, 75%, 60%, 50%
In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the always orthogonal.

AJ-fohp
Автор

7:57 Always true only if it's a symmetric matrix

rishabhshirke
Автор

Sir, what is the purpose of using covariance matrix?

msravya
Автор

I dint understand anything from this. Do I need to watch any prior video relates to pca in order to understand this videos concepts

gurudakshin
Автор

sir why cant we take v2 as our vector having maximum variance in a particular direction

akhilkrishna