Dimensional Reduction| Principal Component Analysis

preview_player
Показать описание
Here is a detailed explanation of the Dimesnioanlity Reduction using Principal Component Analysis.

Please subscribe the channel

You can buy my book where I have provided a detailed explanation of how we can use Machine Learning, Deep Learning in Finance using python

Рекомендации по теме
Комментарии
Автор

This is one of the best videos on Internet for this topic.

Can't thank you enough sir.

shushantgambhir
Автор

I think its better to mention how much variance you want keep rather then mentioning number of components. For eg -
PCA(.80)
# this will maintain 80% variance and will create necessary principal components to keep 80% variance.


Hope this is helpful

vineetsansi
Автор

Thank you Krish, for the concise and clear explanation!

kevinkennynatashawilfredpa
Автор

Thanks for making this type of content. You explain things in a very clear and easy way

kamalkantverma
Автор

Sir, video is very helpful. Analysis is very helpful because analysis is very perfect

beakaiwalyakhairnar
Автор

Excellent!! Your full channel is extremely helpful. Very well explained.

srashtisingh
Автор

Thanks for the video krish,
But wondering, fresher like me would get puzzled in so many techniques of doing feature selection, it would be great if you tell us which feature selection technique to be used and when..

Regards
Pritam

pritamgorain
Автор

super Explanation Anna .
You rocked data science.

AkshaykumarPatilAkki
Автор

Best explanation of PCA . Could you please make an video on Linear Discriminant Analysis. Also please explain the Eigen vector and Eigen value concept behind PCA.

sunilc
Автор

Amazingly explained video sir keep it up.

hassamsiddiqui
Автор

very good explanation. thank you so much !

jazzorcazz
Автор

how can you determine the optimal number of components you should reduce your features to? love your tutorials btw!!!

the_imposter_analyst
Автор

Great .Now I have completed my practice inside jupyter notebook successfully. Cheers

sandipansarkar
Автор

Wonderful.. thank you for doing it sir

betanapallisandeepra
Автор

Thank you for putting the video back :)

adityasingh
Автор

Using PCA the number of Dimensions can be reduced, but can you pls tell us on what basis these Dimensions/variables are reduced? Is it the Entropy value? or some other things....

manojnahak
Автор

should go into more maths and how its working... anyone can fit n transform..

TheMangz
Автор

Thanks for the nice video, I had one doubt. So how do we decide when to apply PCA? Let's say when the features are 2, 3, or more than 3. Is there any constant number of features for that and can you explain the math behind it? Kudos and cheers mate!

kushkumar
Автор

Hi Krish,
Your videos are very useful, thank you for the videos,
I have a doubt, the reason we are dng pca is to reduce the number of features right??... So how we wil know which are the features from the given data are useful while applying different models on our data?

ramleo
Автор

I have some doubts...first can we apply pca for categorical data? Second i wish to know as to how can we calculate the optimum number for n-components? Do we have to calculate the variance explained by manually trying out different values for n—component?

soumendradash