PCA with Python

preview_player
Показать описание

1) We explain about the important input and output parameters
2) We show how PCA can find components from linearly correlated data. We basically generate some data in this fashion. The first PC explains 97% of variability
3) We demonstrate using iris. We show how when we use first two principal components explain the three classes are well separated, however when we use the last the classes are complete chaos
4) We demonstrate using MNIST PCA, we also draw the cumulative variance explained graph which can help us select number of components
5) We demonstrate that 87 Components explain 90% of variability in the data (Original 784 Features) and the accuracy is effected by a mere 3.5% when trained using these features
6) We explain the different versions Randomized (More Efficient), Incremental (for Large Datasets using minibatch), Sparse (Interpret-able Components), Kernalized (Find non linear combinations)
Рекомендации по теме