Linear discriminant analysis (LDA) - simply explained

preview_player
Показать описание

In this video, we will see how we can use LDA to combine variables to predict if someone has a viral or bacterial infection. We will also compare LDA and PCA (07:50), discuss separation (11:50), the math behind LDA (14:40), and how to calculate the standardized coefficients (21:00).
Рекомендации по теме
Комментарии
Автор

the best explanation in all internet. Thank you a lot

hopelesssuprem
Автор

Thanks for sharing. This is amazingly crafted and easy to follow. I finally understood the math behind LDA. Brilliant!

nikeforo
Автор

I like your video always with subtitles so i can better understand what are you saying

wtoenmy
Автор

really you are genius and your help always remembered until my last breath.

ratnakarbachu
Автор

I went to the MANOVA video, and it said I needed to first understand the LDA video so now I am here, and now it says I have to go to the PCA video hehe

gwendolyneortiz
Автор

Thank you so much for this very useful video.

saifh.al-nimer
Автор

Thank you for the video. Could you please make an extended video for LDA? I mean videos like what you did for PCA

danialb
Автор

Your videos have been amazing, can you please upload content related to Bayesian analysis

rekhapriya
Автор

Hello sir, can you please explain why first eigen vector is considered for LDA?

spp
Автор

Can you please provide the dataset on which you worked

ritiksuri
Автор

Thank you very much for these beneficial videos. However, you mentioned that you used software to calculate the Eigenvectors. Please recommend which software to use (the simpler, the better)?
Again, thanks a lot for your videos

ahmadalmomani
Автор

In your calculation of matrix W, when the sizes of the classes or groups is UNEQUAL, what are the variables n_1 and n_2? (i.e. in the equation shown at 15:43). Also, thanks so much for these videos (and the PCA ones). Well explained, with good examples and you did it in half the time everyone else takes!

jimjohnson
Автор

I greatly appreciate the video. I only have one question: Does this linear discriminant analysis approach rely on the Bayesian, Fisher or some other approach?

MariaMartinezGarcia-kyru
Автор

Hello! great video, however, I have a question/need clarification: is LD1 found at 18:19 the actual line, on which when projected upon best seperates the two classes? And then the following calculations in the video is the data being projected on the line, or is this line found somewhere else? So to clarify; im looking specifically for the line that best seperates the two classes

iwwyl
Автор

Hello, thank you for the excellent explanation! So a variable contributes more to the groups' separation when it has a high weight. How do we interpret the negative weights of the variables (in LDA)?

silesoul
Автор

Hello! I would like to ask what the null hypothesis for LDA is or DA (Discrimination Analysis) in general? And also if you would know what the hypothesis is when MANOVA and DA together is used?

paolopanlaqui
Автор

hey can you tell me how you calculated 0.11 and 0.70?

upanshisharma
Автор

Could you please check if the legend colors for bacterial and viral are correct on the figures (e.g. at 6:21)? With scikit-learn I get low values for bacterial and high values for viral data transformed samples. Therefore, viral data points, transformed with LDA, should be above the bacterial ones.

sergeypigida
Автор

Thanks! This video is so helpful! Coffee on me :)

ig
Автор

How you assign alpha1 and alpha2 value?

sunitharamcse
welcome to shbcf.ru