13.0 Introduction to Feature Selection (L13: Feature Selection)

preview_player
Показать описание
This video gives a brief intro of how we care about dimensionality reduction and introduces feature selection as a subcategory that we will cover in more detail in the upcoming videos.

-------

This video is part of my Introduction of Machine Learning course.

-------

Рекомендации по теме
Комментарии
Автор

You taught my Introduction to Biostats and 451 at UW. You are one of the best professors I have ever encountered. Thank you for providing all of this extra material. It is greatly appreciated.

emilwalleser
Автор

Commitment level: 1000.
Thanks for adding additional topics and making the lecture videos publicly available :-)
Really looking forward to the new edition of your book 👀

pulkitmadan
Автор

Absolutely love your content! Best machine learning lectures I've watched so far on youtube :)

zaynnicholas
Автор

Great to find out your YT, seems I'm following you in each platforms. ❤️

UniverseGames
Автор

Amazing content ❤️ keep it up, have learnt so much from you mate 🙂

wolfisraging
Автор

Thank you so much for uploading!
I have a dataset where feature selection/dimensionality reduction is vital, but in my class at university that we were only scatching the surface of feature selection (and we didnt get those add on lectures ;))

HannyDart
Автор

I have been following your videos since the past month and I can say with confidence that your videos are one of the best I have seen so far on ML! Thanks a lot for all the efforts you have put in making these lectures!!!
Are there any lectures/books that you have authored which are dedicated to forecasting (both time-series and using regression models?). Would be of great help to me if you could guide me to them.
Thanks again! :)

rahulpaul
Автор

I have been enjoying your ML lectures and they are very well explained with a deep understanding of the concepts. Firstly, thank you for putting these up on YouTube.

I just had one question - so I have heard that dimensionality reduction has principal component analysis and Anomaly detection. So how is Principal component analysis different from feature selection?

keyushshah
Автор

Respected professor, your videos are guiding me very well and recently I'm learning how to select features. I have one huge confusion, Anova(F-test) is often used in Filter method for feature selection. Theory says, Anova should be used for feature selection when target is Binary but I saw in some practical use people also uses Anova when target is multi class. So Anova(F-test) can also be applied if our target is not binary and has multiple classes(say data like IRIS)?



Another question Anova assumes features to be normally distributed, But in practice most of the time we encounter data that are not fully normal in such case does it matter much to apply it in feature selection? or Transformation of data into some distribution is compulsory? Please clear my confusion answering these two questions.

beautyisinmind
Автор

same same following you here as well, besides tweerrrra

wagutijulius
Автор

Hello, Are slides from all lectures available somewhere?

bogdan