Feature Engineering Techniques For Machine Learning in Python

preview_player
Показать описание


Subscribe if you enjoyed the video!

Best Courses for Analytics:
---------------------------------------------------------------------------------------------------------

Best Courses for Programming:
---------------------------------------------------------------------------------------------------------

Best Courses for Machine Learning:
---------------------------------------------------------------------------------------------------------

Best Courses for Statistics:
---------------------------------------------------------------------------------------------------------

Best Courses for Big Data:
---------------------------------------------------------------------------------------------------------

More Courses:
---------------------------------------------------------------------------------------------------------

Timeline:
00:00 Introduction
1:44 Initial Setup
10:00 Dimensionality Reduction (PCA)
16:22 Preprocessing / Scaling
26:08 Categorical Encoding (Dummy / One-Hot)
33:09 Binning (Grouping / Aggregating)
37:56 Clustering (K-Means)
44:08 Feature Selection
Рекомендации по теме
Комментарии
Автор

Run a heat map for all columns when viewing correlations before running PCA, there are way more opportunities for dimensionality reduction.

roblee
Автор

for info: if you delete the column island then you should delete the rows containing value 1 as well or you will have the other encoded columns equals zero in all.

e_hossam
Автор

i've been lost in feature engineering chapter on the book that i am currently ready for machine learning right now, and straight ahead i found your video, and with the whole 47 minutes i have learned 2-3 things from you and i understand the whole process lot more better now, this all thanks to you Greg! keep up making these types of videos bcs WE NEED YOU!!

nokroman
Автор

Been waiting for this one!! Amazing video! Thanks Greg

arsheyajain
Автор

Greg, your videos are absolutely lovely and reinforce everything I’m learning in my classes, thank you so much

Eizengoldt
Автор

I've waited for this video! Many Thanks! :)

mikekertser
Автор

Buddy, I have subscribed you. Please keep uploading more video that helps a lot

crepantherx
Автор

You explain everything in such an easy-to-follow way! Thanks for the amazing video!

jacobsquires
Автор

this kind of way is what we need outside the uni class. Enough for PCA knowledge in Uni, let's code!

findoc
Автор

really great comprehensive video, would be great if you did one on how to select features to get the best results for this problem itself

ShadowDC
Автор

Amazing video as always! Super helpful

ashleyb
Автор

Best of the best!
Thank you greg, you are the man!

hsoley
Автор

Thank you a lot for these helpful experiments .. it gave me a lot of ideas in data preprocessing !

rafik
Автор

at 4:30 after looking it up, "frac=1" sets the fraction of the dataset to shuffle to be 100% of the total dataset, instead of say 25%. didn't really explain that properly

ShaneZarechian
Автор

Thanks a lot greg, you have helped me a lot through this videos.

darshanprabhu
Автор

Wouldn't this be called Transformation techniques of preprocessing instead? I thought Dimensionality Reduction would be separate from Feature Engineering, with Feature Scaling making up the 3rd subtopic. So something like:

Dimensionality Reduction (removing features)
PCA
Clustering

Feature Engineering (creating/transforming features)
One-Hot
Binning

Feature Scaling of features)
Your scaling

russelllapua
Автор

Thank you so much. Your efforts are really appreciated.

dvcdbio
Автор

You're Gold !! Keep up the good work.

gpsvlogs
Автор

Thank you for posting this, i like all of your videos :)

kashishrajput
Автор

great🙌🏻, very helpful keep making more such videos

prasadchavan