Feature Selection in Machine learning| Variable selection| Dimension Reduction

preview_player
Показать описание
Feature selection is an important step in machine learning model building process. The performance of models depends in the following : Choice of algorithm
Feature Selection
Feature Creation
Model Selection

So feature selection is one important reason for good performance. They are primarily of three types:

Filter Methods
Wrapper Methods
Embedded Methods

You will learn a number of techniques such as variable selection through Correlation matrix, subset selection, stepwise forward, stepwise backward, hybrid method etc. You will also learn regularization (shrinkage) methods such as lasso and Ridge regression that can well be used for variable selection.

Finally you will learn difference between variable selection and dimension reduction

Coursera :

Recommended Data Science Books on Amazon :

Data Science Books on Amazon :

Coursera :

Udacity Nanodegree:

20$ discounts on below LIVE courses : use coupon YOUTUBE20

Data Science Live Training :

Big Data Training:

Рекомендации по теме
Комментарии
Автор

This is one of the best comprehensive videos I have watched on feature selection. But why do you have to make such a good video and then kill it with so many ads. Seems very counter-intuitive

anitha_raman
Автор


Data Science Books on Amazon :


Coursera :



Udacity Nanodegree:



20$ discounts on below LIVE courses : use coupon YOUTUBE20

Data Science Live Training :

Big Data Training:

AnalyticsUniversity
Автор

For p=10, shouldn't the number of models be 1+((10*11)/2) which comes to 56 models? How did you get 211 models? At 29:00.

nonegog
Автор

i need to find the accuracy of the model after finding the variable importance...please tel, me how to do

poojamahesh
Автор

at 16:38, why cant we reverse the place of X1 and X2 to get 4 th model

saketnair
Автор

Very informative about feature selection. Thanks for the video.

mahaveerthakur
Автор

Dear Sir,
Another way to do forward, backward and stepwise regression model selection in R

#Forward selection
reg_fwd=step(null_model, scope=list(upper=full_model, lower=null_model), direction = 'forward', k=2, steps=500)

#Backward elimination
reg_back=step(regmodel, scope = list(upper=full_model, lower=null_model), direction = 'backward', k=2, steps=500)

#Stepwise
reg_both=step(null_model, scope=list(upper=regmodel), direction = 'both')

aniruddhg
Автор

Very useful and detailed. @10.19.. can you let me know what is IV?

sandeepchavan
Автор

what if i have more than 500 variables
??

rushikeshmore
Автор

Informative video on feature selection.

mychannel
Автор

im doing project for feature selection using wrapper method and the learning algorithm is ant colony, cuckoo search, cat algo and svm classification ../any one have any idea about this in python

sathishkumarrb
Автор

How is "Choice of algorithm" different from "Model Selection" ?

SanjayFGeorge
Автор

i have an important question i am using backward elimination (SVM-RFE with 10 fold cross validation) on a small dataset of 200 observations the question is should i split my dataset into traing and validation and then apply the backward elimination on the training set or on the whole dataset(200 observations) if yes then after getting the final model (fit) how can i test its accuracy on a real testing data ?

rawiasammout
Автор

how to do this using Python instead of R ?

kamalmustafa
Автор

what is RSS ? can you please explain it.

abhinavgupta
Автор

Good info.
I don't mind making money with advertising. However you exagerated, with so many ads. That's why I didn't finish watching it.

otroleonarbe
Автор

i guess if a feature has a high correlation with the target variable then we should remove that variable

ravisrivastava
Автор

Hybrid selection explanation not clear..

abhishek