Tutorial 28- Ridge and Lasso Regression using Python and Sklearn

preview_player
Показать описание
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more

#Regularization

Please do subscribe my other channel too

Connect with me here:
Рекомендации по теме
Комментарии
Автор

Watch the 2nd part just now.... You're like a savior to me as I have some deadlines due tomorrow and this helped me a lot sir. Thank you very much.💯💯

abhishekchatterjee
Автор

You have already trained the model with whole data at first then splitting it for the prediction. Nothing wrong but I don't think so its an ideal technique.

alakshendrasingh
Автор

I feel, steps for regression process can be like this:
1) split the data into train and test.
2) Use train data for cross-validation and find the parameter with min MSE.
3) Use the same parameter over test data and check for the accuracy of different models.

shindepratibha
Автор

I'm following the 'the complete machine learning playlist' playlist, but you're step-jumping skipping many details saying 'hope you know this', love your teaching though but Can you make any good completed playlist??

ayankoley
Автор

I think the good example for regularization would be to show that model's accuracy on training data is excellent and accuracyon test data is bad i.e- overfitting; then we can use regularization and compare the results

Kumar-ohjl
Автор

A small suggestion --- better differentiate the terms - alpha (for Lassa) and lambda (for ridge). It is confusing.

esakkiponraj.e
Автор

Why did we split the data into train & test "after" doing fitting in lasso regression? I mean, shouldn't it be like splitting the data before fitting and creating a model? 8:34

Amir-English
Автор

Beautifully explained. Really Helpful. Thank you!

vishalvishwakarma
Автор

How do we select the value of Aplha/Lambda, what is the ideal value

kushswaroop
Автор

I m d first to to watch ur video.. Was watching ur bias n variance video... N notification came for this... Big fan of urs sir...

iftekarpatel
Автор

Great explanation. Very nice and simple explanation for ridge and lasso. Both theoretical and practical concept is good. Keep doing such videos.

lotmoretolearn-dataanalyti
Автор

Krish thank you for this video very informative. I have question though. Dont you think predicting (x_test, y_test) from the model that is trained from (x, y) would predict a memorized value? shouldn't it be accurate and realistic prediction if model is trained from x_train and y_train rather than x, y for testing purpose?

Emotekofficial
Автор

Is the method correct?
In my understanding we should:
1. split up the data train and test.set (maybe - depends on data units - standardize before)
2. do the hyperparameter optimization on the train set with cv-> best model for training data
3. predict with best model on the test set -> realistic result for unseen data


you did:
1. hyperparameter optimization on all data using cross validation -> best model for all data
2. split up all data in training and test set
3. predict with best model on test set -> in my opinion your "best model" has already seen the data in the test set. Hence the result should not be quit realistic. What do you think?

MrPiickel
Автор

can we use other scoring method rather than neg_mean_squared_error to solve the problem...If any please suggest...Please help me out..

surajsoren
Автор

Hi Krish, You are explained in a clear and easy manner to understand the concepts. Krish, If possible can you please explain the machine learning algorithms as like you explained as linear regression algorithm. I Mean to say, theoretical explanation of the all the important machine learning algorithms. thank you

rajeshthumma
Автор

Good videos. So far so good. From most of the videos, i feel inference part is missing. What can we infer from the plots ?

jordan
Автор

Hi.
I just want to know if I am not wrong, we need to use train_test_split method before training the data. right?
But you trained the data and then split the data into train & test, which for sure do not give us an accurate prediction on future predictions.


Please correct me if I am wrong.


Thank you.

pandya
Автор

Is deep learning playlist is complete ?if not please complete it🙏🙏

techbenchers
Автор

Superb explanation. Need to get my hands dirty in jupyter notebook. Thanks

sandipansarkar
Автор

your videos are really nice keep it up!

sinister_deamon
welcome to shbcf.ru