Get started with using TensorFlow to solve for regression problems (Coding TensorFlow)

preview_player
Показать описание


Рекомендации по теме
Комментарии
Автор

This is a very elegant piece of code. It's worth while to pause and study it.

jamesthompson
Автор

I learned more from this single video than my entire last semester, looks easy but it has lots of hidden information. Thank you

MilindParvatia
Автор

i cant give better summary than this in short time. great job! thank you

antimatterwt
Автор

great!! there're too much classification vids, but no regression vids. nice to see this

oliverli
Автор

In 3:14, there is a other way to do the One Hot Encoding using the pandas get_dummies function:

dataset.Origin = dataset.Origin.map({1: 'USA', 2: 'Europe', 3: 'Japan'})
ohe = pd.get_dummies(df.Origin)
dataset = pd.concat([dataset, ohe],

By the way, thanks for the videos!

asuagar
Автор

Normalization does not lead to features between 0-1 rather it puts the features around 0 with std deviation of 1. It gives all the features equal variance so that no weight gets too powerful (around 6 minute mark)

genoalam
Автор

I'm starting with TensorFlow and the video is amazing, Thanks a lot!

aragorngamer
Автор

thanks . it's nice to just explain it quick and let people go back in the code and review. I'll check if you have a classification model as well

CardioCareWellness
Автор

This was so helpful in many ways in my studies. Thank you very much for the easy to understand explanations

akilaj
Автор

A minor correction: What we did here during preprocessing is Standardization (in terms of Z score) not Normalization (Min-max scaling). great video though!

anuragsharma
Автор

Splendid lesson, clear lesson. How delicate the way he transmit ideas according python instruction. Love it!

anthonyfernandezgonzalez
Автор

I started with learning Tensorflow, ended looking at 1970-s car videos - how did that happen?

notarealhandle
Автор

This is amazing. Thank you for sharing!

haseebtubing
Автор

At 9:30 - overfitting; At smaller sample sizes e.g. via sampling of the total set, least square coefficients could be used as reference estimates (b=inv(X'X)X'y). At 10:40; why not plot the prediction errors to get outliers? And at 10:54, why not explore the residuals? Deviations from the normal distribution usually generates interesting information about the sample. Information that can be used for model improvement.

tmusic
Автор

HI, Thank you for this video;
Please why you have chosen 64 for Dense.
Thank you

aomo
Автор

Requesting to make videos on NLP using TF and Deep Learning.. thank you so much for sharing such wonderful videos.

SanataniAryavrat
Автор

Great video!
Two questions;
- Where do you define that you're predicting the MPG and not something else?
- My PrintDot function is not recognized, I get a NameError everytime I try to run the code (im sure there are no typos). Anyone knows what the problem could be?

pieterherings
Автор

I appreciate the tutorial, it is great! I have question just to make sure, so the percentage of data used is 64 : 16 : 20 (Training : Validation : Test)? since the validation split is 0.2 or 20% from training data (which is 80% from total data). thanks in advance :)

DimasAnggaFM
Автор

What do I have to write in order to get the model to estimate the MPG consumption of a single car in this case of a single row? Like throwing in cylinder, weight, etc. and getting out the MPG of my Bugatti Veyron. :D

Thanks in advance, Vito.

vitotonello
Автор

Perfect! This tutorial helped me upgrade my old tensorflow code to use keras. Good tip also about normalizing. That said, for some reason the early_stop callback stops training after about 10 epochs which is way too early!

tripzero