Online Summer Training in Machine Learning and Data Science with Python | Class-8

preview_player
Показать описание
Agenda-
Concepts Gradient descent , Cost function ,Mean square error, Learning rate
Save Model Using Joblib And Pickle

Reference Lectures for this session:-

Attendance Rules:

1. Write Session Summery below the YouTube Video after Every Class.

2. Solve assignment after Every Class on:

3. Solve Given Task and Share to your Linkedin Profile after every class.

Do attendance formalities with your Registration IDs.

****** Attendance Rules are Compulsory for Summer Training Certification.
#LinearRegression #MachineLearning #PythonOnlineTraining #PythonTraining #datasciencetraining

Рекомендации по теме
Комментарии
Автор

loved your explanation mam thank you so much...

manognanalluri
Автор

GO_STP_5968:
Today we have seen the basic math behind the linear regression. we have discussed the gradient descent, we have seen the mean square error (cost function). we have written a code and known how it works. Lastly we have seen the procedure to save our models i.e. by using joblib and pickel library functions.

spandanpal
Автор

Reg ID : GO_STP_6113
In Today's session we discussed about :
Gradient descent, math behind the algorithm, cost function, how save the trained model using pickle and joblib...

sivaprakash
Автор

Swati Priya
GO_STP_9067
Today we learnt the concepts of Gradient descent, Cost function, Mean square error, Learning rate save Model Using Joblib And Pickle, Saving model, load model & prediction

swatipriya
Автор

GO_STP_1600
This class was about gradient decent, mean squared error, cost function and how to save the implemented model using pickle and joblib

santhoshmuruganantham
Автор

GO-STO-3383:

In this session we learn about Gradient descent, cost function, mean square error, learning rate, save model using joblib and pickle.

NitishKumar-oshr
Автор

GO_STP_8298

SUMMARY

Gradient descent is an iterative optimization algorithm used in machine learning to minimize a loss function. The loss function describes how well the model will perform given the current set of parameters (weights and biases), and gradient descent is used to find the best set of parameters. We use gradient descent to update the parameters of our model.The primary set-up for learning neural networks is to define a cost function (also known as a loss function) that measures how well the network predicts outputs on the test set. The goal is to then find a set of weights and biases that minimizes the cost.  With a large learning rate, we can cover more ground each step, but we risk overshooting the lowest point since the slope of the hill is constantly changing. With a very low learning rate, we can confidently move in the direction of the negative gradient since we are recalculating it so frequently. A low learning rate is more precise, but calculating the gradient is time-consuming, so it will take us a very long time to get to the bottom.

arnimamishra
Автор

GO_STP_13390:
In this session we learnt concepts gradient descent, cost function, mean square error, learning rate
save the model using Joblib and Pickle.

prasadampajalapu
Автор

GO_STP_4741


Today's class summary:
the concepts of Gradient descent, Cost function, Mean square error, Learning rate save Model Using Joblib And Pickle, Saving model, load model & prediction

shekhsohel
Автор

GO_STP_5324
Today's session we learnt about, cost function, mean square error, saving model and loading model using pickle and joblib etc...thank you

ashi
Автор

GO_STP_6527:
Today's session covered topics including gradient descent, cost function, mean square error, learning rate, saving and loading model using joblib and pickle.

shwetachaurasia
Автор

GO_STP_12194:
In this session we learnt about the math part behind gardient descent algorithm, mean square error, saving model using pickle

ashokkumartankala
Автор

GO_STP_6613

In today's session we have learnt concept of gradient descent, cost function, mean square error, learning rate save model using Joblib And pickle, Saving model, load model  & prediction.

DivyaSharma-kiso
Автор

GO_STP_10212
This session is about concepts of gradient descent, cost function, mean square error, learning rate and saving model using pickle and joblib.

vidhyasagarreddykeshepally
Автор

GO_STP_7313
NANDITA SWAMI
today we learnt about Gradient decent, cost function, mean square, rate of learning, model saving things done.
GO_STP_7313

nanditaswami
Автор

Name-Renu saran
Registration ID_GO_STP_8490
Today we learn about matplotlib line, different types of plots in matplotlib- bar, graph, pie, chart, histogram, scatter, plot, area plot

renusaran
Автор

GO_STP_5272

Today's class summary.

the concepts of Gradient descent, Cost function, Mean square

error, Learning rate save Model Using Joblib And Pickle, Saving model, load model & prediction

krupavaramkalla
Автор

GO_STP_9672

it was a informative class 8:- I have learn lots of things like
Concepts Gradient descent, Cost function, Mean square error, Learning rate Save Model Using Joblib And Pickle in Python.
thank for such informative class.

BoomBoom-zrqu
Автор

GO_STP_13460
today we learn Gradient descent, Cost function, Mean square error, Learning rate
Save Model Using Joblib And Pickle

manteshwaripipare
Автор

GO_STP_5791
° about the mathematical part of linear regression
° how to save model using pickle() and joblib() function

paromitadutta