Cost Function in Neural Network | Types of Cost function we use in different applications

preview_player
Показать описание
In this video, we will see what is Cost Function, what are the different types of Cost Function in Neural Network, and which cost function to use, and why.

We will also see Loss Function. Loss function is the term used to represent the error for 1 observation. While Cost function is the term used for the average of errors for all the observation.

We will see Cost Function for 3 different types of problems :
1.) Regression or Linear Regression in Neural Network
2.) Binary Classification in Neural Network
3.) Mulit-class Classification in Neural Network

Timestamps:
0:00​ - Agenda of the video
0:28 - What is Cost Function
1:09 - Cost Function for Regression problem in Neural Network
3:14 -Binary classification Cost Function in Neural Network
6:43 - Multi-class classification Cost Function in Neural Network
9:09 - Summary

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

This is Your Lane to Machine Learning ⭐

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Рекомендации по теме
Комментарии
Автор

If you found this video helpful, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.

MachineLearningWithJay
Автор

In a few years I can't help but think your videos on machine/deep learning concepts are going to be recommended by many teachers for basic understanding. Nicely explained.

naveenkv
Автор

Thanks a lot for This Amazing Introductory Lecture 😁
Lecture - 4 Completed from This Neural Network Playlist

PrithaMajumder
Автор

Thanks for making this..I feel like there is not enough explanation about this topic in YouTube

adnanhowlader
Автор

Splendid work. The way you explained it with example is a tremendous help.

tiyyob
Автор

You are a good teacher. Congratulations ❤

S.aliakbar.h
Автор

Thank you for your excellent explanation

ISMAIL-dlgw
Автор

cost function for binary classification that you mentioned is also called as entropy.
am I right?

manikantaperumalla
Автор

thank you very much for this video! I appreciate your effort in simplifying this. I would like to check with you about the cost for Multi-class classification in sigma whether there should be (1/m) for many observations in the training 9:00 ? Because I saw you include (1/m) for the average for binary classification, but I did not see you including it in the Multi-class classification. Thank you in advance for your response. ^^

bonpagnakann
Автор

when you summarized the formula @9:00 min, wont there be an 1/m upfront? as that's what you explained at the top handwritten

Ivaan_reminiscence
Автор

Great Chapter, Really Appreciate your efforts...You can become another Sal Khan...One suggestion, the animated pointer feels disturbing to eyes, You may use a normal one instead

learnhome
Автор

My doubt is can't we get y = [1, 0, 1, 0] if we have (predicted output) a = [0.5, 0.2, 0.8, 0.3] ?? what is the guarantee that we always get only one element in a is having >= 0.5 ??

PavanKumar-hpel
Автор

Can you suggest a book which you follow for classification and its cost function

shahfahad
Автор

for multi-class classification, do we not add a (1-y)log(1-a) term?

harshitjuneja
Автор

Can you plz provide the slides you used for these videos?

md.enamulatiq
Автор

What if the actual value is 0 and the predicted value is 1.

Will the error be infinite.

AnbuArasu-fgss
Автор

This guy teaches better than my professor.

indrayudhmondal