Bayesian Neural Network | Deep Learning

preview_player
Показать описание
Neural networks are the backbone of deep learning. In recent years, the Bayesian neural networks are gathering a lot of attention. Here we take a whistle-stop tour of the mathematics distinguishing Bayesian neural networks with the usual neural networks.
Рекомендации по теме
Комментарии
Автор

This is one of the best explanations I have seen on Bayesian neural networks. Thanks!

sibyjoseplathottam
Автор

Only two minutes in, and I can already say with confidence that this is the best explanation of B-CNN I've ever seen. Thanks a lot!

M-
Автор

One of the best and most concise descriptions to BNNs for newcomers such as myself

waleedkhan
Автор

u explained the bayesian nn in the easiest possible way that i can think of... excellent work

arjunroyihrpa
Автор

Great explanation. Simple and to the point

myusernameis
Автор

I enjoy your videos, and I have a suggestion to improve the audio quality. You should build or buy a pop filter to put in front of the microphone to eliminate the puffing sounds. This way you can talk even closer to the microphone and the audio will improve.

softerseltzer
Автор

Top best ever explanation, side by side thus you know how different it is from standard NN

Must
Автор

Hi, sorry I don't know much about what was talked about here in the video but I found it intriguing. I will be entering college soon and am deciding on my major. Is this statistics? Or Computer Science? Or mathematics? Felt like a combination of all of them (all 3 fields are very much overlapping too).

anantsharma
Автор

Best explanation on Bayesian neural nets

makamsidhura
Автор

Yep, I finally understood it. Thanks!

andreymanoshin
Автор

How do you backprop in Baysian Neural Networks ?

Trubripes
Автор

This is really helpful. So the ensemble is kind of a simplified bayesian ?

canalfishing
Автор

Thanks a lot. Can you share some of your published papers that I can go through.

mohdata
Автор

Lovely video. One small point: KL divergence is not a distance measure! (KL (a||b) != KL (b||a)). Cheers

MLDawn
Автор

Awesome video!! Nice and clear explanation! It would be perfect if the recording equipment was better👏

RangoKe
Автор

Wow this is way better than blog posts!!

phoenixUtube
Автор

Can we use them
For regression problems?

SamiaToor
Автор

its a really helpful video, thanks a lot

faatemehch
Автор

3:55 Here my years of college crept in asking: Why is it equivalent? Doesn't this assume a particular loss function?

I'm not quite sure - and perhaps the question is banal.

But thank you very much, the video is incredibly helpful!


Edit: Sorry I didn't pay attention... you mentioned that only some Loss-functions adhere to this criterion ^^

It is so satisfying to feel that those years of statistics finally pay off.

Thank you very much!

walterreuther
Автор

Can you share the example? for a BNN regression problem. I need to make a BNN for 06 inputs and 1 output problem. thanks

excursion