Generative Modeling - Normalizing Flows

preview_player
Показать описание
In the second part of this introductory lecture I will be presenting Normalizing Flows.
Рекомендации по теме
Комментарии
Автор

wow... this is absolutely brilliant. Due to the bijective nature of the normalizing flow, you're constrained to only utilizing bijective functions (which is quite limiting indeed). However, by designing the NN structure in this way, you're able to offload parameter learning to an entire internalized NN, where the NN outputs parameters for a bijective function. Mind blown, crazy stuff!

After all this is complete the learning piece simply being MLE makes a ton of sense.

Dank je wel!

MathStuff
Автор

This is the greatest explanation of coupling layers I've seen. Thank you

heyasmusic
Автор

This explanation is amazing... it asks for an understanding of basic concepts of linear algebra and and statistics, but it is still clear enough to understand it, when knowledge in these subjects is more based on intuition than on in depth education
Thanks a lot for this, it's really great!

svart-rav
Автор

Great intuitive explanation, thank you.
Currently taking Stanford XCS236 “Deep Generative Models”. Your video was very helpful in clarifying some of the math, particularly the role of the determinant.

alexezazi
Автор

Hello Hans, this video series is great! I really apprciate you break down each component of the formula and try to kake sense of it.

zxynj
Автор

This is an incredibly nice explanation! Thank you so much

moon_dragon
Автор

Thank you so much for such a great, clear and easy to follow explication, I like the comparison between flow-based models, GANs and VAEs at the end of the video ! Also, the math explanation is very clear :)

sehaba
Автор

Thank you so much for this video. I'd watched several videos on flow before this one, but this is where it really clicked for me. I echo @MathStuff1234, absolutely brilliant.

CalebCranney
Автор

This is the most useful lecture for starting normalizing flow!!!

tkkitk
Автор

Thanks for your generative model series!

seunghyeonjeon
Автор

Great explanation! Straight to the point and clear!

piotrkaminski
Автор

Hello, I really enjoyed the explanation. It was easy to follow and the analogy was very useful!

nathanwong
Автор

very nice explanation for me as an data science studen thank you

simonhradetzky
Автор

it like diffusion model now aday! greate!

jakewong
Автор

I think at 9:00, because of the chain rule we must evaluate not at x, but like this (example for 2 functions): Df(x)=Df_1(f_2(x)) Df_2(x)

jcamargo
Автор

Hello, thanks for your explanation.
I don't understand how shuffling is an invertible function, do you have to remember the places where you shuffled your points?

blacksages
Автор

Hi, that is an amazing lecture. Thank you so much for the video. Could you please post the lecture powerpoints?

ShuaimingJing-ueqn
Автор

Nice video again, however I could not wrap my head around: If I use random shuffling, how can it still be invertible?

praveen
Автор

Are diffusion models are specific implementation of it? Or something else?

BartoszBielecki