CS480/680 Lecture 23: Normalizing flows (Priyank Jaini)

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

For me, this is the best lecture on the hole internet about Normalizing flows, Thanks a lot

mohannadbarakat
Автор

27:12 Autoregressive Flows (AR)
29:56 AR with Gaussian Conditionals
33:49 Masked Autoregressive Flows (MAF)
37:15 Real-NVP
40:12 Real-NVP (for real this time)
41:25 Neural Autoregressive Flows (NAF)
43:37 Sum-of-Squares Polynomial Flows (SOS)
47:35 Glow: invertible 1x1 convolutions
59:40 The catch of normalizing flows

maurocamaraescudero
Автор

(3:38) "of course, this does not apply to Keanu Reeves" :D

mananlalit
Автор

What a great lecture! Thank you for publishing this.

josephedappully
Автор

it is good explanation lecture. Thank you

aniwatpha
Автор

Thank you very much for providing this great lecture. Just a note: in the slide title, it is "conservation of probability mass", not "conversation of.."

lorenzoservadei
Автор

very good explanation, where can i get slides of this lecture?

sonalgarg
Автор

How do you tell what the latent space encodes for?

matthewpublikum
Автор

can anyone tell me from where to start a gaussian distribution or x distribution

Rajkumarhz
Автор

On this channel this video has highest views and i am pretty sure this is because of indian name(priyank jaini).
India has totally overflowed the amount of engineers it produces every year.

umairalvi
Автор

not even a chuckle for the keuana reeves reference!!

brycejohnson