Generative Adversarial Networks

preview_player
Показать описание
A lecture that discusses Generative Adversarial Networks. We discuss generative modeling, latent spaces, semantically meaningful arithmetic in latent space, minimax optimization formulation for GANs, theory for minimax formulation, Earth mover distance, Wasserstein GANs, and challenges of GANs.

This lecture is from Northeastern University's CS 7150 Summer 2020 class on Deep Learning, taught by Paul Hand.

References:

Goodfellow et al. 2014:

Goodfellow, Ian, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. "Generative adversarial nets." In Advances in neural information processing systems, pp. 2672-2680. 2014.

Radford et al. 2016:

Radford, Alec, Luke Metz, and Soumith Chintala. "Unsupervised representation learning with deep convolutional generative adversarial networks." arXiv preprint arXiv:1511.06434 (2015).

Ulyanov et al. 2018:

Ulyanov, Dmitry, Andrea Vedaldi, and Victor Lempitsky. "It takes (only) two: Adversarial generator-encoder networks." In Thirty-Second AAAI Conference on Artificial Intelligence. 2018.

Karras et al. 2018:

Karras, Tero, Timo Aila, Samuli Laine, and Jaakko Lehtinen. "Progressive growing of gans for improved quality, stability, and variation." arXiv preprint arXiv:1710.10196 (2017).

Lucas et al. 2018:

Lucas, Alice, Michael Iliadis, Rafael Molina, and Aggelos K. Katsaggelos. "Using deep neural networks for inverse problems in imaging: beyond analytical methods." IEEE Signal Processing Magazine 35, no. 1 (2018): 20-36.

Arjovsky et al. 2017:

Arjovsky, Martin, Soumith Chintala, and Léon Bottou. "Wasserstein gan." arXiv preprint arXiv:1701.07875 (2017).

Park et al. 2020:

Park, Sung-Wook, Jun-Ho Huh, and Jong-Chan Kim. "BEGAN v3: Avoiding Mode Collapse in GANs Using Variational Inference." Electronics 9, no. 4 (2020): 688.
Рекомендации по теме
Комментарии
Автор

I have watched many YT videos on GANs but this is by far one of the very best at explaining GANs. Thank you and keep up the good work!

bluestar
Автор

Loved this video, thanks a lot. Will patiently wait for the next one :)

ayankashyap
Автор

Loved your video on VAEs, and really like this one for Vanilla GANs, but I couldn't hang in there with the math for the Wasserstein GAN.

robwasab
Автор

Very interesting. Thanks a lot for making these videos. I think I will need to rewatch this in order to fully understand everything, but I already learned a lot!

steffen
Автор

Love your channel Paul, you should make moar videos, you're really great at explaining things! Can't wait for the next one!

alexandramalyugina
Автор

Please explain attention model with neural networks, transformer your explaining is very very good ❤️

ahmedkotb