Lesson 7: Practical Deep Learning for Coders

preview_player
Показать описание
EXOTIC CNN ARCHITECTURES; RNN FROM SCRATCH

This is the last lesson of part 1 of Practical Deep Learning For Coders! This lesson is in two parts:

1) We look at a range of more ‘exotic’ CNN architectures to learn how to deal with multiple inputs (such as incorporating metadata), multiple outputs (including predicting bounding boxes for localization), creating heatmaps, and handling larger images. These architectures all draw on keras’ functional API, which we learnt about in the last lesson; so be sure to study up on that powerful tool before diving into this lesson.
2) We build a recurrent neural network from scratch in pure python / numpy, including implementing the gradient calculations and SGD updates ourselves. Then we build a gated recurrent unit (GRU) in theano.

These more advanced topics are designed to be a stepping point towards part 2 of the course, which will be taught at the Data Institute at USF from Feb 27, 2017, and will be online sometime around May, 2017. It may take more study to get a complete understanding of this week’s lesson than some previous weeks—indeed, we will be revising some of this week’s material and discussing it in more depth in the next part of the course. In the meantime, we hope that you’ve got plenty to keep you busy! Many thanks for investing your time in this course!

If you’ve got something out of it, perhaps you could give back by contributing to the Fred Hollows Foundation? As we discussed in the class, just $25 is enough to restore site to one eye. Since we’ve learnt to create software that can see, let’s help more people see too. :)
Рекомендации по теме
Комментарии
Автор

Jeremy, you are the best teacher ever and a role model. I wish someday I am able to organize thoughts in my head with this much clarity and explain to people in such a simplified (yet detailed manner). And I also wish someday I become a successful data scientist like you and make a huge impact on the lives of people.
All the best for part two. Looking forward to joining it.

singlasahil
Автор

In last couple of months I went through 7 DL full courses on coursera, udemy etc and read about ~11 DL related books and nothing was even close to your series. Absolutely stunning course. Really well explained. I loved excel spreadsheet exercises. I'll watch the remaining lessons very soon. Einstein said: If you can't explain it simply, you don't understand it well enough. You sir, have nailed it. Greetings from the Bay Area!

greko
Автор

Just finished Lectures 1-7. Fantastic course! And thank you so much for your generosity in creating and sharing this content!!

RamaRamakrishnan
Автор

How do I clap on this youtube thing? Great job guys! I can't wait for the next series.

big_whopper
Автор

This is incredible course. I have learnt so much by doing these videos.

rajatiitian
Автор

Wonderful series. Thank you for making it available.

carinmeier
Автор

This was an excellent lecture series! Thank you so much for doing this.

jamescope
Автор

Thank you very much Jeremy. The series was more than excellent, it is outstanding. I never learn something that interesting. Someone mentions Games of Throne below but for me it is more interesting to watch your lectures than watch Games of Throne.

tnaduc
Автор

done, thank you i'm looking forward watching the second part

IRSOG
Автор

Really great series. I learned quite a lot. Will the part 2 be freely available?

DSWithSanjaya
Автор

if and when you experiment with resnet for transfer learning, can you please update us on the results? or perhaps a blog post with some instructions?

masterdon
Автор

Jeremy,
Fred Hollows Foundation website does not process payments from US - it requires suburb name. Is there another way to make a donation?

MrNikosido
Автор

58:15 - 59:18, especially 58:45 - 59:02

williamchamberlain
Автор

Is the sound a bit echoey and funny with this one - or is it just me?

zhubarb