Deep Learning Tutorial with Python | Machine Learning with Neural Networks [Top Udemy Instructor]

preview_player
Показать описание
In this video, Deep Learning Tutorial with Python | Machine Learning with Neural Networks Explained, Udemy instructor Frank Kane helps de-mystify the world of deep learning and artificial neural networks with Python!

In less than 3 hours, you can understand the theory behind modern artificial intelligence, and apply it with several hands-on examples. This is machine learning on steroids! Find out why everyone’s so excited about it and how it really works – and what modern AI can and cannot really do.

In this course, we will cover:

• Deep Learning Pre-requistes (gradient descent, autodiff, softmax)
• The History of Artificial Neural Networks
• Deep Learning in the Tensorflow Playground
• Deep Learning Details
• Introducing Tensorflow
• Using Tensorflow
• Introducing Keras
• Using Keras to Predict Political Parties
• Convolutional Neural Networks (CNNs)
• Using CNNs for Handwriting Recognition
• Recurrent Neural Networks (RNNs)
• Using a RNN for Sentiment Analysis
• The Ethics of Deep Learning
• Learning More about Deep Learning

At the end, you will have a final challenge to create your own deep learning / machine learning system to predict whether real mammogram results are benign or malignant, using your own artificial neural network you have learned to code from scratch with Python.

Separate the reality of modern AI from the hype – by learning about deep learning, well, deeply. You will need some familiarity with Python and linear algebra to follow along, but if you have that experience, you will find that neural networks are not as complicated as they sound. And how they actually work is quite elegant!

This is hands-on tutorial with real code you can download, study, and run yourself.

#Udemy
#ITeachOnUdemy

Share your story with #BeAble
Рекомендации по теме
Комментарии
Автор

I have a doubt. Isn't the formula for softmax = e^x / sum(e ^ x) for each values of x? And the formula that's shown in the video (9:30) for softmax 1 / (1 + e^-x) actually meant for sigmoid?

ramsp
Автор

Completed the whole tutorial, exercise and project and thanks for putting this up on YouTube. This has gives me the little push I needed to start further learning in deep learning.

naveed
Автор

Thanks for the video. My first journey into ML.
Btw: for the spiral model, if you change the activation to "ReLU" gives a good spiral fit with just 3 hidden layers of 6, 4, 2 :). Great fun way to learn with the examples.

solomonfekaduhailu
Автор

57:35 Using PyCharm 2019 CE and the and tf.Session() commands aren't working. I've tried almost everything and have been attempting to get this to work for half the day. Is there something I'm missing? Any help would be great.

drakejohnson
Автор

Awesome video thanks ☺️ could you please share how can realize communication system with deep learning

ofijanabu
Автор

On the spiral pattern, why even consider NN? A simple Knn with 3-5 closest neighbors would have done an amazing job on this problem. No need to force NN. Amazing video btw! you are a rock star!

YiannisPi
Автор

Amazing content, I really learnt a lot thanks Frank. Please keep on producing such indepth material

egnatious
Автор

Thanks for this video. Much appreciated. Just wondering whether RNN is the only option in Neural Networks for stock prediction. Please advice me on this point.

satheeshkrishnankannaiyan
Автор

'Gradient descent' has a pre-computer history: conceived as linear programming;s maxima-minima methods wayback before computers.

frankx
Автор

Awesome Video.. Just when i was thinking of learning CNN and RNN.
Please can you make a video on Graph Neural Network. Thank you

planet
Автор

What version of TensorFlow is he using, I'm having lots of trouble with setup

alexfagan
Автор

I really like the advise sir Frank!!
Good effort...thanks
I really like the advise sir Frank!!
my notebook says tensorflow has no attributes named global_variables_initializer

robinbartsch
Автор

The spooky part mentioned in RNN explanation is because this course doesn't explain in depth how the algorithms work. So it seems like it's magic, but it isn't. It's good to get your feet in ML, but it feels like after this course I will simply use pre-made models and will not be completely sure why they work, just that they do. Either way, thanks, this helped a bit!

miicro
Автор

Would be good to update the parts specific to tensorflow v1 to v2. Or maybe instead of redoing the video have v2 code versions for the appropriate examples

michaelsalmon
Автор

I think you made a mistake when you use keras to solve the house-votes-84 dataset problems.
your code:
all_features =, at time 1:42:37
but the feature_names already contain the 'party' column and you just use the 'party' column as the label below

ianyang
Автор

You have a wonderful voice tone which is relaxing and i think is perfect for these kind of videos. Great job sir

Skandawin
Автор

Very well done and thank you for sharing. Liked the mix of moral and Tech - important and appreciated you brought it in to the video.

StokkerGold
Автор

Free idea for a deep learning project: Make early phonograph recordings clear and steady so that they sound almost like modern recordings.

frankx
Автор

my notebook says tensorflow has no attributes named global_variables_initializer

arnabpersonal
Автор

Note: you can add more features instead of more neurons to get a faster and better result!

mel_cosentino