Lets Unfold RNN| Recurrent neural network explained | Recurrent neural network complete tutorial

preview_player
Показать описание
Lets Unfold RNN| Recurrent neural network explained | Recurrent neural network tutorial
#RNN #machinelearning #datascience #unfolddatascience
Hello,
My name is Aman and I am a Data Scientist.

Follow on Instagram: unfold_data_science

Topics for the video:
recurrent neural network,
recurrent neural network for time series prediction,
recurrent neural network explained,
recurrent neural network tutorial,
recurrent neural network python,
recurrent neural network pytorch,
recurrent neural network example,
rnn deep learning,
rnn example,
rnn explained,
rnn lstm,
rnn architecture,

About Unfold Data science: This channel is to help people understand the basics of data science through simple examples in an easy way. Anybody without prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at a high level through this channel. The videos uploaded will not be very technical in nature and hence can be easily grasped by viewers from different backgrounds as well.

Book recommendation for Data Science:

Category 1 - Must Read For Every Data Scientist:

Category 2 - Overall Data Science:

Category 3 - Statistics and Mathematics:

Category 4 - Machine Learning:

Category 5 - Programming:

My Studio Setup:

Join the Facebook group :

Follow on Twitter: @unfoldds

Watch python for data science playlist here:

Watch the statistics and mathematics playlist here :

Watch End to End Implementation of a simple machine-learning model in Python here:

Learn Ensemble Model, Bagging, and Boosting here:

Build Career in Data Science Playlist:

Artificial Neural Network and Deep Learning Playlist:

Natural language Processing playlist:

Understanding and building a recommendation system:

Access all my codes here:

Рекомендации по теме
Комментарии
Автор

Amazing explanation boss. The way you have pilled the concept. After listing to 5 videos from experts, I could finally understand the concept.

yosupa
Автор

I refered so many videos regarding RNN. But only urs is clear in depth . True mentor. I salute you sir

suganyaramu
Автор

watched numerous video to get this type of knowedge

SINDAVALAMBASAVA
Автор

Very well explained. Thank for your hardwork to understand the concepts well, practice and explained to all of is.

nagarajtrivedi
Автор

Excellenly presented. Even Statquest failed to teach bettre. Kudos!!!

jsridhar
Автор

Amazing explanation . I am really surprised why so less likes here !!! Please keep it up .

piusranjan
Автор

You're so good Aman and talks basics. Thank you very much for sharing these videos.

SurendraY-gk
Автор

Very nice Aman sir... Thank you for your help...

swathiangamuthu
Автор

Wonderful way of initiating the video.

deepakdodeja
Автор

Sir, good morning, its an excellent contribution to all categories of people related to this field. Its wonderful, thanks a lot. Just a small doubt sir, in this video you mentioned to add the bias term(at 17.25 min) in the formula. My doubt is, In the network, where the bias is mentioned or added, whether at the end, or at the O2 level. pl clear my doubt. thank you once again.

PratapaReddyYakkaluri
Автор

Sir please help: the part at 14:45 where the output of the recurrent node is passed to next time step of the same node BUT ALSO passed to the other node in the hidden layer, i didnt understand that part pls explain it to me intuitevly/mathematically how it works❤

SelfBuiltWealth
Автор

Fantastic job Aman. Please create video on LSTM also.

geekyprogrammer
Автор

nice explaination, i have one question

1 ) will the data/output from neurons of the same layer will be passed to another neuron in the same layer ?

keshav
Автор

Hi Aman, I need one suggestion. I need to convert xaml files to atmx files. Is it possible ?. How to develop model ? which model i need to use and how to build dataset ? kindly guide me on this.

gopinathsrimatthirumala
Автор

Please do upload the previous video of 39 minute long

karanmehta
Автор

Hi aman.. Thanks for the video
There will be three weights, in common notation
waa for the previous word
wax for the input word
wya for the output...
Correct me if I am wrong
Thanks.. Can we expect the derivations also in the next video? 😊

tejkiran
Автор

Here, You didn't explain How Each node of the Hidden Layer Process(We know) -> Passes(other Hidden nodes of the same layer & other layers) -> How it stores the output hidden state of each node,

How it process with Next timestamp

and finally previous Dense give the multiple HiddenStates,
How it using that Hidden States & finally give the ouput

and also RNN has Multiple type of Architecture (For Many to one) When the output layer works

please explain these doubts with the logic, sample code (Ex) along with Sample calculation (We want process only that's enough not exact nums)

Even it takes a longtime in a vedio, please upload as single vedio

Every sources of internet gives the outline process of RNN not depth level
can you please?

ClipsforQalb