Encoder Decoder | Sequence-to-Sequence Architecture | Deep Learning | CampusX

preview_player
Показать описание
In this video, we unravel the complexities of the Encoder-Decoder architecture, focusing on its application in sequence-to-sequence tasks. Whether you're a student, developer, or tech enthusiast, join us on this learning journey as we break down the fundamentals of this powerful model.

============================
Do you want to learn from me?
============================

📱 Grow with us:

👍If you find this video helpful, consider giving it a thumbs up and subscribing for more educational videos on data science!
Share your thoughts, experiences, or questions in the comments below. I love hearing from you!

Thank you for joining us on this exploration of the Encoder-Decoder architecture. Stay curious, and let's decode the future together! 🚀🤖

⌚Time Stamps⌚

00:00 - Intro
01:22 - SEQ2SEQ Data
08:04 - Things to Know Before You Start
10:05 - High Level Overview
13:43 - What's under the hood?
19:25 - Training the Architecture using Backpropagation
48:44 - Prediction
55:35 - Improvement 1 - Embeddings
1:00:30 - Improvement 2 - Deep LSTMs
1:10:45 - Original Research Paper
1:10:58 - The Sutskever Architecture

✨ Hashtags✨
#EncoderDecoder #SequenceToSequence #DeepLearning #MachineLearning #CampusX #TechExplained #AI #NeuralNetworks #LSTM #AlgorithmExplained
Рекомендации по теме
Комментарии
Автор

You create videos that are worth a thousand others, and I understand that it takes time. Your content is so polished, so please don't rush to complete this series.

rabingiri
Автор

Bruh, this is something else, first time someone is teaching by taking reference from a research paper. Keep giving us the content, this is just worth millions dollar content not just thousands. Good Work, Bro. Mad Love to you

gouravgulia
Автор

I doubt if there's any video on youtube providing as much value as you do in each video of yours. Thank you and you have my respect Nitish Sir!

virajkaralay
Автор

"This is the only YouTube channel where I click the like button first and then watch the video because, I know video to bawaal honevala hi hai" 😄

bhushanbowlekar
Автор

lot of Love and Respect from Pakistan❤❤❤

abdulqadar
Автор

Sir I really want to thank you from the bottom of my heart, I got my first Job Few days they as Machine Learning Engineer, Thank you sir I have been following your channel from last 6 months and that changed my life and career as well. ❤🙏.

kashifhabib
Автор

Finally, thanks a lot sir for uploading Encoder-decoder architecture, I was really finding a source to learn, then your video came and I bet its better than all the courses out there. YOUR WAY OF TEACHING IS JUST AMAZING ❤

HarshSingh-zpjb
Автор

Exitment for this playlist is increasing day by day. I Eagerly waits for each videos..❤..thanx sr...

shaukat
Автор

I wil like the video first and then watch the video. 😀 Coz i know the playlist is wonderful and one of the best 🙂

MaheshKumar-kmek
Автор

Awesome. You put such great efforts in your videos, it really reflects there.

somdubey
Автор

you have made me really feel what machine learning and deep learning is ..

RahulKumar-tmc
Автор

Thank you so very much for posting the lecture. I am eagerly waiting everyday for NLP videos. I check the playlist almost everyday. Please give the playlist more time and complete the content..
U are one fine Data Science Tutor I have come across on youtube. 🎉

riyatiwari
Автор

Thank you for sharing knowledge. You deserve the best teacher award.

bananamaker
Автор

The best tutorial that i ever watched . So much respect for you, Nitish Sir!💌

mimjamammonmoy
Автор

Thanks alot Sir, Ive learned more from your channel than so called best selling courses .Love From Pakistan

NoumanKhan-ugks
Автор

Best Explanation Ever. Eagerly waiting for Transformer Architecture Video.Keep it up @Nitesh bro

DS_AIML
Автор

00:02 Introduction to Encoder Decoder Architecture
03:03 Sequence to sequence data presents challenges for neural network architecture
09:31 Simple overview of encoder-decoder architecture
12:35 Overview of Encoder-Decoder architecture
19:03 Decoder stops producing output when it sees 'end' sentence.
21:31 Training encoder-decoder on English-Hindi dataset
27:20 Understanding the encoder-decoder seq2seq architecture
30:13 Softmax layer generates probabilities for words.
36:10 Explaining the process of calculating the output using soft max layer.
38:45 Output from softmax in step 3 is a vector representing probabilities.
44:10 Back propagation involves gradient calculation and parameter update
46:40 Training process involves forward propagation, loss calculation, gradient updates, and weight optimization.
52:05 The process of teacher forcing during training and handling input without correct labels.
55:08 Improvement in basic architecture: use of embeddings
59:43 Seq2Seq architecture involves encoder and decoder parts with specific functionalities.
1:02:17 Deep LSTM architecture makes it easier to handle long-term dependencies.
1:07:11 Deep LSTM architecture shows better results than single layered LSTM
1:09:56 Reversing input sequences can improve translation quality for certain language pairs

janvibhardwaj-ng
Автор

Bru you've explained these topics better than some renowned ai researchers like Andrew Ng etc.
I would recommend to please do a video on attention mechanism and transformers.

abrarahmed
Автор

So rich information/knowledge you are teaching for free and as an indian, your videos are like a 0-layered neural network, no need to do embedding....

akashkarn
Автор

Sir, just viewed the whole video!, it was amazing, you taught it in such an easy way and amazing way, that when I started to read the paper, it seems soo easy to read and literally was so damn understandable. Thanks a lot sir and appreciate your help! 🙏 ♥

HarshSingh-zpjb