Encode decoder seq 2 seq architecture| encoder decoder model | encoder decoder neural network

preview_player
Показать описание
Encode decoder seq 2 seq architecture| encoder-decoder model | encoder-decoder neural network
#machinelearning #chatgpt #datascience #ai
Hello,
My name is Aman and I am a Data Scientist.

Follow on Instagram: unfold_data_science

Topics for the video:
encoder decoder model,
encoder decoder sequence to sequence architecture,
encoder decoder neural network,
encoder decoder architecture,
encoder decoder in deep learning,
encoder decoder in tamil,
encode decoder model in image processing,
encoder decoder in transformer,
encoder decoder in NLP,
encoder decoder machine translation,
encoder decoder transformer model,
python encoder decoder transformer,

About Unfold Data science: This channel is to help people understand the basics of data science through simple examples in an easy way. Anybody without prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at a high level through this channel. The videos uploaded will not be very technical in nature and hence can be easily grasped by viewers from different backgrounds as well.

Book recommendation for Data Science:

Category 1 - Must Read For Every Data Scientist:

Category 2 - Overall Data Science:

Category 3 - Statistics and Mathematics:

Category 4 - Machine Learning:

Category 5 - Programming:

My Studio Setup:

Join the Facebook group :

Follow on Twitter: @unfoldds

Watch python for data science playlist here:

Watch the statistics and mathematics playlist here :

Watch End to End Implementation of a simple machine-learning model in Python here:

Learn Ensemble Model, Bagging, and Boosting here:

Build Career in Data Science Playlist:

Artificial Neural Network and Deep Learning Playlist:

Natural language Processing playlist:

Understanding and building a recommendation system:

Access all my codes here:

Рекомендации по теме
Комментарии
Автор

The best explanation I have ever seen, thank you!

АбдурахмонАбдухамидов-щь
Автор

Never have I ever seen this an easy explanation like this, you earned a subscriber.

AbhishekVerma-kjhd
Автор

Excellent, brilliant explanation of transformers!

charleskangai
Автор

Wow, you made a complex subject extremely simple. Thank you so much 🔥

pythondemon
Автор

Thank you so much for the simple and clear explanation

maruthiprasad
Автор

Latest & Greatest @ finest which is understable with Professional style videos. Thanks so much.

vasutke
Автор

You are doing really good job. Keep it on!.
Since you are breaking the concepts into very granular blocks, is needed really helpful anyone who starts with and revise.

sounishnath
Автор

What an amazing explanation sir, I really appreciate your effort to simplify this concept.

bhavikdudhrejiya
Автор

Good Explanation and understandable to layman. Can you also explain what will the predicted probability/output vector for EOS?

chithrasrinivasan
Автор

Waiting for the next one please continue the series ❤ 😊

AnkitGupta-rjyy
Автор

so good continue on LLM, Langchain, RAG, GEMNI

KhairulMia-trjv
Автор

awesome videos, i want to learn about attention and transformers

rakeshkumarsharma
Автор

Can you do complete series on Generative AI

jallaaswini
Автор

Do you have videos on Self attention, transformers?

roversncoders
Автор

Can i use this for recommendation model use for user click pattern understanding?

yasink
Автор

Sir make video on current market situation for data science domain please request

tvssarkar
Автор

hello sir, , r u going to continue to make sessions on attention, transformers and bert ?? when u r uploading next video?? plz rply

mahipatil
Автор

Sir did not understand the START word and also what happens at the output of morneg output?

sg
Автор

Hi brother can yu please continue this playlist by uploading transformer, bert and so on

yp
Автор

Hi aman those links are character to character encoding.. Can we also do word to word encoding right? Which is better char2char Or word2word....

Can you also explain how words get predict in the testing phase.... Bcz we don't have target words right? We only pass context vector of the inputs sentence to the decoder..

Thanks

tejkiran