BERT vs GPT

preview_player
Показать описание
#machinelearning #shorts #deeplearning #chatgpt #neuralnetwork #datascience
Рекомендации по теме
Комментарии
Автор

One is for Natural language understanding and another is for Natural language generation

darshantank
Автор

this is very useful. Just wanted to add that the gpt decoder doesn't have the cross attention in the transformer block.

VarunTulsian
Автор

Great explanation. For eg, if I have to read all the client emails and understand their requirements and auto create tasks based on that prediction, which model should I go for? BERT or GPT?

maninzn
Автор

So BERT doesn’t have a decoder? Did I misunderstand

nicholaszustak
Автор

What if I stack both encoders and decoders? Do I get some BERTGPT hybrid?

vladislavkorecky
Автор

Rubber •🐥• ducky • you • are • the • one. You • make • 🛀 • 🧼 • bath • time • lots • of • fun.
[BERT _learning_ ERNIE]

JohnBerry-qh
Автор

Bert also Drives a trans am!

Jokes aside I do appreciate your videos!

Dr_Larken
Автор

I hadnt known that BERT was an acronym and had been wondering why the Sweden LLM was called Bert. I wonder if this is why. Thanks for the info!

JillRhoads
Автор

transformer models are usually parallelly run right?

saimadhaviyalla
Автор

Can you please explain about their training process?

contactdi