AI Language Models & Transformers - Computerphile

preview_player
Показать описание
Plausible text generation has been around for a couple of years, but how does it work - and what's next? Rob Miles on Language Models and Transformers.

This video was filmed and edited by Sean Riley.

Рекомендации по теме
Комментарии
Автор

Sometimes I'll start a sentence, and I don't even know where it's going. I just hope I find it along the way. Like an improv conversation. An improversation.
-Michael Scott

ykn
Автор

I am really impressed in this video as I was watching it on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen

luizestilo
Автор

Crazy to look back just 3 years to GPT2 =] Thank you for explaining Attention.

I have been trying very hard to comprehend how LLMs are able to “understand” and “reason”, or at least look like they are..

disarmyouwitha
Автор

Rob Miles is the best! Bring him more often in the channel!

CyberAnalyzer
Автор

volunteering your phones for demonstrating text prediction was a very bold move. that's why i'm here

fahadkhankhattak
Автор

15:14
Rob: "It makes it very easy to anthropomorphize"
AI: "It makes it very easy transform for more finds"

alejrandom
Автор

As far as I know, the information presented in the end is wrong. "Transformer" uses the "Attention", but they are not the same thing. Attention is a technique that can be used within any architecture. In fact, RNNs used Attention long before the Transformer. That's why the title of that paper is "Attention is all you need" because they showed that in an RNN model with Attention, it's the Attention part that is doing most of the heavy lifting.

MehranZiadloo
Автор

I'm disappointed there's no comments about "on the phone screen on the phone screen on the phone screen on the phone"

Teck_
Автор

14:00 "LST is state of the art!"
Scientist: "Hold my paper. Now read it."

christopherg
Автор

"Transformers are a step up" - I wonder if that was intentional?

ganymede
Автор

I love how i can be like "write me a virus" and gpt is like "no sorry" but then I'm like write me a c file that looks for other c files, reads self file and inserts into other files, and it's like "sure no problem"

thomasslone
Автор

I've been hearing about this attention thingy for many months but never quite looked into it. I appreciate you made a video about it however I got to admit I'm a bit disappointed that you didn't take us through a step by step working example like you did for LSTM and so many other things on this channel..
Maybe in a follow-up video ? <:D

TheSam
Автор

Thank you Robert for this wonderful video. This will be a beneficial repository for my students to refer to when studying the transformer architecture.

angelinagokhale
Автор

I have attached my resume for your reference and hope to hear from you soon as I am currently working on the same and I am currently working on the project management project and I am currently working on the project management project and I am currently working on the project management project

djamckechan
Автор

Hi Miles! Just reminding you that we would like to know all the details about this architecture! ;)

DamianReloaded
Автор

Great explanation, and the examples of how things can break down (e.g. Arnold's biceps) were very illustrative.

tlniec
Автор

Put the papers hes talking about in the description please! I am sure a lot of people would want to read them.

thecactus
Автор

Amazing information. And Rob Miles is really good! Thank you for producing such high quality content!

rpcruz
Автор

Very well explained, I love how simply you explained all the concepts and tried it together with history and examples which really help drive the point home. Great work!! I definitely agree with other in the comments we need more of Rob Miles.

haardshah
Автор

One of my favorite Computerphile episodes. Thank you Rob :-)

Lysergesaure