How word vectors encode meaning

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

Never have I thought I would hear Grant say what is Hitler + Italy - Germany

mohammadjadallah
Автор

This is the wildest start to a short I’ve ever seen

danielnelson
Автор

Honestly the fact that human knowledge can be encoded this way is worthy of an entire video on its own. All of the magic of machine learning could be understood purely in those terms.

ophello
Автор

I love this. I alternate between thinking this is mundane and the most natural way to encode a language, and being astounded that this is actually how the LLM does it

johnchessant
Автор

Finally… we have quantified the Italians

afraidofmoths
Автор

"Hitler + Italy - Germany ~ Mussolini" makes perfect sense.

pontus_qwerty
Автор

It's insanely cool that this idea worked out this way.

LetondAtreides
Автор

I love how I’ve been working with LLMs for quite a while now but when I saw you’re making a series on this I was like a little kid on Christmas morning.

Rateddany
Автор

There is a very compelling theory that the capacity for abstract thought originally evolved from the same part of our brain that processes navigation and spatial orientation. In a certain sense, just about every practical problem in logic can be encoded as an equivalent geometric problem, so you can reuse the same structures to solve totally abstract problems that have nothing to do with moving around. If this is true, it would be very extremely insightful - when someone says they can't "visualize" something, maybe what they really mean is that they can't build a space in their brain to navigate.

DistortedSemance
Автор

OMG Grant, thank you! I just watched a few Computerphile videos about LLMs and AI art, and they kept talking about "embedding". I kept thinking of embedding metadata into an image or video, or similar concepts. I also knew that computers read text and image contents as numbers, but for some reason I could not make the connection between the numbers and "embedding".

This short saved my sanity. Thinking about concepts as vectors in higher-dimensional space was actually _easier_ to grasp than whatever muck I was trying to imagine before!

This new perspective, along with your amazing capacity for teaching, visualizing, intuiting, and relating these concepts has enhanced my life in so many ways. Thank you!

sabinrawr
Автор

-sees short while scrolling YouTube
-clicks on channel
-clicks on first short vid
-“what is Hitler+Italy-Germany”
-horror

Boykisser_Certified
Автор

Yup. My mind was blown when I first learned about word2vec.
And its interesting to think that there is enough information encoded the placement of words in a sentence, namely how close two words appear relative to each other throughout an entire corpus of sentences, to be able to cluster words in n dimensional space in a way that encodes these relationships/associations.

CoughSyrup
Автор

Super cool, and totally makes sense to visualize it this way. The more dimensions you have in this vector space representing words, the more meaning they can encode.

csdrew
Автор

Ngl the fact you started the short with an insanely crazy question to keep people engaged in watching the full short is nothing short of brilliant. Bravo

lazylavalamp
Автор

It makes a whole lot of sense actually, the AI draws literal parallels in it's own mind, and they can be further or closer from the center point of the axis. The real problem is imagining a graph where there are literally all the words visible, unlike this one that probably had the AI find a local point that can help it draw said parallel accurately. I don't know if I'm rambling, this subject is really interesting

NexusBecauseWhyNot
Автор

Kicking this one off with a brand new sentence 😂

w
Автор

🤣🤣 I wasn’t ready for the beginning lmao

kwiky
Автор

this was the COLLEST thing a saw in years. thank you

pedrojesus
Автор

Love the direction this is heading, to back Engineer logarithmic probability vector-values in metastable memory associations to reconstite word meaning of self-defining time-timing sync-duration identification conglomerations of pure-math derivatives <cause-effect> meaning.

davidwilkie
Автор

It’s so interesting how you can use groups of singularities to represent such complex equations

DeeDoo-oo