Deep learning requires fundamentally new kinds of hardware | Jim Keller and Lex Fridman

preview_player
Показать описание
Please support this podcast by checking out our sponsors:

GUEST BIO:
Jim Keller is a legendary microprocessor engineer, previously at AMD, Apple, Tesla, Intel, and now Tenstorrent.

PODCAST INFO:

SOCIAL:
Рекомендации по теме
Комментарии
Автор

Jim and Lex always have really great discussions.

johnkost
Автор

So this is the other way around. Classically coders have to optimize their code to specific hardware. But here they are optimizing hardware to run their (pytorch) programs most efficiently.

ramanmono
Автор

Excellent sound tangible. A tactical tactile sphere of coherence within symbiosis in biomorphic an molecular structures such as spectrums tone an form throughout (Singularity) Upgrade !

dougbaker
Автор

I don’t even know what just happened..

markjackson
Автор

Can someone explain like I'm 5: "the future of software is data networks..the programs"
Is he literally saying the future of software is basically pools of already functioning data networks, networked together to form a larger solution? Like, his example of pixels running their programs, but the pixels themselves have no idea what the other pixels are doing...and this will solve that.

Did I understand that correctly?

smudge
Автор

Jensen says you just need to buy more 3090s.

bobtoo
Автор

A fast Hadamard tranform chip could be made using a few thousand transistors. Even a neural network chip based on that would need less transistors (56000) than a Z80 CPU from the 1970s. Eg. Fast Transform fixed filter bank neural networks.

hoaxuan
Автор

The problem with not getting the basics down 1oo%. With not doing the scientific methodology 100% from the ground up is that very expensive, elaborate hardware gets built. When something far simpler would be better.

hoaxuan
Автор

i learn more listening to mr. keller than of my school.

ndx
Автор

My favourite topic AI Accelerators and ML on Chip

bhuvaneshs.k
Автор

Tenstorrent chip sounds like parallel processing running, interrelated, spatial threads. Getting closer to how the human mind works... it's only a matter of time now, bois. Tenstorerent --> realizing the capability for Software 2.0 --> Skynet.

brasidas
Автор

what hes trying to say is the Von neumann architecture will not natively work with neural networks.

HavokR
Автор

Anyway we already live in a world of talking sand. Magic eh?

hoaxuan
Автор

Fun fact:



Jim Keller is brother in law of Jordan B. Peterson.

miroslavdanilov
Автор

I feel dumber just from listening to him talk... the IQ of everyone listening just dropped 50 points.


Jk

drhilltube