Extropic Beff Jezos on AGI Computing | Better than Quantum Computing? | Accelerate or Die

preview_player
Показать описание
Learn AI With Me:
Join my community and classroom to learn AI and get ready for the new world.

LINKS:

Garry Tan:
This chip will accelerate AI compute way past Moore's Law

#ai #openai #llm

BUSINESS, MEDIA & SPONSORSHIPS:
Wes Roth Business @ Gmail . com

Just shoot me an email to the above address.
Рекомендации по теме
Комментарии
Автор

Okay, that part about the reason for the French being involved in AI was funny af

clinteastwood
Автор

Native French speaker here — I am SHOCKED by your flawless logic. There are several childhood jokes around the idea of getting someone to spell GPT out loud, somewhat similar to getting someone to read a sofa king whatever phrase in English.

BenoitStPierre
Автор

That analysis of the French motivation is not just plausible but 100% fact.

saltyBANDIT
Автор

"Stay Alive" is the best advice you can give to the world.

If we can hang in there for the next 5-10 years... damn.

OscarTheStrategist
Автор

we need *variable* input/output transistors. Not just 1 or 0 on or off, we need levels of on-ness. This will vastly increase the amount of compute possible without quantum computing.

chadx
Автор

The French were some of the earlier adopters of the internet (specifics excluded). They were ahead of most Americans even though the early forms of the internet were created in the US.

Rockyzach
Автор

J'adore ton sens de l'humour, Wes. 😂 Au Québec, nous avons Yoshua Bengio et l'institut MILA. 😊 Keep up the great vids man !

WenRolland
Автор

I wonder what Gill Bates is thinking about all this... 🤔

milesprowr
Автор

Wes that was soooo funny! I love your dry humor… thanks again for putting out such awesome content dude 😎

almostoffthegrid
Автор

Guillaume Verdon is form Quebec. That's akin to saying an American is from the UK.

lemelou
Автор

Thank God you weren't shook this time.

eternalcold
Автор

one thing to also remember is, just because we approach the size limit, does not mean we are in trouble even if a new method is not yet discovered. if we approach the smallest size limit, then next thing we do to get faster and more powerful computers, is to scale in space. so you occupy space and expand the system with more units in parallel. so this is the next step after we've discovered the best method and also reached the point of the most energy we can extract from the smallest unit of spacetime. now we cant make things smaller, only expand in space

businessmanager
Автор

🎶CEO, entrepreneur,
Born in 1964.
Jeeeffrey,
Jeffrey Beeezos.🎶

Now I have to sub.. Damn you Wes, damn you- you man of culture. 😌

jericolandry
Автор

Aww, that brings back memories of my poorly departed cat. Id often confess to my cat thatxid farted. I hope he rests in peace. ❤

AxisSage
Автор

Now I need a generated image of ChatGPT as a farting cyborg cat wearing a French beret. But yeah, this is fascinating news. I really hope we can develop consumer grade chips that will let us run inference at home. Whether it's by Groq or Extropic, or whoever else out there is building AI-focused chips. That would be amazing.

AmandaFessler
Автор

As a French person, I LOVED your bit at 6:40. It made me laugh so much.

MindBlowingXR
Автор

Veritasium has a few interesting videos about analog chips. "The most powerful computers you've never heard of" and "future computers will be radically different (analog computing)".

jamieyoung
Автор

Whether these guys or one of the many others, the incredible foment in hardware is really exciting!

DaveEtchells
Автор

My other thinking is that complicated dynamical conflicts in personal and interpersonal situations are far easier resolved using a thermodynamic stack. . Less human perhaps but less suffering

Geosynchronus
Автор

The idea of using superconducting circuits for computers has been talked about for years but none has appeared - if they do, it would only be in a big server farm as it would need liquid nitrogen or similar to cool it . . . . . and how do you connect the circuitry to normal room temp connections? Not very mobile.

gregzambo