How much energy AI really needs. And why that's not its main problem.

preview_player
Показать описание

Artificial Intelligence consumes a lot of energy, both during training and during operation. We’ve heard a lot about this. Indeed, Sam Altman the CEO of OpenAI recently said that we’ll need small modular nuclear reactors just to power all those AIs. Well, hold that thought. Today I want to look at how much energy these AIs really need and explain why I think this isn’t the main problem.

🔗 Join this channel to get access to perks ➜

#science #sciencenews #technews #tech
Рекомендации по теме
Комментарии
Автор

OK super intelligent AI, solve our energy crisis:

-AI shuts itself off

mqbgofjzkkonzx
Автор

Makes my brain look incredibly efficient! It can compose sentences, draw pictures and drive cars and it only consumes a few sandwiches and a beer for energy.

MarksElectricLife
Автор

Sam didn't mention AI fully depends on a small nation called Taiwan where all Nvidia chips are produced. So AI depends on energy, money, and actual political stability.

pirixyt
Автор

so brains are actually far cheaper even if not perfect: maybe we should start training those

logieman
Автор

3:21 a picture really is worth a thousand words

badroad
Автор

1950s: Computers are so expensive that there will only be a few owned by big companies. 2024: AI is so expensive that there will only be a few owned by big companies.

rogercarlson
Автор

Now that phone call ending gave me a big laugh, we both have the same sense of humour it seems.

adriang
Автор

The match of the video clips with the text "training of the model" and "it's regular use" is just brilliant 😂

smartpowerelectronics
Автор

I love the rationalizations on offer by Altman and others whenever asked about the energy problem. The answer is: maybe it will push us to finally make fusion work!!

Translation: ADAPT OR DIE, subtext: OR BOTH

alieninmybeverage
Автор

The Concord was really fast. And it really made you think air travel was going to takeoff however, sooner or later the cost catch you in the end.

whatwherethere
Автор

At least using humans as batteries doesn't seem to worth it.

utkua
Автор

I’ll just continue to chuckle at the way the algorithms go off the rails because of programming bias.

johnwollenbecker
Автор

AI's real biggest problem: garbage in/garbage out

TedSeeber
Автор

I read a premise that computing alone will require as much energy by 2050 as the whole world now uses and that the ideal location is in outer space...24 hour sunlight and other space base benefits...although, outer space can be highly energetic with particles that erode materials and penetrating radiation that could produce product defects.

danielmcwhirter
Автор

You are basically right. I read estimates that predict that datacenters are to consume about 30% of the energy worldwide in a few years. Some think that we can't build power plants fast enough to keep up. Any kind of power plant, mark you.
On the other hand, there are many companies working on solutions for edge computing and much more efficient chips for inference workloads. Memory will be a constraining factor here, if the models are getting bigger like they did in recent times. But there's also work done on this. Increasing the sparsity of models is a very active field of research.

bastiangugu
Автор

Don't forget they will have to retrain these models on a regular basis too. How regular? We don't know that yet.

mikespangler
Автор

I think it is super-important to keep in mind that large neural networks are obviously an intermediate step on the trip to a more generalized AI. And extrapolating from that current technology to what the future might hold is probably pretty risky. Keep in mind that biological systems, not just mammal brains, seem to be orders of magnitude more efficient learners than neural networks with orders of magnitude less power consumption. We have so much to learn and discover.

davidbonn
Автор

Perhaps we should invest more time and money in how to generate more energy.

AnnNunnally
Автор

@Sabine We are training a LLM here at our premises for our own use and one of the most common uses of our supercomputer is trainign AI models. We are also involved in measuring the energy consumption of processes (software, not only hardware) and working on several optimization projects. So we may soon have some hard numbers for that.
Actually, SLURM, the scheduler, can provide the amount of KW / Joules used per computing job run.

VFella
Автор

Also there is the cooling of all those centers guzzling electricity like there's no tomorrow. The amount of water that has to be diverted to them is astonishing.

cheesium