The True Cost of Running ChatGPT: Hardware, Electricity, and More!

preview_player
Показать описание
Important: This video is based on GPT 3.5, not GPT 4 or 4o, because there is no reliable data for these models.

Ever wondered what powers ChatGPT and how much it costs to run? In this video, we break down the incredible hardware, electricity, and overall expenses behind this advanced AI. From NVIDIA A100 GPUs to staggering monthly bills, discover the true cost of keeping ChatGPT operational. Don't forget to like, subscribe, and share your thoughts in the comments!

This video is part of a video series, which unveils the true scale, cost and more about huge tech companies and projects.

Check out some of the other episodes of the series:

New episode every Friday, other videos will be listed here as soon as they are released!

Рекомендации по теме
Комментарии
Автор

Nice video.
I always wanted known how much is cost.

And in one video, sam altman told, we have reduced for gpt 3.5 cost of running chatgpt cost around 10x times.

What do you think about inference parallelism while serving requests.
Like, here, it means, multiple requests, like 32 request can run simultaneously. Like, 1 request take 1 second, then 32 requests would take 1.2 second, like that (i may be wrong, perplexity has articles on it).

nithinbhandari
Автор

you forgot to mention cooling of the GPU-s that cost almost the same as the TDP

giorgim
Автор

so about a billion a year? or 700million investment then 300 million electricity cost per year.

olabassey