How Nvidia Became America’s Third-Most Valuable Company

preview_player
Показать описание

Last week, Nvidia saw the biggest single-day gain in West Street history, fuelled by the ongoing AI boom. But, it's rise does look a bit unstable, so, we're going to take a look at what's going on with the American tech giant, and whether the boom can really last.

Our mission is to explain news and politics in an impartial, efficient, and accessible way, balancing import and interest while fostering independent thought.

TLDR is a completely independent & privately owned media company that's not afraid to tackle the issues we think are most important. The channel is run by a small group of young people, with us hoping to pass on our enthusiasm for politics to other young people. We are primarily fan sourced with most of our funding coming from donations and ad revenue. No shady corporations, no one telling us what to say. We can't wait to grow further and help more people get informed. Help support us by subscribing, engaging and sharing. Thanks!

//////////////////////

00:00 - Introduction
00:49 - Who is Nvidia?
02:41 - Nvidia’s Recent (and Ridiculous) Boom
05:42 - Can it Last?
08:15 - Sponsored Content
Рекомендации по теме
Комментарии
Автор

CUDA is not a fine tuning software. It is a framework used to write software that makes the best use of NVIDIA chips. And once a workload is written in CUDA it takes effort to port to other manufacturers hardware. Each manufacturer (AMD, Intel) has their own such framework and none of them are inter-compatible. Most existing software is written using CUDA because for some time it was the only choice.

ValeryMeleshkin
Автор

Funny story: My HPC professor has Nvidia shares (at least he told us as much when we started the class). And what does he teach in class? Besides OpenMP for CPU parallelism, he also teaches CUDA.

He's definitely nurturing his investment.

cyberrb
Автор

Your explanation of the difference between CPUs and GPUs is incorrect.

GPUs do not simply process in parrellel - CPUs also do this.

GPUs are specifically designed for certain types of calculations that are used in generating images for a screen, specifically vector calculations. These calculations are also widely used in large language model, hence why Nvidia made the switch to focussing on AI.

So compared to existing large chip manufacturers (e.g. Intel, AMD), Nvidia has been tailoring their chips towards AI development for decades, which is why they're well placed to cash in on the AI boom compared to their competitors.

joemcmahon
Автор

I anyone question "Why Nvidia and not AMD?" The answer is CUDA. It's proprietary and became standard in the industry a decade ago. Just like you are going to get a job they ask knowledge of Microsoft Office or Adobe software and not another open source alternative.

jmtradbr
Автор

Microsoft, Alphabet, Amazon, and Meta DON’T like with monopolies? I’m sure that is unpleasant, I wouldn’t know

corwin
Автор

As someone in tech I appreciate the effort but i do agree with the other comments about CUDA and CPU vs. GPU, I would suggest in future that you get a technical expert to check over scripts to root out any issues.

Also 74.3% != ~4/5

metalhead
Автор

Some inaccuracies..
nVidia cards weren't participially great at crypto and etherium, it's just that the demand was far greater than supply, so any card was better than no card and because nVidia had the far greater market share at the time, they were producing more product to sell and they had a larger profit margin, so they benefitted far more than AMD.

The strength of CUDA is that they seeded it into universities decades ago, investing heavily to ensure software developers knew CUDA really well, which supported the efforts to get software written for CUDA. Work is just coming to fruition to enable CUDA code to run (imperfectly) on AMD cards, so that advantage may diminish rapidly now as well.

Covid also should have been mentioned, it bridged the gap between crypto and A.I, ensuring demand remained extremely high.

ChrispyNut
Автор

it's the "bit spenny" for me 😂😂😂

MC_aigorithm
Автор

I quite like most of the simplifications in this video, but I think explaining CUDA as just a way to optimise the compute units is a little misleading or at least oversimplified

james
Автор

NVDA could squander their lead (Intel did, Tesla seems primed to) but even so it would take a while and, imo, the value would plateau rather than plummet.

epbrown
Автор

Nvidia does have their own models, it is just mainly tied to their consumer GPUs as features for them compared to their competitors and not something meant to be widely used like an LLM.

AndersHass
Автор

NVIDIA is great a producing powerful chips, but not necessarily the most efficient. NVIDIA GPUs are literally used as space heaters by some gamers. That’s where Apple and Amazon may have an advantage because their scale is so large, they have to design their chips with system efficiency as a requirement. Apple’s Mac chips are already approaching NVIDIA’s GPUs performance in some cases at 1/3 the power. It’s insane how inefficient their chips are.

shanep.
Автор

This will be a short lived story 😅 they are profiteering right now and charging 40k for what cost 3k to produce ! This never lasts. Expect the share to go back to 1T valuation.

thierry-le-frippon
Автор

I don't think anything other than the hallucination problem can stop AI, It can be decreased but not stopped.

prembagui
Автор

So a serious miss on the story was glossing over their domination of the software side of things. Like other companies can make chips have as good or better specs or maybe better than NVIDIA, but they are generally a decade behind on the software side of things. CUDA is the standard for AI, and it doesnt work on non-NVIDIA chips. Other companies can make boards who are similar in specs, but start from ground 0 in programing for those boards. NVIDIA expands this by the fact that its software library is open, so its way bigger than any company can reasonably make in a internal company environment. Their advantage is like a snowball rolling down hill. Because its open source, more devs will choose to use it, and thus expand the library, and thus attract more devs. That large snowball only works on NVIDIA boards.

Of all the companies in direct competition, realistically only Google probably matters, as they have been more into the board manufacturing for AI specifically longer than others by a few years, and of course been internally involved with AI for a long time. Other companies can make specialized chips for very narrow applications, which is what companies like Tesla plans, but that wont threaten the general AI-focused GPU market.

This also ties into the biggest threat to NVIDIA: A change in the best way to approach Machine learning. Just as NVIDIA benefited when GPUs were identified as the optimal method to approach machine learning when CPUs were still handling the task, if this changes then NVIDIA no longer has the monopoly from all that software utilizing the CUDA core architecture.

Kyrephare
Автор

Why would you round up 74.25%, almost exactly 3/4s, to 80%.

aspacelex
Автор

Groq will kill Nvidia, LLMs and other GenAI models aren't trained often but its used for inferencing much more frequently. Thats where GPUs seem to be struggling and Groq's LPUs are much better. Majority of companies wont train LLMs but rent out LLMs from other companies like OpenAI, Antropic or Meta.

rajbiswas
Автор

AI vs. Crypto
Abundance vs. scarcity
$0.01 any pictures vs. $250k monkey pic

antman
Автор

Nvidia right now sounds a lot like a monopoly.

mguitarte
Автор

Nvidia is one of those companies that sorta seems boring.

But when you look at it in detail, kinda like this video you realize that yes they really only make one product.

But that one product has usecases with just about every hyped industry over the last 30 years.

RealMacJones