Why Apple Uses Google TPUs While the World Relies on NVIDIA Chips for AI Training

preview_player
Показать описание
While the world relies on NVIDIA chips for AI model training, Apple is breaking the mold by choosing Google's Tensor Processing Units (TPUs) instead. This bold move sees Apple renting server space from cloud providers rather than building its own data centers, potentially revolutionising cost and efficiency in AI processing. Could this decision challenge NVIDIA's dominance in the AI chip market? Watch this video to know more about Apple's innovative strategy and its implications for the future of AI!

#google #apple #nvidia #ai #artificialintelligence

Mint is an Indian financial daily newspaper published by HT Media. The Mint YT Channel brings you cutting edge analysis of the latest business news and financial news. With in-depth market coverage, explainers and expert opinions, we break down and simplify business news for you.

Click here to download the Mint App
Рекомендации по теме
Комментарии
Автор

Nvidia might have been making AI chips and Cuda architecture.
But all of them, Any manufacturer making AI chips is relied on Google's Tensor process and architecture.
Even NVIDIA uses Tensor LLM RT and many Tensor processes in their chips in some proportion.
It makes sense to directly go to Google for these chips as they have modified for mobile, quantum, servers use cases with nearly same architecture.
Believe it or not Google has leading technologies everything for further inventions that are made inhouse.

ram_chopade_cr
Автор

Apple always is trying to rule out any dominance by other companies and/or vendors except themselves.

jiyeoulryoo