Trillium TPU, built to power the future of AI

preview_player
Показать описание
To deliver the next frontier of models and enable you to do the same, we’re excited to announce Trillium, our sixth-generation TPU, the most performant and most energy-efficient TPU to date.

More than a decade ago, Google recognized the need for a first-of-its-kind chip for machine learning. In 2013, we began work on the world’s first purpose-built AI accelerator, TPU v1, followed by the first Cloud TPU in 2017. Without TPUs, many of Google’s most popular services — such as real-time voice search, photo object recognition, and interactive language translation, along with the state-of-the-art foundation models such as Gemini, Imagen, and Gemma — would not be possible.

Trillium TPUs achieve an impressive 4.7X increase in peak compute performance per chip compared to TPU v5e. We doubled the High Bandwidth Memory (HBM) capacity and bandwidth, and also doubled the Interchip Interconnect (ICI) bandwidth over TPU v5e. Additionally, Trillium is equipped with third-generation SparseCore, a specialized accelerator for processing ultra-large embeddings common in advanced ranking and recommendation workloads. Trillium TPUs make it possible to train the next wave of foundation models faster and serve those models with reduced latency and lower cost. Critically, our sixth-generation TPUs are also our most sustainable: Trillium TPUs are over 67% more energy-efficient than TPU v5e.

Рекомендации по теме
Комментарии
Автор

What about the successor of Google Coral? I'm asking for us mortals

returnedinformation
Автор

Thanks for the notebookllm. Its very useful.

arirajuns
Автор

Will there be special programs/offers for university research institutes?

Atom
Автор

TPUs although very useful and a big gain in speed for training large AI models, are only on GCP. So if you’re not on that cloud, you can’t use them.

joe_hoeller_chicago
Автор

The narration feels like it's been made by a vlogger who just started yesterday in his basement.

sharex
Автор

Not 1 solar panel on Google's building. Shame on them

EarthCreature.
Автор

All of this power, all of this technology...for watching dogs go surfing.

andreamanninfiaschi
Автор

TPUs. Mga gawa ko yan, gawa namin yan.

Google is mine.

jjgerald
Автор

That facility reminds of other earlier and also larger platforms for intelligence.

cystarkman
Автор

I want a tpu on my pi that interacts more with local 1-3b llms.

Unineil
Автор

The future of ai harsware technology should be such that we xan also develop micro intelligent robots, how will gpu, tpu in any way fit such micro robots?

dineshlamarumba
Автор

Will it be used in Project Nimbus in the West Bank?

Автор

Excelente herramienta para LA Google cloud

julioconradomarinardila
Автор

Looking good! Accelerate is the way….to 1 Trillium FLOPS!

mcd
Автор

i think AI is out and too hyped, i am betting bigger on the metaverse and organoid intelligence

matt
Автор

If only the realtime translation was good - It and subtitles are horrible, no one uses it but hey thanks for forcing me to verify login/location every 5minutes

syntaxed
Автор

No more cpus or gpus for humans anymore. 😢
Everything is for AI, from now on. 🥺

sahilx
Автор

I have doubt, human brain not have skill all topic, even for single topic no one be excellent. Then how we say want we doing is correct?😅😊❤❤

GobalKrishnanV
Автор

There is no point to build AI chip if your AI product sucks. Move fast or you would be the Nokia of 2020s years.

HaiLeQuang