The Special Memory Powering the AI Revolution

preview_player
Показать описание
Links:
Рекомендации по теме
Комментарии
Автор

I was an early adopter of HBM graphics card from the AMD fury and then AMD Vega 64 series. They were cutting edge and I wished it caught on back in the days, I’m glad it’s still alive and now in demand and I can’t wait to see what consumer application will come out of it.

poi
Автор

AMD first used HBM1 in their flagship gaming GPU the R9 Fury in 2015. so gamers knew about HBM long before the current AI hype

Hobbesever
Автор

I have zero knowledge about all of this but you always make it so much more understandable. You're really good at teaching or presenting. Take this coming from a dsylexic like me who finds it difficult to understand new topic immediately

HeyHey
Автор

I've always called DRAM "deeram". Never heard anyone call it "dram".

qwaqwa
Автор

I really appreciated the comment about the ecosystem and "working through the newness" - I work in cloud software development and we still see a *lot* of this problem in the field. Every once in a while the difficult part is the technology, but more often the difficult part is getting everyone to play nicely together so we can have nice things, like Kubernetes that works out of the box, or HBM3

Sometimes I think we get focused on the competition of who came up with a new technology first, or who implemented it best. But realistically, nobody's gonna buy your thing if they can't make it work with their thing. You need to have a very cooperative mindset to work on the cutting edge.

tubaterry
Автор

I love all of your DRAM content. Your work effectively ends up being a seed crystal to my own research. Looking forward to Micron, and NAND manufacturing series one day.

qr
Автор

Little did I know the Titan V that I bought with award money in grad school turned out to be a very sophisticated piece of hardware. HBM FTW!

craigcarlson
Автор

I've worked on a certain big automotive manufacturer's certain semiconductor project, and the architecture is designed in a way to be modular and grow around multiple Samsung HBM3 dies, in order to facilitate their future autonomous driving tech. I can fairly say that this tech is only on its way to the moon now, as development is in full swing. Server applications aside, automotive is indeed where HBM3 is in dire need.

moldytexas
Автор

@6:24 you say " compared to GDRR 64 bit", but the slide says x32 bit

ToniT
Автор

You’re the only one I’ve heard who says it like “dram”. Everybody else calls it “dee-ram”. Same as SRAM is “ess-ram”. HBM is used in data communications buffer application as well. Thanks, keep up the great work!

swmike
Автор

Intel Knight's Landing had stacked memory (MCDRAM), Xeon Max (SPR HBM) has HBM2, and Intel/Altera has Stratix 10 MX FPGAs with HBM2. Will be interesting to see if ML is the killer app that drives wider deployment.

poofygoof
Автор

Man, the Radeon 7 just came out ahead of its time.

ksalieri
Автор

The other place I've seen HBM used is as an alternative to TCAM for routers that need route scale. Juniper networks has done this with their Express series.

JoshHoppes
Автор

Nvidia’s H100 uses HBM2. They created H200 recently with HBM3 to compete as AMD’s first foray into AI was always planned with HBM3.

dummiesgoogr
Автор

i wish i could use my radeon VII for AI but it doesnt have cuda cores. Its 16gb of HBM would come in handy

na_
Автор

Thank you for your new video :)

You should also look at IBMs new AI chip “North Pole”… the processing cores are sprinkled between the RAM blocks

robertoguerra
Автор

Reading through Computer Architecture a Quantitative Approach at the moment and was reading about HBM, sure enough you've just released a video on it. Thanks for the great explanation :). On another note, have you done a video on HAMR and MAMR HDD's? Would love to see a sort of comparison video on the competing technologies especially since Seagate shipped the first commercial HAMR drives this year.

postmanpat
Автор

Isn't GDDR, 32bits per stack? With a 256bit memorybus a videocard usually has 8 chips. 8x32=256.

nitroxinfinity
Автор

As someone that has worked in big data for years, there’s no revolution lol. What there is is some very loud marketing teams and scam investments.

Jaxck
Автор

I can always find the big brain subject matter, explained to a level I can almost begin to ingest right here ... and still leaves me fascinated, ready for more. As always ... Thank you!

danytoob