AMD's Hidden $100 Stable Diffusion Beast!

preview_player
Показать описание
Thanks to Gigabuster.EXE for his help!

**********************************
Check us out online at the following places!

-------------------------------------------------------------------------------------------------------------
Licensed under Creative Commons: By Attribution 3.0 License
Рекомендации по теме
Комментарии
Автор

Well, they _were_ $100... after this video posts, maybe a little more ;)

JeffGeerling
Автор

It would be very nice too see real ROCm support for the RX5000 series and above. That would make it so much easier for students to be able to experiment with AI while still having a decent gaming GPU. After all, most consumer motherboards don't have enough PCIe connectivity to install 2 separate cards and have them both run full speed.

garrettkajmowicz
Автор

AI is not the only workload that needs insane amounts of VRAM. A few of these Instinct MI25 would make for a very capable FluidX3D system for CFD simulations. Also compelling cards are the $200 Nvidia Tesla P40 24GB, and $600 AMD Instinct MI60 32GB.

ProjectPhysX
Автор

I was about to buy one next week was hoping that no one else noticed the low price.
Already printed a fan adapter.

magfal
Автор

I wish they would add an option for stable diffusion to also use system memory as VRAM, While a game using system memory to supplement a lack of VRAM, will render a game nearly unplayable due to the frame rate drop, it would be good for doing a final high res render if you like a specific iteration of an image. For example, allow it to supplement your 8-16GB of VRAM with like 50GB of system memory, and render a 4K version, even if it will take a few hours due to the slower system RAM.

Razor
Автор

Thank you SO MUCH for this video.

I was planning on doing Stable diffusion at home, but with only my own Vega 64 to start, it felt a bit complicated outside of a whole, costly system upgrade.

I'll probably go this route if shipping from the US to Europe isn't too costly/complicated.

As a hobbyist in tech with low budget, your content is so valuable.
Please keep up the good work.

hectorvivis
Автор

the only reason i've been holding on AMD is because it's hard to get all the newer AI models to run properly. might give it a shot with those, specially at that price. but LLMs is what i'd really like to see running on AMD hardware.

Brutaltronics
Автор

I actually picked up one of these for this exact reason about a month and a bit ago, but haven't been able to get around to using it yet due to the cooling issue. Glad more folks thought the same and have been engineering solutions! Thanks for putting this out there (though hopefully this doesn't disrupt the market)

ColonelFrosting
Автор

Cool video. If you want more accurate Danny Devito faces you could make a Devito LORA for Stable Diffusion. Also, looking good, Wendell! I know it wasn't by choice but it's a silver lining :p

Jimmy___
Автор

I *just* picked up one of these and I'm quite happy with the purchase. It's a lot of compute for the price point. I don't do ML but for spectral DE solvers it's been great.

mikebutler
Автор

I've been using my 6700xt with stable diffusion for a while. If anyone needs a hand, reddit has some decent guides. I can give pointers maybe too. ;)

arugulatarsus
Автор

This is pretty cool for a budget option but the extra work requires some skill for sure, getting a used 5700 xt or non xt i think is more viable to skip the extra work, or just get a RX 6600 for 200 USD, and no work required.
Regardless of what you do if you do not want to go the Linux route Nod-AI - SHARK has a gpu agnostic solution that runs on AMD, Intel and Nvidia, even APUs.
If your a non-normy you would appreciate their work.

TheSleppy
Автор

I'm looking forward to the day when I can run this kind of stuff, plus my own personal assistant on my own nearly silent server, without any need to send my information to greedy corporations...

unclerubo
Автор

Now AMD only has to support their more recent GPUs, like their RX 7900 XTX... 4 Months out and still no ROCm support...

cromefire_
Автор

The results I've been getting with Easy Diffusion are insane. My 2070 laptop can generate a 512x768 picture with 40 steps in about 15 seconds.

xero
Автор

that thumbnail was really bugging me but i didn't want to offend anyone just in case... glad you touched on it in the video

BAD_CONSUMER
Автор

I got really excited for a minute, then I realised I confused Mi25 for Mi210.

СусаннаСергеевна
Автор

Nvidia gets more attention on the consumer space from having a longer and general much easier support tail. Maxwell is still supported by the current versions of cuda, and Kepler will still work with older drivers on cuda 11.

I'd love to see the AMD support improve. Hopefully the attention from this video will help that.

I saw this thread on the forum. One thing I wondered was how this card compares the M40 or P40 cards which are around the same price but a lot easier to get running.

tad
Автор

First experience with this channel. Informative, accessible, and well-presented, even by old media standards.

This would have fit comfortably on TechTV back in the day.

AnneHirow-bhyq
Автор

I got a Radeon VI as a hand-me-down from a Mac user's external GPU setup. 16GB makes me interested to use it for Stable Diffusion. I tried it before with my 1070ti, I wonder if it'll be better with twice the VRAM.

Machinationstudio