Run FLUX Under 12 GB VRAM

preview_player
Показать описание
Tired of dealing with out-of-memory errors while trying to run FLUX? In this video, we'll show you exactly how to get the powerful FLUX model working on a GPU with just 12GB of VRAM.

Command: pip install bitsandbytes

Chapters:
00:00 : Intro
00:09 : Install Manager and Model Loader
01:53 : Add Model Loader Node
02:30 : Outro
Рекомендации по теме
Комментарии
Автор

Error occurred when executing KSampler:

Attempted to call `variable.set_data(tensor)`, but `variable` and `tensor` have incompatible tensor type.

Somebody got a fix for that?

Hypesx.
Автор

i have 8go VRm and its use only 5.7 loaded in lowvram mode 5696.2 can use more than 7go of vrm

LinusBuyerTips
Автор

I hav followed all the steps, but this error appears when generating image:

Error occurred when executing SamplerCustomAdvanced:
'ForgeParams4bit' object has no attribute 'module'

I have cuda 12.1 and pytorch 2.1.2 (Comfy UI installed via Stability Matrix)

pedroj.fernandez
Автор

You didn't specify how much time you get when generating an image

fairyroot