How to Pick the Best Flux Models for YOU and Use in Forge UI

preview_player
Показать описание
Flux Forge UI Tutorial covering my current pick of the best Flux models available, and the fundamental understanding and considerations, to choose and use the right model, for YOUR specific needs and hardware specs.

The focus of this video, is aimed squarely at helping you understand the factors to consider, for choosing the best Flux models for YOU, so that you can save time, and find the sweet spot, between image quality and generation speed.

LINKS:

Dev FP16

Dev FP8

Dev FP8

Dev GUFF Q8

Dev GUFF Q4

Dev NF4 v2

Schnell FP16

Schnell FP8

Schnell FP8

Schnell GUFF Q8

Schnell GUFF Q4

Schnell NF4

VAE

CLIP Text Encoder

T5 Text Encoder

T5 Text Encoder

PATHS:

Save Models in:
Forge\webui\models\Stable-diffusion

Save VAE in:
Forge\webui\models\VAE

Save CLIP and T5 Text Encoders in:
Forge\webui\models\text_encoder

CHAPTER TIMESTAMPS:

00:00 Pick the Best Flux Models for You and Use in Forge UI
00:32 My Current Best Flux Model Picks
01:27 Flux Dev vs Flux Schnell Pros and Cons
04:31 Flux Model Size vs GPU VRAM, RAM, Virtual Memory
08:05 Flux ‘All-In-One’ vs Flux ‘Model-Only’
10:47 Download and Save Flux Model, VAE, CLIP, T5
13:22 Selecting Flux Model, VAE, CLIP, T5 in Forge UI

COPYRIGHT NOTICE:

Images, audio and video created and owned by FoxtonAI.
Рекомендации по теме
Комментарии
Автор

Thanks for the class, please do a Comfyui tutorial too

thiagoreis
Автор

Don't forget with Flux and lower vram, some users are reporting errors with LORAs - they aren't loaded when ComfyUI goes into lowvram mode. They fixed the FP8 lowvram mode issue - but not with the other models.

nodewizard
Автор

Can the model and text encoder mix match? ex Dev Guff Q8 + T5 fp16 default vice versa. Also, I was told to use Dev guff or FP8 when using loras even though I have a 24gb vram. To confirm that is what you were explaining in this section: 5:56 -7:00

Brandon-xvk
Автор

Where do you find the realistic vision V51 & SDXL Faetastic V24 Stable diffusion models? I don't see them linked in the description to download them into the stable diffusion folder

liberty
Автор

I couldn't run the original Flux.1 Dev which is 22GB+ with 16GB of RAM (24GB VRAM, RTX A5000), I had to up to 32GB RAM...

SparkyRih
Автор

Sorry for the late question, I just found your excellent tutorial. According to your hope, if image quality is the most important aspect, with a 24GB RTX3090 video card, 64GB RAM, Win 11, what configuration is best? For example, can I find a better one?
flux1-dev-fp8; 30 steps; ae.safetensors; clip_l.safetensors; t5xxl_fp16.safetensors

Автор

Maybe its because I use ComfyUI, but my Dev FP8 with everything only peaks around 11GB vram when running. By your math it should require way more. I was even able to generate fairly quickly on a 8GB card too.

HForceClan
Автор

Flux Dev 1 BNB NP4 V2 is good but is bad with text

GaijinGamerGirl
Автор

Sorry, I'm not clear. Which version is best for 4060 8gb Vram and 16ram?

josemtb