Flux In Very Low VRAM With LORA

preview_player
Показать описание
Tired of dealing with out-of-memory errors while trying to run FLUX and also want to use LORA? In this video, we'll show you exactly how to get the powerful FLUX model working on a GPU with Very Low VRAM but also want to use lora with it.

Chapters:
00:00 : Intro
00:02 : Download all the models
00:36 : Place all the model files into their folder
01:39 : Add and connect nodes
02:20 : Set new models
03:01 : Use Lora
03:22 : Outro

#flux #comfyui
Рекомендации по теме
Комментарии
Автор

hi there just here to say that was some useful video :) will really improve my stack thanks!

AIBizarroTheater
Автор

Thanks for this very helpfull tuto ! I saw that you have the same gpu as me 3060 rtx with 12 go which models should you advice me to use ? Q4 ?

hatuey
Автор

I worked through all of the steps ytou describe but the process is failing for me at the the VAE Decode Node with the following message: Given groups=1, weight of size [4, 4, 1, 1], expected input[1, 16, 128, 128] to have 4 channels, but got 16 channels instead. Any idea what could be causing this?

willch
Автор

What is the time to generate a image, did you speed up the video?

sacome_tv
Автор

Can it work with my specs?

Nvidia 3060 20Gb total memory, 4Gb dedicated memory.

Cpu i7, 32Gb ram

chouawarasteven
Автор

how is this different to the Flux schnell? asking by a noob🤥

sirjcbcbsh
Автор

Can I run it with 16gb system Ram😢 ihave rtx3060 12gb

TrungHieuVo-bu
Автор

Will it run in my 4gb graphics, nividia GeForce rtx 3050, i5 12gen???!🥹

aswinrathan