Dreambooth Stable Diffusion Local training advice for cards with low Vram.

preview_player
Показать описание
Recorded this one one go, messy video, might make another late.

#dreambooth
#stablediffusion
#nvidia
#aitraining
#advice
#helpful
#interesting
Рекомендации по теме
Комментарии
Автор

If you have low GPU RAM available, make sure to add a after sending it to cuda for less VRAM usage (to the cost of speed)
So the question is, where to enter this code? Straight to promt? I have 4 gigabytes of RAM, which only did not SD 2 version swears that there is not enough memory CUDA

JimPoisonFishingPlanet
Автор

After creating the classifiers it runs out of memory.

wlf
Автор

Does it work with GTX 1050 Ti (4gb VRAM)? I tried earlier and I got error "CUDA Run Out Of Memory" right before finalizing training

thevaldis
Автор

I have an RTX 3070 (8GB), but even with it I couldn't get past the first step creating the model (Create Model) it says in the webui: "Error", and in the console it says: "IndexError: tuple index out of range ", does anyone know how fix it?.

jorgetinoco
Автор

I have RTX3050, 4Gb VRam, will it work?

victorproff
Автор

In my memory attention option, I only have "flash_attention" option and there is no xformers option. Running 1.5 AND DREAMBOOTH IS ON LATEST VERSION.

rilijohnmichael
Автор

I have 3060ti 8gb vram, using kohya_ss ui with same settings xformers, but training gets error, tried Lora training and it's running, but checkpoint training nit working.

sukhpalsukh
Автор

Hey, i did exactly like you and got this error "Please configure some concepts." do you know why?

Igor-rqmp
Автор

i have a GPU amd, but it gives me an error when creating the model.

eliasherreradg
Автор

I am thinking to sell my kidny and but A6000 with 48gb :D, no just kidding

mlnima