ComfyUI: Intro to Control Net- Sketch to Render (Part III)

preview_player
Показать описание
In this video I will go over how you can develop more advance workflows with comfy UI with control net to allow more structural control based on input images and they are use in popular workflows such as sketch to image AI generation.

This follows on from a previous video on the basics of comfy so please take a look at that for more background. Control net is a core extension that I often use as part of my architectural workflows so is definitely worth getting acquainted with.

00:00 Intro
00:35 Control net Set-up
01:34 Control net Model install
03:30 Sketch to Image Workflow
08:18 Using multiple control nets
12:02 Comparisons in ComfyUI

#ai #architecture #generativedesign #parametric #stablediffusion #animation #comfui #image #design #generativedesign #painting
Рекомендации по теме
Комментарии
Автор

Congratulations! Nice tutorial!!! When We are gonna have more tutorial talking about Multi Controlnet Stack?

Arquitetod
Автор

The creativity level is set by CFG, while denoise will determine how much of the original image can be redrawn. Using the Advanced ControlNet instead of the regular one seems pointless unless you are fine-tunning the advanced node's values. I didn't quite get the purpose of the Scribble Lines node, is it just to resize the source image? I believe that's done automatically depending on the output size anyway.

andresz
Автор

Nice tutorial, why did we have to copy the reference image and connect one to the vae encode and another to the preprocessor, it would mean for every new image we change we need to have 2 copies?

abiodunshonibare
Автор

Good viedeo i watched basically all of them but i have a problem with accassing my controlnetloader in comfyui can you maybe help me reugh discord or something

bimpf