Stable Diffusion Controlnet Adapters Explained In Detailed | Color And Style Adapter ControlNet

preview_player
Показать описание
#stablediffusionart #stablediffusion #stablediffusionai

In this Video I have Explained About how to Install and Use Controlnet Adapters that are recently updated on Controlnet Extension and I have Deeply Explained On How Colour adapter and Style Adapter works On Controlnet Extension.

Controlnet Is One of the best extension yet available in stable diffusion Automatic1111.

I hope so You Guys Like this Video
There are lots of things yet to explore in stable diffusion.
Do let me know in the comment section below How should I make videos regarding Stable Diffusion.

Follow Me On
Discord: ZooFromAI# 0737

Note:
Model Used - Art And Eros Model
Samplers Used - Euler A

Links To Downloads And Videos:-

#stablediffusion #stablediffusiontutorial #stablediffusionai #stablediffusioncolab #stablediffusiongooglecolab #stablediffusionaitutorial

_Music In this Video_
Upbeat Corporate Podcast by Infraction [No Copyright Music] / Marketing:
Рекомендации по теме
Комментарии
Автор

This Is a detailed Video in which I have Explained 2 Controlnet Adapters
1. Colour Adapter Starts At 03:29
2. Style Adapter Starts At 05:52
You Can Download these Model Adapters From this Link:

CHILDISHYTofficial
Автор

What amazes me is how quick these updates are being rolled over that you can hardly catch up with them, while Adobe still has the same AI photoshop features on "beta" for the past two years...

DJVARAO
Автор

Upscalers next please? Like how to improve small details?

luminvade
Автор

what the difference between t2iadapter_color_sd14v1.pth and They are located on the hugging face, in the shared folder with the models.

User-pqyn
Автор

The color adapter works for me, but the style adapter does not. The Guidance Start (T) value doesn't change anything. The result is the same as when ControlNet is turned off. Please tell me how to fix this? Thank you!

User-pqyn
Автор

lovig your stuff..
but this wont work on me.. all installed but doesn´t do nothing clip_vision>t2iadapterstyle.. guidance star zero to one give me the same image as control net pluged of. any clue?
thanks in advance

marcelooliveira
Автор

The guidance start and end work basically the same as the prompt editing feature. The start is at what percentage of generation the CN adapter starts influenving your image and the stop is what percentage of generation it stops doing so. Think of it as if the enable button was tied to the generation progress bar. If you have start 0.5 and stop 0.9 then the style isn't used until 50% of the way through generation then once 90% of generation is reached it stops being used again

JohnVanderbeck
Автор

Great video man!
I have just one problem with the "clipvision" it's not working for me. It's just generating the prompt, but the other adapters works. What it can be?

miguelarce
Автор

Wow style is amazing. You did a great job explaining how much affects the final outcome.

I need to update my controlnet!

clearandsweet
Автор

There is no color in Preprocessor. How can I get it?

PSandmore
Автор

You should have reused the seed for better comparability between different parameter settings instead of always generating with a random seed

Tignite
Автор

get an error message: RuntimeError: pixel_unshuffle expects height to be divisible by downscale_factor, but input.size(-2)=683 is not divisible by 8

TobinatorXXL
Автор

I had an error using style adapter that was solved by setting my batch to 2 - Thank you for the great guide to the latest addition, I'm having more fun already!

autonomousreviews
Автор

Can't seem to get these models working properly. From the whitepaper it looks like they were trained with SD1.4, it looks like you're using a custom model in the video. Have you tried it with any others like Dreamshaper or Illuminati?

benpalmer
Автор

you are really helpful with all this stable diffusion update, very very good content!

aginbayu
Автор

Very informative. I hadn't seen this covered anywhere else yet. Thanks.

coda
Автор

Where do you get the Yaml for the tencent adapters?

Because_Reasons
Автор

Dont really remember if you mention it on the video, but is it just me or it only works in square formats? Every time I try to do it on 4:3 or anything not 1:1 it gives me this: RuntimeError: The size of tensor a (22) must match the size of tensor b (21) at non-singleton dimension 2

hugoruix_yt
Автор

The safetensors format doesn't work for me 🤪

unknowngodsimp
Автор

Bery gud video tutorial..i tell you. Thanks

bertoness