filmov
tv
FLUX: The First Ever Open Source txt2img Model Truly Beats Midjourney & Others - FLUX is Awaited SD3
Показать описание
FLUX is the first time every an open source txt2img is able to truly surpass and produce better quality and better prompt following images than #Midjourney, Adobe Firefly, Leonardo Ai, Playground Ai, Stable Diffusion, SDXL, SD3 and Dall E3. #FLUX is developed Black Forest Labs and its team is mainly composed by the original authors of #StableDiffusion and its quality is mind blowing. When I say these words I am not exaggerating you will see that after watching the tutorial. In this tutorial I will show you how to very easily download and use FLUX models on your PC and also on Cloud services Massed Compute, RunPod and a free Kaggle account.
🔗 FLUX Instructions Post (public no need login) ⤵️
🔗 FLUX Models 1-Click Robust Auto Downloader Scripts ⤵️
🔗 Main Windows SwarmUI Tutorial (Watch To Learn How to Use) ⤵️
🔗 Cloud SwarmUI Tutorial (Massed Compute - RunPod - Kaggle) ⤵️
🔗 SECourses Discord Channel to Get Full Support ⤵️
🔗 SECourses Reddit ⤵️
🔗 SECourses GitHub ⤵️
Video Chapters
0:00 Introduction to the truly SOTA txt2img model FLUX which is Open Source
5:01 How we are going to install FLUX model into our SwarmUI and use it
5:33 How to accurately download FLUX models manually
5:54 How to download FP16 and optimized FP8 FLUX models automatically 1-click
6:45 Which precision and type of FLUX models are best for your case and what is the difference
7:56 Which folder you need to put FLUX models accurately
8:07 How to update your SwarmUI to latest version for FLUX support
8:58 How to use FLUX models after SwarmUI started
9:44 How to use CFG scale for FLUX model
10:23 How to see what is happening that moment in the server debug logs
10:49 Turbo model image generation speed on RTX 3090 Ti GPU
10:59 Somes turbo model may generate blurry images
11:30 How to generate images with development model
11:53 How to use FLUX model in FP16 instead of default FP8 precision on SwarmUI
12:31 What are the difference between development and turbo model of FLUX models
13:05 Generating native 1536x1536 and testing high res capability of FLUX and how much VRAM it uses
13:41 Image generation speed of 1536x1536 resolution FLUX image on RTX 3090 Ti GPU with SwarmUI
13:56 How to check if you are using any shared VRAM - this slows down generation speed significantly
14:35 How to use SwarmUI and FLUX on cloud services - no PC or GPU is required
14:48 How to use pre-installed SwarmUI on amazing Massed Compute 48 GB GPU for 31 cents per hour with FLUX dev FP16 model
16:05 How to download FLUX models on Massed Compute instance
17:15 FLUX models downloading speed of Massed Compute
18:19 How much time it takes on Massed Compute download all very best FP16 FLUX and T5 models
18:52 How to first update and start SwarmUI on Massed Compute with 1-click
19:33 How to use Massed Compute started SwarmUI on your PC's browser with ngrok - you can use even on your phone this way
21:08 Comparing Midjourney image to open source FLUX with same prompt
22:02 How to set DType to FP16 to generate better quality images on Massed Compute with FLUX
22:12 Comparing FLUX generated image with the Midjourney generated image for same prompt
23:00 How to install SwarmUI and download FLUX models on RunPod to use
25:01 Step speed and VRAM of Turbo model vs Dev model of FLUX
26:04 How to download FLUX models on RunPod after SwarmUI installed
26:55 How to start SwarmUI after you restart your pod or turn off and on your pod
27:42 If CFG scale panel of SwarmUI is not visible properly how to fix it
27:54 Comparing FLUX quality with very best models of Stable Diffusion XL (SDXL) via popular CivitAI image
29:20 FLUX image generation speed on L40S GPU - FP16 precision
29:43 Comparing FLUX image vs CivitAI popular SDXL image
30:05 Does increasing step count improves image quality significantly
30:33 How to generate bigger resolution 1536x1536 pixel image
30:45 How to install nvitop and check how much VRAM 1536px resolution and FP16 DType uses
31:25 How much speed drop happened when increase image resolution from 1024px to 1536px
31:42 How to use SwarmUI and FLUX models on a free Kaggle account same as on your local PC
32:29 How to join SECourses discord channel and contact with me for any help and discuss AI
🔗 FLUX Instructions Post (public no need login) ⤵️
🔗 FLUX Models 1-Click Robust Auto Downloader Scripts ⤵️
🔗 Main Windows SwarmUI Tutorial (Watch To Learn How to Use) ⤵️
🔗 Cloud SwarmUI Tutorial (Massed Compute - RunPod - Kaggle) ⤵️
🔗 SECourses Discord Channel to Get Full Support ⤵️
🔗 SECourses Reddit ⤵️
🔗 SECourses GitHub ⤵️
Video Chapters
0:00 Introduction to the truly SOTA txt2img model FLUX which is Open Source
5:01 How we are going to install FLUX model into our SwarmUI and use it
5:33 How to accurately download FLUX models manually
5:54 How to download FP16 and optimized FP8 FLUX models automatically 1-click
6:45 Which precision and type of FLUX models are best for your case and what is the difference
7:56 Which folder you need to put FLUX models accurately
8:07 How to update your SwarmUI to latest version for FLUX support
8:58 How to use FLUX models after SwarmUI started
9:44 How to use CFG scale for FLUX model
10:23 How to see what is happening that moment in the server debug logs
10:49 Turbo model image generation speed on RTX 3090 Ti GPU
10:59 Somes turbo model may generate blurry images
11:30 How to generate images with development model
11:53 How to use FLUX model in FP16 instead of default FP8 precision on SwarmUI
12:31 What are the difference between development and turbo model of FLUX models
13:05 Generating native 1536x1536 and testing high res capability of FLUX and how much VRAM it uses
13:41 Image generation speed of 1536x1536 resolution FLUX image on RTX 3090 Ti GPU with SwarmUI
13:56 How to check if you are using any shared VRAM - this slows down generation speed significantly
14:35 How to use SwarmUI and FLUX on cloud services - no PC or GPU is required
14:48 How to use pre-installed SwarmUI on amazing Massed Compute 48 GB GPU for 31 cents per hour with FLUX dev FP16 model
16:05 How to download FLUX models on Massed Compute instance
17:15 FLUX models downloading speed of Massed Compute
18:19 How much time it takes on Massed Compute download all very best FP16 FLUX and T5 models
18:52 How to first update and start SwarmUI on Massed Compute with 1-click
19:33 How to use Massed Compute started SwarmUI on your PC's browser with ngrok - you can use even on your phone this way
21:08 Comparing Midjourney image to open source FLUX with same prompt
22:02 How to set DType to FP16 to generate better quality images on Massed Compute with FLUX
22:12 Comparing FLUX generated image with the Midjourney generated image for same prompt
23:00 How to install SwarmUI and download FLUX models on RunPod to use
25:01 Step speed and VRAM of Turbo model vs Dev model of FLUX
26:04 How to download FLUX models on RunPod after SwarmUI installed
26:55 How to start SwarmUI after you restart your pod or turn off and on your pod
27:42 If CFG scale panel of SwarmUI is not visible properly how to fix it
27:54 Comparing FLUX quality with very best models of Stable Diffusion XL (SDXL) via popular CivitAI image
29:20 FLUX image generation speed on L40S GPU - FP16 precision
29:43 Comparing FLUX image vs CivitAI popular SDXL image
30:05 Does increasing step count improves image quality significantly
30:33 How to generate bigger resolution 1536x1536 pixel image
30:45 How to install nvitop and check how much VRAM 1536px resolution and FP16 DType uses
31:25 How much speed drop happened when increase image resolution from 1024px to 1536px
31:42 How to use SwarmUI and FLUX models on a free Kaggle account same as on your local PC
32:29 How to join SECourses discord channel and contact with me for any help and discuss AI
Комментарии