filmov
tv
Kasucast #19 - Stable Diffusion: Worldbuilding an IP with 3D and AI ft. MVDream | Part 1

Показать описание
#MVdream #sdxl #ComfyUI #stablediffusion #conceptart #worldbuilding #aiart #blender #substance3d #bytedance #tiktok
This is a video going primarily going over MVDream. It is a unique text-to-3D model. It is also the first part of series that I plan on finishing, which is worldbuilding an intellectual property with 3D and AI.
It focuses heavily on the theory and setup/usage of the MVDream repositories. The second half consists of using the exported meshes for various concepting workflows with the objective being the interior design of a character's living space.
The final 3D design of the character's room isn't finished in this video as it would take too long. I elected to break up this premise into several parts, so please look forward to it.
Practical tools:
Generating a 3D obj from a single image:
Theory/Discussions:
MY DOCKER ENVIRONMENT SETTINGS:
Timestamps:
00:00 Intro
01:21 MVDream: what is it/what does it solve?
03:38 Dataset Overview
05:00 Camera settings explanation
06:01 Multiview perspective
06:51 Multiview 2D code dive
07:44 Camera embedding
09:36 Camera utils
10:05 Setting up MVDream 2D environment
11:54 Trying out the Gradio server
15:26 Start of MVDream-threestudio
17:49 Setting up Docker environment for MVDream-threestudio
27:10 Explaining why the gradio server for 3D is not usable
34:37 Generating NeRFs through CLI
38:25 Exporting meshes
40:20 Evaluating MVDream mesh fidelity
42:35 Second stage refinement and why I don't recommend it
44:09 Redesign from refinement = unusable
44:57 Showing some other NeRF to 3D mesh objects
47:17 Rendering out a 3D object
48:10 Using 3D renders as ControlNet guides
50:34 Worldbuilding overview (context)
52:32 Potential room designs
53:33 Potential chair designs
54:36 Generating albedo map texture in ComfyUI
56:19 Using Adobe 3D Sampler to convert albedo into PBR textures
58:52 Quick setup of converted PBR textures
1:02:00 Using same process to generate metal textures
1:03:33 Quick overview of using Cube by CSM to convert a picture to a mesh
1:05:18 Checking refined mesh from Cube
1:05:50 Closing thoughts
🎉 Social Media:
Images/processes may be fabricated and therefore not real. I am unaware of any illegal activities. Documentation will not be taken as admission of guilt.
This is a video going primarily going over MVDream. It is a unique text-to-3D model. It is also the first part of series that I plan on finishing, which is worldbuilding an intellectual property with 3D and AI.
It focuses heavily on the theory and setup/usage of the MVDream repositories. The second half consists of using the exported meshes for various concepting workflows with the objective being the interior design of a character's living space.
The final 3D design of the character's room isn't finished in this video as it would take too long. I elected to break up this premise into several parts, so please look forward to it.
Practical tools:
Generating a 3D obj from a single image:
Theory/Discussions:
MY DOCKER ENVIRONMENT SETTINGS:
Timestamps:
00:00 Intro
01:21 MVDream: what is it/what does it solve?
03:38 Dataset Overview
05:00 Camera settings explanation
06:01 Multiview perspective
06:51 Multiview 2D code dive
07:44 Camera embedding
09:36 Camera utils
10:05 Setting up MVDream 2D environment
11:54 Trying out the Gradio server
15:26 Start of MVDream-threestudio
17:49 Setting up Docker environment for MVDream-threestudio
27:10 Explaining why the gradio server for 3D is not usable
34:37 Generating NeRFs through CLI
38:25 Exporting meshes
40:20 Evaluating MVDream mesh fidelity
42:35 Second stage refinement and why I don't recommend it
44:09 Redesign from refinement = unusable
44:57 Showing some other NeRF to 3D mesh objects
47:17 Rendering out a 3D object
48:10 Using 3D renders as ControlNet guides
50:34 Worldbuilding overview (context)
52:32 Potential room designs
53:33 Potential chair designs
54:36 Generating albedo map texture in ComfyUI
56:19 Using Adobe 3D Sampler to convert albedo into PBR textures
58:52 Quick setup of converted PBR textures
1:02:00 Using same process to generate metal textures
1:03:33 Quick overview of using Cube by CSM to convert a picture to a mesh
1:05:18 Checking refined mesh from Cube
1:05:50 Closing thoughts
🎉 Social Media:
Images/processes may be fabricated and therefore not real. I am unaware of any illegal activities. Documentation will not be taken as admission of guilt.
Комментарии