Image to Mesh using ComfyUI + Texture Projector

preview_player
Показать описание
Today's video will discuss image-to-mesh workflow, including 3D reconstruction from a simple image or multiple images.

00:00 Introduction
00:46 ComfyUI Layer Diffuse
01:52 3D reconstruction solutions
04:54 CRM introduction
05:52 CRM diagram
07:19 ComfyUI 3D Pack
07:41 CRM Image to Mesh workflow in ComfyUI
12:26 Wonder 3D Image to Mesh workflow in ComfyUI
13:24 Import CRM mesh in 3D Max
14:42 Mesh comparison of CRM, TripoSR, Wonder3D+NeuS
15:38 Mesh optimization
16:56 Comparison of retopology and optimization process
22:02 Zbruch Z-Remesher
24:13 UV
24:52 Create outline texture for mesh using Texture Projector in UE
27:22 Texture refinement
29:33 Project textures to mesh using Texture Projector in UE
30:59 Bake texture using Property Baker in UE
32:06 Single image to mesh final results
32:48 Why the reference image must be close to the frontal view
34:43 Use depth Control Net to control the view angle
37:56 Multi-view images to mesh workflow in ComfyUI
42:18 Gaussian Splatting + DMTet
43:02 Restriction of Multi-view images to mesh workflow in ComfyUI
44:04 Summary

Music: Sunny Skies (by Suno)

Create various textures using Texture Projector and Stable Diffusion
-----------------------------------
ComfyUI Jake Upgrade:

ComfyUI Img2Mesh workflow:

-----------------------------------
MARS Texture Projector:

MARS Property Baker:

MARS Master Material:
-----------------------------------
Houdini Lego Mesh

Gaussian Splatting

-----------------------------------
#imagetomesh #3dreconstruction #sv3d #triposr #unreal #textureprojector #stablediffusion #comfyui
Рекомендации по теме
Комментарии
Автор

Custom Nodes and Workflow are uploaded to Git Hub, which can be found in the description.

kefuchai
Автор

This is mindblowing! Thank you very much for shared this!

luizcesarleite
Автор

That was super-easy to follow, and very well presented. Kudos!

OriBengal
Автор

You could do all of those dozens of steps... OR... just use the front and side images for reference and simply build the model like you would any other. You've put so much work into saving time, you've definitely made it harder.

MaxSMoke
Автор

methodical presentation. Very well done

michaelmurrillus
Автор

Awesome topology comparison! This video is gold

EqualToBen
Автор

Hi Jake from State Farm, I'm also an AI named Jake.

akuunreach
Автор

impressive. the amount of work you put in this video...well done. subscribed.

soma
Автор

good for hard surface/static/prop assets. For organic and animated models, a big no no, will be a nightmare for animator

artmosphereID
Автор

basically its an over glorified base mesh.

Zamundani
Автор

I have a feeling this has way more potential.

Is there any way to deconstruct the image into parts, then compare these parts to a set of variants, pick the closest and construct a composition of these? Like a reference model to help the process?

I imagine you'd need a few basic head shapes, ears, chins, eyebrows and perhaps even hairdos. Then have sets of images (angles or a lora model?) for archetype heads (or complete bodies).
Like a base bland model, one with extreme elvish ears, one with a pronounced chin, one with exaggerated brows, etc. Then you'd need to reconstruct the mix you want from these archetype sets to get the right mix. I'm not sure you can use interpolation that easily (iirc ControlNet had options?) but if you use the same process on each image of these consistent sets, the resulting set should be consistent too, right?
Then if each archetype has a nicely fixed 3d model, you could also generate one for the mixed composition.

Would such a process to create an approximation or base model be doable?
Could you use this and the actual image (in iterations?) to create a consistent 3D model without any manual fixes?

teambellavsteamalice
Автор

Beautiful!! Great video. A lot of work and time in this great explanation.

vivigomez
Автор

They could use metaball instead of voxel for the mesh cloud point recreation, it's super light and fast for generated volume, blender handle the metaball convertion to polygon very well...

Meteotrance
Автор

cool song, totally unexpected out of the blue!

brianmcquain
Автор

Hello, I like your video very much, congratulations, But I am a beginner and I tried to install everything necessary to follow your video but I can't.
I installed ComfyUI but, after that, when I try to install the ComfyUI-3D-Pack, the installation fails (I probably do something wrong in the process).
It would be great if you could make a tutorial starting from the beginning, on how to install ComfyUI, then your plugins and then “ComfyUI-3D-Pack”.
I sincerely hope you decide to do that, to help beginners like me. Regards.

evanpiccirillo
Автор

workflow links are not valid, can you share them again

Hemanthkumar-imch
Автор

Baked light and no pbr textures are showstopper for game dev, they also have this AI look and unoptimised mesh. Thats far from useful.

Rahviel
Автор

tbh i would faster sculpt and paint this myself than all that shenaningans

MrGATOR
Автор

"looks cool and effective" for the untrained eye.. in reality this is like 20 times more complicated and time consuming than a regular 3d workflow getting a result that it's not even usable...

cj
Автор

Hi, where can I find the JK_workflow?
Thanks in advance

PixelPoetryxIA