Low Poly Model Alignment for PBR Texture Baking in Blender

preview_player
Показать описание
#photogrammetry #scanning #tutorial #blender #workflow #lowpoly

In this video I present how to create and align a low poly plane for further PBR texture baking in Blender.
Hope someone finds it useful. If you like it and want to me to create more content like this one, please leave tht thumbs up, drop a comment or even subscribe to my channel
Big thanks to all who did it already as it really motivates me to produce more content like this.

Stay safe and till the next one
Cheers!
G.

-------------------------------
0:00 Intro
0:32 Basics
4:32 Some Theory
10:58 Blender in Practice
19:27 Summary
Рекомендации по теме
Комментарии
Автор

I also work with Blender. Your video is very helpful. I hope you can post similar videos like this one often. Thanks for sharing dear. All the best to you. Have a wonderful day and see you next time again. Many Greetings from Germany ♥ Like 163

outdoorstours
Автор

6:00 that was an astonishing example to describe the difficulty to get the scale of an object😍

manmadeartists
Автор

Think Im first. Sweet.
Love your vids. Great content.

cobeer
Автор

Really well explained video. Thx
I haven't found a 3D scan program that works for me yet.
I saw substance alchemist now has 3D scan feature as well.
I might check this out some day.

lawrence
Автор

I’d love to know how people are creating roughness maps from their photogrammetry scans, since by default the roughness is uniform (obviously not ideal)

peterallely
Автор

blender it can flip uv any direction you like . go into uv hit u and you have flip . there horiz or vertic .

abdelkarimkeraressi
Автор

Hi Grzegorz! Great video. I have two quick questions. In this video, when you take photos on location, it looks like your camera isn't parallel to the ground but is at about a 45-degree angle. Is that the case?

Regarding the equipment itself, I have a full-frame sensor camera and a Canon 50mm lens that I use in many situations. If I use it for photogrammetry, does this simply mean I would need to take more photos to cover the area ? Or would you recommend photographing with, say, a 35mm lens (which I don't have) ?

logred--
Автор

Not first but happy to see you back to posting videos! I have a workflow question. I used to use Unity's AI-based ArtEngine for a lot of my photogrammetry texture processing. Especially for tiling a texture with a specific pattern (like a brick wall or stone tiles). Nowadays i struggle to get results that are as good using manual techniques like painting seams and tiling my materials in Painter for example. Its never as good and it takes so much longer. What do you use today to remove seams and make materials tilable ? Would love to hear more about this. Looking forward to more videos of you

Hwr
Автор

Hi, I trying to follow the video and I get a little bit confused, when you almost have the correct scale on the added plane in blender (2x2meters) almost 180x180 cm. and then you scale the the plane to match the imported HP. don't you lose the scale if you do that? or is this because you need to match the lowpoly to the insanely high poly that you can't import? Hmm. I might have answered my own question...

Jonte
Автор

Excellent video. Really great quality editing, diagrams, and images to help explain everything 👍👍👍

Barnyz
Автор

Brilliant! As others have been saying, it's great to see your photogrammetry workflow within Blender! And the extra effort in editing and explaining is above and beyond! Awesome on every bake level! :D

CreativeShrimp
Автор

Great video. Seeing you give Blender a chance in your photogrammetry workflow is excellent.
I have a very similar workflow for processing surface scans, except I made a geometry node setup which aligns an unwrapped plane based on the average normal of the subject, projects it on the subject and scales it based on the rulers in the scan. This speeds up the whole process. Generally, it takes longer to export the scan from Metashape than to prepare the projection plane.

HerrWaffell
Автор

Your videos are pure gold, what do you think about the Nvidia NeRF?

Your explanation is so clear I love it! Why did you settle for 2mkw?

Gringottone
Автор

You should *not* subdivide the plane (unless you have a non-organic surface with structure/lines) No matter your subdivisions, when you conform/project, edges will appear in your 32bit floating point depth map and normal map. Instead, high pass the depth and blue channel of the normal with a very large radius to eliminate those low frequency "hills." Then, run some basic math on resetting the height to its natural range. All of this can be done in Substance Designer.

You also should be using a set of ground control points via fiducial markers (April Tags) when scanning so you don't need a ruler. Great tutorial otherwise!

matterfield