Unity Shader Graph Basics (Part 4 - The Depth Buffer)

preview_player
Показать описание
The depth buffer is instrumental in rendering objects correctly. Similarly, the depth texture is extremely helpful for creating certain effects. Learn how both work in Part 4 of this Shader Graph tutorial series!

I'm using Unity 2022.3.0f1, although these steps should look similar in previous and subsequent Unity versions.
------------------------------------------------------------------------

------------------------------------------------------------------------

------------------------------------------------------------------------
------------------------------------------------------------------------
Рекомендации по теме
Комментарии
Автор

A longer explanation of how the depth buffer actually stores values, since I cut a very long explanation of what the "non-linear relationship" is from the video, paraphrased from my shader book:

Unity calculates the distance of a pixel from the camera, which we can call its z-value, hence "z-buffer". This value is between the near and far clip distances of your camera, because those are the minimum and maximum distances that actually get rendered.

This z-value is changed into a depth value that we can store in the depth buffer by transforming the [near, far] range to a [0, 1] range. The depth buffer stores floating-point numbers between 0 and 1.

If this mapping were linear, we could run into precision issues with close objects. Especially for small objects close together, we could feasibly end up with errors where objects end up rendered in the wrong order.

To avoid that, we want to use as much precision as possible to represent close objects. The exact formula that is used for converting the z-value to a depth value is:

depth = ( 1/z - 1/near ) / ( 1/far - 1/near )

What you end up with is a curve. For the default Unity camera values where near = 0.3 and far = 1000, 70% of all information stored in the depth buffer represents objects up to a distance of one meter from the camera. Which is amazing when you consider the remaining 30% represents the other 999 meters!

As mentioned in the video, those depth buffer values get copied to the depth texture, and then Shader Graph gives you the tools to decode this curve into two linear formats (Linear01, where values are linearized in the same 0-1 range, and Eye where values are just the original z-values - distances from the camera - that we started with).

Hope that gives a bit more context!

danielilett
Автор

Wow, this is the best introductory video I've ever seen, and it's for the latest version of Unity. It was a huge help. I sincerely hope you can keep updating.🥰

拉拉菲尔
Автор

Great explanation, hope you keep making this series!

semiterrestrial
Автор

Such a brilliant and helpful series. Great stuff Daniel. (Not least because that 3 sec lerp explanation was the best and most concise i've seen)

richardaen
Автор

These videos are helping me to better understand the ShaderGraphs, thx❤️

FarwalDev
Автор

Thank you, this series is helping me a lot to understand shader graph

usercontent
Автор

Thank you. I've learned exactly what I was looking for last few days.

antonovivan
Автор

Once again nicely explained, thank you so much !

christianschneider
Автор

Great explanations of basics in your videos, thank you!

ripmork
Автор

Great work! Looking forward for the vertex shader

lpfonseca
Автор

Like #100! Your tutorials are absolutely amazing. Please please keep going :)

dopinkus
Автор

Hi Daniel! I have a question. I dont know why, but, "Depth test 2:44 " only works for me if "Surface Type" is Transparent. I don't understand why it doesn't work when is opaque, like you

AlexLozanoAcerca
Автор

Thanks for this tutorial, it's awesome! I just have one tiny question: if I have a second camera in my scene that renders the scene to a RenderTexture with a Color Format of R8G8B8A8_UNORM and a Depth Stencil Format of D32_SFLOAT, and I pass that RenderTexture to a URP Shader Graph as a Texture2D, is there a way to read the depth values from the RenderTexture in the graph? I believe it is not possible, but just wanted to confirm. It's very odd, but it seems like it's impossible :(

rockclimbermaca
Автор

I have this problem, that with a Canvas Group and some GO with Images, the images on top get transparent and the color of images behind mix with the color on front giving a bad result, there is a good way to fix it? I tried with stencil, but then the aliasing scream at your face haha.

LuizMoratelli
Автор

What about hairs in URP, will it work for the same or could you cover "Hairs in URP" topic?

GameBit
Автор

Is there a way to sample a pixel of the depth texture here?

DaveRune
Автор

Thanks for the tutorial, can i do this in Built In Pipeline?

StressedProgrammer
Автор

Has this been changed in Unity 6 or am I just blind? Cannot find Depth Texture setting.

anttiv
Автор

When is the next part going to release?

nopepsi
Автор

Can you show this with Shadergraph in Built In Render Pipeline.

harshadjoshi