-
-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a render mode to allow depth testing and screen reading transparent objects #10847
Comments
I don't know if this is technically feasible given how Godot's renderer works, especially since it currently only supports forward rendering and not deferred rendering. |
I see, before I posted this proposal I did look at some other posts and someone said they just decided to do it themselves in the engine so I figured it was something easy to implement and was missing for some other reasons. At least the screen reading that is. |
We many options for transparency. You have to be more specific with what you want and what your problem is, because it could be a genuine limitation, or you just haven't done your reseach. |
I want a screen effect that occurs when the camera is contained within a water volume which distorts the screen, as well as layering other effects. The issue is I cannot read the screen texture to distort it, as transparent shaders are not included in the screen texture. I also cannot depth test the water volume to determine how thick the fog effect should be, as you cannot depth test transparent shaders. I think there is a solution with using the compositor, but that is far above my current knowledge (though I am trying to understand it). If there was something simple like a render mode that exposed this functionality then it would make creating these effects extremely easy for less experienced developers. I updated the original post to better describe the issue I am having. |
You want a post-processing effect for this. See this page in the docs for more info. You don't actually need to get the depth of transparent objects to do this (unless you have transparent things underwater maybe, but that's a separate problem). |
That docs page is what I used to build the shader. The issue is (again this could be the wrong way to go about what I want) inside my quad shader that covers the camera I want to read the depth so I can determine how transparent a given fragment should be. Objects up close are 0 ALPHA, and objects further than a certain range are 1 ALPHA. The problem arises because I cannot get the depth of transparent materials, so if the only thing between my camera and the sky (or an object far away) is a transparent waterfall, it gets covered because the quad shader displays 1 ALPHA. If the surface of the waterfall shader was included in the depth test then my quad screen shader would render at 0.2 ALPHA on that fragment, allowing the waterfall to be visible. I am very new to shader programming and this could be the wrong way to go about what I want to achieve. (screen effect which adds fake fog, the further away a fragment is from the camera the higher the alpha value for that fragment) |
I've also worked with screen-reading effects that both needed to use the depth texture and needed to be able to see transparent objects. Godot doesn't have great options for this right now (as you've discovered), but they are currently in development (see the Rendering Compositor issue, which Calinou mentioned above). If you need a way to work around this right now (before the Rendering Compositor is finished), here are a few options:
If you decide to go with this third option, I'd be happy to provide more details. I've done it before, so I know it can work. But it is a little finicky to get everything set up correctly. |
Wow that is a beautiful write up, and helps immensely. I have been putting off learning compute shaders to get the effect I want and have been working on other parts of my game. I definitely think solution 3 seems the most convenient for me as I am still hesitant on delving into the compositor. |
I decided to go ahead and make a simple project showcasing the third method (passing screen textures via viewports). It's copied from another project I had been working on and stripped down to its essentials. (Although, I did include several material properties on the object shader which might not be essential, such as metallic, roughness, and emission.) Hopefully this can be helpful for you! https://github.com/thompsop1sou/custom-screen-buffers |
Thank you, that's very helpful |
Yeah so the easiest way to do that especially if you only need it on the waterfall is actually probably to just have a special material on the waterfall that fades as the player gets closer (i.e. using If you need a more general solution, what thompsop1sou said applies. |
Describe the project you are working on
I am working on a project that heavily uses transparency
Describe the problem or limitation you are having in your project
Transparency is very difficult to work with - especially when they overlap. I have issues with transparent objects blending improperly with each other, and with the sky texture.
I want a screen effect that occurs when the camera is contained within a water volume which distorts the screen, as well as layering other effects.
The issue is I cannot read the screen texture to distort it, as transparent shaders are not included in the screen texture.
I also cannot depth test the water volume to determine how thick the fog effect should be, as you cannot depth test transparent shaders.
Describe the feature / enhancement and how it helps to overcome the problem or limitation
Add a render_mode flag that allows a shader to depth test transparent shaders and include transparent shaders in screen reads by reading after the transparent layer.
This way on my screen space shader I can accurately get the depth/screen of every object in the scene.
Describe how your proposal will work, with code, pseudo-code, mock-ups, and/or diagrams
I am unsure how it would work in the engine, but my assumption is performance would only be lost when actually using the render mode.
For example add render modes:
depth_test_alpha
andscreen_read_alpha
Using depth_test_alpha you have transparent objects included in your depth texture
Using screen_read_alpha you have transparent objects included in your screen texture
If this enhancement will not be used often, can it be worked around with a few lines of script?
I can't actually think of any solution whatsoever, which is why I am writing this. If there were a work-around that was somewhat convenient I would use it (if one exists let me know!)
Is there a reason why this should be core and not an add-on in the asset library?
I can't think of any downsides except performance which only happens if you actively use the render mode. And there are so many flaws with transparency handling in-engine, so it would be nice to have it built in.
The text was updated successfully, but these errors were encountered: