-
-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Alpha Hash and Alpha2Coverage Implementation #40364
Conversation
be3d760
to
e52460e
Compare
Regarding the hashing function, I had some questions. Why did you choose this technique? Placing the stability aside, the true random nature of the hash function means that it looks visually noisy, which IMO is worse than using simpler, cheaper methods using a screen-space noise sample offset by a temporal factor. |
Hi @s-ilent , regarding the dithering functions I am not really an expert in the matter, nor have I greatly invested myself in comparing different dithering functions. But I can still give some of my input on the matter from what I have read from the papers. I think first and foremost, I must apologize because the preview video does not really demonstrate what the hash dithering is for - and although I believe you already understand the intent behind it, I'll clarify for others here. The Alpha Hash function (or any dithering in general) really isn't useful for the hair case I showed in the video - as the hair was opaque and faded to transparency at the edges. Dithering is a technique to fake some form of OIT transparency where blending is required. We can argue that in the hair example, blending isn't really required as much, we just want smoothing of the edges. The hair strands themselves aren't transparent. Where dithering does shine is in situations where we want to emulate some form of true transparency throughout the mesh/texture plane/etc. Slide 8 of the NVIDIA slides show what were trying to solve: Many of those techniques do look nice - and it could be nice to experiment with a few of them. It's hard to grasp through the paper however how well they work for texture transparency though - the examples I see are all technical visualizations of the noise or seem to deal with shadows. I'd love to see some real case examples where alpha transparency is simulated outside of shadows. As for the benefit of the hashing function:
The temporal offsets can still lead to shimmering as you said - which to some people, or for some use cases, is visually worse than the noise generated by the hashing. I don't believe that its only benefit is its stability for TAA, but it's stability in general - it looks consistent under camera movement. Some ideas I got from reading those papers was perhaps pre-dithering transparency masks per texture instead of storing the smooth alpha values. Some concerns I don't believe I saw addressed either - how does it behave when zooming out or looking from a distance? I have a lot of questions for these techniques! I would love to see more research into them. At the end of the day though, the dithering functions are actually one of the easier things to do custom implementations for! |
aa11dc0
to
b6cc680
Compare
Being completely honest, I have the feeling there might be better alternatives to this problem that are not being explored. While for the most part I think this is good, I also adhere to the fact that AlphaHash is mostly a technique devised for TAA which does not make much sense in a forward renderer like the one in Godot. One alternative I would like to investigate myself for this is alpha slicing and sorting, which should work quite well for hair and other transparent objects (specially two sided ones). On import, faces that intersect each other are cut/sliced and then, before drawing, the index buffer is sorted using a compute shader (this can be enabled when geometry is closer than a given threshold). Aided by depth prepass, results should be as good as the reference image. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this can still be merged after rebased, even if we implement other alternatives for improving alpha draw order. Only problem I see is what I mentioned here.
I can rebase this and fix it up, just give me a few days :) |
b6cc680
to
46eae70
Compare
Yeah, I would not use Alpha Hash specifically for things like hair. However, it can be very good for simulating anti-aliasing for far away objects, so I don't think its useless. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm accepting for now despite my comment about the hash function. If it turns out to be an issue we can easily drop in a new hash function at any time. Preferably one recommended from this recent paper. http://www.jcgt.org/published/0009/03/02/
Could you squash commits into one (or more if relevant, but fixups should be melded into the original commit that requires a fixups)? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Docs need to be added before this is merged. You can generate the shell by running the engine from command line from the root directory of the godot project using the argument --doctool .
aa90a9a
to
02b4fb5
Compare
Squashed and doc comments added. |
02b4fb5
to
e5d7c7d
Compare
Fixed typos in docs. |
Thanks! |
@@ -482,6 +512,8 @@ class BaseMaterial3D : public Material { | |||
TextureChannel ao_texture_channel; | |||
TextureChannel refraction_texture_channel; | |||
|
|||
AlphaAntiAliasing alpha_antialiasing_mode; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Uninitialized variable:
scene/resources/material.cpp:1481:6: runtime error: load of value 3200171710, which is not a valid value for type 'BaseMaterial3D::AlphaAntiAliasing'
SUMMARY: UndefinedBehaviorSanitizer: undefined-behavior scene/resources/material.cpp:1481:6 in
@marstaik How does alpha hashing compare to interleaved gradient noise? Is it still relevant when we have interleaved gradient noise in Also, I noticed there are references to both godot/servers/rendering/renderer_rd/shaders/scene_forward_clustered.glsl Lines 1708 to 1712 in 995093f
|
@marstaik If you're still around, could you give some example values to use for In the future, couldn't the value of |
@Calinou I haven't touched this in a while, but you can think of the alpha edge value as the width of the transparency band around the object that is then marked for msaa. A higher value should give you better visuals, but it also marks more pixels in general for the msaa process, and thus costs more. If you are familiar with a radial soft brush like in Photoshop, then think of the edge value as the the band from the solid center to the faded away edge that disappears. The best possible result would be to mark the entire range of faded pixels for msaa, so they all get blended, but then the area of the circle that needs msaa also goes up drastically, because it's an area (the circumference of the object * "the width" of the edge band). The alpha scissor is a hard cutoff. If it's 0.5, then any pixels with less than 0.5 transparency are discarded. The alpha edge then starts working offset from that edge. Back to the brush example, if I set the scissor to 0.5 then the the outer half of the blended radius of the brush would completely disappear, as we scissor them off. We are then left with pixels of 100 opaqueness to 50 opaqueness, or 0 transparency to 50 transparency, depending on your viewpoint. The alpha edge value then only has those values left to play with to determine the band. It's incredibly difficult I would think to automatically determine. There are different quality areas to consider (such as wanting very high quality player hair vs random foliage), but there is also input texture issues to consider. The artist dictates how much "edge" there is to work with depending on how fast they decided to fade the values to zero. IE did they step the alpha down by 10 percent every pixel, meaning a gradient of 10 pixels from opaqueness to fully transparent, or by 1 percent, meaning 100 pixels of "edge". That in addition to personal preference then determines how much you need to blend your edge to look acceptable. |
Also, as comes with a few years down the line hindsight, it might be better to have a boolean toggle to make alpha edge relative to alpha scissor (and default it to on) so that it is scaled by the alpha scissor and much more intuitive. IE if the scissor is 0.5 and there are only half as many pixels, then instead of being additive we can remap the alpha edge [0,1] range onto [0, .5] |
This commit adds the following features to the Vulkan rendering pipeline:
Alpha Hash, AlphaToCoverage + AlphaToOne, and "mipped" antialiased edges.
These techniques are very helpful for creating nice looking Hair and Foliage, where alpha channels are used the most.
Preview Video:
https://youtu.be/zQKkUNvAAJ4
Alpha Hashing
Alpha Hashing is a dithered alpha testing method from NVIDIA that you can find more documentation for here
AlphaToCoverage and AlphaToOne
AlphaToCoverage is a technique that takes the fragment shaders alpha channel and AND's it with the MSAA SampleMask to produce additional areas for Anti-aliasing.
AlphaToOne is an additional flag in which after the alpha channel is used for the MSAA SampleMask, the alpha value is set to the maximum alpha value.
It is the combination of these two techniques that allow for good-looking anti-aliased alpha testing.
But why both? Why is AlphaToCoverage by itself not enough?
In the preview video above, you may notice that when switching from "Alpha Edge Blend" to "Alpha Edge Clip" (the respective names in the UI for AlphaToCoverage and AlphaToCoverage + AlphaToOne) that using AlphaToCoverage by itself still results is some bleed/halo effects around the textures. This is because AlphaToCoverage alone does not clamp the alpha value; that is - the resulting alpha of the fragment shader still has to be blended somehow. And in general this is an issue we have with alpha blending. All AlphaToCoverage does by itself is export the Alpha channel as an area for MSAA to act upon.
This is where AlphaToOne comes in - after adding the apha to the MSAA SampleMask, the alpha value is set to the maximum - so no blending occurs anymore.
The final result is fragment color output with no alpha channel, yet MSAA smooths the areas where the alpha channel was.
Alpha Edge Sharpening
This blog post by Ben Golus goes through some techniques for sharpening the Anti-aliasing edge with some mipmapping techniques.
This technique has been implemented as a function called
compute_alpha_antialiasing_edge
inscene_hight_end.glsl
and usage will be explained below.The Material3D system uses this by default when AlphaAntialiasing is enabled.
Render Flags and Usage
AlphaToCoverage, AlphaToCoverage + AlphaToOne
In the spatial shader, the render flags
alpha_to_coverage
ORalpha_to_coverage_and_one
can be added to process the result of the shadersALPHA
with either AlphaToCoverage, or AlphaToCoverage + AlphaToOne.When either of these are set, the blend_mode is overridden to BLEND_MODE_ALPHA_TO_COVERAGE, which is better suited for blending.
Alpha Scissor
ALPHA_SCISSOR_THRESHOLD
float, [0,1] is set in the shader, then alpha values less than the threshold will be discarded.Alpha Hash
ALPHA_HASH_SCALE
float, recommended (0,2] is set in the shader, alpha hashing will be used and alpha values less than the hash are discarded. Alpha Hash Scale simply affects the dithering affect of the alpha hashAlpha Edge (Needs Texture)
Alpha Edge requires two variables:
ALPHA_ANTIALIASING_EDGE
float [0,1] - Affects the edge point of the Edge sharpening. IfALPHA_SCISSOR_THRESHOLD
is set, it is added toALPHA_ANTIALIASING_EDGE
and clamped from [0,1].ALPHA_TEXTURE_COORDINATE
vec2 - The texture coordinate to use. Thinkuv_coordinate * alpha_texture_size
.Please feel free to ask question and test it out!
I am looking forward to your feedback.
Thanks,
Marios S.
Bugsquad edit: This closes godotengine/godot-proposals#1273.