Save cluster render shader from being optimized out entirely #76832
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
In certain platforms (i.e., certain consoles) a fragment shader that doesn't output a fragment color can be heavily optimized, down to no-no. In practice, I've seen that happening in the cluster render shader. It leverages rasterization but at the same time acts as a compute shader, in that it writes to some buffer but doesn't output any fragment color.
What this PR does to solve that is letting the RD driver report whether the currrent platform/driver has such behavior and, where that's true, patches the aforementioned shader by adding an artificial data dependency between an also added fragment color output and the result of the atomic operations, preventing the aforementioned kind of heavy optimization from "rendering" it useless.