-
Notifications
You must be signed in to change notification settings - Fork 386
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Are 8-bit outputs sRGB encoded? #988
Comments
I'm unsure this needs to be called out in the spec since that's already defined by canvas. |
It's defined as sRGB color space, but sRGB encoding is different: https://hackmd.io/@jgilbert/sRGB-WebGL |
Yeah, it seems there's two things here... There's the colour space, which is implicitly sRGB, but it might be worth making that clear, since the compositing path for immersive mode doesn't involve the browser compositor. There's also the texture format, which is trickier, since the browsers use linear formats, but the devices might have a preference for a non-linear one. |
The color management chain must not be any different than what is done for a regular web page. Chrome and Safari both have a color correct workflow so they can render with any colorspace into any color device and have the colors match. I believe Firefox doesn't have this. |
I'll highlight again that this is orthogonal to color space. It's a little
tricky, but basically "sRGB" is overloaded to mean both a color space
(colors look the same) and encoding (better darks in 8-bit). They are
technically orthogonal. I think I did a better job of explaining this in
the document I linked.
…On Wed, Mar 18, 2020, 12:38 PM Rik Cabanier ***@***.***> wrote:
The color management chain must not be any different than what is done for
a regular web page.
Chrome and Safari both have a color correct workflow so they can render
with any colorspace into any color device and have the colors match. I
believe Firefox doesn't have this.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#988 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AALHJDOFW2OOQS2QA5TXY2DRIEPLTANCNFSM4LOJVBQA>
.
|
This is also a perf issue on some platforms, where both RGB and sRGB work, but sRGB goes directly to the OS's compositor, and RGB gets an extra blit. We won't be able to match native perf on those platforms is we don't support the sRGB format. |
I was under the impression that most systems allow you to write to a sRGB or linear sRGB buffer so you wouldn't need an extra blit. |
That's not what we've been told by the device manufacturer 🙁 We've not done the experiments to find out though, partly because the extra blit is there in the conversion from side-by-side to texture arrays, so we need to switch to layers first. |
@asajeffrey does this mean you're looking into implementing layers? If so, I'd love to hear what feedback you have. |
Yeah, it's next on my queue. Our motivation is to get a blit-free path for immersive XR sessions. We'll see if that's possible! |
With Oculus browser's move to OpenXR, this implication that we write sRGB colors to an RGB texture has become an issue for us. I looked at the code that was written by Microsoft and they seem to be doing the same thing. What can we do to work around this? |
/facetoface |
To avoid confusion, please don't just use "sRGB" by itself in this issue. As @kdashg pointed out, it's ambiguous if this means linear color values vs gamma-adjusted nonlinear values, or a distinction between specific color spaces such as sRGB vs CIE RGB or others. FWIW, https://www.w3.org/TR/webxr-ar-module-1/#xr-compositor-behaviors defines blending modes for compositing the rendered buffers., i.e. "source-over" vs "lighter" for alpha-blend and additive environment blending respectively. The linked formulas appear to assume linear color values. If the provided buffer were using gamma-adjusted nonlinear sRGB, blending would require a conversion to linear color values and then back to gamma-adjusted values for final output. |
That is indeed happening in the compositor: sRGB textures are linearized and then blended. |
It looks as if there's multiple potential issues here:
|
I suspect that this is a web platform issue that applies equally to 2d content.
I'm unsure if that's needed. How is the blending done today on a 2D page. Isn't it the same?
These 2 points are more about color management. OpenXR has APIs for those and we picked one for WebXR. Would it make sense to expose them. |
Can we please add an unambiguous statement to the spec what the color encoding (in the sense of linear vs sRGB "gamma" curve) is supposed to be for an immersive session, to avoid diverging implementations? According to @toji the expected behavior is that apps should use sRGB output encoding, matching what they'd use for plain 2D rendering to a canvas. Adding to the confusion, the WebXR Layers API appears to specify RGBA as the default color format for projection layers which would be a linear encoding: https://www.w3.org/TR/webxrlayers-1/#xrprojectionlayerinittype dictionary XRProjectionLayerInit {
//[...]
GLenum colorFormat = 0x1908; // RGBA It's also a bit of a trap that apparently OpenXR swapchains need an API-specific sRGB format to avoid being interpreted as linear, so I think an implementation on top of OpenXR could easily use the wrong encoding. I think this is causing real world issues, according to aframevr/aframe#5444 the Apple Vision Pro appears to be treating WebXR color data as linear, causing dark colors to appear too bright and washed out. @AdaRoseCannon FYI. It would be unfortunate if apps start hardcoding compensating measures based on device name or similar, since then the result would look wrong if an implementation fixes this later. |
This is correct, as in: the OS should treat the buffer that is produced by WebXR (or WebGL canvas) as sRGB.
That spec is correct. All compositing in WebXR and WebXR layers is done in linear space.
What is that API specific sRGB format?
This is simply a bug in AVP's rendering pipeline. Their rendering in 3D should match what is done in 2D. WebGL draws the same pixels in either world.
yes, hopefully Apple can fix this soon so authors don't start working around it by adding code based on the user agent. :-@ |
I think we're in agreement here, but as far as I can tell the spec doesn't say this anywhere. Also, I think it would be useful to have a WebXR sample that tests this, for example by showing a dithered pattern next to a color gradient similar to http://www.lagom.nl/lcd-test/gamma_calibration.php . (This brings back memories of the discussions around premultiplied alpha for additive blend mode in immersive-web/webxr-ar-module#14 where there was also an implementation inconsistency.)
Those are two orthogonal issues. Yes, all compositing in general should be done in linear space, otherwise the result is incorrect. However, this does not mean that the inputs need to be provided in a linear buffer. As long as it's properly annotated in a way recognized by the compositor, the input layers can be linear 8-bit RGB, sRGB8, a floating point format, or anything else that's supported. If for example the input layer is marked as an SRGB8 texture, a texture read in the compositor shader will automatically convert it to linear for further processing. (I think I got this wrong in the earlier comment #988 (comment) - it's fine to specify blending algorithms in linear format as long as the input format conversion happens correctly before this, as is the case when a shader reads from a properly typed texture.) Overall, I think it seems better to think of SRGB8 as a weird number format, analogous to floating point, in the sense that it just stores numbers in a different way to preserve accuracy for low intensities.
See the link, it mentions The note there is also relevant: OpenXR applications should avoid submitting linear encoded 8 bit color data (e.g. |
Do canvas 2d, webgl or css specify this? I feel that this is a generally underspecified corner.
I wasn't talking about the inputs; WebGL specifies the behavior there.
I'm unsure what you mean by that.
These are for swapchains for different graphics drivers.
Yes, that would be the case IF there was a conversion. |
I think this just started out as the expected default behavior because CRT monitors happened to have an approximately sRGB response curve, so the pixels that apps wrote into output buffers got interpreted as sRGB. I 100% agree that apps should look the same for 2D and XR output without needing to special-case their color handling for immersive sessions. For example, using the recommended color management in Three.JS adds a linear-to-sRGB conversion for the final rendering result. (See "Output color space" here.)
By "inputs" I meant inputs to the compositor, which is the output of the WebXR app. This does NOT need to be linear RGB. The format can be whatever the XR app and the compositor agree on. If the app provides an XRProjectionLayer with format "RGB", it's linear, and if it provides format "SRGB8", the data is stored in nonlinear format and the compositor converts it to linear when it's reading from the texture buffer. (Otherwise the colorFormat attribute to XRProjectionLayerInit would be pointless?)
Storing color intensities in 8 bits per channel means there are only 256 different values. If you store data as 8-bit linear and eventually convert that to sRGB for final display output, you lose a lot of precision for dark colors since there's no possible 8-bit linear input value to represent them. If you encode data as 8-bit sRGB, you effectively have more bits of precision for dark colors. Here's a JS demonstration of a lossy conversion when forcing colors into an 8-bit linear intermediate encoding: let p = [];
for (let i = 0; i < 256; ++i) { p[i] = i; }
function srgbToLinear(v) { return v <= 0.04045 ? v / 12.92 : Math.pow((v+0.055)/1.055, 2.4); }
function srgbToLinear8bit(v) { return Math.round(255 * srgbToLinear(v / 255)); }
function linearToSrgb(v) { return v <= 0.0031308 ? v * 12.92 : 1.055 * Math.pow(v, 1/2.4) - 0.055; }
function linearToSrgb8bit(v) { return Math.round(255 * linearToSrgb(v / 255)); }
p.map(srgbToLinear8bit);
=> [0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 5, 5, 5, 5, 6, 6, 6, 6, 7, 7, 7, 8, 8, 8, 8, 9, 9, 9, 10, 10, 10, 11, 11, 12, 12, 12, 13, 13, 13, 14, 14, 15, 15, 16, 16, 17, 17, 17, 18, 18, 19, 19, 20, 20, 21, 22, 22, 23, 23, 24, 24, 25, 25, 26, 27, 27, 28, 29, 29, 30, 30, 31, 32, …, 253, 255]
p.map(srgbToLinear8bit).map(linearToSrgb8bit);
=> [0, 0, 0, 0, 0, 0, 0, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 22, 22, 22, 22, 22, 22, 22, 22, 28, 28, 28, 28, 28, 28, 34, 34, 34, 34, 34, 38, 38, 38, 38, 42, 42, 42, 42, 46, 46, 46, 50, 50, 50, 50, 53, 53, 53, 56, 56, 56, 59, 59, 61, 61, 61, 64, 64, 64, 66, 66, 69, 69, 71, 71, 73, 73, 73, 75, 75, 77, 77, 79, 79, 81, 83, 83, 85, 85, 86, 86, 88, 88, 90, 92, 92, 93, 95, 95, 96, 96, 98, 99, …, 254, 255] Note the data loss for low-intensity colors. A smooth color gradient from 0-28 only has 13 and 22 as intermediate values, leading to color banding. And this is an inherent problem no matter how the app tries to render. There's simply no way to get final sRGB intensities on screen between 1 and 12, your choices are just 0 or 13. If instead the app provides data to the compositor in a nonlinear sRGB 8-bit encoding, the dark colors keep their precision. (In exchange for having fewer distinct bright colors, but this is far less visually obvious.) Yes, the compositor will internally do linear calculations after reading data from a sRGB texture (and converting to linear), but it does NOT have to crush the values to 8 bits while doing its computations since the GPU does shader calculations at higher internal precision. At the end it has to convert to sRGB, and it's important to avoid having an 8-bit linear intermediate texture format in the path to that.
Yes, but I think the issue here is that textures can be typed as either linear RGB or nonlinear sRGB, and using sRGB is generally opt-in behavior because earlier APIs tended to silently assume linear encodings. Marking a texture as sRGB specifically means that a shader sampling that texture will do an sRGB-to-linear conversion automatically, so that the shader code can assume that all further operations happen in linear color space. The benefit of doing so is that the GPU typically does fragment computations in a higher-accuracy internal format, and the resulting data has more than 8 bits of linear precision for dark colors.
Marking a texture as sRGB specifically means that there is an auto-conversion happening when the texture gets read by the compositor's shader.
I don't understand what you mean here. Yes, if an app creates sRGB output values (as usual for final output) and the compositor interprets it as a type=linear texture, it will be too bright. The texture needs to be marked as sRGB so that the texture read correctly converts it to linear. That's supposed to be the default behavior, and I think should be the recommended method for the Layers extension by marking the projection layer as colorFormat=SRGB. |
For the email record, I just edited the previous comment to add a missing section: [...] your choices are just 0 or 13. If instead the app provides data to the compositor in a nonlinear sRGB 8-bit encoding, the dark colors keep their precision. (In exchange for having fewer distinct bright colors, but this is far less visually obvious.) Yes, the compositor will internally do linear calculations after reading data from a sRGB texture (and converting to linear), but it does NOT have to crush the values to 8 bits [...] |
That's a bit misleadingly phrased. That's the case for a traditional display pipeline, but in the end it's up to the GPU and display hardware how the final output gets turned into light. The point is that it is normally able to show distinct colors for low-intensity values beyond what an 8-bit linear encoding can represent. |
OpenXR made the decision to composite in sRGB so it's still the case there. There are of course steps after the final composite to map it to the current display profile. |
This is not what I said.
No, that would be incorrect. The default of the layers API should match the default WebGL canvas behavior which is rgb; regular WebXR has the same default. |
Thanks for the discussion. I can hardly follow :). My two cents. regardless of specifics of a solution I agree behavior should be consistent on regular WebGL and immersive, and also across browsers. A-Frame might implement a user agent check as an interim solution unless a fix is coming soon (on AVP side?). There are no mentions to Safari in the Vision OS 1.1 beta. I imagine a fix might take at least weeks or few months. |
Please wait for Apple to fix this obvious bug. Don't add temporary workarounds that might have other side effects. |
+1. (I guess if someone wants to develop an application to be ready for a future fixed browser version, it would be OK to temporarily add a local workaround for that, but it shouldn't be deployed on an end user facing web site.)
That sounds wrong, and I don't know if it's just a terminology mismatch. I think we're in agreement that the web app writes its output into the opaque framebuffer using sRGB encoding, same as if it was targeting a 2D display. The browser needs to send this data to the system compositor, and basically has two choices:
When you say "the browser creates linear output values", do you mean it produces numbers that the compositor can use as-is? That's correct assuming that the compositor is expecting sRGB.
I don't get this part. Regular WebGL canvas is interpreted as sRGB, that's why for example the recommended Three.JS rendering setup puts a I think it would be wrong if a Layers extension XRProjectionLayer with colorFormat=RGBA would treat the data as sRGB and for example pass it on to the compositor without conversion to a GL_SRGB8_ALPHA8 OpenXR swapchain buffer. While that would match the default WebXR rendering, I think it doesn't make sense - what would colorFormat=SRGB8 do differently then? To the best of my knowledge, the default WebXR (and 2D canvas) rendering is equivalent to SRGB8 and would match being used with an OpenXR SRGB8 swapchain. |
I was being sloppy in the last paragraph about alpha channel details - the important part is nonlinear sRGB vs linear RGB, and either one can have an alpha channel added if needed. The core question is, what's the behavior difference between colorFormat=RGBA and colorFormat=SRGB_ALPHA_EXT? If the default of colorFormat=RGBA is intended to match default WebXR behavior (which should match 2D canvas which expects sRGB data), what does colorFormat=SRGB_ALPHA_EXT do differently? |
That may be "wrong" but that is how browsers work.
No, we are not in agreement. The framebuffer is in linear RGB same as WebGL. The swapchain is in sRGB.
No, the browser creates linear values. Technically this is wrong but that is how they're exposed to the page.
It only does that conditionally and looking at the code, only for sRGB render targets.
No, changing the default to sRGB would break each WebXR experience because now drawing with RGB will get gamma correction applied. /agenda discuss linear vs sRGB |
See Updates to Color Management in three.js r152 - the point is that while it's natural to do rendering in a linear space, treating the output as linear is wrong, leading to plasticky-looking unrealistic shading. Setting |
Regardless of what three.js did, colors between WebGL and WebGL under WebXR must match. Marking the destination as sRGB will break this. |
Ah, I think I figured out a potential disconnect. GPU Gems 3 - The Importance of Being Linear says in 24.4.2:
What I was talking about was that reading from an sRGB buffer will automatically apply an sRGB-to-linear conversion. However, it appears that writing to an sRGB framebuffer will have the GPU auto-apply gamma correction, and we indeed don't want that for WebXR content. So the canvas and default WebXR projection layer contain sRGB data, and need to be treated as sRGB when being read by a compositor, or when sent directly to a display, but the conversion to sRGB needs to happen manually for historical reasons. So marking an output texture as sRGB would be wrong if this activates the GPU's auto gamma correction since we don't want that. Basically, we need a hybrid texture that appears as linear RGB when being written to (because the writer already should have done gamma correction), contains sRGB data, and should be read by the compositor as a sRGB texture. Does that clear things up? Apologies for the misunderstanding on my part, I wasn't aware of the auto sRGB gamma correction when writing to declared sRGB textures, assuming that that's what's happening here. So in the Layers API, declaring the texture as SRGB8_ALPHA8 would mean that the GPU converts to sRGB on write, and the app should skip doing its own gamma correction. In either case the content of the texture is sRGB format color values, and would be read as such by the compositor. |
This interpretation matches the OpenXR backend in Chrome - it tells OpenXR that the swapchain is DXGI_FORMAT_R8G8B8A8_UNORM_SRGB, but sets up the browser-side shared image as linear RGBA8888. In any case, I think this kind of hybrid buffer is rather surprising and I think the spec would benefit from being clearer about that. (Or is this obvious to everyone else and I'm just being dense?) |
+1
Yes, this is exactly the way it's implemented in the Quest browser. I suspect that the developers on AVP Safari missed this step.
no need for apologies. It IS a confusing subject :-) |
It is NOT obvious which is why I raised this issue in the past. I don't know how to fix it in spec text though since this behavior is not limited to just WebXR. |
@toji pointed me towards https://registry.khronos.org/OpenGL/extensions/ARB/ARB_framebuffer_sRGB.txt which explains the reading and writing behavior. Again, sorry about causing confusion by not being aware of the conversion-on-write behavior, but it still seems weird to me to be working with allegedly linear buffers containing sRGB data. From the overview:
(The following sections go into detail how sRGB transforms are applied on read and write.) |
At a very high level, maybe something along these lines? WebXR framebuffers always contains sRGB data, meaning that the pixels are intended to be interpreted as a nonlinear intensity curve roughly similar to a traditional CRT display. However, to remain compatible with 2D canvas rendering the default behavior is to NOT do any automatic color curve conversion when drawing to the buffer. Applications need to do their own gamma correction. The recommended method is to do internal lighting calculations in a linear space (see The Importance of Being Linear), and then apply a linear-to-sRGB conversion as a final shader step when writing the output fragment. The browser and GPU will NOT do this for you automatically. If you are using a rendering software package, you should set it to produce nonlinear sRGB output. Optionally, when using the Layers extension, applications can set up an XRProjectionLayer using a non-default sRGB format. Doing so enables automatic gamma correction, the linear internal shader data is auto-converted to sRGB when being written by the GPU. In this mode, do NOT add your own linear-to-sRGB conversion, for example you should configure your rendering engine to use a linear output color space. Remember that the final buffer content must always be sRGB, the distinction is simply if you do the conversion to sRGB yourself, or if you ask the GPU to do it automatically. When using a linear RGB colorFormat, your application must apply gamma correction and produce sRGB data from its shaders. When using a nonlinear sRGB colorFormat, the GPU does gamma correction automatically, and your shaders are expected to produce linear data so that they can be converted correctly. (Yes, this sounds backwards and is confusing.) The default WebXR behavior when not using the Layers API is equivalent to colorFormat=RGBA, meaning that the application must do gamma correction and produce sRGB format data from its shaders. WebXR does not provide any way to produce buffers whose content would be interpreted as linear by the system compositor. (This would also be undesirable since linear 8-bit buffers typically produce color banding due to poor granularity for dark colors.) For implementers, when not using the Layers API, or when using a Layers API linear RGB output format, the WebXR opaque framebuffer needs to be set up as a hybrid texture. It contains sRGB data, but needs to be exposed to client rendering code as a linear RGB texture so that the GPU does not apply an automatic linear-to-sRGB conversion on write. When this data is passed on to a system compositor or other reader, it must be declared as being in sRGB format (matching its actual data content) so that compositing correctly applies an sRGB-to-linear conversion in the GPU when reading the texture. When implementing the Layers API, it should work to directly use the provided colorFormat as the texture format for the framebuffer that the client is rendering into, and this should cause the GPU to automatically apply linear-to-sRGB conversion for sRGB colorFormats. However, the texture must always be assumed to contain sRGB data and must be passed to the system compositor as an sRGB format as-is (with no separate data conversion step), using the appropriate vendor format such as |
I don't think we should say that. That implies that WebXR WebGL is different from regular WebGL.
This is not correct. The projection layer has to be in RGB; just like it is in XRWebGLLayer. (You can query the format in GL)
There is no reason to call out layers. Default projection layer uses the same code path as XRWebGLLayer.
That would be up to the system compositor. I don't think we should say this.
No difference between Layers and XRWebGLLayer. As you pointed out, android and desktop chrome are doing this as well.
This is also system dependent so can't put that in the spec... Maybe we can say that colors created in WebXR or WebXR Layers with (linear) RGBA color format, MUST be processed as if they are in the sRGB color format. That is basically what our implementations are doing. |
The point is that this is the same as regular WebGL, and that's also what a normal canvas WebGL application should be doing. (See the three.js documentation I had linked earlier.) Setting Three.JS to use sRGB output encoding looks correct both on a 2D screen and in WebXR. (It currently looks wrong in Apple Vision Pro since that does NOT interpret the data as sRGB. It treats the data as linear, meaning it looks much too bright for dark colors, and doesn't match the 2D mode.)
I meant that the pixel data is going to be interpreted as sRGB once compositing is complete. The format as queried by GL needs to be RGB, but technically that's a lie since it doesn't match the pixel interpretation. |
Even if you don't set sRGB encoding, the output will be different in Vision Pro. You can see it in the WebXR samples that draw directly without an encoding. |
I think that's expected, as far as I understand there's no way to actually set an encoding on a WebGL canvas or a WebXR render buffer (other than using Layers with an SRGB colorFormat). Three.js's output encoding and color management is purely internal to the app, it just changes internal calculations. If the Vision Pro treats the WebXR buffer as linear, that would be expected to affect all WebXR applications consistently. (If it's inconsistent, that would mean something else is going on.) There are various ways to get correct-looking results in a WebGL/WebXR app. Modern Three.JS prefers to work in linear space internally, reads from source textures by marking them as sRGB (most image sources use that) which makes the GPU convert from sRGB from linear when sampling them. It does lighting and blending in linear space. Then, it does a final linear-to-sRGB conversion in a shader when writing output pixels, assuming it's configured to use sRGB output which is the normal setting in this mode. Alternatively, an old-style renderer may be completely oblivious to sRGB vs linear issues. It can read data from textures marked as linear RGB even though they contain nonlinear data, and write the result to the output framebuffer without doing any conversion. That way, the data gets written in its nonlinear form to the output buffer, and it looks (more or less) correct when the output interprets it as sRGB data. This works fine for playing videos for example, but isn't quite right when doing blending or lighting since that should happen in linear space. I get the impression that the WebXR sample for video playback does this - as far as I can tell the input video texture is just GL.RGBA and gets written to the output buffer as-is, with no gamma correction, so it writes the video's presumably nonlinear pixels directly to the output. (I may have missed some post-conversion, but couldn't find any relevant-looking matches for However, I am confused by the PBR shader's gamma correction being disabled - does the rest of the PBR shader already work in sRGB space internally? The glTF spec says that colors are specified as sRGB, so maybe it just works with those directly? However, it's supposed to convert to linear for lighting and then back to nonlinear for display, and I'm a bit confused how this works. @toji, can you chime in? Your commit message that disabled gamma correction just says "Look, just... yeah. It's complicated." I understand the sentiment ;-) |
Apologies for the triple post (on top of being annoyingly verbose about this in general). Github had appeared to fail to submit the post, but had actually done so quietly in the background. |
WebGL is supposed to use the But I'm not confident in how that's interpreted? It looks like the default framebuffer is allocated with a linear texture format initially, though it can be overridden to an explicitly sRGB format by calling (It is worth mentioning that Three.js does not call drawingBufferStorage() at all, so I think we should assume that Three.js is always writing to a linear drawing buffer.) The spec for the PredefinedColorSpace type says that it specifies "the color space of the canvas's backing store." If we understand the "backing store" to be the WebGL drawing buffer in this case, then that plausibly means that the internal texture, which has a linear format, is interpreted as having sRGB values in it for the purposes of compositing. Hence why most WebGL (and WebGPU) apps find it necessary to apply gamma correction to their final output values.
The typical way that this works is that when loading a glTF file you would place images like the baseColorTexture in a texture that uses an sRGB format, and images like normal and metallic/roughness maps in textures with a linear RGB format. All graphics APIs will convert from the color encoding of the texture to linear when values are sampled in the shader, so as long as you used the right formats for each texture the conversion you mentioned will be done for you. Similarly, when writing out to a framebuffer the values are always given linearly and converted to the encoding of the framebuffer format internally. If your render target is an sRGB format then the conversion is implicit. But, as I pointed out above, it looks like by default the WebGL framebuffer will be allocated with a linear format.
I apologize for not responding directly so far, this continues to be an area of significant complexity and nuance that I find I have to re-learn every time it comes up. My confidence level in the correctness of my understanding is middling. |
I added some clarifying text to the WebXR Layers spec. It matches the logic of both Chrome and Quest browser, |
Rik's PR language matches my understanding as I gave above in terms of what's actually happening. (Putting aside whether or not that's the "best" way to handle it.) One thing that I guess I'm still trying to wrap my head around, though, is how that logic actually gets applied in a typical compositor. For example, something like OpenXR. If we look at https://registry.khronos.org/OpenXR/specs/1.0/man/html/XrSwapchain.html it says "Images submitted in sRGB color space must be created using an API-specific sRGB format (e.g. DXGI_FORMAT_R8G8B8A8_UNORM_SRGB, GL_SRGB8_ALPHA8, VK_FORMAT_R8G8B8A8_SRGB) to apply automatic sRGB-to-linear conversion when read by the runtime. All other formats will be treated as linear values." So does that imply that a bit-exact copy from the linear-but-containing-sRGB-values WebGL texture to the OpenXR swapchain sRGB texture must occur? Maybe Rik could give some insight into how that bit works in the Meta browser? |
No, there is no copy. The same underlying texture data is used by the RGB and sRGB textures.
Klaus pointed out above how this works in Chrome. |
Thank you Rik, I think your https://github.com/immersive-web/layers/pull/305/files addresses the main concern. I think it would be good to mention somewhere that the default WebXR projection layer (when not using Layers API) acts the same as colorFormat=RGBA/RGB, but I don't know a good place to put that.
I think we've reached a consensus here? I feel the important point is to distinguish between "buffer contains linear vs nonlinear pixels" (the mapping from numbers to intended brightness) vs "buffer's declared type is SRGB/RGB" which controls the GPU's automatic gamma conversion to/from sRGB when reading from or writing to a buffer. In those terms, the default WebGL/WebXR buffers are expected to contain nonlinear pixels. For WebGL/WebXR rendering purposes it needs to be declared as a linear RGB/RGBA type for historical compatibility to ensure the GPU doesn't do gamma correction on output (the app must handle this). The system compositor needs to consume it as sRGB pixels. (Typically this would be done by marking it as a SRGB type and letting the GPU convert to linear on read for further compositing, but that's an implementation detail as long as the end result is equivalent.) |
That should likely be in the WebXR. I'll see if I can find a place for it. |
I am quite confidently that my understanding of this part of rendering is noticeably less than Brandon's. Much of this is touching on work within Khronos where I start to rely on other experts. What I do know is the following:
If this is an important discussion item, we should try to get Emmett to attend. |
What you list, is mostly on the input side so it doesn't apply. |
They probably should be spec'd at such, unless there's major hardware that supports XR otherwise, but doesn't support sRGB outputs.
The text was updated successfully, but these errors were encountered: