-
-
Notifications
You must be signed in to change notification settings - Fork 477
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Windows] Unable to create non-SRGB framebuffer #1175
Comments
Can you call this function and see if you actually got an sRGB context on linux? https://docs.rs/glutin/0.22.0-alpha1/glutin/struct.ContextWrapper.html#method.get_pixel_format Can you call this function to check if you got an EGL context on linux? https://docs.rs/glutin/0.22.0-alpha1/glutin/struct.Context.html#method.get_egl_display Currently sRGB support should work with GLX, but not EGL. No clue about WGL. on MacOS I have a burning suspicion it is broken (#1160). |
Executing the following immediately after creating the use crate::glium::glutin::os::ContextTraitExt;
println!("{:#?}", display.gl_window().get_pixel_format());
println!("{:?}", unsafe { display.gl_window().get_egl_display() }); ... with
With EDIT: I guess it would be useful to see the output of |
Right, hmmm. On linux, can you check if it thinks it's srgb? https://community.khronos.org/t/check-color-encoding-in-the-default-framebuffer-draw-buffer-for-srgb/72854 Also, can you provide some hardware info: |
Thanks for answering so fast! :)
I never executed raw OpenGL calls when using
Did you mean
|
I've never used glium, sorry. |
So ok, I ran this code using let events_loop = glutin::EventsLoop::new();
let wb = glutin::WindowBuilder::new();
let gl_window = glutin::ContextBuilder::new()
.with_srgb(false) // <------------------------------------------
.build_windowed(wb, &events_loop)
.unwrap();
let gl_window = unsafe { gl_window.make_current().unwrap() };
gl::load_with(|symbol| gl_window.get_proc_address(symbol) as *const _);
let mut out: gl::types::GLint = 0;
unsafe {
gl::GetFramebufferAttachmentParameteriv(
gl::DRAW_FRAMEBUFFER,
gl::BACK,
gl::FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING,
&mut out as *mut _,
);
}
println!("out: {}", out);
if out == gl::LINEAR as i32 {
println!(" linear");
} else if out == gl::SRGB as i32 {
println!(" srgb");
} It always prints, regardless of the value in
I only checked this on Ubuntu now, but I might check it on windows later. In any case, this seems like a bug somewhere, right? The framebuffer is sRGB but it shouldn't. Any idea what could be wrong here? EDIT: I tested it on Windows and it get's stranger. Regardless of the
The shown color is I uploaded my test program now: https://github.com/LukasKalbertodt/srgb-test |
So on Windows (with WGL) what might be happening is that the function From there, any call to @zegentzy Should |
@sumit0190 If the claim is that the format is not actually sRGB but On windows, the format is decided by calling If a format is found after calling that function, we use that format, as we cannot change it, see the remarks for
-- https://docs.microsoft.com/en-us/windows/desktop/api/wingdi/nf-wingdi-setpixelformat If not found (as in previously unset), we either call
Looking over the code I see one potential issues which should be investigated by someone in procession of a windows computer: |
Numerous things:
|
Good point. I changed it but it didn't change anything on Ubuntu or Windows. I pushed the change to the repo I linked above. I also checked
I'm not quite sure what you want to say by that, sorry! And could someone clarify in what state this issue is? Like:
|
There's definitely something funky going on here, although my cursory knowledge of OpenGL/glutin may not be enough to figure it out and @zegentzy might have to help me out in the end. Interestingly, at least on my machine, changing |
@sumit0190 Can you check that |
@zegentzy So here's some more interesting stuff.
So the Then I decide to use my own dummy descriptor, with some random values for all attributes. This is what that looks like: let descriptor =
[
gl::wgl_extra::DRAW_TO_WINDOW_ARB as i32, 1 as i32,
gl::wgl_extra::SUPPORT_OPENGL_ARB as i32, 1 as i32,
gl::wgl_extra::DOUBLE_BUFFER_ARB as i32, 1 as i32,
gl::wgl_extra::PIXEL_TYPE_ARB as i32, gl::wgl_extra::TYPE_RGBA_ARB as i32,
gl::wgl_extra::COLOR_BITS_ARB as i32, 32 as i32,
gl::wgl_extra::DEPTH_BITS_ARB as i32, 24 as i32,
gl::wgl_extra::STENCIL_BITS_ARB as i32, 8 as i32,
// gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as i32, pf_reqs.srgb as i32,
0 as i32, // End
]; Still the same result though; a format of But if I uncomment the commented line, suddenly things change: now, a format of Now that makes me think: it looks like So I make these modifications: // Find out if sRGB is needed and explicitly set its value to 0 or 1.
if extensions
.split(' ')
.find(|&i| i == "WGL_ARB_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_ARB as raw::c_int,
);
} else if extensions
.split(' ')
.find(|&i| i == "WGL_EXT_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as raw::c_int,
);
} else {
return Err(());
}
out.push(pf_reqs.srgb as raw::c_int); And this should do the same thing as my dummy descriptor. But even though I can see the attribute ( |
// Find out if an sRGB extension is present then explicitly set its value to 0 or 1.
if extensions
.split(' ')
.find(|&i| i == "WGL_ARB_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_ARB as raw::c_int,
);
out.push(pf_reqs.srgb as raw::c_int);
} else if extensions
.split(' ')
.find(|&i| i == "WGL_EXT_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as raw::c_int,
);
out.push(pf_reqs.srgb as raw::c_int);
} else {
if pf_reqs.srgb {
return Err(());
}
}
Nvm, didn't read hard enough, you said it still doesn't work in the end... hmmmm. |
Can you comment out the code specifying things like alpha, stencil bits, ect, ect, then add them back one by one. The goal is to discover if one of the other options is conflicting/overriding our what we set |
We'd still need to modify it anyway, since if it's left unspecified it's somehow assumed to be true (this is contrary to the documented behavior; I should try this on Linux sometime). Anyway, I was doing exactly what you suggested and narrowed down the culprit to So just to be clear: if |
Can you run this program and check the available pixel formats: http://realtech-vr.com/admin/glview This just strikes me as shitty intel drivers. |
Available pixel formats seem right to me. And I did my investigation on an Nvidia Quadro with recent-ish drivers (although theirs have been known to be shitty as well, so you never know). Also, I just verified that setting |
@zegentzy Here's an interesting link that chronicles similar oddities: Note that while the thread itself is from 2014, the last post (by which time the issue hadn't been resolved yet) is from a year ago. Interestingly, on my Linux VM, |
@sumit0190 Is there a format with non-zero alpha bits which is also not-srgb? |
@zegentzy Not that I can find. Anyway, from what I can observe, this behavior is visible outside of I created this table to better document this weird behavior. Note that while this experiment was done with the C++ version, Without further ado: Default attributes:
Pixel formats:
Observations:
Summary:
@zegentzy Thoughts? |
Silly driver issues are silly. Not much we can do. Please file this as a PR: #1175 (comment) I'm opening separate issues for X11 and macOS, so that this issue doesn't get too cluttered. Edit: anyways, I'll close this issue as wontfix (as there is nothing we can do) once you got that PR filed. Thanks for your time :) |
Appoligies, @LukasKalbertodt, you sorta got drowned out.
What I'm saying is, while that code is satisfactory for testing if the FBO is linear/srgb, actually drawing to it would cause silly stuff. I just wanted to point that out, altho it wasn't not actually relevant to what it was demonstrating.
Yeah something looks wrong, altho I'm not all too familiar with how sRGB works. Yeah, it's not your fault.
Yeah, I've experienced sRGB issues. I usually fiddle with the supplied colors until I get my expected color.
Most likely there is little we can do. If the drivers are broken, the drivers are broken. Feel free to scheme up workarounds like that in #1175 (comment). If they aren't too intrusive, I'm fine with including them. |
@zegentzy I'll have that PR ready by tomorrow. It was fun investigating this; I hope to be able to contribute more to the project! @LukasKalbertodt With my PR, sRGB rendering will be controlled by a |
Well, with your specific drivers, yes. Other drivers might behave differently, so users should still set |
@zegentzy Well that's the thing: I tested this with Nvidia, Intel and AMD and got the same results on all three when testing with Windows 10 (with the latest drivers). OP's Intel behaves the same way. All this makes me think that it's at least partially due to WGL, but I could be wrong. So yeah, I don't think this has anything to do with the driver version. |
@sumit0190 Older/newer implementations might behave differently, idk. Maybe I'm reading the extension wrong. Maybe none of the drivers bothered caring. Nevertheless, users should set |
Sorry for the super late reply!
No problem! Thank you two for investigating this so quickly! I unfortunately do not understand everything you wrote as I have no idea about the windowing site of OpenGL. As I understand all comments: "drivers are bad & strange, set So I think this all is pretty unfortunate because AIFACT most glium programs on most machines are just "wrong" then. Meaning: colors are incorrectly converted. I mean from the artistic point of view it doesn't matter because you just tweak your colors until you like it. But returning |
@LukasKalbertodt Is this still an issue? |
I just tested this again with more or less the code from my first comment: (glium 0.27.0) #[macro_use]
extern crate glium;
use glium::Surface;
fn main() {
let events_loop = glium::glutin::event_loop::EventLoop::new();
let wb = glium::glutin::window::WindowBuilder::new()
.with_inner_size(glium::glutin::dpi::LogicalSize::new(1024.0, 768.0))
.with_title("Hello world");
let cb = glium::glutin::ContextBuilder::new().with_srgb(true);
let display = glium::Display::new(wb, cb, &events_loop).unwrap();
#[derive(Copy, Clone)]
struct Vertex {
position: [f32; 2],
}
implement_vertex!(Vertex, position);
let shape = vec![
Vertex { position: [-1.0, -1.0] },
Vertex { position: [-1.0, 1.0] },
Vertex { position: [ 1.0, -1.0] },
Vertex { position: [ 1.0, 1.0] },
];
let vertex_buffer = glium::VertexBuffer::new(&display, &shape).unwrap();
let indices = glium::index::NoIndices(glium::index::PrimitiveType::TriangleStrip);
let vertex_shader_src = r#"
#version 140
in vec2 position;
void main() {
gl_Position = vec4(position, 0.0, 1.0);
}
"#;
let fragment_shader_src = r#"
#version 140
out vec4 color;
void main() {
color = vec4(vec3(0.5), 1.0);
}
"#;
let program = glium::Program::new(
&display,
glium::program::ProgramCreationInput::SourceCode {
vertex_shader: vertex_shader_src,
tessellation_control_shader: None,
tessellation_evaluation_shader: None,
geometry_shader: None,
fragment_shader: fragment_shader_src,
transform_feedback_varyings: None,
outputs_srgb: false, // MARK B ----------------------------------
uses_point_size: false,
}
).unwrap();
loop {
let mut target = display.draw();
target.draw(&vertex_buffer, &indices, &program, &uniform!{}, &Default::default()).unwrap();
target.finish().unwrap();
}
} And I indeed still get exactly the same colors for the four scenarios I described. So yes, this issue still exists as far as I can tell. I just tested on Ubuntu 20.04 now, though. |
@LukasKalbertodt In my situation, it worked as expected to add Btw, why does no conversion from sRGB to RGB happen in the case of "with_srgb(false) and outputs_srgb: true"? |
@Boscop Yes adding Regarding your question, see my first comment, in particular:
|
@LukasKalbertodt Thanks, I read that, but why does that mean that no conversion happens (if the framebuffer is RGB but the shader outputs sRGB)? |
To be honest, I already forgot most of the details. I can't tell you why this is, but that's apparently how OpenGL works. For example. also see this StackOverflow answer.
|
Fixed in #1435, I guess. |
Hello there!
I have found a few other issues related to sRGB, but they simply do not help me and I am still incredibly confused. My problem is that
ContextBuilder::with_srgb
does not seem to have any effect. Check out this minimal example (usingglium
):There are two places of interest which I will explain later.
.with_srgb(_) // MARK A
outputs_srgb: _, // MARK B
My test setup is the following: I run this program and take a screenshot and then inspect what color value the pixels have. For the four possible configurations, I get the following values:
.with_srgb(false)
.with_srgb(true)
outputs_srgb: false
#bbbbbb
#bbbbbb
outputs_srgb: true
#808080
#808080
So as far as I understand: there is the
GL_FRAMEBUFFER_SRGB
flag. If that flag isfalse
, OpenGL does not perform any conversion from fragment shader to frame buffer. If it is enabled, however, OpenGL assumes that the shader output is linear RGB and will thus -- if the frame buffer has an sRGB format -- convert the shader output to sRGB. Inglium
,GL_FRAMEBUFFER_SRGB
is controlled by theoutputs_srgb
parameter of the program. If the latter isfalse
,GL_FRAMEBUFFER_SRGB
is set totrue
and the other way around.Additionally, I would expect
glutin
to create a framebuffer with the format specified by.with_srgb(_)
. As such, I have the following expectations:with_srgb(false)
andoutputs_srgb: false
: the framebuffer should be linear RGB, meaning that no conversion should happen, regardless ofGL_FRAMEBUFFER_SRGB
. As such, I would expect#808080
, but I got#bbbbbb
.with_srgb(true)
andoutputs_srgb: false
: the framebuffer should be sRGB and since the shader does not output sRGB, I expect OpenGL to convert. As such, I expect#bbbbbb
, which I actually got.with_srgb(false)
andoutputs_srgb: true
: as above, the framebuffer should be linear RGB, meaning that no conversion should happen, regardless ofGL_FRAMEBUFFER_SRGB
. As such, I would expect#808080
which I luckily also got.with_srgb(true)
andoutputs_srgb: true
: the framebuffer is sRGB, but the shader also outputs in that color space, so no conversion should happen. As such I expect#808080
which I got.This issue is about the first situation. As far as I can tell, this is just wrong.
I tested the above on several platforms: Ubuntu 18.04, MacOS and Windows. I always got the same results (well, on MacOS and Windows the
#bbbbbb
was slightly off, but still way more than#808080
).(I also did a test with GLFW in case that's interesting. I used the
simple
example and changed line 63 to" gl_FragColor = vec4(vec3(0.5), 1.0);\n"
. Running that, I get a#808080
triangle.)Am I doing something wrong? Am I completely misunderstanding color spaces? Is this a bug in
glutin
/glium
/...? Would be super great if someone could help me!The text was updated successfully, but these errors were encountered: