Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requesting "at least" a certain OpenGL context version #58

Open
dolda2000 opened this issue Jul 23, 2022 · 3 comments
Open

Requesting "at least" a certain OpenGL context version #58

dolda2000 opened this issue Jul 23, 2022 · 3 comments

Comments

@dolda2000
Copy link

Perhaps I'm just being blind and this feature already exists, but it seems to me that, if I specify an explicit OpenGL version (like 3.3) in the GLData, then that is exactly the version I get, rather than it being used to request "this version or any later version".

As for why I need this, perhaps some brief background is useful. I'm in the process of adding LWJGL support to a project that thus far has only used JOGL, where I can create a context with the maximal version that supports the core profile, so when I use it on a system that supports OpenGL 4.6, I get 4.6, and when I use it on a system supporting 3.3, I get 3.3.

There are two main reasons why this is nice. For one thing, it lets me optionally use functionality that is available only in OpenGL versions higher than what I strictly require. If I specify OpenGL 3.3 with LWJGL-AWT, the JVM crashes if I call eg. glObjectLabel, whereas if I specify the maximal OpenGL version with JOGL, I can detect at runtime if it's available and only call it if it is.

The other thing is that I can usually run on a lower OpenGL version than I "technically" require. For instance, going by the spec, my program requires OpenGL 3.3, since it uses (among other things) glVertexAttribDivisor. However in reality, the majority of OpenGL implementations down to 3.0 support that function even though it only became core in 3.3, and if I just require the maximally supported OpenGL version, I can easily run on the vast majority of OpenGL 3.2 implementations with no issue, whereas if I request 3.2 with LWJGL-AWT, then that function becomes unavailable for me.

I should also add that it seems that LWJGL-AWT technically already supports this, just not on Win32. I only realized this after someone else tested on Windows (I only tested on Linux myself), and apparently only the Win32 implementation calls validateAttributes. The Linux and MacOS versions don't seem to call that, so I could just specify GLData.profile = GLData.Profile.CORE without specifying majorVersion or minorVersion (leaving them at zero), and that worked fine. I got an OpenGL 4.6 core profile and everything worked as expected, but since the Win32 implementation calls validateAttributes, it throws an exception complaining that a Core profile may only be specified if {major,minor}version have been specified as at least 3.2. Which I know it technically needs to be, but actually specifying that forces the OpenGL context version down to that, instead of using the latest version.

@hageldave
Copy link
Contributor

I do not specify anything in the GLData object, i.e., I use new GLData() and pass that to the constructor of AWTGLCanvas. This way I get the highest possible version (in linux that is). I believe its the same for windows, but I'm too lazy to switch OS and check. Looking at the usages of GLData.majorVersion you should be able to find some code in the respective PlatformGLCanvas implementations that populates fields in the effective GLData object. So you can go check what the effective version is when looking at the protected field AWTGLCanvas.effectiveafter the context was created, e.g. during initGL() or paintGL().

@dolda2000
Copy link
Author

dolda2000 commented May 9, 2023

Sorry for not following up on this until now. It has been a bit of a background project for me.

The issue with not filling in {major,minor}Version is that (as I briefly alluded to) I do want to specify a Core profile (ie. GLData.profile = GLData.Profile.CORE), and when I do this, I get an error from GLUtil.validateAttributes that I also need to use at least OpenGL 3.2 when I don't set the version explicitly. That made me draw the conclusion that I'm supposed to always fill in the version numbers. Should this be considered a bug with not handling unspecified version numbers correctly when other attributes of GLData are specified?

I'm not sure if you're telling me to actually just use an empty GLData and just check so that I get a Core profile back? I guess I certainly could do that, but wouldn't that risk getting a compatibility profile on some implementations even if they could give me a Core profile if I had just specified that?

@dolda2000
Copy link
Author

I'm not sure if you're telling me to actually just use an empty GLData and just check so that I get a Core profile back? I guess I certainly could do that

Actually, having tried that, I'm not so sure. If I don't request a profile, I just get null for the profile in the effective GLData. I tried requesting the profile with glGetInteger(GL_CONTEXT_PROFILE_MASK) as well, and it too returns 0 when no profile has been requested on creation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants