Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GLSL 1.50 not supported #137

Closed
orderorder opened this issue Jul 1, 2014 · 32 comments
Closed

GLSL 1.50 not supported #137

orderorder opened this issue Jul 1, 2014 · 32 comments

Comments

@orderorder
Copy link

I'm using mesa-10.2.2 drivers. When I try to launch the game the terminal shows the next error:

error: GLSL 1.50 is not supported. Supported versions are: 1.10, 1.20, 1.30, and 1.00 ES

The sound works but the screen don't show anything. Does the game work with mesa drivers? Is the GLSL the problem? How can I change the GLSL version?

Thanks

@motorsep
Copy link

motorsep commented Jul 1, 2014

Mesa drivers can be a problem. I recall open source drivers do not support s3tc compression and Doom 3 BFG has all textures compressed with various s3tc methods.

Please try AMD's drivers.

@orderorder
Copy link
Author

With the nonfree amd drivers works very well, but I was trying to see how fast it runs with the mesa drivers or if is it possible to use mesa instead of closed drivers. ;)

@motorsep
Copy link

motorsep commented Jul 3, 2014

Do you mean there are AMD drivers you have to pay for? :P

@orderorder
Copy link
Author

You know what I mean ;D

@auledoom1
Copy link

This issue and issue #107 are symptoms of the same problem, rbdoom runs fine with the AMD closed source drivers, but the mesa implementation on radeon does'nt meet the engine's opengl requeriments.

@ChristophHaag
Copy link

Sorry, but it is just false that mesa does not support the features doom3 bfg requires.

The problem is that with compatibility profiles mesa will only ever support opengl 3.0. With core profiles mesa supports opengl 3.3 including GLSL 1.50.
If you set MESA_GL_VERSION_OVERRIDE=3.3 MESA_GLSL_VERSION_OVERRIDE=330 you can observe that all the shaders will compile on radeonsi. BUT: If you override the version like that, You will run into the message that GL_ARB_multitexture is not supported, because these overrides are just a hack.

If doom3 bfg could switch to use a core context instead of compatibility there is a good chance that it will "just work".

On intel it seems to choose OpenGL ES shaders instead. Those compile and it looks like this:
doom3

@ChristophHaag
Copy link

Okay, I have successfully switched to the core profile by

  • Using SDL2: -DSDL2=ON in cmake-eclipse-linux-profile.sh
  • Enablin glewExperimental = GL_TRUE; in sdl_glimp.cpp
  • Updating GLEW to 1.11
  • Changed the condition from && to ||: glConfig.textureCompressionAvailable = GLEW_ARB_texture_compression != 0 || GLEW_EXT_texture_compression_s3tc != 0;. I don't know if both are needed? In a core context intel supports GL_ARB_texture_compression_bptc and GL_ARB_texture_compression_rgtc and s3tc is supported through the dxtn library.

Now doom3 starts on intel and radeonsi with a core context and works.

Using GLEW 1.11.0
OpenGL Version  : 3.3
OpenGL Vendor   : Intel Open Source Technology Center
OpenGL Renderer : Mesa DRI Intel(R) Ivybridge Mobile 
OpenGL GLSL     : 3.3
OpenGL Driver Type     :   1

It renders pretty much okay on radeonsi:
doom3radeon

But it spews these error messages:

Mesa: User error: GL_INVALID_ENUM in glDisable(GL_DEPTH_BOUNDS_TEST_EXT)
caught OpenGL error: GL_INVALID_ENUM in file /home/chris/build/rbdoom3-bfg-git/src/rbdoom3-bfg/neo/renderer/RenderSystem.cpp line 783

And when starting there is this error:

Mesa: User error: GL_INVALID_ENUM in glGetString(GL_EXTENSIONS)

I think I'll fork the repo and put all the stuff there so you can review the changes and see whether I did everything correctly because I have almost no experience with this stuff.

edit: Here are my commits: https://github.com/ChristophHaag/RBDOOM-3-BFG/commits/master

@BielBdeLuna
Copy link

great! Christopher with an Ivy Bridge chip does the shadows maps work? have you tried leaving them on? on Sandy Bridge (the generation before Ivy Bridge ) rendering them was too costly, and so Robert disabled them for mesa. but maybe with the newer Ivy bridge or with Haswell ( the generation after Ivy Bridge ) they work ok...

@ChristophHaag
Copy link

You mean commenting like this in RenderSystem.cpp?

    // RB: turn off shadow mapping for OpenGL drivers that are too slow
    switch( glConfig.driverType )
    {
        case GLDRV_OPENGL_ES2:
        case GLDRV_OPENGL_ES3:
        //case GLDRV_OPENGL_MESA:
            r_useShadowMapping.SetInteger( 0 );
            break;

        default:
            break;
    }
    // RB end

It runs kinda playable at 1920x1080. With smaller resolutions it will certainly be okay. I don't even know if it properly enabled and there are way too many opengl errors for this to be an accurate representation of performance anyway.

Here's a short 10 second video how it looks: https://www.youtube.com/watch?v=MNEx2e12it8 (soft shadows disabled, they really make it slow)

But the distinction for "mesa" is bad in my opinion anyway, when it also catches my Radeon HD 7970M or even stronger GPUs.

Maybe you can make it into a menu configuration option like soft shadows so the user can test and decide.

@RobertBeckebans
Copy link
Owner

glewExperimental = GL_TRUE should not be used as it enables all OpenGL extensions no matter they are provided or not. If you want to use the OpenGL 3.2 core profile then use +set r_useOpenGL32 2

@RobertBeckebans
Copy link
Owner

I tried your changes on Kubuntu 14.04 and the intel open source driver and the engine even fails with glGetString( GL_VERSION ). Dunno but the open source drivers really annoy me.
I extended the renderer to use OpenGL ES 3.0 shaders in case GLSL 1.50 is not supported.
I only used it on Intel with SDL 1.x however it should be the way for all Mesa drivers. You only need a compatibility profile for id Tech development tools like the D3Radiant or the wireframe debugging tools.

@DanielGibson
Copy link

Am 20.08.2014 00:42, schrieb Robert Beckebans:

I tried your changes on Kubuntu 14.04 and the intel open source driver
and the engine even fails with glGetString( GL_VERSION ). Dunno but the
open source drivers really annoy me.

Possibly newer kernels (Ubuntu 14.04 has 3.13) have better drivers -
especially the OpenGL 3.x stuff is quite recent as far as I know.

I think that https://01.org/linuxgraphics/ provides a recent driver for
intel GPUs without upgrading the kernel.

@ChristophHaag
Copy link

Ok, the documentation said:

which ensures that all extensions with valid entry points will be exposed.

and I don't know this stuff well enough to know whether you can have a valid entry point with the extension not being provided at all. I mean, the texture compression extension still failed because one of the things was not provided. Why is it activated for apple then? There is another comment that apple only supports the core profile. Did someone put that in as a workaround for the core profile?

Anyway. I have only googled shortly but from what I have read I don't know if SDL1.2 even supports creating any core context. May I ask, why not SDL2? As far as I have seen literally everything about it is better...

So set r_useOpenGL32 2 in autoexec.cfg? Okay, but better would be to just use core profile if it is available. Shouldn't it be always available on non-embedded drivers? There is also a bug report at mesa: https://bugs.freedesktop.org/show_bug.cgi?id=78773
From what I heard this means:

The problem is that game opens a core context but expects to use legacy/compatibility extensions like GL_ARB_multitexture. Same happens with GL_ARB_texture_compression and GL_ARB_vertex_buffer_object which it queries as well.
that when you have a core context, you should not request these extensions.

If I do not use glew experimental, and comment out all the extension requests where it errors out, I get a segfault in R_CheckPortableExtensions...

        // recheck all the extensions (FIXME: this might be dangerous)
    R_CheckPortableExtensions();

wut, dangerous?? When I comment that out I get a segfault in idVertexBuffer::AllocBufferObject. I believe this is because without experimental, glew does not set up all the stuff like buffers etc. correctly.

and the engine even fails with glGetString( GL_VERSION )

I also get Mesa: User error: GL_INVALID_ENUM in glGetString(GL_EXTENSIONS) because of this bug in glew: http://sourceforge.net/p/glew/bugs/120/?page=1

I believe your frustration does not come from the open source drivers but from glew.

And now I believe that my trouble comes from the extension string not working with core contexts where experimental glew just happens to make glGetString to work by magic.

glew has this issue for 4.5 years now??? WTF?

@ChristophHaag
Copy link

So it's not a big problem to get glew to work with a core context, but the segfaults in a core context are a problem. Again, I have almost no practical experience and google my way through this stuff. :) I have read this: http://stackoverflow.com/questions/7589974/opengl-glgenbuffer-vs-glgenbuffersarb/7590050#7590050

And if I "de-ARB-ify" the functions, it stops segfaulting.
Like, doing stuff like this:

-       glBindBufferARB( GL_ARRAY_BUFFER_ARB, bufferObject );
+       glBindBuffer( GL_ARRAY_BUFFER, bufferObject );

so glew is not loading the "old" ARB functions but only the new ones, and calling the old ones segfaults. If I interpret that answer on stackoverflow correctly, this is the expected behavior, isn't it?

@kortemik
Copy link

@ChristophHaag https://github.com/DeanoC/RBDOOM-3-BFG.git here is some work that has been put towards de-arbifying, you might want to take a look at it

@ChristophHaag
Copy link

In my fork I replaced a bit by hand and then just put all files through sed to replace a few functions that segfaulted and now it runs on radeonsi without OpenGL errors. Apparently the old enums work just fine with the new functions.

Intel is for some reason again using the GL ES 3 path again even with set r_useOpenGL32 2 in autoexec.cfg and D3BFGConfig.cfg but radeonsi is definitely compiling and using the 1.50 shaders successfully.

Of course doing it properly would be better. If there is a list of replacements, this shouldn't be hard to automate.

@RobertBeckebans
Copy link
Owner

glGetString(GL_EXTENSIONS) is obsolete with OpenGL 3.2 core profiles and I
already patched GLEW 1.10 with a workaround. Even the Nvidia driver would
fail on Linux with +set r_useOpenGL32 2. It is right, if we create a core
profile then we can simply skip most extensions checks like
GL_ARB_multitexture because they are part of the core profile anyway and we
can set them to true in that case.

2014-08-20 10:27 GMT+02:00 Mikko Kortelainen [email protected]:

@ChristophHaag https://github.com/ChristophHaag
https://github.com/DeanoC/RBDOOM-3-BFG.git here is some work that has
been put towards de-arbifying, you might definitely take a look at it


Reply to this email directly or view it on GitHub
#137 (comment)
.

@RobertBeckebans
Copy link
Owner

GLSL ES 3.00 is used for the Mesa path which is determined in R_CheckPortableExtensions() by

if( idStr::Icmpn( glConfig.renderer_string, "Mesa", 4 ) == 0 || idStr::Icmpn( glConfig.renderer_string, "X.org", 4 ) == 0 )
{
glConfig.driverType = GLDRV_OPENGL_MESA;
}

@RobertBeckebans
Copy link
Owner

I would suggest to extend that check for Gallium and other open source drivers and then try again. The intel driver did not support GLSL 1.50 when I tried it. GLSL ES 3.00 was the only way and is a subset of GLSL 1.50.

@BielBdeLuna
Copy link

compiles and plays correctly with and without r_useOpenGL32 2,
it displays this in a sandy bridge with mesa:

(...)
----- R_InitOpenGL -----
Initializing OpenGL subsystem
Using 8 color bits, 24 depth, 8 stencil display
Using GLEW 1.10.0
OpenGL Version  : 3.0
OpenGL Vendor   : Intel Open Source Technology Center
OpenGL Renderer : Mesa DRI Intel(R) Sandybridge Mobile 
OpenGL GLSL     : 1.3
   maxTextureAnisotropy: 16.000000
...using GL_EXT_texture_lod_bias
X..GL_GREMEDY_string_marker not found
...using GL_EXT_framebuffer_object
...using GL_EXT_framebuffer_blit
----- Initializing Render Shaders -----
----- Initializing Sound System ------
(...)

it states GLSL 1.3 while openGL is 3.0 and GLEW 1.10.0

@RobertBeckebans
Copy link
Owner

It only matters with SDL 2. SDL 1.2 is not able to create a compatibility or core profile so it uses the default OpenGL context. This stuff is really annoying about OpenGL :(
You don't have these problems with OpenGL ES and EGL.

@BielBdeLuna
Copy link

so with SDL 1.2 it's the same as r_useOpenGL32 0?

@ChristophHaag
Copy link

With SDL1.2 you can get OpenGL ES 3 on intel. It enables this by the keyword "Mesa" in the renderer string.

glxinfo | grep OpenGL
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile 
OpenGL core profile version string: 3.3 (Core Profile) Mesa 10.3.0-devel (git-92ee0ae)
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 10.3.0-devel (git-92ee0ae)
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 10.3.0-devel (git-92ee0ae)
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.0
OpenGL ES profile extensions:

For radeon the equivalent string is "Gallium":

DRI_PRIME=1 glxinfo | grep OpenGL
OpenGL vendor string: X.Org
OpenGL renderer string: Gallium 0.4 on AMD PITCAIRN
OpenGL core profile version string: 3.3 (Core Profile) Mesa 10.3.0-devel (git-92ee0ae)
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 10.3.0-devel (git-92ee0ae)
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 10.3.0-devel (git-92ee0ae)
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.0
OpenGL ES profile extensions:

With up to date mesa you get OpenGL 3.3 in at least intel and radeon.

I believe for sandy bridge you only get up to OpenGL 3.1 for now, but 3.2 and probably 3.3 is in the works: http://www.phoronix.com/scan.php?page=news_item&px=MTc2MzA

On my Ivy Bridge it works pretty good with OpenGL 3.2: https://www.youtube.com/watch?v=Goe0JBvHyBI

It's not the best performance on intel with shadow mapping, but it's kind of okay.

For ubuntu you probably want to use the PPA of the user oibaf: https://launchpad.net/~oibaf/+archive/ubuntu/graphics-drivers
This is most of the graphics stuff kept up to date.

@RobertBeckebans
Copy link
Owner

"so with SDL 1.2 it's the same as r_useOpenGL32 0?" Yes

@BielBdeLuna
Copy link

I don't understand how Ubuntu works I have both libsdl1.2 and libsdl2 (libsdl1.2-dev, libsdl1.2debian, libsdl2-dev, libsdl2-2.0-0, libsdl2... all the damn libsdl2 libraries as well as the "recommended" by ubuntu libsdl1.2. yet still can't play d3bfg with sdl2: when I compile it compiles fine, but when I try to start it up I get this:

(...)
----- R_InitOpenGL -----
Initializing OpenGL subsystem
WARNING: SDL_GL_SWAP_CONTROL not supported
Using 8 color bits, 24 depth, 8 stencil display
GLimp_Init() - GLEW could not load OpenGL subsystem: Missing GL versionsignal caught: Segmentation fault
si_code 1
Trying to exit gracefully..
(...)

I have libglew-dev version 1.10.0-3 (and all the other libglew which all of them are the same version)

but as I read in http://glew.sourceforge.net/ this version of libglew is already adding openGL 4.5 additions. and I have to go back to libglew version 1.5.4 for something around openGL 3.3 so it's clearly not libglew that is failing here.

I think Ubuntu uses the last version of SDL (that is sdl1.2) even if you have sdl2 enabled. I curse Ubuntu so much! :D

@ChristophHaag
Copy link

Have you changed DSDL2=ON in theneo/cmake-linux*.sh script you run? I had only changed one script in my fork, now I have changed all.
Does ldd say it is linked against SDL2

$ ldd RBDoom3BFG| grep SDL
        libSDL2-2.0.so.0

@BielBdeLuna
Copy link

I have it ON in order to in the sh script, and I get that error message (without SDL2 I can play allright). what does the ldd command do? where do I do the ldd command?

@ChristophHaag
Copy link

It belongs to the linker and shows what libraries the binary you give as an argument will use. And if it is sdl2 then sdl2 will be used. But sdl2 is not the only requirement. My fork I linked a few comments back is certainly not very clean, but it should give you an OpenGL 3 core context and actually run.

@darkhemic
Copy link

In doing some digging around. Are the problems with SDL2 and the GLSL shaders linked to issues when the library is built? I'm looking over my configure file and logs for this set of libraries I'm building for the project I'm working on.

configure:20403: result: yes
configure:20874: checking for OpenGL (GLX) support
configure:20892: gcc -c -g -O2 -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include conftest.c >&5
configure:20892: $? = 0
configure:20898: result: yes
configure:20926: checking for EGL support
configure:20943: gcc -c -g -O2 -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include conftest.c >&5
conftest.c:133:30: error: EGL/egl.h: No such file or directory

and further down

configure:20957: checking for OpenGL ES v1 headers
configure:20975: gcc -c -g -O2 -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include conftest.c >&5
conftest.c:133:30: error: GLES/gl.h: No such file or directory
conftest.c:134:33: error: GLES/glext.h: No such file or directory

and then

configure:20993: checking for OpenGL ES v2 headers
configure:21011: gcc -c -g -O2 -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include -Iinclude -I/home/chris/Open-BFG-Tech/lib/SDL2-2.0.3/include conftest.c >&5
conftest.c:133:32: error: GLES2/gl2.h: No such file or directory
conftest.c:134:35: error: GLES2/gl2ext.h: No such file or directory

@ChristophHaag
Copy link

conftest.c:133:32: error: GLES2/gl2.h: No such file or directory

Well, I have /usr/include/GLES2/gl2.h from mesa. If you hare using mesa, maybe it is not compiled with --enable-gles2? It's off by default, your distribution has to enable it. If you are using proprietary drivers from amd or nvidia, I don't know if they support gles sufficiently. Or maybe they just aren't correctly installed?

Back to the topic of this bugreport. Two things:
First:

GLSL ES 3.00 is used for the Mesa path
Can you maybe rename GLDRV_OPENGL_MESA? I find it very counter intuitive and impossible to guess that "mesa" means "OpenGL ES" because mesa implements OpenGL and OpenGL ES.

Secondly:
9147482 made not only classic mesa but also mesa/gallium use OpenGL ES. In the short term this makes it work without changing too much, but it also has the unfortunate side effect, that e.g. shadow mapping is disabled, even if it's a very powerful radeon gpu that could easily render it even with today's not completely optimized driver.

Sorry, if this sounds pejorative, not intended: But when doing the OpenGL core profile to the specification, it works perfectly fine on mesa. In my fork I replaced a few functions and variables by hand and a few that kept segfaulting with sed, and it wasn't actually much work, but obviously having deprecated functions that "happen" to work and new functions mixed is a mess, so I completely understand why you didn't just merge it. :)

Is there a plan to eventually update the legacy stuff to OpenGL core and then just drop GLDRV_OPENGL_MESA and use GLDRV_OPENGL32_CORE_PROFILE for it?

@shmerl
Copy link

shmerl commented Aug 27, 2017

Is there a plan to eventually update the legacy stuff to OpenGL core and then just drop GLDRV_OPENGL_MESA and use GLDRV_OPENGL32_CORE_PROFILE for it?

I'd second that. There is no need for compat profile anymore, it only creates problems. it should be just using core profile straight.

@myousefi2016
Copy link

Your suggestions helped me a lot to solve a problem!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests