-
Notifications
You must be signed in to change notification settings - Fork 259
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GLSL 1.50 not supported #137
Comments
Mesa drivers can be a problem. I recall open source drivers do not support s3tc compression and Doom 3 BFG has all textures compressed with various s3tc methods. Please try AMD's drivers. |
With the nonfree amd drivers works very well, but I was trying to see how fast it runs with the mesa drivers or if is it possible to use mesa instead of closed drivers. ;) |
Do you mean there are AMD drivers you have to pay for? :P |
You know what I mean ;D |
This issue and issue #107 are symptoms of the same problem, rbdoom runs fine with the AMD closed source drivers, but the mesa implementation on radeon does'nt meet the engine's opengl requeriments. |
Okay, I have successfully switched to the core profile by
Now doom3 starts on intel and radeonsi with a core context and works.
It renders pretty much okay on radeonsi: But it spews these error messages:
And when starting there is this error:
I think I'll fork the repo and put all the stuff there so you can review the changes and see whether I did everything correctly because I have almost no experience with this stuff. edit: Here are my commits: https://github.com/ChristophHaag/RBDOOM-3-BFG/commits/master |
great! Christopher with an Ivy Bridge chip does the shadows maps work? have you tried leaving them on? on Sandy Bridge (the generation before Ivy Bridge ) rendering them was too costly, and so Robert disabled them for mesa. but maybe with the newer Ivy bridge or with Haswell ( the generation after Ivy Bridge ) they work ok... |
You mean commenting like this in RenderSystem.cpp?
It runs kinda playable at 1920x1080. With smaller resolutions it will certainly be okay. I don't even know if it properly enabled and there are way too many opengl errors for this to be an accurate representation of performance anyway. Here's a short 10 second video how it looks: https://www.youtube.com/watch?v=MNEx2e12it8 (soft shadows disabled, they really make it slow) But the distinction for "mesa" is bad in my opinion anyway, when it also catches my Radeon HD 7970M or even stronger GPUs. Maybe you can make it into a menu configuration option like soft shadows so the user can test and decide. |
glewExperimental = GL_TRUE should not be used as it enables all OpenGL extensions no matter they are provided or not. If you want to use the OpenGL 3.2 core profile then use +set r_useOpenGL32 2 |
I tried your changes on Kubuntu 14.04 and the intel open source driver and the engine even fails with glGetString( GL_VERSION ). Dunno but the open source drivers really annoy me. |
Am 20.08.2014 00:42, schrieb Robert Beckebans:
Possibly newer kernels (Ubuntu 14.04 has 3.13) have better drivers - I think that https://01.org/linuxgraphics/ provides a recent driver for |
Ok, the documentation said:
and I don't know this stuff well enough to know whether you can have a valid entry point with the extension not being provided at all. I mean, the texture compression extension still failed because one of the things was not provided. Why is it activated for apple then? There is another comment that apple only supports the core profile. Did someone put that in as a workaround for the core profile? Anyway. I have only googled shortly but from what I have read I don't know if SDL1.2 even supports creating any core context. May I ask, why not SDL2? As far as I have seen literally everything about it is better... So
If I do not use glew experimental, and comment out all the extension requests where it errors out, I get a segfault in R_CheckPortableExtensions...
wut, dangerous?? When I comment that out I get a segfault in idVertexBuffer::AllocBufferObject. I believe this is because without experimental, glew does not set up all the stuff like buffers etc. correctly.
I also get I believe your frustration does not come from the open source drivers but from glew. And now I believe that my trouble comes from the extension string not working with core contexts where experimental glew just happens to make glGetString to work by magic. glew has this issue for 4.5 years now??? WTF? |
So it's not a big problem to get glew to work with a core context, but the segfaults in a core context are a problem. Again, I have almost no practical experience and google my way through this stuff. :) I have read this: http://stackoverflow.com/questions/7589974/opengl-glgenbuffer-vs-glgenbuffersarb/7590050#7590050 And if I "de-ARB-ify" the functions, it stops segfaulting.
so glew is not loading the "old" ARB functions but only the new ones, and calling the old ones segfaults. If I interpret that answer on stackoverflow correctly, this is the expected behavior, isn't it? |
@ChristophHaag https://github.com/DeanoC/RBDOOM-3-BFG.git here is some work that has been put towards de-arbifying, you might want to take a look at it |
In my fork I replaced a bit by hand and then just put all files through sed to replace a few functions that segfaulted and now it runs on radeonsi without OpenGL errors. Apparently the old enums work just fine with the new functions. Intel is for some reason again using the GL ES 3 path again even with Of course doing it properly would be better. If there is a list of replacements, this shouldn't be hard to automate. |
glGetString(GL_EXTENSIONS) is obsolete with OpenGL 3.2 core profiles and I 2014-08-20 10:27 GMT+02:00 Mikko Kortelainen [email protected]:
|
GLSL ES 3.00 is used for the Mesa path which is determined in R_CheckPortableExtensions() by if( idStr::Icmpn( glConfig.renderer_string, "Mesa", 4 ) == 0 || idStr::Icmpn( glConfig.renderer_string, "X.org", 4 ) == 0 ) |
I would suggest to extend that check for Gallium and other open source drivers and then try again. The intel driver did not support GLSL 1.50 when I tried it. GLSL ES 3.00 was the only way and is a subset of GLSL 1.50. |
compiles and plays correctly with and without r_useOpenGL32 2,
it states GLSL 1.3 while openGL is 3.0 and GLEW 1.10.0 |
It only matters with SDL 2. SDL 1.2 is not able to create a compatibility or core profile so it uses the default OpenGL context. This stuff is really annoying about OpenGL :( |
so with SDL 1.2 it's the same as r_useOpenGL32 0? |
With SDL1.2 you can get OpenGL ES 3 on intel. It enables this by the keyword "Mesa" in the renderer string.
For radeon the equivalent string is "Gallium":
With up to date mesa you get OpenGL 3.3 in at least intel and radeon. I believe for sandy bridge you only get up to OpenGL 3.1 for now, but 3.2 and probably 3.3 is in the works: http://www.phoronix.com/scan.php?page=news_item&px=MTc2MzA On my Ivy Bridge it works pretty good with OpenGL 3.2: https://www.youtube.com/watch?v=Goe0JBvHyBI It's not the best performance on intel with shadow mapping, but it's kind of okay. For ubuntu you probably want to use the PPA of the user oibaf: https://launchpad.net/~oibaf/+archive/ubuntu/graphics-drivers |
"so with SDL 1.2 it's the same as r_useOpenGL32 0?" Yes |
I don't understand how Ubuntu works I have both libsdl1.2 and libsdl2 (libsdl1.2-dev, libsdl1.2debian, libsdl2-dev, libsdl2-2.0-0, libsdl2... all the damn libsdl2 libraries as well as the "recommended" by ubuntu libsdl1.2. yet still can't play d3bfg with sdl2: when I compile it compiles fine, but when I try to start it up I get this:
I have libglew-dev version 1.10.0-3 (and all the other libglew which all of them are the same version) but as I read in http://glew.sourceforge.net/ this version of libglew is already adding openGL 4.5 additions. and I have to go back to libglew version 1.5.4 for something around openGL 3.3 so it's clearly not libglew that is failing here. I think Ubuntu uses the last version of SDL (that is sdl1.2) even if you have sdl2 enabled. I curse Ubuntu so much! :D |
Have you changed DSDL2=ON in the
|
I have it ON in order to in the sh script, and I get that error message (without SDL2 I can play allright). what does the ldd command do? where do I do the ldd command? |
It belongs to the linker and shows what libraries the binary you give as an argument will use. And if it is sdl2 then sdl2 will be used. But sdl2 is not the only requirement. My fork I linked a few comments back is certainly not very clean, but it should give you an OpenGL 3 core context and actually run. |
In doing some digging around. Are the problems with SDL2 and the GLSL shaders linked to issues when the library is built? I'm looking over my configure file and logs for this set of libraries I'm building for the project I'm working on. configure:20403: result: yes and further down configure:20957: checking for OpenGL ES v1 headers and then configure:20993: checking for OpenGL ES v2 headers |
Well, I have Back to the topic of this bugreport. Two things:
Secondly: Sorry, if this sounds pejorative, not intended: But when doing the OpenGL core profile to the specification, it works perfectly fine on mesa. In my fork I replaced a few functions and variables by hand and a few that kept segfaulting with sed, and it wasn't actually much work, but obviously having deprecated functions that "happen" to work and new functions mixed is a mess, so I completely understand why you didn't just merge it. :) Is there a plan to eventually update the legacy stuff to OpenGL core and then just drop GLDRV_OPENGL_MESA and use GLDRV_OPENGL32_CORE_PROFILE for it? |
I'd second that. There is no need for compat profile anymore, it only creates problems. it should be just using core profile straight. |
Your suggestions helped me a lot to solve a problem! |
I'm using mesa-10.2.2 drivers. When I try to launch the game the terminal shows the next error:
error: GLSL 1.50 is not supported. Supported versions are: 1.10, 1.20, 1.30, and 1.00 ES
The sound works but the screen don't show anything. Does the game work with mesa drivers? Is the GLSL the problem? How can I change the GLSL version?
Thanks
The text was updated successfully, but these errors were encountered: