achieving compatibility with OpenGL
category: code [glöplog]
This is a coder thread.
Everyone who did an OpenGL demo know how painful and poorly undocumented it is to be compatible with all gfx cards out there. Especially porting from NVIDIA to ATI brings lots of problems.
Why about sharing knowledge about these incompatibilities ? I suggest a format for the problems one can encounter:
vendor GFXcard x84524+:
- does not support (advertised) GL_WTF_esoteric_extension
- can't render to elliptic textures
- GLSL: no support for statements
We started hosting a bunch of glinfo2.exe reports here: OpenGL capabilities
With the disappearing of delphi3d.net, there seems to be no place to find such information. Er... there is this this page but the full database don't seem to be available.
I hope it will provide useful tips about what you can expect from a given card, even if glinfo2 become less and less informative with the new OpenGL versions. I don't know if GPU Caps Viewer has an export function but it could be interesting aswell.
Everyone who did an OpenGL demo know how painful and poorly undocumented it is to be compatible with all gfx cards out there. Especially porting from NVIDIA to ATI brings lots of problems.
Why about sharing knowledge about these incompatibilities ? I suggest a format for the problems one can encounter:
vendor GFXcard x84524+:
- does not support (advertised) GL_WTF_esoteric_extension
- can't render to elliptic textures
- GLSL: no support for statements
We started hosting a bunch of glinfo2.exe reports here: OpenGL capabilities
With the disappearing of delphi3d.net, there seems to be no place to find such information. Er... there is this this page but the full database don't seem to be available.
I hope it will provide useful tips about what you can expect from a given card, even if glinfo2 become less and less informative with the new OpenGL versions. I don't know if GPU Caps Viewer has an export function but it could be interesting aswell.
Starting with pretty old cards:
ATI Radeon 9800:
- cannot render to NPOT texture
- GLSL: no texture2DLod in fragment shader
- GLSL: no constants array of matrices
- GLSL: no dynamic array index
ATI x700
- GLSL: no texture2DLod in fragment shader
- GLSL: no switch-case statement
- GLSL: no constants array of matrices
- GLSL: no dynamic array index
- should call glEnable(GL_TEXTURE_xD) and glDisable(GL_TEXTURE_xD) for each used texture unit or samplers yields black
ATI Radeon 9800:
- cannot render to NPOT texture
- GLSL: no texture2DLod in fragment shader
- GLSL: no constants array of matrices
- GLSL: no dynamic array index
ATI x700
- GLSL: no texture2DLod in fragment shader
- GLSL: no switch-case statement
- GLSL: no constants array of matrices
- GLSL: no dynamic array index
- should call glEnable(GL_TEXTURE_xD) and glDisable(GL_TEXTURE_xD) for each used texture unit or samplers yields black
Use the highest OpenGL version as possible (3.2 at the moment).
vsync differences:
(swapInterval(1) and vsync enabled in driver settings)
ATI (Radeon Mobility 4570, latest Catalyst drivers):
- SwapBuffers() does not block
- VSync does not work in Win7/x64 (it works in Linux!)
NVidia (9800 GT)
- SwapBuffers() blocks
I had to put a glFinish() before the SwapBuffers() call to get
real smooth vsync'd animation on ATI. The alternative
would have been to usleep()/Sleep() before calling SwapBuffers.
Anyone knows a better solution ?
boobs. everything's better with boobs.
cool link indeed
ATI HD 2400, Catalyst 04/05/2010
- you cannot pass a sampler wrapped in a struct to a function. Pass the sampler separately else your sampler will yield black.
- a preprocessor routine must end with \n, EOF won't work (compiler fails with blank error).
- #version must be the first line of a shader, if used (NVIDIA accepts it anywhere)
- GL_AMD(X)_debug_output works but is nearly useless, don't bother implementing
- you cannot pass a sampler wrapped in a struct to a function. Pass the sampler separately else your sampler will yield black.
- a preprocessor routine must end with \n, EOF won't work (compiler fails with blank error).
- #version must be the first line of a shader, if used (NVIDIA accepts it anywhere)
- GL_AMD(X)_debug_output works but is nearly useless, don't bother implementing
OpenGL still suxx ? ( altho i feel bad about saying this, and even more about not already using this ! )
Quote:
achieving compatibility with OpenGL
... through simply not using opengl
thats the easiest way ;)
To be fair, ATI drivers for the HD series are miles better than for the x1000 series. Yet something usually break with each release.
i´d love to break up with m$/dX and go oGL completely , but hearing/reading about stuff like this alll the time just kills the movement !
I use http://www.kludx.com/. And it lists both Direct3D and OpenGL features.
I suggest to rename this thread to "achieving ATI-compatibility with OpenGL".
Quote:
I use http://www.kludx.com/. And it lists both Direct3D and OpenGL features.
Nice database. I find funny how intel claims OpenGL 2.0 support without GL_ARB_multisample, nor GL_ARB_imaging.
Anybody has experience with Intel + OpenGL "2.0" ?
http://www.opengl.org/registry/doc/GLSLangSpec.4.00.7.pdf
hfr: that's just because #ponce hasn't started on glsl1.5 or geoshaders yet ;) (nvidia has turned pretty strict (and more than what the standard says) with the newer versions).
Quote:
The #version directive must occur in a shader before anything else, except for comments and white space
hfr: that's just because #ponce hasn't started on glsl1.5 or geoshaders yet ;) (nvidia has turned pretty strict (and more than what the standard says) with the newer versions).
Actually this is in the spec since glsl 1.10, debugging bad GLSL is also part of my work.