PDA

View Full Version : What SDL_GL_SetAttributes do you use??



savage
22-10-2006, 06:49 PM
Hi all, when using OpenGL with SDL, what SDL_GL_SetAttributes do you use?

I've been using

SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );

without really understanding if this is the most optimal setting I should be using.

technomage
22-10-2006, 07:16 PM
I use


SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, SCREEN_DEPTH );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );


Where screen depth is the value retrieved from the SDL_VideoInfo structure.

but I sometimes use


SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE,stencilsize);


as well.

the point is with GL_DEPTH_SIZE opengl will use the value to pass in to the SDL_GL_SetAttribute for the depth size, NOT the value passed in to SDL_SetVideoMode, at least the is my understanding. I jsut make sure they are both the same :D

grudzio
22-10-2006, 09:03 PM
Where screen depth is the value retrieved from the SDL_VideoInfo structure.

SDL_GL_DEPTH_SIZE sets the size of the depth buffer not the frame buffer. To set color depth use SDL_GL_BUFFER_SIZE instead. (This is what SDL docs says :) ).
And when using OpenGL it does not matter what you pass to SDL_SetVideoMode.

Actually, in most cases it does not matter what you send to SDL_GL_SetAttribute. It depends on desktop resolution and capabilities of graphics card. I did some tests and no matter what I pass to OpenGL, when my deskop has 32bpp I always get color depth 32 and sizes of color components 8 (I use SDL_GL_GetAttribute to get values). When running in 16bpp I've got color depth 16 and red compnent 5 green 6 and blue 5.
As for depth buffer I alwas get 24 bits. It also does not matter if running in window or fullscreen.

The only settings that change something are SDL_GL_ALPHA_SIZE and SDL_GL_STENCIL_SIZE. If you don't set them they will be set by default to zero. Otherwise they will be set to best value possible. In my case 8 for stencil size and 8 for alpha size in 32bpp mode. When I tried to set SDL_GL_ALPHA_SIZE to 1 in 16bpp mode, SDL window didnt show and i've got no error. Strange.

I did test it only on Linux, so maybe it is Linux specific.

So, my conclusion is that no matter what you think SDL knows better how to setup OpenGL for you :D

michalis
23-10-2006, 08:08 AM
I did test it only on Linux, so maybe it is Linux specific.

So, my conclusion is that no matter what you think SDL knows better how to setup OpenGL for you :D

It's not related to Linux or Windows. Insted, it's graphic-card specific. On some graphic cards you may even always get alpha buffer, even when you didn't request it. Or always get stencil buffer (although this is rare). And so on. The settings you give in SDL_GL_SetAttribute are a minumum requirements --- so if your desktop is already 32bpp, it's no wonder that resulting context also has 32bpp, even though you didn't request so much.

This is the reason why you should always test your programs with desktop resolution dumped down, and on various graphic cards if possible. It's possible that your program looks bad if user has desktop color depth set to 16bit, and you didn't observe the problem because your desktop is set to 32bit.

grudzio
23-10-2006, 09:16 AM
It's not related to Linux or Windows. Insted, it's graphic-card specific. On some graphic cards you may even always get alpha buffer, even when you didn't request it. Or always get stencil buffer (although this is rare). And so on. The settings you give in SDL_GL_SetAttribute are a minumum requirements --- so if your desktop is already 32bpp, it's no wonder that resulting context also has 32bpp, even though you didn't request so much.



So, actually the calls to SDL_GL_SetAttribute may be omited (except setting the alpha and stencil buffer) since it is up to graphics card what I will get?

michalis
23-10-2006, 09:39 AM
So, actually the calls to SDL_GL_SetAttribute may be omited (except setting the alpha and stencil buffer) since it is up to graphics card what I will get?

Yes, they all can be omitted. Assuming that you're going to be happy with whatever context parameters you will get. If you omit all, then the only thing that is guaranteed is that you will get *some* color buffer.

However, almost all programs will at least need SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );

grudzio
23-10-2006, 02:03 PM
Yes, they all can be omitted. Assuming that you're going to be happy with whatever context parameters you will get. If you omit all, then the only thing that is guaranteed is that you will get *some* color buffer.

However, almost all programs will at least need SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );

It looks like I have to be happy with whatever color buffer I will get.
It is a bit frustrating :evil: .