Results 1 to 7 of 7

Thread: What SDL_GL_SetAttributes do you use??

  1. #1

    What SDL_GL_SetAttributes do you use??

    Hi all, when using OpenGL with SDL, what SDL_GL_SetAttributes do you use?

    I've been using
    [pascal]
    SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
    SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 );
    SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
    SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
    SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
    [/pascal]
    without really understanding if this is the most optimal setting I should be using.
    <br /><br />There are a lot of people who are dead while they are still alive. I want to be alive until the day I die.<br />-= Paulo Coelho =-

  2. #2

    What SDL_GL_SetAttributes do you use??

    I use

    [pascal]
    SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
    SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 );
    SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
    SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, SCREEN_DEPTH );
    SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
    [/pascal]

    Where screen depth is the value retrieved from the SDL_VideoInfo structure.

    but I sometimes use

    [pascal]
    SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE,stencilsize);
    [/pascal]

    as well.

    the point is with GL_DEPTH_SIZE opengl will use the value to pass in to the SDL_GL_SetAttribute for the depth size, NOT the value passed in to SDL_SetVideoMode, at least the is my understanding. I jsut make sure they are both the same
    <A HREF="http://www.myhpf.co.uk/banner.asp?friend=139328">
    <br /><IMG SRC="http://www.myhpf.co.uk/banners/60x468.gif" BORDER="0">
    <br /></A>

  3. #3

    What SDL_GL_SetAttributes do you use??

    Where screen depth is the value retrieved from the SDL_VideoInfo structure.
    SDL_GL_DEPTH_SIZE sets the size of the depth buffer not the frame buffer. To set color depth use SDL_GL_BUFFER_SIZE instead. (This is what SDL docs says ).
    And when using OpenGL it does not matter what you pass to SDL_SetVideoMode.

    Actually, in most cases it does not matter what you send to SDL_GL_SetAttribute. It depends on desktop resolution and capabilities of graphics card. I did some tests and no matter what I pass to OpenGL, when my deskop has 32bpp I always get color depth 32 and sizes of color components 8 (I use SDL_GL_GetAttribute to get values). When running in 16bpp I've got color depth 16 and red compnent 5 green 6 and blue 5.
    As for depth buffer I alwas get 24 bits. It also does not matter if running in window or fullscreen.

    The only settings that change something are SDL_GL_ALPHA_SIZE and SDL_GL_STENCIL_SIZE. If you don't set them they will be set by default to zero. Otherwise they will be set to best value possible. In my case 8 for stencil size and 8 for alpha size in 32bpp mode. When I tried to set SDL_GL_ALPHA_SIZE to 1 in 16bpp mode, SDL window didnt show and i've got no error. Strange.

    I did test it only on Linux, so maybe it is Linux specific.

    So, my conclusion is that no matter what you think SDL knows better how to setup OpenGL for you

  4. #4

    What SDL_GL_SetAttributes do you use??

    Quote Originally Posted by grudzio
    I did test it only on Linux, so maybe it is Linux specific.

    So, my conclusion is that no matter what you think SDL knows better how to setup OpenGL for you
    It's not related to Linux or Windows. Insted, it's graphic-card specific. On some graphic cards you may even always get alpha buffer, even when you didn't request it. Or always get stencil buffer (although this is rare). And so on. The settings you give in SDL_GL_SetAttribute are a minumum requirements --- so if your desktop is already 32bpp, it's no wonder that resulting context also has 32bpp, even though you didn't request so much.

    This is the reason why you should always test your programs with desktop resolution dumped down, and on various graphic cards if possible. It's possible that your program looks bad if user has desktop color depth set to 16bit, and you didn't observe the problem because your desktop is set to 32bit.

  5. #5

    What SDL_GL_SetAttributes do you use??

    Quote Originally Posted by michalis

    It's not related to Linux or Windows. Insted, it's graphic-card specific. On some graphic cards you may even always get alpha buffer, even when you didn't request it. Or always get stencil buffer (although this is rare). And so on. The settings you give in SDL_GL_SetAttribute are a minumum requirements --- so if your desktop is already 32bpp, it's no wonder that resulting context also has 32bpp, even though you didn't request so much.
    So, actually the calls to SDL_GL_SetAttribute may be omited (except setting the alpha and stencil buffer) since it is up to graphics card what I will get?

  6. #6

    What SDL_GL_SetAttributes do you use??

    Quote Originally Posted by grudzio
    So, actually the calls to SDL_GL_SetAttribute may be omited (except setting the alpha and stencil buffer) since it is up to graphics card what I will get?
    Yes, they all can be omitted. Assuming that you're going to be happy with whatever context parameters you will get. If you omit all, then the only thing that is guaranteed is that you will get *some* color buffer.

    However, almost all programs will at least need [pascal] SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );[/pascal]

  7. #7

    What SDL_GL_SetAttributes do you use??

    Quote Originally Posted by michalis

    Yes, they all can be omitted. Assuming that you're going to be happy with whatever context parameters you will get. If you omit all, then the only thing that is guaranteed is that you will get *some* color buffer.

    However, almost all programs will at least need [pascal] SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );[/pascal]
    It looks like I have to be happy with whatever color buffer I will get.
    It is a bit frustrating :evil: .

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •