Quote Originally Posted by grudzio
I did test it only on Linux, so maybe it is Linux specific.

So, my conclusion is that no matter what you think SDL knows better how to setup OpenGL for you
It's not related to Linux or Windows. Insted, it's graphic-card specific. On some graphic cards you may even always get alpha buffer, even when you didn't request it. Or always get stencil buffer (although this is rare). And so on. The settings you give in SDL_GL_SetAttribute are a minumum requirements --- so if your desktop is already 32bpp, it's no wonder that resulting context also has 32bpp, even though you didn't request so much.

This is the reason why you should always test your programs with desktop resolution dumped down, and on various graphic cards if possible. It's possible that your program looks bad if user has desktop color depth set to 16bit, and you didn't observe the problem because your desktop is set to 32bit.