PDA

View Full Version : OpenGl compatibility



code_glitch
11-06-2011, 09:03 PM
And in recent news my FBO struggle continues and performancce gets revamped... Importantly, FBO research into why in the name of ... it crashes constantly brought me to the OpenGl Extensions and version and got me thinking: OpenGl 1.5 and 2.0 are not the 'nicest' platforms. On the other hand 3.0 and 4.0 have inherent benefits. For the programmer.

So I was wondering, what graphic cards and opengl versions is everyone out there kicking? It'd be interesting to know, there wouldn't be much point writing OpenGl 1.0 code if everyone has 2.0, and the same can be said for 4.1 if no one has it...

Oh, and does OpenAl work the same way? :D Yea - I really should know this if I write OGL/OAL code right? :-[

Sascha Willems
12-06-2011, 09:13 AM
This is a hard question and depends on your target audience and/or wether you have the time to do several renderpaths. I can only talk for my own projects, but the user's base graphics cards range from very old and integrated ones to the latest ones around the corner so I try to support most of them in "Phase 2" of Projekt W via multple render paths. One for very old cards that don't even have shaders (using the fixed function pipeline, e.g. texture combiners etc.) and one with all eye-candy enabled that uses OpenGL 2.x functionality.

But you should at least settle for OpenGL 1.5. I don't think that there are many GPUs out there that won't support that one, and if you plan to release in one or two years you can even base anything on OpenGL 2.x.

As for 3.x and 4.x. Yes, in theory these are nice cause the forward only compatible contexts really get rid of a lot of old-fashioned and deprecated functionality. Those are the clear future of OpenGL but if you'd only use these you'd have a pretty small and limited user base as you need pretty new cards for those OpenGL versions.

So if you plan to roll out a game and want to reach as many people as possible you either have to go for the lowest supported version for your target audience or you implement several render paths. But you can make life a bit easier if you e.g. don't use immediate functions (glBegin, glTranslate, glRotate, etc.) anywhere, not even in your OpenGL 1.x render path but use vertex arrays or VBOs there cause these can still be used with the new OpenGL versions.

And as for OpenAL : It's API is based on OpenGL's API, but that's about it. OpenAL is still proprietary as it's owned by Creative Labs. So the only thing that both have in common is a similar looking API.

Ñuño Martínez
13-06-2011, 07:56 PM
Is there OpenGL 4.0? :shocked: I thought I wasn't so old... Wait, I'm not so old!

Not sure but IRC my Ubuntu has OpenGL 1.2 or so...

Sascha Willems
13-06-2011, 08:02 PM
Actually the current version is 4.1 (http://www.opengl.org/documentation/current_version/), and our header (http://wiki.delphigl.com/index.php/dglOpenGL.pas/en) already supports it. Though the hightes thing I used was a 3.2 forward compatibility context in where you have to use VBOs to pass data and shaders to do everything else. It's a big change if you're used to all immediate functions and stuff but makes for a sleek API forcing you to use only good performing functions to render your scene.

code_glitch
13-06-2011, 08:46 PM
Ah hem.... VBOs are in 3.x?? Woops. Now I know why that didnt work when I tried it on a 2.0 card and calling the gl_2_0 init procedure :D

Anyways, if I catered for opengl >2.0, would you deem that as 'more than most except a minority' and 'all but a small minority' in the next year or so? ???

code_glitch
13-06-2011, 08:50 PM
Ok, bar that idea, I'll post it here first and save a thread...

I'm trying to copy a texture to another texture with


glBindTexture(GL_TEXTURE_2D, Src.Texture);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, Src.GlWidth, Src.GlHeight, GL_RGBA, GL_UNSIGNED_BYTE, Dest);


I think the glBind works since I have found no evidence to the contrary... glTexSubImage2D though, is being a pain. Whenever I run the application from a terminal, I get a crash. And whenever I run it from GDB it just hangs. Any ideas? I tried FBOs to 'render' one onto the other with no success. I don't know if it has either to do with my C/C++ reading but it compiles fine or with OpenGl extensions (i think the latter is more likely but not sure).

Any help would be very greatly appreciated. Of course any other fast alternatives to duplicating/copying a texture are welcome.

cheers,
code_glitch

Sascha Willems
13-06-2011, 09:05 PM
Ah hem.... VBOs are in 3.x?? Woops. Now I know why that didnt work when I tried it on a 2.0 card and calling the gl_2_0 init procedure :D

Anyways, if I catered for opengl >2.0, would you deem that as 'more than most except a minority' and 'all but a small minority' in the next year or so? ???

No, VBOs are already present in earlier versions. As far as I know from 1.5 with extensions and 2.0 within the core. It's just that with a forward compatible context VBOs are the only way to submitt vertices, color, texcoords etc. to the GPU in GL 3/4.


Ok, bar that idea, I'll post it here first and save a thread...

I'm trying to copy a texture to another texture with


glBindTexture(GL_TEXTURE_2D, Src.Texture);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, Src.GlWidth, Src.GlHeight, GL_RGBA, GL_UNSIGNED_BYTE, Dest);


I think the glBind works since I have found no evidence to the contrary... glTexSubImage2D though, is being a pain. Whenever I run the application from a terminal, I get a crash. And whenever I run it from GDB it just hangs. Any ideas? I tried FBOs to 'render' one onto the other with no success. I don't know if it has either to do with my C/C++ reading but it compiles fine or with OpenGl extensions (i think the latter is more likely but not sure).
glTexSubImage2D is not to blame. The parameters look correct. Are you sure you've made up enough space for Dest using the correct size and datatype? And also have you used glTexImage2D first? Cause the sub-version can only be used if you at least once copied(created) the texutre using glTexImage2D. Though it should not crash then but rather just show an OpenGL error. So I guess your problem is because Dest has a wrong size.

code_glitch
13-06-2011, 09:12 PM
Ok, so that is a very valid point and now I've ammended it all to:


glBindTexture(GL_TEXTURE_2D, Src.Texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, Src.GlWidth, Src.GlHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, Dest);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, Src.GlWidth, Src.GlHeight, GL_RGBA, GL_UNSIGNED_BYTE, Dest);
The good news: OpenGl docs says its ok.
The bad news: glTexImage2D now crashes the program. Or should glteximage2d be before I bind it? ???

Edit: having glTexImage2D before glBindTexture seems to have no effect - crash either way.

Edit2: Where do I put the missing (duh) glCopyTexImage2D? I think that might be the problem right?

code_glitch
13-06-2011, 09:21 PM
Got some progress; I gave the following a spin:



glBindTexture(GL_TEXTURE_2D, Src.Texture);
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, Src.GlWidth, Src.GlHeight, 0);


As a result, I get all white pixels when drawn...

Sascha Willems
14-06-2011, 08:01 AM
So to be clear : You want to copy one texture to another one? If that's the case, you're doing it wrong. This is how you copy one texture into another using OpenGL :
Create buffer with correct size. Tex.Width * Tex.Height * Datatype (usually unsinged int, which is Cardinal in Delphi if your texture is RGBA = 4*8 Bits)
Bind your source texture using glBindTexture
Copy it's content to your buffer using glGetTexImage
Bind your dest texture using glBindTexture
Upload your buffer to OpenGL via glTexImage2D

code_glitch
14-06-2011, 03:57 PM
Ah, right.... Will check it out. Didn't expect to do 2 binds etc. Hopefully I shall have more luck this time.

phibermon
15-06-2011, 10:18 AM
As stated in a project post, I'm all for GL 3.2+ . Only really using GL4.x for tesselation atm.

I think with games like Just Cause 2 being DirectX11 only, we're going to start to see more and more games follow suit. Microsoft want to push people onto later platforms (money in the pocket) and you can guarantee that the next XBox will use DirectX11 or greater. The next playstation will no doubt follow suit with GL4+ (PS3 is totally OpenGL).

I think when these platforms hit us, there will be no justification for publishers/developers to spend time and money maintaining/implementing old code paths (I prefer your render-path terminology actually Sascha). To do so for a sub-set of PC users will not be economically viable.

In fact the only reason we've seen a slow adoption of DirectX 11 is simply because of the existance of the Xbox 360 and the same reasons stated, it's just far cheaper to stick to DirectX 9.

On a side note, thank you Sascha and everyone at the delphigl.com community for your excellent GL4.1 compatible headers! I know that Free Pascal has GL4 headers, but when it comes to GL, I trust you guys more :)

thegilb
08-07-2011, 02:46 PM
I think with games like Just Cause 2 being DirectX11 only, we're going to start to see more and more games follow suit. Microsoft want to push people onto later platforms (money in the pocket) and you can guarantee that the next XBox will use DirectX11 or greater. The next playstation will no doubt follow suit with GL4+ (PS3 is totally OpenGL).

Are you sure Just Cause 2 was DX11 only? Really sure? Microsoft have tried to make newer versions of DirectX an incentive to move players onto new versions of Windows, with varying degrees of success. The tide will change when the majority own DX10 / DX11 capable hardware, though publishers have a mixed relationship with the PC as a platform due to piracy and perceived popularity / ROI.



I think when these platforms hit us, there will be no justification for publishers/developers to spend time and money maintaining/implementing old code paths (I prefer your render-path terminology actually Sascha). To do so for a sub-set of PC users will not be economically viable.

That depends on a LOT of factors. Keeping the majority of your potential market by reusing tried and tested DX9 technology has been the strategy that has kept the PC market buoyed for several years now.



In fact the only reason we've seen a slow adoption of DirectX 11 is simply because of the existance of the Xbox 360 and the same reasons stated, it's just far cheaper to stick to DirectX 9.

I would say the main reason is that most users were or are happy with XP and didn't upgrade to Vista (for obvious reasons). Another good reason is that although technically you can do some new things in DX10 / DX11, you can do everything else in DX9. Finally the biggest reason is that with consoles often leading the revenue share, there is little point in blowing a whole load of money on developing cutting edge technology and graphics for a tiny minority of your customers. Though it is convenient that the XB360 is a DX9 hardware platform, what you gain with one hand is lost with another when you target PS3 and Wii.

phibermon
20-11-2011, 02:19 PM
Hi Gilb,


Are you sure Just Cause 2 was DX11 only? Really sure? Microsoft have tried to make newer versions of DirectX an incentive to move players onto new versions of Windows, with varying degrees of success. The tide will change when the majority own DX10 / DX11 capable hardware, though publishers have a mixed relationship with the PC as a platform due to piracy and perceived popularity / ROI.

Not so sure now, no ;) I'm only going by articles I read upon release. Reading up I hear claims of DX10 but alas, no DX9 yet (which is an odd concidering the engine clearly supports that generation of hardware (360))


I would say the main reason is that most users were or are happy with XP and didn't upgrade to Vista (for obvious reasons). Another good reason is that although technically you can do some new things in DX10 / DX11, you can do everything else in DX9. Finally the biggest reason is that with consoles often leading the revenue share, there is little point in blowing a whole load of money on developing cutting edge technology and graphics for a tiny minority of your customers. Though it is convenient that the XB360 is a DX9 hardware platform, what you gain with one hand is lost with another when you target PS3 and Wii.

I agree (but perhaps the XP issue is less relevent now with 7 + DX11 as it was with Vista + DX10) my apologies if I've missed your point or mis-represented mine, but your final point is more or less what I meant. We talk of DX9/11 but what we really discuss is functionality of the hardware. Modern cross-platform commercial engines (stop me if I'm wrong) pretty much abstract away the differences in the APIs, offering capabilties that are common to the major platforms and scaling to capabilties such as available video memory that differ, where it fits with the development process (so high-res textures on the PC can always be expected as assets from the artists are higher still, but models optimized for PN-Patch tesselation may very well require additional work/money)

There are other notable additions but would you agree that the hardware tesselation stage is the big one? Regardless of the API through which the hardware is exposed, I feel we won't see a majority adoption of this functionality in PC games until the next generation of consoles.