Quote Originally Posted by Andru View Post
In 2009 I bought my HP laptop, and guess what? There was no official drivers for Radeon HD 4650 inside(installed, I mean), so my system didn't support OpenGL and DxVA for video decoding using videocard.
From the sounds of it, the system didn't support anything unless you installed drivers anyhow. How is OpenGL separate in this situation?

Quote Originally Posted by Andru View Post
You think so because you are programmer, a lot of people uses computers without understanding whole "system stuff", and because of that publishers prefer Direct3D. And if your game is not something very incredible, you will loose a lot of costumers because of OpenGL. But all this is related to casual/small games.
I'd have to respectfully disagree. If I was stuck installing something for OpenGL, sure my computer knowledge might have played a factor, but as some guy just playing OpenGL-based games I didn't have to install anything. Maybe I'm just lucky and only ever used computers that had graphics cards with proper drives that came with it?

That's possible, but I seriously doubt that because someone who doesn't know much about computers and didn't have their system setup properly is a reason to blame OpenGL.

The same can be said for DirectX too. I could just as easily blindly accuse Direct3D of being poor for casual games because the user might not have their graphics drivers properly setup. We might as well erase all the crosswalks because some people don't look where they are going.

To that point, I've kept hearing about issues with OpenGL support on various cards. If it's an issue with later versions of OpenGL (ie the current 4.x) then only use the version of OpenGL that everyone will be guaranteed to have. If it's a casual game that you are making then you don't need all those new features to make your game look good. Games were quite impressive with 1.2 and 2.1 alone if I recall correctly.