No because you can render in 2D just fine with OpenGL - pixel perfect in fact if you offset for the center of pixels. You've got to use something. why *not* OpenGL? no matter what you use you're still teaching API specifics if you want to include source with your examples.
That's a good idea, have fun porting my OpenGL tesselation tutorials to VESA under DOS.
You mean those games that ran at 320x240, 640x480 using spites?
Yes I know - this was my point which you've just proven - "every modern graphics card processes all of the graphics in 3D" while not strictly true is true enough that teaching something like OpenGL is *essential* - yeah use an existing engine that in turn uses OpenGL if you like - different versions of the tutorials for different engines would be ideal but even if you use an existing engine you'll still find yourself talking about the hardware and how to use it unless you want somebody to end up replicating immediate mode on top of a buffered API because they don't know any better.
Following your logic we shouldn't assume they're on a modern computer at all and write all our tutorials for a hypothetical Turing machine and describe hypothetical ways that graphics might one day be printed out.
Well I disagree with that - the scaling and vast number of sprites used in games such as Terraria or Starbound would be unfeasible to process in software. The reason the games don't suffer from the 'graphics cards not keeping up' is *because* they use hardware acceleration. Swtich to software rendering then the biggest overwhelming bottleneck will be graphics - because it's not really the graphics card keeping up - then it's the CPU doing everything except moving the final graphics to video memory.
Bookmarks