Quote Originally Posted by deathshadow View Post
Since I'm planning on it being openGL aware, 32 bit floating point is likely the best minimum once rotations, gravity, momentum and drag are in the equation, much less the notion of resolution independent rendering. I considered single precision since that's what glFloat is guaranteed to be
OK I understand that when you are dealing with graphics and physics having 32 bit or 64 bit numbers could come in handy. But what about other cases. For instance if you are making some shooter game you wil define you units health with integer I presume. So what will be max health for your units? Several milions hitpoint? I don't think so. You wil probably define you units max heath in the range of few hundred hitpoints. So why do you need 64 bit integer for this again?
I know that having several different iteger types can be quite confusiong for beginers. Infact the concept of the integer itself could be confusing.
What I had in mind was support for 8 bit, 16 bit, 32 bit, etc numbers but the programing language itself is taking care about wich of theese is being used (self optimization)

Quote Originally Posted by deathshadow View Post
If the 'slowdown' of 64 bit numbers when you have a math coprocessor DESIGNED to handle numbers that size is an issue, you're probably doing something wrong.
I was refering of using 64 bit integers on 32 bit procesors (many platforms today still use 32 bit processing). Ofcourse you can stil do 64 bit math procesing on 32 bit procesors but for that you need to do two math calculation for one (first you calculate fir 32 bits, then you calculate nex 32 bits, and finaly you join result from these two functions into final 64 bit result). So making 64 bit calculations on 32 bit procesor takes atleast tvice as long.

Quote Originally Posted by deathshadow View Post
I'm arguing with myself over that because to be honest, I hate complex character sets; they are a needlessly complex mess that until recently (relatively speaking) weren't even involved on computers. I often feel that if we just restricted ourselves to 7 bit ASCII we wouldn't have a lot of the headaches that crop up on websites... and since my bread and butter the past decade has been websites; it's only further made me HATE languages or even normal text that requires anything more complex than that.
I do understand you point of wiew. More complex charsets do cause much more complexity. But how would you feel if you had a developing tool or programing language wich doesnt alow you to show some specific characters used in your language?

Quote Originally Posted by deathshadow View Post
I'd be far, far more worried about the overhead of string processing UTF-8 into a raster based font system like my GLKernedFont method; which will be what it's going to run since being cross platform I can't rely on any specific engine being present, and freetype looks like ass and kerns text like a sweetly retarded crack addict.
I don't seem how it would be difficult rendering true type fonts. I will learns shortly becouse I intend to make functions wich will alow me to do that in Aspyre graphic engine. If it will work well I do intend on sharing it source code.

Quote Originally Posted by deathshadow View Post
Well, that and I have some really weird ideas on how an interpreter should work - and want to test said ideas without getting led off-track.
Testing weird ideas isn't bad at all. Infact every invention was at firsth thought as a weird idea.