Quote Originally Posted by SilverWarior View Post
I meand how often do you use nimbers greatear than 32 bit Integer in your games?
Since I'm planning on it being openGL aware, 32 bit floating point is likely the best minimum once rotations, gravity, momentum and drag are in the equation, much less the notion of resolution independent rendering. I considered single precision since that's what glFloat is guaranteed to be, but the max value limitation when dealing with non-integer values worried me and, well.. It plays to your second question about it:

Quote Originally Posted by SilverWarior View Post
Also won't using 64 bit integers considerably slow down all mathematical calculations when programs would be running on 32 bit operating systems.
The speed concern did worry me, but given that interpreted languages were fast enough by the 386 era to make simple sprite based games, I'm really not all that worried about the speed of floats when the minimum target is a 700mhz ARM 8 or a multi-ghz machine with SSE operations available.

... and remember, even a 8087 math-co could handle 80 bit "extended" at a decent speed even at clocks below 8mhz... I should know, I have one in my Tandy 1000SX next to the NEC V20 running at 7.16mhz.

Though that could just be I got used to targeting 4.77mhz on a 16 bit processor with a 8 bit data path that doesn't even have hardware sprites, I may be overestimating the capabilities of a 32 bit processor at almost 150 times that speed with blitting offloaded to the GPU.

I don't want typecasting to get in the way of actually using the language, and if that means a bit of slowdown, so be it. It's something that pissed me off back on things like Apple Integer Basic where you didn't even have fractions... With 4th graders as the target, it's complex enough without confusing them on the difference between ordinal and real... much less the dozen or so integer widths, or half dozen real types. By the fourth grade they should be learning long division, so explaining why they get 5/2=2 is something to avoid until they put on the big boy pants and move to something like C, Pascal or Python.

BASIC on the Coco only had 'numbers' -- and they were 32 bit numbers with 8 bits for the floating point on a 8 bit processor. (admittedly the semi-16 bit floating point monster 6809) -- and handled things just fine. If the 'slowdown' of 64 bit numbers when you have a math coprocessor DESIGNED to handle numbers that size is an issue, you're probably doing something wrong.

Or am I really off-base with that type of thinking?

I'm still thinking I might switch to 80 bit extended, though I'm going to test 32 bit single precision too; which is why I've defined my own "tReal" type so I can change it program-wide as needed.

Quote Originally Posted by SilverWarior View Post
Also wich type of string would you use (Ansi, UTF-8, Unicode)? Having only ansi support will make your rograming language les interesting for any programer coming from Easten Europe, Asia, or any other country whic uses some aditional characters wich are not present in Ansi charset.
I'm arguing with myself over that because to be honest, I hate complex character sets; they are a needlessly complex mess that until recently (relatively speaking) weren't even involved on computers. I often feel that if we just restricted ourselves to 7 bit ASCII we wouldn't have a lot of the headaches that crop up on websites... and since my bread and butter the past decade has been websites; it's only further made me HATE languages or even normal text that requires anything more complex than that.

Honestly, I'm tempted to restrict it to the character set used by a Apple IIe.

BUT -- you are right, such ethnocentric views could limit the potential audience; something I'd like to avoid. At the same time, I'd be far, far more worried about the overhead of string processing UTF-8 into a raster based font system like my GLKernedFont method; which will be what it's going to run since being cross platform I can't rely on any specific engine being present, and freetype looks like ass and kerns text like a sweetly retarded crack addict.

Also, converting even one complete UTF-8 font set to raster and making a kerning table for it doesn't rank all that high up on my to-do list either; Maybe if I got my auto-generator for the kerning table completed... THOUGH...

I could at least start adding the code hooks to add extended codepage support; that way other people who want it elsewhere could add that support themselves once I go public with the first release and codebase. That might be the best approach since I'm not planning on this being a one man operation forever... just until I hit Beta.

Partly to prove to myself I can still do this sort of thing. Past six years I've been retired due to failing health, and slowly weaning my clients off support... starting to feel like the mind is slipping, and this project isn't just about helping others, but also about proving something to myself.

Well, that and I have some really weird ideas on how an interpreter should work - and want to test said ideas without getting led off-track.