PDA

View Full Version : Gumberoo - Making a new learning language intepreter using FPC



deathshadow
15-04-2012, 07:06 PM
Posting this across several forums... This is a bit lengthy a post, but it's because I'm going to outline my entire reasoning for starting this project.

For some time I've been of the opinion that education on the whole is failing to... well, educate. Between teachers being paid more than twice that of the average citizen bankrupting our schools, to poor standardized test performance, social promotion, and kids graduating high school with what when I was a kid would have been considered a 4th grade reading level, it is painfully apparent to me that the entire "institutionalized" educational system is failing to deliver. There's a reason we're seeing budget cuts, program cuts, and even mass layoffs or outright firings. Just ask Providence, Rhode Island about that one.

The gaps in what's being taught too -- like apparently physics is a single class at the junior high locally, that isn't even offered to standards level students?!? That's frightening to me.

My ex-fiance was/is a math teacher, now studying up for her masters after a year of subbing, and everything she's told me reinforces this -- what the kids coming into her high school class had for knowledge being a fraction of what we were expected to know 30 years ago at the 4th grade level... much less concepts like "social promotion" and the rapid decline of results in standardized tests leading to the inexorable conclusion that modern school is little more than glorified daycare!

NOT that from my time in the system roughly two or three decades ago delivered anything I use today past the 6th grade. My schools didn't even have computer classes; though certainly some of them had computers that were locked away due to a lack of qualified staff or at best used as typewriter replacements for the ONE typing course. (great when there were no printers).

I think back to when I was learning -- and what did I REALLY learn to program on? I discount the Cosmac Elf because it was a limited toy more about learning the hardware side of things; but really it was the TRS-80 Model 1 and Color Computer, Atari 800, and the Commodore Vic-20 where I started actually being able to write software; Software that left my math teacher's jaws hanging open in wonder all before I even got to Junior High.

I look at the state of computing today, and... where are the equivalents to ROM BASIC that a ten year old kid could sit down with something simple like say... "Getting Started with Extended Color Basic" and by the end of the book be able to write something simple like pong all on their own? Something they can sit down, turn the machine on and just start using?

On the hardware side we have new devices like the Raspberry Pi -- cheap, simple... but on the software side they have it booting into Debian or Fedora and expect the people 'learning' to be alternating between gEdit and Bash? "for education" leaves me thinking they are aiming for college age -- in which case they're almost pricing themselves UNDER the market.

Near as I can tell (and I'm three weeks into this) there is NO suitable language / development environment I would be comfortable handing to a ten year old like I would a C=64 to go with the Pi's "for education" concept. Sure there's talk of doing things like making it boot to a BBC emulator or porting BBC basic to it; but that's ridiculously crippled if authentic, a bit complex due to choices that were made due to hardware restrictions that no longer exist, and if you were to modernize it, why not at that point just make a new programming language?!?

If nothing else, seriously, what is a child from today going to care about BBC Basic in terms of what they can do with it?

More so since to make kids interested in learning; what made me and those of us interested in programming back then in the first place?

Games? I know I wrote my share and typed my share of them out of magazines, and contributed my own to the various rags of the era.

Music? All of you who spent hours on your C=64 making SID files or later on the Amiga MOD scene...

Graphics? I remember being all impressed when I made my Coco draw Garfield on the screen using the DRAW statement...

Combine graphics and music, and this is what the demoscene was all about! Democoders being some of the most hardcore programmers ever.

Also when it comes to the pi, with the low price it would be a great entry level for the people who can't afford a modern PC or even a netbook/tablet... and if we're talking the poor why leave our dumpster diving friends with TV's out of the equation? The pi has a composite out, yet all the current software is pretty much MEANT for HDMI resolutions -- why not a 'old school' style 40x25 UI, since that's about the upper limit you can push composite anyways? (with 320x240 graphics underneath).

... and lets not leave out people who already have hardware; cross platform is easy these days what with SDL and OpenGL.

So... I'm doing it. I'm making a new interpreted language with an immediate mode, in-built editing tools, with 40x25 text and 320x240 as it's primary targets (with support for higher resolutions).

It will be called... Gumberoo; with it's mascot Menlo, the friendly Gumberoo. (as opposed to his evil cousin Anur, the baby eating Gumberoo)

For those who don't know, the Gumberoo is a North American legend of a hairless bear with a thich black bullet resistant hide and long almost alligator like mouth full of teeth.

Gumberoo was chosen because few people use the name, it's not trademarked to anything, the domain was available, and it sounds cute and friendly, even if the actual mythological creature was generally considered a monster. Taking a monster and making it friendly is nothing new -- Just look at Nessie or Ferdinand the Bull.

I'm writing this entire project using Free Pascal, and I may switch to objectPascal syntax just to make it easier when it comes time to deal with xCode for iOS/OSX. I chose Pascal not just because it's the language I'm most comfortable with, but for code clarity and because FPC can now target a wide variety of languages, and now ships with working SDL and OpenGL on SDL units... that it can now target ARM, x86, AMD64/EM64T, PowerPC and even SPARC... is really the icing on the cake. Lands sake, it might be possible to build this for the Wii or DS... or at least make the interpreter work there.

Besides, I'd sooner put a bullet in my head than deal with GCC. I'd sooner hand compile 8k of Z80 machine language than deal with 50 lines of C code.

So far I'm working up several lists... Broken into subsections; "Development" means targets I can likely achieve unassisted and when complete will mean it's reached Beta status. "Final" are what will be required for it to be considered 1.0

--------------------
Target Audience
--------------------
4th graders and up. I want this to be interesting enough for making games to hook kids, advanced enough to still entertain adults, and

--------------------
Hardware/OS
--------------------

OS Platforms
Development: Linux, Windows
Final: OSX, iOS, Android

CPU's
Development: x86, AMD64/EM64T
Final: ARM

Minimum of 700mhz ARM8. (what's in the pi)

Memory
Should be able to run full featured in less than 32 megs free space. (basically half what's left over after a 'massive' Fedora install on the Pi)

--------------------
Language
--------------------

variable types -- there will only be three types of variables, "number", "string" and "array". Number will either be 64 bit double precision or 80 bit extended. With modern hardware there's no reason to confuse people new to programming with the difference between byte, word, dword, integer, longint, etc... If you add a number to a string it will auto-convert to string, If try to add a string to a number it will throw a error during tokenization.

tokenizing editor -- per line editing much akin to the old line basic, but it will also let you scroll through the listing rather than using immediate mode commands like LIST. When you hit 'enter' to finish editing a line, the tokenizer will compile the line to bytecode.

break/continue -- one thing interpreted basic used to provide was actually a very robust debugging tool in being able to break and continue code. You could ^C, or you could use a command like 'break' to halt execution, change the values of a variable or look at variables, type "continue" and it would... continue with your changed values. You don't get that from compiled languages...

Immediate mode -- there are lots of new names for doing this, but the result is the same... you type in a line of code and it runs immediately, as opposed to 'deferred' where it runs later. Since I'm going to have the tokenizer compile by line, performing immediate mode operations should be simple.

No Scope -- Scope is a very complex concept that I've repeatedly over the years seen confuse people new to programming. Let's get logic flow into their heads BEFORE we confuse them with scope.

No userland objects -- objects are hard enough in a real language, much less in one without scope. It's a complex concept better left for better languages. Besides, user created objects in an interpreted language is most always a bad idea, and doesn't do objects proper justice. (see PHP)

System Objects -- however, introducing the notion of system objects having properties and methods is simple, and can be considered a baby-step into the world of objects.

Basically, provide objects, just don't confuse them by letting them try to make their own.

No line numbers -- I might be patterning on ROM BASIC, but lets at least TRY to make this a real language.

Labels -- since we don't have line numbers, we'll need something to point at. Rather than getting things all fancy with functions I'd like to keep it simple as line numbered basic was, so we'll have labels. Probably start them with a colon like in some assemblers.

No user function calls -- we'll have subroutines, but to keep the tokenizer (and expression evaluator) clean and simple, we'll not introduce that concept.

No return values -- since we lack scope and user made functions, there'll be no mechanism for RETURN to actually pass values.

Pointers -- uhm, no...

No Peek/Poke -- multi-platform from the start, there's no reason (and in some cases no way) to let people blindly access memory.

Useful state based input handlers -- inkey$ sucked, readchar sucked, input really blows chunks if you're writing something realtime like a game... so I'm thinking a 'ifKeyDown' language construct to read if a key is pressed, and a 'inputmap' function letting you quickly map out what to do or call when a key is down during a logic loop. This is one of the things we used to have to peek into memory for in the first place.

Sprite and animation engine -- The ability to assign a variable as a sprite, load textures from a png file, probably incorporate a simple sprite editor into it. This is something we'd have killed for back in the day, as even getImage/putImage and it's various equivalents across platforms were slow, painful and not that useful... I'm thinking on having these rendered using OpenGL and/or OpenGL ES so rotation, scaling and depth-sorting can be easily done. Also should incorporate collision detection.

"world mapping" -- the ability to place tiles into a large 'world map' that can be scrolled around in the background.

Backbuffered video -- a requirement for game animations that don't suck. Since I'm thinking OpenGL this is easy to do on the back end.

Rudimentary automatic physics -- mono-directional gravity (Mario), radial gravity (SpaceWar), drag, possibly having automatic reactions for collisions like bounce rates.

Audio playback -- likely WAV, Ogg or MP3.

Simple multivoice composer -- piano-roll type tool for making your own music and being able to call it from inside your program.

Programmable Synthesizer for making custom sound effects... I'm thinking multi-operator and introducing the concepts of ADSR, but in a simple way kids can grasp; kind of like what was built into a Casio VL-Tone, except having it sound good. A soft-synth would allow games to sound the same regardless of platform without relying on existing (or not existing) midi capabilites. If I could figure out the VL-tone at age 8, I figure a 10 to 12 year old can probably handle it today.

I'm still playing with the syntax, but so far I'm aiming for a simple program to look something like this:



bufferVideo
makeSprite player 32x32
player.loadTiles "playerSprites.png"
loadSound "thrust.wav",thrustSound

:menu
player.hide
clear
stopSounds
at 0,16
writeCentered "Simple Game Demo"
write
writeCentered "Press <SPACE> to Start"
writeCentered "or <Q> to quit"
inputMode buffered
flushKeyBuffer
renderFrame

:menuKeyLoop
inputMap
= ' '
jump gamestart
= 'q','Q'
jump exitgame
endInputMap
jump :menuKeyLoop

:gameStart
clear
player.moveTo 0,0
player.setMomentum 0,0
player.setAngularGravity 180,10
player.setDrag 1
player.show

:mainLoop
renderFrame
inputMap
= "q"
jump menu
= "a",arrowLeft,numberPad4,digitalLeft
player.addMomentumX -1
= "d",arrorRight,numberPad6,digitalRight
player.addMomentumX 1
= "w",arrowUp,numberPad8,digitalUp
player.addMomentumY -1
player.setAnimationRow 2
call thrust
= "s",arrowDown,numberPad2,digitalDown
player.addMomentumY 1
player.setAnimationRow 2
call thrust
endInputMap
jump mainLoop

:thrust
if thrustSound.stopped
thrustSound.play
else
thrustSound.sustain
endIf
return

:exitGame
stopSounds
clear
writeCentered "Thanks for Playing"


Very rough example of what I have in mind for language syntax and grammar.

... to be continued

deathshadow
15-04-2012, 07:07 PM
... continued

In terms of status, right now I'm still struggling with getting my expression evaluation engine in place; I figured i'd tackle one of the toughest parts first. The tokenizer for expressions is working stellar, but I'm having trouble with function calls...

3+(-2) returns 1, correct
3+abs(-2) returns 5, correct
(-2)+3 returns 1, correct
abs(-2)+3 returns -5... WRONG.

It's literally ignoring the closing parenthesis... I'm probably overlooking the obvious. Even with this minor bug it handles complex formula:

5*(a-3)+2^8+6*b

flawlessly. Function calls also run quickly because the tokenizer is compiling to state-based lookups, meaning when the interpreter gets to the token for a function, the next byte is an array index into the function list -- so I can call it thus:

functionList[currentData^]^(expression);

Where currentData^ is the current byte stored in the code, and expression is the function to evaluate what's inside the function call.

The expression engine is a simple three stage expression/factor/value system... you call expression, it calls factor, which first calls value to, well... get a value to work with. Value checks to see if we're pulling an immediate, a function, or a variable, returns kind... factor then checks for a 'second stage' operator like multiply/divide/exponent, if present pulls a value and does it, if not it returns to expression which checks for first stage operations (addition/subtraction) and does those, again calling factor for their values.

It's how I did it all those ages ago when I wrote a Clipper interpreter in DiBol (I should track down the code for that)... tried and true method. I think it's even how Prof. Wirth did it in his p-code interpreter. (as opposed to C with it's 17 stage nightmare)

Means a wee bit of nesting, but nothing FPC can't handle.

So the code is coming slowly. I'm working up a website for the project and hope to soon get that up live... but I've not really set up a time limit on getting this into the beta stage. I may release some binary demos before this goes beta, but for now I'm keeping the code private as, well... during development I don't work well with others. Once I have it to the point of text mode and input handling complete, I'll be opening up the code to the public.

I have had suggestions to make a Kickstarter page for it. I'd like to get something to show for actual working tokenizer (read "caching bytecode compiler" in 'modernspeak') and interpreter first.

Was also thinking on making the IDE web-aware, so kids could easily share their games online with others.

So... good idea, crazy idea, who cares? Any ideas/suggestions welcome... well -- apart from "oh just write a book about python" or "why don't you make a UML implementation", in which case bugger off! Dealt with that already in two different places, and to me that's a sign that said individuals don't get the concept.

Sorry for the long post... Wait, no I'm not; The TLDR crowd can kiss right off too! :D

WILL
15-04-2012, 07:42 PM
An interesting proposal. I think that anything that helps kids learn to program is great.

There was a programming compiler/interpreter called Turing (http://en.wikipedia.org/wiki/Turing_(programming_language)), named after Alan Turing, by some students at University of Toronto in Canada. It was used by many of the high schools in Toronto, including mine, to teach young people how to program. It took the medium to low level coding of the day and brought it up to a high level of code writing. It also took away much of the low level stuff you needed to do to add sound and graphics.

I think that's important! Having an all in one solution so that new programmers don't have to learn "rocket science" before they learn how to simply "put a nail into a wooden board" at the beginning of everything they have to learn. It sort of kills the path to learning this great art form called programming.

deathshadow
15-04-2012, 07:42 PM
Update...

Expression engine complete. Was a stupid mistake where I was treating the function as a expression start, while stilll tokeinizing the opening (

So the problem WAS in the tokenizer after all. Finally, I can move on to actual line processing/execution, and variables.

Though I also just belted out a couple test constants. PI and Deg2Rad.

Deg2Rad=PI/180. You multiply degrees by that to turn it into radians... though since I'm NOT going to make SIN/COS/Arctan use radians, I guess that one's kinda pointless.

Still, it's really cool to see it finally evaluate:



Enter expression or blank line to quit >64*sin(45*pi/180)+160
result: 205.25483400


From here should be smooth sailing, I actually consider that the hardest part of writing an interpreter. It even has error handling:


Enter expression or blank line to quit >160+64*sin(45*pi/180
processing: 160+64*sin(45*pi/180
Error in Expression: Closing Bracket Missing
160+64*sin(45*pi/180
^


Which that's done at the tokenizer, so the actual interpreter part would never even get that code to try and run it. In the final version I plan for the EDITOR to not allow you to add such a line without fixing the error first.

Cybermonkey
15-04-2012, 08:50 PM
That's a rather interesting project. But speaking frankly: lables and jumps are considered as bad programming, right? Shall in nowadays people start programming the "old" style (GOTO)?
I also think that functions are elementary of a learning computer language. Even Pascal was invented to teach programming; see wikipedia article to Pascal:
Initially, Pascal was largely, but not exclusively, intended to teach students structured programming.
What about Lua? ;)
Just my two cents.

paul_nicholls
15-04-2012, 09:13 PM
Sounds like a very interesting project! Looking forward to more :)

deathshadow
15-04-2012, 09:46 PM
That's a rather interesting project. But speaking frankly: lables and jumps are considered as bad programming, right?
Well, keep in mind I consider Assembly/Machine language to be simpler than C... and in ASM you only have jump or call. The difference between CALL and a function is... uhm... yeah. At most the difference being passing a result in a register or on the stack. Just be glad I rejected the idea I had of making conditionals work like machine language.

While I've loved pascal from the day I first learned it on a DEC Rainbow, a lot of it's concepts and methods of working, like a number of more modern languages I feel require more time investment to do anything useful -- to that end there's python, but that has it's own host of issues and isn't entirely suited to what I want the projects focus to be.

A lot of this comes from my recent putterings on the Apple II and VIC 20... Pascal had a number of conceptual hurdles I never had to deal with in line numbered BASIC... hurdles I didn't really make it past until I was a teenager. I'm trying to find a middle-ground between the two, without treading into the noodle-doodle land that Python does with classes.

Again, I'm hoping to make it simple enough that with a decent manual a ten year old/4th grader could use it without adult intervention; which is why I reject python or even pascal. They make assumptions of knowledge and require certain amounts of programming theory, that honestly I glazed over at that age and gave up on before grasping them... Hell, I know adults today who glaze over on endless pages of theory going "where's the beef?".

It's a big bun. A big fluffy bun. It's a very big bun...
WHERE'S THE BEEF?

In a lot of ways, I want to make what LOGO wanted to be but never was.

paul_nicholls
15-04-2012, 10:39 PM
...the noodle-doodle land that Python does with classes.

hahaha!! nice description :D

SilverWarior
15-04-2012, 11:19 PM
There is something that I can't seem to coperhand. Why would you limit numbers to 64 bit. Isn't this a waste of memory? I meand how often do you use nimbers greatear than 32 bit Integer in your games? Also won't using 64 bit integers considerably slow down all mathematical calculations when programs would be running on 32 bit operating systems.
Also wich type of string would you use (Ansi, UTF-8, Unicode)? Having only ansi support will make your rograming language les interesting for any programer coming from Easten Europe, Asia, or any other country whic uses some aditional characters wich are not present in Ansi charset.

deathshadow
16-04-2012, 01:24 AM
I meand how often do you use nimbers greatear than 32 bit Integer in your games?
Since I'm planning on it being openGL aware, 32 bit floating point is likely the best minimum once rotations, gravity, momentum and drag are in the equation, much less the notion of resolution independent rendering. I considered single precision since that's what glFloat is guaranteed to be, but the max value limitation when dealing with non-integer values worried me and, well.. It plays to your second question about it:


Also won't using 64 bit integers considerably slow down all mathematical calculations when programs would be running on 32 bit operating systems.
The speed concern did worry me, but given that interpreted languages were fast enough by the 386 era to make simple sprite based games, I'm really not all that worried about the speed of floats when the minimum target is a 700mhz ARM 8 or a multi-ghz machine with SSE operations available.

... and remember, even a 8087 math-co could handle 80 bit "extended" at a decent speed even at clocks below 8mhz... I should know, I have one in my Tandy 1000SX next to the NEC V20 running at 7.16mhz.

Though that could just be I got used to targeting 4.77mhz on a 16 bit processor with a 8 bit data path that doesn't even have hardware sprites (http://www.deathshadow.com/pakuPaku), I may be overestimating the capabilities of a 32 bit processor at almost 150 times that speed with blitting offloaded to the GPU.

I don't want typecasting to get in the way of actually using the language, and if that means a bit of slowdown, so be it. It's something that pissed me off back on things like Apple Integer Basic where you didn't even have fractions... With 4th graders as the target, it's complex enough without confusing them on the difference between ordinal and real... much less the dozen or so integer widths, or half dozen real types. By the fourth grade they should be learning long division, so explaining why they get 5/2=2 is something to avoid until they put on the big boy pants and move to something like C, Pascal or Python.

BASIC on the Coco only had 'numbers' -- and they were 32 bit numbers with 8 bits for the floating point on a 8 bit processor. (admittedly the semi-16 bit floating point monster 6809) -- and handled things just fine. If the 'slowdown' of 64 bit numbers when you have a math coprocessor DESIGNED to handle numbers that size is an issue, you're probably doing something wrong.

Or am I really off-base with that type of thinking?

I'm still thinking I might switch to 80 bit extended, though I'm going to test 32 bit single precision too; which is why I've defined my own "tReal" type so I can change it program-wide as needed.


Also wich type of string would you use (Ansi, UTF-8, Unicode)? Having only ansi support will make your rograming language les interesting for any programer coming from Easten Europe, Asia, or any other country whic uses some aditional characters wich are not present in Ansi charset.
I'm arguing with myself over that because to be honest, I hate complex character sets; they are a needlessly complex mess that until recently (relatively speaking) weren't even involved on computers. I often feel that if we just restricted ourselves to 7 bit ASCII we wouldn't have a lot of the headaches that crop up on websites... and since my bread and butter the past decade has been websites; it's only further made me HATE languages or even normal text that requires anything more complex than that.

Honestly, I'm tempted to restrict it to the character set used by a Apple IIe.

BUT -- you are right, such ethnocentric views could limit the potential audience; something I'd like to avoid. At the same time, I'd be far, far more worried about the overhead of string processing UTF-8 into a raster based font system like my GLKernedFont method; which will be what it's going to run since being cross platform I can't rely on any specific engine being present, and freetype looks like ass and kerns text like a sweetly retarded crack addict.

Also, converting even one complete UTF-8 font set to raster and making a kerning table for it doesn't rank all that high up on my to-do list either; Maybe if I got my auto-generator for the kerning table completed... THOUGH...

I could at least start adding the code hooks to add extended codepage support; that way other people who want it elsewhere could add that support themselves once I go public with the first release and codebase. That might be the best approach since I'm not planning on this being a one man operation forever... just until I hit Beta.

Partly to prove to myself I can still do this sort of thing. Past six years I've been retired due to failing health, and slowly weaning my clients off support... starting to feel like the mind is slipping, and this project isn't just about helping others, but also about proving something to myself.

Well, that and I have some really weird ideas on how an interpreter should work - and want to test said ideas without getting led off-track.

SilverWarior
16-04-2012, 07:00 AM
Since I'm planning on it being openGL aware, 32 bit floating point is likely the best minimum once rotations, gravity, momentum and drag are in the equation, much less the notion of resolution independent rendering. I considered single precision since that's what glFloat is guaranteed to be

OK I understand that when you are dealing with graphics and physics having 32 bit or 64 bit numbers could come in handy. But what about other cases. For instance if you are making some shooter game you wil define you units health with integer I presume. So what will be max health for your units? Several milions hitpoint? I don't think so. You wil probably define you units max heath in the range of few hundred hitpoints. So why do you need 64 bit integer for this again?
I know that having several different iteger types can be quite confusiong for beginers. Infact the concept of the integer itself could be confusing.
What I had in mind was support for 8 bit, 16 bit, 32 bit, etc numbers but the programing language itself is taking care about wich of theese is being used (self optimization)


If the 'slowdown' of 64 bit numbers when you have a math coprocessor DESIGNED to handle numbers that size is an issue, you're probably doing something wrong.

I was refering of using 64 bit integers on 32 bit procesors (many platforms today still use 32 bit processing). Ofcourse you can stil do 64 bit math procesing on 32 bit procesors but for that you need to do two math calculation for one (first you calculate fir 32 bits, then you calculate nex 32 bits, and finaly you join result from these two functions into final 64 bit result). So making 64 bit calculations on 32 bit procesor takes atleast tvice as long.


I'm arguing with myself over that because to be honest, I hate complex character sets; they are a needlessly complex mess that until recently (relatively speaking) weren't even involved on computers. I often feel that if we just restricted ourselves to 7 bit ASCII we wouldn't have a lot of the headaches that crop up on websites... and since my bread and butter the past decade has been websites; it's only further made me HATE languages or even normal text that requires anything more complex than that.

I do understand you point of wiew. More complex charsets do cause much more complexity. But how would you feel if you had a developing tool or programing language wich doesnt alow you to show some specific characters used in your language?


I'd be far, far more worried about the overhead of string processing UTF-8 into a raster based font system like my GLKernedFont method; which will be what it's going to run since being cross platform I can't rely on any specific engine being present, and freetype looks like ass and kerns text like a sweetly retarded crack addict.

I don't seem how it would be difficult rendering true type fonts. I will learns shortly becouse I intend to make functions wich will alow me to do that in Aspyre graphic engine. If it will work well I do intend on sharing it source code.


Well, that and I have some really weird ideas on how an interpreter should work - and want to test said ideas without getting led off-track.

Testing weird ideas isn't bad at all. Infact every invention was at firsth thought as a weird idea.

deathshadow
16-04-2012, 09:43 AM
OK I understand that when you are dealing with graphics and physics having 32 bit or 64 bit numbers could come in handy. But what about other cases. For instance if you are making some shooter game you wil define you units health with integer I presume. So what will be max health for your units? Several milions hitpoint? I don't think so. You wil probably define you units max heath in the range of few hundred hitpoints. So why do you need 64 bit integer for this again?
Remember this is an interpreter, not a compiler, handling data types could add as much if not more overhead as well... php comes to mind with it's total lack of strict typecasting but at the same time internally having types; all those data conversions on every access/assignment can end up just as long as calling the FPU... being it ARM's VPF, legacy x87, or SSE.

Spending time on an interpreter, even after tokenizing on automatically selecting the optimal type for a value or range of values can end up just as big and slow as just operating on a single fixed type. All those branches, calls and conditionals add up quick... certainly faster than the 2 to 8 clock difference between a 32 bit integer operation on the cpu and 64 bit one on the FPU. (at least on x86... still learning ARM)


I was refering of using 64 bit integers on 32 bit procesors (many platforms today still use 32 bit processing). Ofcourse you can stil do 64 bit math procesing on 32 bit procesors but for that you need to do two math calculation for one (first you calculate fir 32 bits, then you calculate nex 32 bits, and finaly you join result from these two functions into final 64 bit result). So making 64 bit calculations on 32 bit procesor takes atleast tvice as long.
I get that, and in a way it's part of why I'm not bothering even having integer types.

By going straight to the math-co, on x87 that's simply a FLD, the operation (FMUL,FADD,FSUB,FDIV), then FSTP -- not really any more or less code than mov eax,mem; mov ebx,mem; mul ebx; mov mem,eax

At most a FPU double multiply (for example) on anything x87 pentium/newer is 12 bus clocks memory, 12 bus clocks code fetch and 6 cpu clocks execution (including setting up the FPU memory pointer)... A 32 bit integer multiply on same might be only 6 bus clocks memory, but it's 20 bus clocks code fetch and 4 cpu clocks.... so they may look like they take the same amount of time, but remember the bus isn't as fast as the cpu; as such on modern computers it is often FASTER to do a 64 bit floating point multiplication than it is a 32 bit integer one... just because 386 instructions are that extra byte in length meaning a longer wait for it to fetch.

Of course, if you can optimize the assembly to put everything into proper registers you can shift that back around, but that's more the type of thing for a compiler to do, not an interpreter.

... and while that's for the wintel world of doing things, you also have to remember that ARM lacks a integer divide; while the various VFP/VFE/SIMD/NEON whatever they want to optionally include this week do tend to provide it. Of course, there is the issue of not being able to rely on which FPU extensions are even available (if any) on ARM, and if FPC even bothers trying to include code for them -- that is a concern I'm going to have to play with in QEMU. I know the Cortex A8 provides NEON, which uses 64 bit registers despite being hooked to a 32 bit CPU.

After all, that's why SIMD and it's kine exist, and why the x87 was a big deal back in the day... since a 8087 was basically having a memory oriented 80 bit FPU sitting next to a 16 bit processor.

It is a good point though that I should 'check it'... I'll probably toss together a synthetic bench tomorrow to gauge the speed differences, if any... though constant looping will likely trigger the various caches, so I'll probably have to make a version that puts a few hundred k of NOP's in place to cache-flush between operations.

I'm also used to thinking x86 where the 'integer optimization' for coding hasn't really been true since pentium dropped... I've really got a lot of studying of ARM to do -- and the code FPC makes for ARM. I mean, does it even try to use SIMD/FPV if available?


I don't seem how it would be difficult rendering true type fonts. I will learns shortly becouse I intend to make functions wich will alow me to do that in Aspyre graphic engine. If it will work well I do intend on sharing it source code.
Wait until you try using the train wreck known as freetype -- it's rubbish, pure and simple... There's a reason so many SDL and OpenGL programs don't even bother and use raster fonts instead... The rendering is ugly, inconsistent, painfully slow and the code interfaces are the worst type of tripe this side of trying to write a device driver for linux.

I was thinking I could use monospace and/or monokerned fonts instead of true kerning; that would make it simpler/faster and since it's going to have an editor, it will have a monospaced fonts anyways. Vector fonts are a high resolution luxury that I don't think translate well to composite-scale resolutions in the first place; see the old vector fonts from the BGI at CGA resolutions.

I may also keep the editor strictly SDL, leaving openGL for when a program is running in the interpreter. Still need to play with the idea. Nowhere near working on that part of it yet as my first order of business is getting the tokenizer and bytecode interpreter complete to where it can at least make a console program. THEN I'll worry about the IDE, graphics, fonts, etc...

SilverWarior
16-04-2012, 10:17 AM
Remember this is an interpreter, not a compiler

Pardon me. It seems I misunderstood your intention. I got a feling that you intend to make completly new programing language from scratch.

BTW how come you have decided to not include support for pointers? I do understand that for beginner they might seem verry complicated, but when you learn working with them then you realize that they can be verry powerful if used corectly.

deathshadow
16-04-2012, 12:14 PM
Pardon me. It seems I misunderstood your intention. I got a feling that you intend to make completly new programing language from scratch.
It is... but it's an interpreted one like PHP, Python, Perl, ROM Basic, Javascript, (and if you don't buy into this virtual machine BS, Java), etc -- instead of a compiled one. At this point being 'truly original' is a joke (given 90% of programming languages are just C rehashed -- see Java and PHP) but there are some ways to make things simpler.

A lot of the choices come from Python and Ruby without the parts to me are either needlessly complex, pointless, or annoying. (annoying? Elif much?!?).


BTW how come you have decided to not include support for pointers? I do understand that for beginner they might seem verry complicated, but when you learn working with them then you realize that they can be verry powerful if used corectly.
First, as you said it is a complicated concept; one I really wasn't able to grasp until Junior high and I was pretty up on the take for this sort of stuff. (given I'd written my first business app in diBol at 13)

But more importantly, given the other stuff I've omitted, what purpose would they serve? There's no direct memory access, no complex data structures like record or userland objects, arrays are (hopefully) going to automatically be dynamic (since I'll be using "array of" with setlength on the back end)... Pointers for the sake of having pointers is... pointless.

Plenty of real world deployment languages get along just fine without them. PHP doesn't have pointers... Java doesn't have them; they have handles which will work much akin to my 'createxxxxx' methods...

They're an advanced concept best left to an intermediate or higher language, not a elementary one.

Though I've been beating myself up over including/not including all sorts of things like pointers the past week and a half since I started this project on a cocktail napkin during a "brainstorming luncheon". I keep having to remind myself "don't make it any more complicated than a 1980's ROM Basic, just let it do more"

Which means no scope, no pointers, no user functions, no typecasting of numerics, no complex data structures.

paul_nicholls
16-04-2012, 12:57 PM
Don't beat yourself up, stick to your guns! :)

code_glitch
16-04-2012, 03:24 PM
I have to be honest though. Pointers are somewhat of a double edged sword, used correctly they can create some absolutely marvelous and innovative code, especiall in an OOP context. However, the down side is that they can also create an absolute hell of debugging as in my case, I often find myself referring to 'runlevels'. In other words, an array of procedures, sometimes 2D, which the program can modify on the fly to change its behaviour when conditions are met. However, due to the generic context of many of these procedure the error with a memory address is not so useful - its not the actual procedure that was at fault as the code there was working just a minute ago - but rather, a pointer has moved to the wrong procedure/variable and fed the wrong array into the wrong procedure or something.

I guess what I'm getting at, is that although GDB can point to your code being faulty, pointers are more of a higher level 'logic' problem. I can definitely see why its a good idea to keep people whom are new to code away from such nasties, however I would recommend something similar to pointers as when they eventually become more adept at programming, they will inevitably want to create more complex programs and experiment with that concept - something all to present in the real world. Perhaps implementing something similar to pointers, but under a more structured context would be a good idea in that case?
Just my 2 cents... Great work anyhow.

deathshadow
16-04-2012, 07:55 PM
I guess what I'm getting at, is that although GDB can point to your code being faulty, pointers are more of a higher level 'logic' problem.
Which is why with my primary target being pre-teens down to third or fourth grade, they have no place.


I can definitely see why its a good idea to keep people whom are new to code away from such nasties, however I would recommend something similar to pointers as when they eventually become more adept at programming, they will inevitably want to create more complex programs and experiment with that concept - something all to present in the real world. Perhaps implementing something similar to pointers, but under a more structured context would be a good idea in that case?
To me that's time for them to move on to a real programming language -- like pascal or C.

A lot of people seem to like to use the training wheels analogy; but usually you give them a tricycle or big wheels (http://en.wikipedia.org/wiki/Big_Wheel_(tricycle)) long before you give them a real bike even with training wheels. I don't see a lot of adults or even teenagers riding around on big wheels. You might have alphabet blocks as a toddler, doesn't mean you're still using them in Elementary school.

I'm thinking Duplo, not Lego Technix. I'm thinking Erector Set, not Arc Welder. I'm thinking alphabet blocks, not Merriam-Websters.

Oddly, across the various places I'm discussing this that seems to be the hardest point to drive home and/or that people aren't understanding. (this forum so far has been the most accepting of the concepts)- Hoping that will get better when I have a real website written up with the various points and information broken into pieces and dumbed down -- as opposed to forum rants the illiterates who make up modern society piss and moan about. Oh noes, wall of text -- AAAH!!! God forbid people be expected to read anything anymore.

WILL
16-04-2012, 08:02 PM
I had a thought. You said that pointers was something that you didn't get until junior high right? Well why bother putting them in for the first version? When I first learned to program I started with only the functions I needed to do some of the interesting things then eventually I learned more to become a better programmer. So why not take that approach?

Start with the basic concepts and lessons, then make a second version OR a "Novice" or "Advanced" edition to teach the second set of concepts that you would learn later on. These editions would compliment each other as people learning the advanced stuff would still be familiar with the language enough that moving onto more complex concepts would be smoother that way.

If this is a project that you wish to put up as a teaching tool of sorts it's best to start with the beginning and then move upward into the next step one at a time. You could even do a 3rd edition that went on to teach the basics of OOP. ;)

code_glitch
16-04-2012, 08:14 PM
Thats an interesting way of putting will. My only potential worry about that is code maintenance - you'd have to fix bugs across each version... Unless, you had it fixed so you had a line like:
set language mode advanced
at the start, kind of like {$Mode ObjFpc}, the interpreter would know to switch out of 'easy' mode with more functions, pointers and etc...

Having said that, though, deathshadow - whats your take on scratch? Personally I actively persecute it... But thats just me :)

deathshadow
17-04-2012, 01:53 AM
Having said that, though, deathshadow - whats your take on scratch? Personally I actively persecute it... But thats just me :)
It's actually something I looked at during my whirlwind tour of "wow these suck". It's too gimmicky, their website seems to be just thrown together any old way to the point I couldn't even figure out where the actual instructions for using it or doing anything in it actually are, and I didn't see any results built in it that were particularly blowing my skirt up.

Though much of that is just the technology it's based on. Flash can be very powerful, but to compile to actionscript, which is basically what it does... well... the net result leaves a lot to be desired... Crappy user input handling, oddball flickers of simple things like sprite movement... and that's before we talk the "IDE".

Which I naturally hate for the goofy illegible color contrasts -- but to be fair, I can't stand illegible acid trip of color syntax highlighting either; it's one of the first things I disable in any editor. (part of why I use Flo's Notepad2 as my primary). I didn't like it when Borland added it to TP4, I've not seen it improve in usefulness since; making giant colored boxes with text that doesn't line up in a clear manner with all sorts of pointless useless extra wording? I'll pass.

In a lot of ways SCRATCH reminds me of Logo -- you can't actually seem to make anything useful with it; and when the results are NOT as good as ROM Basic programs were on a TI-99/4a on a multi-ghz computer using a API that DOES have hardware acceleration... you've got problems.

Super Vegeta
18-04-2012, 08:34 AM
Didn't had enough time to read this whole wall of text, but so far seems very nice. I laughed a bit since I haven't visited the forums for a few weeks and I began writing my own interpreter... and now bah! someone else is doing it too. Keep up your work and good luck.

igmac
18-04-2012, 06:48 PM
On the idea of teaching programming basics in a fun way, have you looked at http://www.spacechemthegame.com/education

Also see http://zachtronicsindustries.com/SpaceChem%20-%20A%20Guide%20for%20Educators.pdf for the PDF in depth education view.

It teaches some basic and also rather advanced fundamentals in a very novel way.

Regards,

Ian.

deathshadow
01-05-2012, 09:48 PM
Tossed together a website for the project.



http://www.gumberoo.org



I also now own the .com and have it set up as a redirect.

yes, it's very gaudy... are the fonts too big?

igmac
02-05-2012, 03:41 PM
yes, it's very gaudy... are the fonts too big?I wouldn't say so deathshadow. It immediately said 'Kiddie Fun' to me when I looked at it.

I would use the words Bright & Lively, not the word Gaudy in describing it :-)

LP
02-05-2012, 07:53 PM
When I've worked for an outsourced project using VB script in Lotus Notes, I promised myself never to return on that path again.

Even original Starcraft's scripts (if you ever used its Map editor) had script facilities that were based on conditions and were edited pseudo-visually.

Seriously, in these technological days when many kids have access to PC and popular tools like Microsoft Office, Open Office and LibreOffice, among many other visual-driven software, when did you see a kid hacking with low-level instruction-level code in Notepad? It is not only difficult to read and understand, but also requires significant effort to achieve anything but basic functionality.

I really applaud the effort to help with kids education, but when making a programming tool for kids, I'd suggest using more visual and high level diagram approach, perhaps something based on top of UML (http://en.wikipedia.org/wiki/Unified_Modeling_Language), instead of assembly-like instruction driven language. Something where you take visual elements, put them together to create a working module, similar to how in LEGO you can construct objects, machines and even electronic/mechanical parts. For example, this video (http://www.youtube.com/watch?v=rWd3vgLaA_M) was posted somewhere here on PGD; something analogical to this example could be made for programming too.

igmac
03-05-2012, 03:28 PM
When I first met computers barely out of School / 2 years National Service it was a Sinclair ZX81. I played with it for 2 or 3 days learning Basic in the process. And then I had the fortune to run into an older programmer who gave me excellent advice. Stop learning any high level language and learn to program Z80 Assembler and if I did that I would, he told me, really understand how a computer worked. After that he said, I would find any computer language easy to learn, easy to understand, easy to recognise it's strengths and weaknesses, and easy to milk every ounce of performance out it.

He was right.

I have employed many programmers from all avenues, some while still studying, some from Technical Colleges, some from Universities. All able (in varying degrees naturally) in particular languages of study. But the rarity of finding that occasional programmer who understood hardware, understood memory, understood optimisation and knew and leveraged the strengths of a few computer languages. Without exception, that programmer had learned Assembler.

As a result I think yes, there is a place for the clickey visual drag and drop abstractioned educational system for budding programmers. They may well one day progress to the dizzy heights of Microsoft Access programmers. They will never become a Nicklaus Wirth, a Dennis Ritchie, a Ken Thompson, a Brian Kernighan, a Linux Torvalds or a John Carmack. You get the drift :-)

Let them get down and dirty with the assembly-like instruction driven language right at the start, and I believe you will give them a solid foundation to build on. (PS: And the interactive mode ability promotes learning iterative experimentation - a major plus in my mind)

Quote from the original post: So... good idea, crazy idea, who cares? Any ideas/suggestions welcome... well -- apart from "oh just write a book about python" or "why don't you make a UML implementation", in which case bugger off!

phibermon
28-07-2012, 08:31 PM
I'm going to agree with igmac. Learning any form of machine code before you learn any other form of programming gives you an understanding that many programmers lack. It shows you the 'why' of higher level languages with which the 'how' becomes far easier.

pitfiend
29-07-2012, 01:37 AM
Computer science is something hard to understand without the right fundation. Indeed, you need to go from the basics. But, what are the basics? Someone can argue that you need to understand data flow, others can say you need to know about data structures, maybe some logic too.

Just having another programming language is not enough, if you ask me. We already have many, start with our beloved Pascal, it was conceived as a learning language. Then you can put Logo, Lisp, or even Basic on the list.

ASM is powerfull, but not good for someone who wants to learn the basics. Neither C is an option if you are learning.

From my experience, I can tell that younger programmers avoid anything that seems extremely complicated. They like easier things, like visual programming. Unfortunately many of them learn popular and easy languages, business oriented. At least, this is the common way here in my country. New programmers learn Visual Basic or VB.NET and some SQL.

What I found disturbing, is the lack of hacking (in the good sense of wanting to know more and experimenting new stuff). They see computers as a black boxes with an OS where you double click an icon and start a program. There's nothing under the OS hood, that motivated the need for understanding, to pursuit the never ending quest for knowledge. They are something we can call key strokers, dumb as rocks, repeating the same patterns they learn, without knowing or understanding why it works or not. They barely know about algorithms and see them as arcane formulaes, wrote by some obscure wizard, who's name is lost in the night of times.

In my opinion, if you want to start something great like teaching good foundations for new programmers, bring them that motivation in the first place, inspire them. Then teach them some "magic" about programming science, also teach them to set reachable goals and teamwork.

LP
29-07-2012, 11:07 PM
What I found disturbing, is the lack of hacking (in the good sense of wanting to know more and experimenting new stuff). They see computers as a black boxes with an OS where you double click an icon and start a program. There's nothing under the OS hood, that motivated the need for understanding, to pursuit the never ending quest for knowledge. They are something we can call key strokers, dumb as rocks, repeating the same patterns they learn, without knowing or understanding why it works or not. They barely know about algorithms and see them as arcane formulaes, wrote by some obscure wizard, who's name is lost in the night of times.

QFT. There isn't better way to explain it as you have. I also find it particularly disturbing that new "development trends" focus on simply finding some combination that solves the problem without understanding the solution.

For some reason this has become a popular approach in education sector where little effort is applied to teach students to understand how the world works (starting from basic sciences such as math and physics, then computer hardware and only then computer software and development), instead they are typically taught about what is thought to be "popular" such as Java for programming and some Adobe products for drawing; the rest is focused on business side. What you get are students who know how to count money (which they won't have), limited if any programming skills in Java, web development using Front Page and graphics design in Illustrator. This "professional" profile fits 80% of popular short-term jobs that are low paid and have tight competition to get into.


In my opinion, if you want to start something great like teaching good foundations for new programmers, bring them that motivation in the first place, inspire them. Then teach them some "magic" about programming science, also teach them to set reachable goals and teamwork.
I agree that the motivation is the most important factor so new developers need to have passion about learning how stuff works and love to figure out what's "behind the scenes".

Although, having said the above, I don't share the opinion that you need to learn assembly first. This could be a typical scenario many years ago and how many of us learned, but today with mobile devices there are cases where you could have no access to assembly and/or machine code at all. However, the passion is what's important and this is why I respect Deathshadow for doing this project. It is a pity that there are very few new projects like this lately - previously we had a lot of new development tools, libraries and frameworks coming out but now only few die hards are left.

Eric
31-07-2012, 07:58 AM
They see computers as a black boxes with an OS where you double click an icon and start a program.
This was kinda inevitable IMHO, the same things happened a few decades ago with... cars. Back then every "hacker" was a mechanic, being able to disassemble a moby or a car engine, tweak it, etc. These day you just glance the surface, and most of the car mechanics only know how to replace spare parts, change oil or tyres.

Computers are just being commoditized, like cars, planes, electricity, tap water, phone, etc. And a myriad of things of our everyday life that used to be hacker territory in their prime.

As computers became more complex and standardized, one just couldn't grasp or match industrially-made stuff. There used to be a time when you could assemble a decent 8 or 16 bit machine from scratch, write its OS, and have it being technologically competitive or better than off-the-shelf products. Heck you could even design your own silicon and your own 4 or 8 bits CPU from the transistors up if you were motivated enough.

For modern hardware or software, you can at best only scratch the surface or learn specific techniques, and these days people are using tools built upon many layers of other tools. Even ASM isn't the simple environment it was, you could know not just all the opcodes but also the common binary forms for a Z80 CPU f.i., while merely knowing all the instructions that exists in x86-64 (even without AVX and other extensions) is already a challenge.

So IMHO, it was kinda inevitable that computers were to become black boxes.

Not knowing algorithms and data-structures is more disturbing, but hardware is fast enough that most of the developers can get away with any crappy algorithms throughout most of their programming life, and fall-back on experts for the few cases where that's enough (just like you'll call the car mechanic, the plumber, etc. from time to time).

pitfiend
31-07-2012, 04:56 PM
So IMHO, it was kinda inevitable that computers were to become black boxes.

Not knowing algorithms and data-structures is more disturbing, but hardware is fast enough that most of the developers can get away with any crappy algorithms throughout most of their programming life, and fall-back on experts for the few cases where that's enough (just like you'll call the car mechanic, the plumber, etc. from time to time).

Sure, but in now days, many are the ones that learns nothing from experimentation, they just sit and learn a popular language, and then call themself programmers. Programming is an Art and a Science. If you want to become a great programmer, you need to go out of the books and try and fail and try again, in a never ending cycle. Learning the basics of a language, doesn't mean you will know what you realy need to master that language.

If students didn't get the right challenges, they just be one of many. Many students are exposed to so simple exercises, that not get out any brain juice, its just another boring exercise ruining any potencial programmer. What is needed is something that makes them to think, to analize and in the end to enjoy doing it. I remember when I was teaching data structures, I proposed a quiz where students must build a data structure to accomodate a Rubic Cube. It was the only task for a two hour test. The academic supervisor was so pissed about it, that he asks me to change the quiz for something easier. That was my last time in theaching.

I believe in the teaching by example metod, with progressive diffilculty. You want better developers, bring them the rights tools and knowledge and make them to think. I like games like The Incredible Machine or SpaceChem, just because they have more that one valid solution and because they push your creativity and thinking limits.

To think is the key, and being motivated too!

Eric
01-08-2012, 07:21 AM
they just sit and learn a popular language, and then call themself programmers.
The same goes for just about every profession out there.

Not all of programming is art and science, arguably most of the programming jobs out there have more in common with repetitive factory jobs than with high tech, research and experimentation. This may be regrettable, but it's true, and that's true even in the game industry.

Just look at any AAA game title: you'll have maybe a handful of devs working on the hard parts of the game engine, on the core AI tech, while you'll have dozens working on scripts for the game levels, the game missions, the support tools for all the game assets and the back-office (not game related, but HR, sales, web servers etc.). Same goes for the artistic side btw, you'll have a few lead artists, and dozens that'll be working from specs with much less creative freedom.


It was the only task for a two hour test.
It's a bit problematic to have a single task IME, depending on previous experience on mood of the students, some can get blocked for various reason (f.i. like having never really manipulated a Rubik's cube in real life).


To think is the key, and being motivated too!
Indeed, but these days, the demand and jobs related to programming is too high, so there are hordes of people that pick programming as their job like they could have picked any other job, and you can't dismiss them, since they're needed. Never forget that there are many programming jobs which no programmer that considers programming as an art and science would willingly take (you included), but those jobs still need to be done by someone...

I guess the issue of education is not really about educating the motivated and brilliant ones (those will always find a way, if you let them), but about the masses that don't really care one way or another, and for which their job is just a boring requirement of life (happy the ones that can combine work and passion!).

pitfiend
01-08-2012, 04:49 PM
The same goes for just about every profession out there.

Not all of programming is art and science, arguably most of the programming jobs out there have more in common with repetitive factory jobs than with high tech, research and experimentation. This may be regrettable, but it's true, and that's true even in the game industry.

Just look at any AAA game title: you'll have maybe a handful of devs working on the hard parts of the game engine, on the core AI tech, while you'll have dozens working on scripts for the game levels, the game missions, the support tools for all the game assets and the back-office (not game related, but HR, sales, web servers etc.). Same goes for the artistic side btw, you'll have a few lead artists, and dozens that'll be working from specs with much less creative freedom.

Sure, but the project proposed try to bring more fun to teaching kids about programming and to keep their interest. I remember when I was 10 and had my first contact with a computer, the legendary Commodore 64, all the wonders I learn that time made me a wannabe programmer, guess what? now it's my profession and hobby, I enjoy it so much that I can't think of me doing anything else. In the begining I learn everything about basic, then I discovered the inmense power of assembler and a whole new universe of possibilities. Later I joined a scene group here in Peru and we made some game intros, SID players, we even fix some pal games to run on ntsc computers, we also made a magdisk with 6 issues. Nothing of that could be possible without an inquisitive mind and teamwork, something rare in now days. It's amazing when we look back on our old metods and realize that we used to apply stuff like scrum development or task management that today has fancy names, but are fundamentaly the same 30 years ago when we were kids.


It's a bit problematic to have a single task IME, depending on previous experience on mood of the students, some can get blocked for various reason (f.i. like having never really manipulated a Rubik's cube in real life).

I was aware of that in the moment. We were tree teachers, and made a common quiz for all our students (it was a hard decision to take, but we agree at last), because we show them similar examples and we knew they were ready for something like that, it was the final test.


Indeed, but these days, the demand and jobs related to programming is too high, so there are hordes of people that pick programming as their job like they could have picked any other job, and you can't dismiss them, since they're needed. Never forget that there are many programming jobs which no programmer that considers programming as an art and science would willingly take (you included), but those jobs still need to be done by someone...

I guess the issue of education is not really about educating the motivated and brilliant ones (those will always find a way, if you let them), but about the masses that don't really care one way or another, and for which their job is just a boring requirement of life (happy the ones that can combine work and passion!).

I kwow perfectly what you mean, my personal experience starts as a business programmer. It's boring, that's why I made my decision to leave anything business oriented in the past and try something new like game development. In this new enterprise I'm learning again, I wanted to show a game for the competition, but still developing it, it takes me more time than expected but in the end I didn't regret to miss the deadline. Eventually the game will be completed, and will post it here for your enjoyment and comments.