PDA

View Full Version : Whiteboard Theory - or - More Power = Lazy is OK



jdarling
19-05-2010, 01:46 PM
Typically I save this type of rant for backyard bitch sessions. But, I thought for once I'd bitch in a public forum and see how many people said I'm just completely nuts.

Here is my new theory "Whiteboard Theory":
No matter how large or small the whiteboard in question people will only utilize it to display the same general amount of information (IE: 10 lines of text on a 4x8' or 10 lines on a 2x2').

I've had this theory for a very long time, in fact my father tells me I noticed it when I was very small on the job sites. I noticed (when I was younger) that no matter how large the witeboards were in the trailers (yes, I'm a construction brat) the Foremen, Supers, and etc would really only utilize it to display the same general amount of information.

For me, at the time, a larger board meant a larger picture. For them, a larger board meant to write in LARGER LETTERS (yes, typically they would use all caps on a larger board, odd isn't it). I've also noticed this to stay true all the way to today where large wall boards are utilized to display the same information that appears on a small cubical wall board.


Anyways, my point as it pertains to development. I've been interested in some side work, and pursued a few options to see where they would go. Recently I lost (or at least I think I lost) one of these opportunities to another developer who's work I've had to cleanup in not 1 but 3 companies (that part doesn't matter at all by the way). When you Google him with relations to Delphi or Pascal you get back 2 pages of content mainly stating things as follows: "In the present days of dual core 3ghz the only service you provide by taking steps like this [he is discussing optimization by the way] is a disservice to your employer."

Since when, as a developer, has it become acceptable to be complacent due to our hardware? Every day I'm amazed at how much more I can do with modern hardware, but that doesn't stop me from trying to get every last ounce of power out of it. Yes, there is optimizing to the extreme (something I've been accused and admit to at times), but there is also common sense optimization. If you can get a bump in speed, and retain readability, by dropping to "old school" methods why not?

In fact, I'd say that by taking the approach of More Power = Less Responsibility your doing the biggest disservice to your employer. When and WHY did it become acceptable to apply Whiteboard Theory to development???

I may be the old guy on the porch screaming "Get off my LAWN!" But I for one still think we should try and get more from our hardware than less. I'm not saying that we should push graphical limits in business applications, but 100 more records a second is 100 more records a second. Add this up over time and you will find any MBA will agree with me (LOL).

- Jeremy

PS: I've already told the company that when he gets done I'll be happy to make the app usable again for only twice my typical rate, its my "We didn't do our research" discount.

User137
19-05-2010, 03:07 PM
If i understood what you mean then i agree. To put it simpler:
Program A: Works OK and looks cool.
Program B: Works fast and very reliably but doesn't look as good.

If i had to choose from the 2 i'd pick B.

The same could be applied to a few games these days where 3D realistic graphics take away of the importance that is smooth gameplay. Many websites could aswell be much lighter than they are. Starting from mere .htm format which is very bloated to what it could be. Simple for human but slow for a computer.

LP
21-05-2010, 04:43 AM
Since when, as a developer, has it become acceptable to be complacent due to our hardware? Every day I'm amazed at how much more I can do with modern hardware, but that doesn't stop me from trying to get every last ounce of power out of it. Yes, there is optimizing to the extreme (something I've been accused and admit to at times), but there is also common sense optimization. If you can get a bump in speed, and retain readability, by dropping to "old school" methods why not?

Although this may be true in the particular case you described, I've actually found otherwise. There are many techniques now (at least in software development) that can be used for certain problems that are highly efficient when compared to what we've been used to some 10-20 years ago.

Also, there is limit on how much you can reasonably waste in terms of power in the favor of laziness. For instance, if you need to look for an entry in database, there is worst case scenario which will not get any worse no matter how powerful your computer or mainframe is. In the case of trucks you mentioned, you can take a huge truck to paint on, but the larger area will require more paint and effort, so it has a practical limit as well.

DarkBow
21-05-2010, 07:01 AM
Since when, as a developer, has it become acceptable to be complacent due to our hardware? Every day I'm amazed at how much more I can do with modern hardware, but that doesn't stop me from trying to get every last ounce of power out of it. Yes, there is optimizing to the extreme (something I've been accused and admit to at times), but there is also common sense optimization. If you can get a bump in speed, and retain readability, by dropping to "old school" methods why not?

In fact, I'd say that by taking the approach of More Power = Less Responsibility your doing the biggest disservice to your employer. When and WHY did it become acceptable to apply Whiteboard Theory to development???


Unfortunately I agree. In the 2 big apps (500k and 1.25k loc) I have worked on there was little to none optimization in a lot of places. People just tend to get things "done" without stress testing their developments. :(

chronozphere
21-05-2010, 07:51 AM
I agree with you. :) It's better to think about optimization, than just getting it done. On the other hand, you have to measure your effort to optimize and the performance that is gained. You should only optimize when it's worth it.
Also, focus on optimization in your overall-design, instead of spending a lot of time optimizing small low-level pieces of code. Yeah, I know that the low-level optimizations would give you a bigger boost. However if your design is not optimal, you have to do a lot of refactoring. If some low-level routine is not optimal, you can just replace it by another faster one. This can be done at a later stage.

This is also the reason why I'm not a big fan of .NET. Those apps have a lot of dependancies. They require a relatively big framework in order to run. You can make the same application, that is ALOT smaller in size, with Pascal or C/C++. I know .NET is for rapid development, and has some other advantages too, like portable code. However, the same can still be achieved with real compiled languages.

jdarling
21-05-2010, 01:48 PM
Holly cow, shoot up a post from the hip and forget to check for a few days and it goes boom! :)

Quick hits and responses (in no particular order):
@Lifepower
Please expand more on those cases, I can think of specific algo's that exist today that didn't then, but that doesn't mean that optimized implementations of them are unnecessary.

I do have to admit that there are limits to hardware and etc, this is true, but with properly matched hardware and software (EG: Block size * Num Block Reads = divisible record size matched with data block read size in a Database setup) will give you the absolute best results. Just saying, "f-it I'll spin up a copy of <insert database name here> and letting go at it with a few terabytes of data and see if it fails" isn't an answer that ANY corporation would accept. Yet, they accept it when someone implements a slow desktop app.

Perfect example: I recently helped a friend re-write some very basic pre/fetch loops. The old loops looked similar to (obviously not really):


get master recordset
for each master record
fetch child record
for each child
perform some math
end
end
end

Now, knowing about pre-fetching and knowing that local drive access is faster then networked drive access, and knowing that the MINIMUM drive space available on client systems was 250GB lets us change this to:


?cache > get master recordset > cache
?cache > get child recordsets within range > cache
perform math on one pass

The difference, what used to take hours now takes seconds (literally, 2 hours down to under 1 min).

@User137
Yes and no, I don't believe that we (as a corporate entity) should ever accept poor UI, but we (again as the corporate entity) have specialists for this (UXP). The developer should focus on making it run like a striped arse ape (what they do best) and not how it looks though :).

And yes, if we (the gaming industry) would place as much focus on the stories behind our games as we (again the industry) do on the next generation of 3D I think games would (maybe only Could) be better. Honestly I prefer my old text based games to the modern 3D things as they provide a real story for me to live out (then again, I'm damned near blind, color blind, and get motion sick LOL). Oddly enough, considering SpiderWeb games still makes a profit, I'm not the only one.

@DarkBow
Exactly my point, a few minor tweaks to the source and the speed would probably go up dramatically. At times performance = $$ saved (see later example)

@chronozphere
Balance is key when talking optimization. Sure, given the proper hardware, software, knowledge, and unlimited time, along with the right team of experts you can put out the fastest thing this side of the river. Of course, you will never make a profit.

My experience as a BSA and with Business in general tells me there are key differences between optimizing for fun and optimizing for profit.

Another example:
At one of the teleconferencing companies I worked for we thought of every second (actually millisecond) spent with the app spinning was time lost. And it was, time spent waiting on the system was time people weren't getting into their calls. We optimized our lookups and worked out ways to processing to background systems (yes, we did TRUE polymorphic code shifting and RPC over the wire in DELPHI). This was a key objective though for the business model.

At my last company where we did marketing for Colleges and Universities I NEVER would have taken that approach, it would be total overkill. But, I did look at optimizing the end users experience and pre-calculation of the client reports as much as possible without breaking the bank.

@All
Again, this was more of a rant than anything, but the feedback is interesting :). Honestly, when you think of what can be done on modern hardware, then realize what we are doing with it, its a bit disheartening.

- Jeremy