VOICE Home Page: http://www.os2voice.org
[Newsletter Index]
[Previous Page] [Next Page]
[Features Index]

June 2000
editor@os2voice.org

Better Hardware Means.... Worse Software?

By Tom Nadeau ©June 2000

               Tom's Homepage: http://www.os2hq.com/

Back in the "good old days" that really were not so good, hardware was expensive. Yes, software was expensive, too, but hardware cannot be pirated. <wink>

Back when RAM was a couple of hundred dollars a megabyte, programmers had to work carefully to ensure that there were no wasted resources. The programs needed to take up a very small RAM footprint, and they needed to avoid glitzy interfaces and inefficient algorithms. Similarly, hard drives of 20 MB used to cost a couple of hundred dollars, and so program storage size needed to be quite a bit smaller, too. The idea of "build it fast and rush it out the door" was simply impossible in an era when the average PC was a 286 or a 386 clone with less than 4 MB RAM available.

There was a hidden benefit to the relatively long, drawn-out software development process of that time: programmers had to spend a lot of time poring over their code to find shortcuts and innovative ways of increasing the program's efficiency.... While they were doing that, they also spotted a lot of bugs, and they had the time to fix them. Programs got fixed as a natural part of the search for efficiency and tightness in the code design. Even companies that did not care about reliability of their products actually got reliability as a "side effect" of the need for efficiency. This is the main reason why programs written ten or twenty years ago tended to be more reliable than programs being written today.

Yes, companies today may rave about "time to market" and the need to keep the version churn going to have a steady stream of incoming revenue. But building code faster is almost certain to reduce reliability, which angers customers and risks lawsuits for lost data. In a competitive software environment, some companies can and should take the route of quality over quantity, or in this case quality ahead of release cycle time. Part of the problem in implementing this strategy is that the software marketplace continues to suffer from reduced competition (because of you-know-who), and another part has to do with the fact that faster hardware seems to make people more "forgiving" about bad software. It doesn't seem to bother people to have fast programs that crash, as it does to have slow programs that crash.

So the next time you buy a piece of software, think about the kind of development mentality that is behind it. If you choose a product from a company that cares little about tight, efficient code, don't be surprised if they also let a lot of unfixed bugs slip through the development process.


Tom Nadeau is the author of Are You Ready for SEVEN LEAN YEARS? http://www.bmtmicro.com/catalog/sevenleanyears.html His web site is OS/2 Headquarters -- http://www.os2hq.com/

[Features]
editor@os2voice.org
[Previous Page ] [Index] [Next Page ]
VOICE Home Page: http://www.os2voice.org