Software as Prosthesis

Sometimes I think the software world went horribly wrong at some point.

There is so much written on hackers, the hacker ethic, the joy of programming, the love of computing, the joy of creativity, etc. Much of this writing is accompanied by lamentations of how most professional software development isn't like that.

The cause is generally attributed to suits, to process, to history, to life itself or to some combination of the above. I have a different theory: I blame it on the early adopters of computers.

Not the early early adopters (i.e. the military) I mean the early adopters in the business world. Back in those days computing was very expensive. To purchase a room-sized computer with less processing power (if a lot more storage and reliability) than a 386, and still make a profit meant that you had to eliminate a lot of cost somewhere else in your organization. Back in those days (as today) that meant labor. Which meant people... which meant jobs.

So, computer programs were intended less for helping people do their jobs as to replace them. Sure, someone would be using the computer to do that, but the main emphasis was replacement, not augmentation.

This was fundamentally different than how the military used computers. To cite just two examples:

At Bletchly Park, Alan Turing and the other mathematicians didn't simply chant incantations at a magic box -- they understood the mathematics and could theoretically (if not practically) perform the exact same calculations as Colossus. The result was to greatly enhance their ability to break codes.

WWII submarines used analog computers to assist the crews in setting the gyroscopes on torpedoes. The TDC Mark III used in American submarines also calculated the target's position. This data was compared to calculations made by the crew, to reduce error. The result greatly augmented the crew's already existing ability to target their weapons.

The difference is that the military wasn't directly concerned with cost, but with results.

Working on my little anagram calculator the other day, I was struck by how different the process and the results were from the type of software I have generally written professionally. Initially I attributed this to the fact that I was writing it for myself and not someone else, but the more I think about it, the more I come to believe that isn't it.

The difference is the use I put the software to. True, I didn't write that software so that someone else could play the anagram game, but more importantly, I didn't write it to replace my playing of the game, but to augment it. I was still a key piece of the solution. I knew what words needed to be in the dictionary, I knew the speed requirements, I even had to sync the two programs by typing the game's information into the calculator. I haven't tried it, but I am willing to bet that if I had it simply list the words, and I type in the ones the were likely to be correct (thus eliminating the largest remaining the bottleneck, the false positives), we'd do even better. A further tweak might be for me to identify the words to send and have the program type. And so on. The point being, the goal is to augment, not replace.

Contrast this with your typical enterprise CRUD application (let's say a Customer Service application). Its designers (and gold donors) are often much less concerned with user augmentation as they are with user reduction. For these people Cost/Benefit analysis means "Do more with fewer people" instead of "Make our people able to do it better." If they can reduce the number of people in the service center, then the project is a success. The goal is to replace not to augment.

The difference is subtle, but I think that this is at the heart of those systems that succeed: They do so because they allow the users to do what they would do anyway better. That's faster, more easily, more accurately, or all of the above. This doesn't necessarily mean fewer people, and so this doesn't necessarily make the project a success from the cost standpoint. However, the ironic part is that the only way to build software that could reduce headcount is to build something that truly augments the workers. It is this dichotomy that is at the heart of why Customer focus is important to build software that works -- and why its so hard to achieve in most environments.

Why do I blame the early adopters? Because the technology required strong central control (because of cost and difficulty of use), which led to specialization. Programmers wrote software for others to use. In those domains (e.g. Engineering) where the users learned to program enough to augment their own jobs, this happened less (although the software quality is deemed terrible by people who list programmer as their job title).

Its still visible today. The largest organizations have (stereotyping alert) the least technically savvy users, the least interesting programming jobs, and the most spectacular software failures. Enterprises (e.g. Trading) where the software development is tightly integrated into the workflow of the business often have people who are at least semi-competent programmers doing the work of the business.

That computing is cheap now, that programming languages (and the knowledge on how to use them) are more accessible than ever before, is dwarfed by the inertia of how things used to be.

It can be different. What if every user learned a high-level language like Ruby, was instructed in how to write simple scripts to augment their jobs, and focused on solving the problem of their job instead of on using software? What if each group of 5 - 10 customer service reps included a couple of programmers to pair with them while they worked? What if we truly democratized computing? What would happen to big IT? Would it even exist? Its not that farfetched -- the use (and abuse) of Excel as a user programming environment attests to that. Imagine if the users had a tool that was actually easy to program well instead?

I don't know the answers, but I like to imagine the results. Someday maybe somebody will try it.

Until then, hackers everywhere will continue to write software to supplement their natural abilities, get a thrill of creativity and power as they do things they couldn't do before writing their little script, and leave everyone else scratching their head as to why these guys think computers are anything but a necessary evil in their lives.

And to all those out there trying to figure out how to save money with computers? Well, you're half right. Labor is still your biggest cost. The difference now is that the labor cost is in the IT department. I say, don't offshore your software IT department, eliminate it. Take all that money you spend training your users to act like monkeys using someone else's software and teach them how to solve their own problems. Integrate good programmers into your people's lives and create human/computer symbiotic systems that solve business problems.

If nothing else, its a solution guaranteed to piss off everyone.