[Note: Another old draft finding the light of day]
(Why organizations who are worried about performance should be worried about productivity instead)
No wonder we programmers often can't relate to our customers. To them 'efficient' means human producitivty. To us it refers to computer resources -- its time to alter our perspective.
A while back [Note: this said 'recently' when I drafted this] Alan Francis linked to an illustrative example of how an algorithm in Java looks when implemented in Ruby. The case for Ruby productivity is pretty compelling (at least in that example). Such examples typically launch 'discussions' on why this may or may not be true. I want to talk about why it isn't more compelling.
I showed the example to someone who is not a programmer (but has a decent understanding of programming) and she said: "The Ruby code looks a lot more efficient." My pedantic response was that it likely less efficient, but more productive. She insisted that efficient was the better term.
On reflection, I believe her definition of efficiency is better than mine. I never did much C programming (and haven't used C++ much in the past couple of years), yet I have a lingering sense of the "efficiency is king" mindset: a nagging concern that there is something inherently bad about being inefficient (from a programming perspective) that compels me to apologize "but its OK if we're more productive" whenever I suggest a language like Ruby.
This seems a common sentiment among programmers advocating such languages, but its not a compelling argument for some. They don't hear "Its a joy to use because you are so productive!", they hear: "You should try Ruby, because it trades computer efficiency for human efficiency." Many of us don't want to hear that because we have a long and glorious history of defending our favorite languages on precisely this ground. And IT managers are an even less likely to want to hear this because performance issues are a common problem that gets them called to the carpet.
But here's the thing: we really can't trade performance for productivity anyway.
Sure, if we compare two algorithimic implementations we can certainly see the tradoff: if we measured the time it takes to implement each one, compared their respective clock ticks, memory footprint, ability to crunch numbers, and so forth, we'd end up with a ratio (seconds saved programming over clock ticks). This is the kind of experiment that could easily be crafted in a Computer Science class room, published, used to smack down attempts to use Ruby et al. -- and yet be totally misleading.
I believe this calculation is done unconciously by those programmers and IT managers who hear "we're trading efficiency for productivity." Its why they are reluctant to take a serious look at higher-level languages. A one-time productivity hit to get faster run-time performance certainly seems like a good trade-off, but the flaw in the argument is that productivity measurement is not reset with each task. It's cumulative. Unlike a programming assignment ("Implement Quick Sort Please"), productivity is measured across an entire solution (whether a build script or a trading system) -- and not just the first writing of the code, but throughout its useful lifetime (the vast majority of coding time is spent changing, or maintaining existing code). In the real world, its not "write once, run forever", its "write a bit, run a bit, change a bit, run a bit", and so on. I am not saying that run-time efficiency isn't important. It is. What I am saying is that the best way to compare run-time efficiency and programmer productivity is not at the micro level, but at the macro level.
In the typical multi-client, data driven CRUD application found in typical IT shops there isn't going to be much choke-point code relative to the entire system, and yet notions of micro-level efficiency drive many such shops to adopt Java, .NET -- even C++ and C -- in an effort to get "performance." More than once, I've heard organizations say "Java sucks! Its too slow! We're switching to C++!" Only to find that the real problem is that their programmers are not proficient in Java and/or their development practices put a premium on working code (instead of correct code). Unless they happen to be proficient C++ programmers the probability that they will be able to use C++ effectively is even less likely, but they make the switch anyway, because "C++ is more efficient!" In short, they seem to conclude: "We'll just 'Make It Work' (using this 'Fast' language) because there won't be time to go back and tune." Unfortunately for their customers, 'Make It Right' seems to get lost in the shuffle.
Of course, simply substituting Ruby for C++, Java or C# in such an environment would be a disaster. The culture equates 'done' with 'working', and so good programmers accustomed to working in such environments tune everything knowing they'll never get a chance to go back once they say its working. They're under constant pressure to get it working because the schedule's been compressed so badly. The schedule is compressed (in part) because they are using a low-level language to implement API wrappers and data-munging. A vicious cycle if there ever was one.
The irony is that the end-user (the one who needs it to be fast) is unwittingly trading their functional requirements for performance ("Well its buggy, but at least I get an answer quickly!"). Too often, they believe this is how it must be, that it is inherent to software development process. And we programmers sell them on this idea when we tell them we have to use some low-level language to build a web application! This is inefficiency in business terms -- what ultimately matters to nearly all professional development efforts. We programmers need to get away from the idea that efficiency should be measured in clock ticks, and start thinking about efficicency like our customers: in time and money.
In summary, I am not suggesting everyone switch all their development to Ruby (e.g. I don't expect (or encourage) my team to stop using C# any time soon), I am merely suggesting that our first thoughts of efficiency should be about people effort, not computer effort. It was once true that the computer's time was more valuable than people time -- those days are over. People time (both programmer's and end-user's) is the most expensive commodity today. Optimize the programmer's time and they can build a system that optimizes the end-users time more quickly and more efficiently -- leaving plenty of time for tuning the computer's efficiency when it needs it.