Prometheus Blogs

The Most Expensive Mistake In Software Development

02/17/2015

Many homeowners have found out the hard way that it’s a lot more expensive to fix the damage resulting from a plumbing leak than it is to fix the leak itself.

When it comes to software requirements, a gram of prevention is usually worth a lot more than a kilogram of cure. Too bad most organizations don’t realize they are paying for cure. The COCOMO II (COnstructive COst MOdel) study of 161 projects found that unskilled analysts double the effort it takes to deliver software1.

Bargraph of COCOMO effort multipliers

Effort multipliers found by COCOMO II. Numbers show how much an unfavorable value for that factor increases software development effort, compared to a favorable value.

Few organizations would deliberately double their development effort, yet it’s very common to delegate requirements capture to domain experts who are poorly trained in requirements capture. Thanks to the limited sample size of the COCOMO study, and the prevalence of non-gaussian statistics, the price can be a lot larger than a factor of two.

The IEEE postmortem of $100 million wasted on the FBI’s Virtual Case File system placed much of the blame on “a deeply flawed 800-page set of system requirements that doomed the project before a line of code was written”.

It gets worse. Most of the $170 million spent on healthcare.gov was wasted, according to an article at cio.com:

“Healthcare.gov could have been a very simple process… Type in zip code, click Submit, select age and family factors, get quote and get number to call purchase insurance.

We know this smaller scope could have worked because it does work. A tiny company called Opscost took that spreadsheet and implemented a website doing just that in few than two person-weeks. The site is thehealthsherpa.com”

That $170 million lost is a bargain compared to the one billion dollars squandered by the Air Force’s Expeditionary Combat Support System. The New York Times quotes a Defense Department officer as saying “We started with a Big Bang approach and put every possible requirement into the program, which made it very large and very complex”.

Then there is the British National Health Service’s scrapped 12 billion pound effort to create a computerized patient record system. The Guardian quotes a Department of Health source as saying “The problem is, it didn’t deliver… It was too ambitious, the technology kept changing”. This, and the two previous examples, may sound like scope problems rather than analysis failures. I’d like to think that a more careful consideration of the goals, benefits, and risks near the outset would have resulted in scope changes.

There are plenty of additional examples: the Denver airport baggage handling system, Mariner 1, Ariane 5 501, the Mars Climate Orbiter, and the Mars Polar Lander all stand out. An article in ComputerWorld says “Of 3,555 projects from 2003 to 2012 that had labor costs of at least $10 million, only 6.4% were successful.”

The above fiascos can be seen in other lights: analysis failure wouldn’t have been so disastrous to these projects, with better risk management. So what is the root cause, bad requirements or bad management? It seems a lot like speculating about chickens and eggs. Better management would have resulted in more attention to analysis as well as to better risk mitigation.

One final example. Humanity nearly died Sept. 26, 1983. An article at the Washington Post describes how the Soviet Union’s warning system falsely determined the United States had launched five Minuteman intercontinental ballistic missiles. It failed to filter out satellite signals resulting from the sun’s reflection off clouds. Fortunately an unsung hero by the name of Stanislav Petrov decided it was a false alarm. The minimum cost would probably have been the gross domestic product of both the US and the Soviet Union (at least that’s the idea behind “mutually assured destruction”). Perhaps humanity would be extinct, the Earth ruled by giant mutant cockroaches. I wonder if they’d do a better job of learning from history.


Related:


  1. COCOMO II found that analysts in the 90th percentile of ability reduce software development effort (they multiply the effort by 0.71, compared to analysts at the 55th percentile with a multiplier of 1.0), while analysts in the 15th percentile increase effort by a multiplier of 1.42. Analyst skill was the single largest factor considered, if you don’t include the size of the project itself. Looking at these multipliers from a different perspective, if we regard the more skilled analysts as the target you should aim for (rather then the 55th percentile, with a multiplier of 1), and therefore normalize effort by dividing all of the effort multipliers by 0.71 (so that the 90th percentile multiplier is 1), using analysts at the 15th percentile doubles your development effort (1.41/0.71 = 2).

Reader Comments
  1. Harold said... around over 2 years ago
    Awesome article !
Leave a Comment

Back