Monthly Archives: September 2014

How not to present an evaluation report

What you say in an evaluation report is important. So is the way you present it.

The Australian Civil Aviation Authority tried to procure an air traffic control system in the early 90s. Things went so badly wrong the Federal Government commissioned a review of the procurement process. [The Honourable Ian Macphee, AO, Independent review of the Civil Aviation Authority’s tender evaluation process for The Australian Advanced Air Traffic System, Australian Government Publications Service 1992 ] Macphee’s report was published in 1992. That’s more than 20 years ago, but the report remains fascinating, because it goes into detail on matters that are normally confidential.
An evaluation report was submitted to the Board of the Civil Aviation Authority in December 1991. It recommended Hughes Aircraft Systems International as preferred contractor ahead of the second-ranked bidder, Thomson Radar Australia Corporation.The report was accompanied by a note from the Chief Executive, which supported the recommendation but asked the Board to note the concerns of the Executive about the risks of the Hughes proposal, in particular in relation to software development. The note didn’t point out that the evaluation team had taken those risks into account but still recommended Hughes.
That was all it took. The Board was completely spooked. The hare was off and running: the Hughes proposal was risky and the Thomson proposal was not. The Board rejected the recommendation and sent management off to have another go.
A management evaluation team undertook an inspection trip covering both Thomson and Hughes. The team did not include anyone with expertise in software development, but they asked more questions about software development than about other areas of risk.
They got very excited about information they collected about the relative numbers of Source Lines of Code, or SLOC in each software development proposal. Thomson had a lot more SLOC than Hughes, which the team interpreted as meaning… the Hughes proposal was more risky!
When they wrote up the report of the trip, they emphasised what they saw as the negative aspects of the Hughes bid in that area and were less critical of Thomson where they had identified Thomson’s risk in other areas as greater than Hughes.
The evaluation team did get a software expert to look at the information when they got back to Canberra. The expert advised the differences in the numbers could be because Hughes would be developing the software from scratch with modern methods and Thomson would be reusing existing software modules: customisation of existing modules uses more SLOC. In other words the difference in SLOC didn’t necessarily say anything at all about the relative risk. The CEO ignored the expert’s report.
Management decided – at a meeting where no minutes were taken – to change its recommendation to the Board …because the Hughes proposal was high risk.
At the Board meeting, appropriately enough on a Friday the 13th, the Board got the original summary of the evaluation results – which still showed Hughes in front on an analysis that had included an assessment of software development risk – a table comparing Hughes and Thomson on the basis of the SLOC differential, with no accompanying expert report and a paper that recommended Thomson on the basis the management team had assessed the Hughes proposal as high risk. Unsurprisingly the Board, none of whom knew much about software, decided in favour of Thomson.
Various things then went wrong and before a contract could be signed the Macphee Review was established. As a result of its recommendations, the competition was re-run with just Hughes and Thomson. The authority made a mess of that as well and ended up in court, but that’s another story.
Two lessons:
Be very careful what you say to an approving body. Try not to set any hares running.
And if fresh evidence turns up, don’t consider the new information in isolation. Re-visit the evaluation to make sure it is reviewed by the experts in context and is given its proper weighting, and re-write the evaluation report to demonstrate you’ve done that.