Lesson Learned - 'Have they never heard of Port Arthur?'* by Stephen Jenner, MSt, MBA, BA, FAPM, FCMA, CGMA
According to Sir Peter Gershon, projects “do not fail for novel reasons, they fail for the same boringly repetitive reasons”[i]. The importance of post-implementation review therefore cannot be overestimated – as a basis for organizational learning and continuous improvement in portfolio, programme and project management. Indeed, number one in the top ten differentiating practices between higher- and lower-performing organizations in one study[ii] was, “Transferral of lessons learned”.
Unfortunately, it appears that these reviews are often far from effective. In one survey, 80% of respondents reported that the review and evaluation of completed projects was inadequate[iii]. Why might this be? Well, admitting failure can be perceived as career limiting – in many organisations, confidence is confused with competence. Better then to exude an air of expertise, with little commitment to regular learning, and blame any undesired outcome on bad luck before moving on to the next project as quickly as possible. Beyond this, part of the explanation lies in a series of cognitive biases that affect our ability to learn from experience, the most powerful of which is simple over-confidence. The result is what is termed the planning fallacy, or the tendency to believe that our project will proceed as planned, even when knowing that the vast majority of other projects have run late, cost more, and failed to realise the forecast benefits. Other cognitive biases that limit our ability to learn from experience include:
- Self-serving bias – the belief that our successes are due to our efforts and abilities, whereas failures are due to external events or bad luck. As the Duke of Wellington said, “Victory has a thousand fathers; defeat is an orphan.”
- Hindsight bias – the tendency to see past events as being more predictable than they actually were. So the outcome is seen as inevitable.
- Outcome bias – the belief that the outcome determines whether a decision was correct, rather than whether it was the right decision given the information available at the time.
- The Texas Sharpshooter fallacy – the tendency to assess the success of a project (and the appropriateness of the original investment decision) in terms of the actual outcome, rather than asking whether the benefits the initiative was designed to deliver were actually realized and whether the outcome represents value for money, i.e. drawing the target after performance has been measured, and calling that success.
- ‘Confirmation bias’ – the tendency to seek evidence supportive of our beliefs, and to ignore evidence that contradicts them. This in turn reinforces the tendency to over-confidence referred to above, since we only recognise evidence that supports our pre-conceived position.
The failure to undertake effective post-implementation reviews and learn from experience is also an example of what Pfeffer and Sutton[iv] refer to as the ‘knowing–doing’ gap i.e. there is a paradox in many areas of management that good practice is known, but rarely applied. Addressing this knowing-doing gap and the cognitive biases outlined above, requires that we address the following six key actions in laying a basis for effective learning.
1. Accept that learning doesn’t happen by accident. Rather it requires conscious and systematic effort.This means that we need to build learning into our business as usual processes rather than tacking it on as part of some software-based knowledge management initiative. For example, Openreach, the infrastructure division of the British telecommunications company BT Group, has instigated a process whereby, as part of submitting a business case to the Board, the writer needs to get a template with a unique reference number. In order to get this template, they must first visit a database, where lessons learned are categorized to aid identification of the most relevant ones. The business case writer is also expected to name the lessons learned that have been taken note of in the business case submission.
2. Expect learning and monitor it. Even if the management adage ‘what gets measured gets done’, is only partly true, then it is crucial that organisations measure learning. For example, via tracking performance at a portfolio-level in terms of:
- Project delivery on time and within budget, and benefits realisation compared to forecast – what is our organisation’s track record in project delivery and is it improving?
- Improvements to the project and portfolio management processes – in what ways are they more efficient and effective than 12 months ago?
If you were about to have an operation you might legitimately ask, “What is this hospital’s track record in performing this operation and how does it compare with other hospitals?” If the answer was “we don’t know,” you might be a little perturbed – and yet that’s the position in many organisations where tangible data on project delivery and benefits realization performance isn’t collected. What is needed is an evidence-base of what actually works. Not only would this inform future decisions by providing an evidence base for improved forecasting, it would also contribute to improved performance. If everyone knows that performance will be measured and reported against transparent and unambiguous success criteria, then there is a greater incentive today to focus on ensuring success.
3. Actively seek dis-confirming evidence. To help overcome over-confidence and confirmation bias, deliberately seek out evidence that challenges our beliefs. This is also aided by the use of a devil’s advocate or what the Nobel-winning Daniel Kahneman refers to as adversarial collaboration[v]. To help overcome the problems of ‘groupthink’, distinguish our beliefs and assumptions from the facts and evidence-based practice.
4. Robust post-implementation review. This should include both ‘summative’ or the backward looking perspective and ‘formative’ or forward looking perspective aspects:
- Summative – did the project deliver as intended?
- Formative – what can be learnt about the way we make our investment decisions, manage projects and realise benefits?
We also need to distinguish decision outcomes from the process used to make the decision. Was success achieved or just the result of ‘dumb luck’[vi]?
5. Don’t wait to the post-implementation review to learn. Whilst post-implementation reviews are important, we don’t need to wait until a project is finished to start the learning process. For example:
- Apply modular approaches to project design and development and link progress to successful completion of regular stage or phase gate reviews. In this way a policy of ‘staged release of funding’ can be achieved based on ‘informed’ confidence in delivery.
- Use what Gary Klein[vii] calls ‘pre-mortems’, and engage stakeholders in imagining the project has failed and exploring solutions before things go wrong.
6. Ensuring lessons learned are applied – It is crucially important that we go beyond capturing and recording the lessons that have been learned, and move towards actually applying them. Pfeffer and Sutton note that at Hewlett Packard, those responsible for collecting lessons are also responsible for disseminating them. At BP, dissemination of learning was encouraged by a range of strategies including: peer assists - whereby staff were loaned temporarily to another business unit to help resolve an issue; peer groups - to help other groups facing similar problems; federal groups – established to review common issues that cut across the peer groups; and knowledge management coaches - who spent the bulk of their time working with people to find ways of working more efficiently and effectively.
Hoffman reports[viii] that NASA’s Knowledge Sharing Initiative gathers and shares knowledge and emphasizes informal first-person storytelling by NASA project managers. It promotes cross-centre collaboration across the agency via annual conferences; an online magazine; forums; publications; and a multimedia library of presentations providing managers, scientists and engineers with concrete examples of obstacles overcome, successes achieved and failures analysed. By employing the power of storytelling to gather, store and share insights, and telling anecdotes from experienced programme, project and engineering professionals, it continues to create and maintain communities of practice across NASA’s decentralized project-based organization.
Key themes from the above include:
- A focus on applying lessons learned widely via networks of people, rather than storing it, and establishing mechanisms to facilitate collaborative working by bringing people together.
- The use of multimedia solutions to share and disseminate lessons learned.
- The application of dialogue and conversational formats and storytelling to more effectively communicate key messages.
Addressing these six key actions helps lay the basis for effective and on-going organizational learning. The importance of this is emphasised by George Santayana who said, “Those who cannot remember the past are condemned to repeat it.”
* The title of the article refers to the response by Commander Mitsuo Fuchido after the surprise attack on Pearl Habour - the Japanese had adopted a similar tactic in launching a pre-emptive strike on the Russian pacific fleet at Port Arthur prior to the Russo-Japanese War of 1904.
About the Author
Steve Jenner, MSt, MBA, BA (Hons), FAPM, FCMA, CGMA is co-author of, and Chief Examiner for, the APMG’s ‘Management of Portfolios’ and was previously Director of CJIT where the approach adopted to portfolio & benefits management won the 2007 Civil Service Financial Management Award. He is a regular speaker at international conferences, trainer and writer on the subjects of portfolio and benefits management - he is the author of several books in the field, and is a professionally qualified management accountant and a Fellow of the APM. Steve holds an MBA and Masters of Studies degree from Cambridge University.
Steve is also the author and Chief Examiner for the APMGs ‘Managing Benefits’ published in September 2012. This article is based on extracts from the new Guide. There is a free to join ‘Community of Interest’ at http://www.linkedin.com/groups/Managing-Benefits-449350
[i] Project, Nov/Dec 2010, Issue 233, page 4.
[ii] Ward, J., de Hertog, S. & Viaene, S. (2007) Managing Benefits from IS/IT Investments: An Empirical Investigation into Current Practice. Proceedings of the 40th Hawaii International Conference on Systems Science, Hawaii, 2007.
[iii] Ward J., Daniel, E. & Peppard, J. (2008) Building Better Business Cases for It Investments,
MIS Quarterly Executive, 7 (1).
[iv] Pfeffer J and Sutton R.I. (2000) The Knowing Doing Gap, Harvard Business School Press.
[v] Kahneman, D. (2011) See: http://skepticalblog.wordpress.com/2011/06/20/adversarial‑collaboration/.
[vi] Russo, J.E. & Schoemaker, P.H. (1990) Decision Traps Decisions – How to Make the Right Decision First Time, Simon & Schuster.
[vii] Klein, G. (1998) Sources of Power, MIT Press
[viii] From ICCPM (2011) Complex Project Management Global Perspectives and the Strategic Agenda to 2025 – The Task Force Report and Compendium of Working Papers. Available at:http://www.iccpm.com.