Monday

Tag Archives: Risk Management

Baked In Optimism – Why so many projects fail

This webinar presented as part of the free PGCS 2023 Webinar Series looked at two processes that are ‘baked into’ standard project management estimating and control to show how recommended good practices are still optimistically biased.

  • When preparing an estimate good practice recommends using Monte Carlo to determine an appropriate contingency and the level of risk to accept. However, the typical range distributions used are biased – they ignore the ‘long tail’.
  • When reporting progress, the estimating bias should be identified and rectified to offer a realistic projection of a project outcome. Standard cost and schedule processes typically fail to adequately deal with this challenge meaning the final time and cost overruns are not predicted until late in the project.

This webinar highlighted at least some of the causes for these problems. Solving the cultural and management issues is for another time. Download the PDF of the slides, or view the webinar at: https://mosaicprojects.com.au/PMKI-PBK-046.php#Process2

Risk mitigation requires courage – How Cockcroft’s Folly saved 100s of lives!

One of the speakers at PGCS 2023 is Alex Walsh, his presentation Managing wicked program delivery looks at the UK nuclear program to decommission the Sellafield complex, one of the most complex high hazard nuclear facilities in the world that was operating from the 1940s through to 2022. For more on this presentation and the PGCS program see: https://www.pgcsymposium.org.au/.

As part of my work on preparing the PGCS program, I had a virtual look at this project and came across this fascinating risk mitigation story where the courage of two managers probably saved hundreds of lives in the North of England.

The site

Sellafield, formerly known as Windscale, is a large multi-function nuclear site close to Seascale on the coast of Cumbria, in NW England. As of August 2022, primary activities are nuclear waste processing and storage and nuclear decommissioning. Former activities included plutonium production for nuclear weapons, nuclear power generation from 1956 to 2003, and nuclear fuel reprocessing from 1952 to 2022.

After the war ended, the Special Relationship between Britain and the United States “became very much less special”. The British government saw this as a resurgence of United States isolationism which raised the possibility that Britain might have to fight an aggressor alone. It also feared that Britain might lose its great power status, and therefore its influence in world affairs, so in July 1946, the Chiefs of Staff Committee recommended that Britain acquire nuclear weapons.

Two reactors (called ‘piles’ at the time) were constructed to enrich uranium to create plutonium and other isotopes. The designers of these reactors desired a passively safe cooling system. In place of water, they used air cooling driven by convection through a 400-foot (120 m) tall chimney, which could create enough airflow to cool the reactor under normal operating conditions. The chimney was arranged so it pulled air through the channels in the reactor core, and huge fans were positioned in front of the core, to greatly increase the airflow rate.

The risk

During construction, physicist Terence Price considered the possibility of a fuel cartridge splitting open, causing the hot uranium to catch fire, resulting in fine uranium oxide dust being blown up the chimney and escaping into the environment.

Raising the issue at a meeting, he suggested filters be added to the chimneys, but his concerns were dismissed as too difficult and too expensive to deal with. However, Sir John Cockcroft, leading the project team, was sufficiently alarmed to order the filters.

They could not be installed at the base as construction of the chimneys had already begun, and were constructed on the ground then winched into position at the top once the chimneys were complete. They became known as Cockcroft’s Folly as many regarded the delay they caused and their great expense to be a needless waste.

This all changed after the Windscale fire of 10th October 1957. This fire was the worst nuclear accident in the United Kingdom’s history, and one of the worst in the world. The fire was in Unit 1 of the two-pile Windscale site and burned for three days releasing radioactive fallout which spread across the UK and the rest of Europe[1].

But, the filters trapped about 95% of the radioactive dust and arguably saved much of northern England from becoming a nuclear wasteland. With typical British understatement, Terence Price said “the word folly did not seem appropriate after the accident“.

The UK government under Harold Macmillan ordered original reports into the fire to be heavily censored and information about the incident to be kept largely secret. It later came to light that small but significant amounts of the highly dangerous radioactive isotope polonium-210 were released during the fire. But the presence of the chimney scrubbers at Windscale was credited with minimising the radioactive content of the smoke.

Both the ‘piles’ were shut down after the fire, but a large quantity of radioactive materials are still inside the sealed #1 pile; this is one of the challenges for the decommissioning program Alex will be speaking about at PGCS in a couple of weeks’ time.

More relevant to this post though is the moral courage exhibited by Sir John Cockcroft in doing the right thing rather than the easy thing to guard against an accident that ‘could not happen’, but did! Thinking through this dilemma puts a whole new perspective on risk assessment and mitigation – in the right circumstances ‘black swans’ can kill.

For more on risk management see: https://mosaicprojects.com.au/PMKI-PBK-045.php


[1] For more on the fire see: https://en.wikipedia.org/wiki/Windscale_fire

Risk Management Update

Mosaic’s risk management pages have been reorganized and updated. All of the papers are available for downloading and use free of charge.  There are also free samples of a couple of useful spreadsheets for assessing risk and planning the management of important risks. 

The risk section of our website is now in two parts:

Risk Management covers the processes involved in the identification and management of risk within a project or program to achieve and maintain a risk profile acceptable to the key stakeholders: https://mosaicprojects.com.au/PMKI-PBK-045.php 

Risk Assessment covers the techniques and tools used to calculate and assess the risk exposure of a project or program: https://mosaicprojects.com.au/PMKI-PBK-046.php

The Planning Paradox – How much detail is too much?

Traditional views tend to favor a management approach built on the assumption that more detail is better – and to a point, this is undoubtedly correct. Insufficient detail in a plan of any type is a sure way to fail; ‘just do it’ at the overall project level does not help. But, finessing project plans to present useful information at the right level of detail is not easy – decisions have to be made!

Balancing the factors shown in this diagram helps make the right decision. As the risk environment increases, the project controls need to be more rigorous. The risk environment is influenced by the size or significance of the identified risks, and the overall degree of uncertainty associated with the work. As either (or both) of these factors increase, the controls systems need to be more rigorous.

The two factors that influence the degree of rigour in the controls system are the amount of detail included (granularity) and the frequency of the monitoring, reviewing and updating of the plans. But, as suggested above, too much detail will increase costs and reduce efficiency and effectiveness.

There’s no right answer to this paradox, our latest article The Planning Paradox – How much detail is too much? offers some useful guidelines to consider (download the article).

For more on Schedule Strategy, Planning, & Design, see: https://mosaicprojects.com.au/PMKI-SCH-011.php  

CPM Anomalies Invalidate Monte Carlo

A couple of weeks ago I posted on some of the anomalies in CPM logic that will cause unexpected results: CPM Scheduling – the logical way to error #1. A comment on the post by Santosh Bhat started me thinking about the effect of these logical constructs on risk analysis.

The various arrangement of activities and links shown in CPM Scheduling – the logical way to error #1 (with the addition of a few more non-controlling links) follow all of the scheduling rules tested by DCMA and other assessments. The problem is when you change the duration of a critical activity, there is either no effect or the reverse effect on the overall schedule duration.

In this example, the change in the overall project duration is the exact opposite of the change in the duration of Activity B (read the previous post for a more detailed explanation).  For this discussion, it is sufficient to know that an increase of 2 weeks in the duration of ‘B’ results in a reduction of the overall project duration of 2 weeks (and vice-versa).

The effect these anomalies on the voracity of a Monte Carlo analysis is significant. The essence of Monte Carlo is to analyze a schedule 100s of times using different activity durations selected from a pre-determined range that represents the uncertainty associated with each of the identified risks in a schedule. If the risk event occurs, or is more serious, the affected activity duration in increased appropriately (see more on Monte Carlo). 

In addition to calculating the probability of completing by any particular date, most Monte Carlo tools also generate tornado charts showing the comparative significance of each risk included in the analysis and its effect on the overall calculation.  For example, listing the risks that have the strongest correlation between the event occurring and the project being delayed.  

Tornado charts help the project’s management to focus on mitigating the most significant risks.

When a risk is associated with an activity that causes on of the anomalies outlined in CPM Scheduling – the logical way to error #1 the consequence is a reduction in the accuracy of the overall probability assessments, and more importantly to reduce the significance of the risk in tornado charts. The outcome of the anomalous modelling is to challenge the fundamental basis of Monte Carlo. There are more examples of similar logical inconsistencies, that will devalue Monte Carlo analysis, included in Section 3.5 of Easy CPM.

Easy CPM is designed for schedulers that know how to operate the tools efficiently, and are looking to lift their skills to the next level. The book is available for preview, purchase (price $35), and immediate download, from: https://mosaicprojects.com.au/shop-easy-cpm.php

Murphy’s Law is not an excuse, it is a call to action!

To apply Murphy’s Law proactively, you need to think through everything before you start work and ask yourself if this part fails, does the system still work?  This article looks at the historical origins of Murphy’s Law and how to use the concept to avoid problems.

Download the article: https://mosaicprojects.com.au/Mag_Articles/AA014_Murphys_Law.pdf

For more on risk management see: https://mosaicprojects.com.au/PMKI-PBK-045.php#General

Radical UncertaintyProject controls for an unknowable future

CREDIT:MATT DAVIDSON (AFTER JOHN TENNIEL)

A new book suggests a paradigm shift in the way project controls a used, is needed on major projects combining the discipline required for major engineering works with the flexibility to deal with an uncertain future – getting the balance right could be very profitable. This article outlines the challenges and shortcomings of existing control processes: https://mosaicprojects.com.au/Mag_Articles/AA009_Radical_Uncertainty.pdf

For more papers on risk and uncertainty see: https://mosaicprojects.com.au/PMKI-SCH-045.php

For more papers on complexity see: https://mosaicprojects.com.au/PMKI-ORG-040.php

Probability -v- luck. Should we give up our day-job?

Based on a successful day at the races, 5 winners and one place from 8 bets, this article looks at the balance between luck and process in achieving the result.  Our conclusion is that you should not confuse luck with skill. Good processes will help build success, persistence will generate more opportunities for you to be lucky, and skill or capability will shift the odds in your favour, but randomness rules!

To quote Coleman Cox: I am a great believer in Luck. The harder I work, the more of it I seem to have.

Click to download the PDF.

For more papers on risk and probability see: https://mosaicprojects.com.au/PMKI-SCH-045.php#Process1

New paper on Technical Debt

Technical debt is more than a technical issue – its effect on major projects can be catastrophic as demonstrated by the major blow outs in cost and time on the £17.4billion London Crossrail project!  Two papers looking at this insidious problem are now available to download:

A brief article at: https://mosaicprojects.com.au/Mag_Articles/P044-Technical_Debt.pdf

My presentation from ProjectChat 2019 at: https://mosaicprojects.com.au/PDF_Papers/P204-Technical_Debt.pdf

The reference case for management reserves

Risk management and Earned Value practitioners, and a range of standards, advocate the inclusion of contingencies in the project baseline to compensate for defined risk events. The contingency may (should) include an appropriate allowance for variability in the estimates modelled using Monte Carlo or similar; these are the ‘known unknowns’.  They also advocate creating a management reserve that should be held outside of the project baseline, but within the overall budget to protect the performing organisation from the effects of ‘unknown unknowns’.  Following these guidelines, the components of a typical project budget are shown below.

PMBOK® Guide Figure 7-8

The calculations of contingency reserves should be incorporated into an effective estimating process to determine an appropriate cost estimate for the project[1]. The application of appropriate tools and techniques supported by skilled judgement can arrive at a predictable cost estimate which in turn becomes the cost baseline once the project is approved. The included contingencies are held within the project and are accessed by the project management team through normal risk management processes. In summary, good cost estimating[2] is a well understood (if not always well executed) practice, that combines art and science, and includes the calculation of appropriate contingencies. Setting an appropriate management reserve is an altogether different problem.

 

Setting a realistic management reserve

Management reserves are an amount of money held outside of the project baseline to ‘protect the performing organisation’ against unexpected cost overruns. The reserves should be designed to compensate for two primary factors.  The first are genuine ‘black swans’ the other is estimating errors (including underestimating the levels of contingency needed).

The definition of a ‘black swan’ event is a significant unpredicted and unpredictable event[3].  In his book of the same name, N.N. Taleb defines ‘Black Swans’ as having three distinct characteristics: they are unexpected and unpredictable outliers, they have extreme impacts, and they appear obvious after they have happened. The primary defence against ‘black swans’ is organisational resilience rather than budget allowances but there is nothing wrong with including an allowance for these impacts.

Estimating errors leading to a low-cost baseline, on the other hand, are both normal and predictable; there are several different drivers for this phenomenon most innate to the human condition. The factors leading to the routine underestimating of costs and delivery times, and the over estimating of benefits to be realised, can be explained in terms of optimism bias and strategic misrepresentation.  The resulting inaccurate estimates of project costs, benefits, and other impacts are major source of uncertainty in project management – the occurrence is predictable and normal, the degree of error is the unknown variable leading to risk.

The way to manage this component of the management reserves is through the application of reference class forecasting which enhances the accuracy of the budget estimates by basing forecasts on actual performance in a reference class of comparable projects. This approach bypasses both optimism bias and strategic misrepresentation.

Reference class forecasting is based on theories of decision-making in situations of uncertainty and promises more accuracy in forecasts by taking an ‘outside view’ of the projects being estimated. Conventional estimating takes an ‘inside view’ based on the elements of the project being estimated – the project team assesses the elements that make up the project and determine a cost. This ‘inside’ process is essential, but on its own insufficient to achieve a realistic budget. The ‘outside’ view adds to the base estimate based on knowledge about the actual performance of a reference class of comparable projects and resolves to a percentage markup to be added to the estimated price to arrive at a realistic budget.  This addition should be used to assess the value of the project (with a corresponding discounting of benefits) during the selection/investment decision making processes[4], and logically should be held in management reserves.

Overcoming bias by simply hoping for an improvement in the estimating practice is not an effective strategy!  Prof. Bent Flyvbjerg’s 2006 paper ‘From Nobel Prize to Project Management: Getting Risks Right[5]’ looked at 70 years of data.  He found: Forecasts of cost, demand, and other impacts of planned projects have remained constantly and remarkably inaccurate for decades. No improvement in forecasting accuracy seems to have taken place, despite all claims of improved forecasting models, better data, etc.  For transportation infrastructure projects, inaccuracy in cost forecasts in constant prices is on average 44.7% for rail, 33.8% for bridges and tunnels, and 20.4% for roads.

The consistency of the error and the bias towards significant underestimating of costs (and a corresponding overestimate of benefits) suggest the root causes of the inaccuracies are psychological and political rather than technical – technical errors should average towards ‘zero’ (plusses balancing out minuses) and should improve over time as industry becomes more capable, whereas there is no imperative for psychological or political factors to change:

  • Psychological explanations can account for inaccuracy in terms of optimism bias; that is, a cognitive predisposition found with most people to judge future events in a more positive light than is warranted by actual experience[6].
  • Political factors can explain inaccuracy in terms of strategic misrepresentation. When forecasting the outcomes of projects, managers deliberately and strategically overestimate benefits and underestimate costs in order to increase the likelihood that their project will gain approval and funding either ahead of competitors in a portfolio assessment process or by avoiding being perceived as ‘too expensive’ in a public forum – this tendency particularly affects mega-projects such as bids for hosting Olympic Games.

 

Optimism Bias

Reference class forecasting was originally developed to compensate for the type of cognitive bias that Kahneman and Tversky found in their work on decision-making under uncertainty, which won Kahneman the 2002 Nobel Prize in economics[7]. They demonstrated that:

  • Errors of judgment are often systematic and predictable rather than random.
  • Many errors of judgment are shared by experts and laypeople alike.
  • The errors remain compelling even when one is fully aware of their nature.

Because awareness of a perceptual or cognitive bias does not by itself produce a more accurate perception of reality, any corrective process needs to allow for this.

 

Strategic Misrepresentation

When strategic misrepresentation is the main cause of inaccuracy, differences between estimated and actual costs and benefits are created by political and organisational pressures, typically to have a business case approved, or a project accepted, or to get on top of issues in the 24-hour news cycle.  The Grattan Institute (Australia) has reported that in the last 15 years Australian governments had spent $28 billion more than taxpayers had been led to expect. A key ‘political driver’ for these cost overruns was announcing the project (to feed the 24-hour news cycle) before the project team had properly assessed its costs.  While ‘only’ 32% of the projects were announced early, these accounted for 74% of the value of the cost overruns.

The Grattan Institute (Australia) has reported that in the last 15 years Australian governments had spent $28 billion more than taxpayers had been led to expect on transport infrastructure projects. One of the key ‘political drivers’ for these cost overruns was announcing the project (to feed the 24-hour news cycle) before the project team had properly assessed its costs.  While ‘only’ 32% of the projects were announced early, these projects accounted for 74% of the value of the cost overruns.

Reference class forecasting will still improve accuracy in these circumstances, but the managers and estimators may not be interested in this outcome because the inaccuracy is deliberate. Biased forecasts serve their strategic purpose and overrides their commitment to accuracy and truth; consequently the application of reference class forecasting needs strong support from the organisation’s overall governance functions.

 

Applying Reference Class Forecasting

Reference class forecasting does not try to forecast specific uncertain events that will affect a particular project, but instead places the project in a statistical distribution of outcomes from the class of reference projects.  For any particular project it requires the following three steps:

  1. Identification of a relevant reference class of past, similar projects. The reference class must be broad enough to be statistically meaningful, but narrow enough to be truly comparable with the specific project – good data is essential.
  2. Establishing a probability distribution for the selected reference class. This requires access to credible, empirical data for a sufficient number of projects within the reference class to make statistically meaningful conclusions.
  3. Comparing the specific project with the reference class distribution, in order to establish the most likely outcome for the specific project.

The UK government (Dept. of Treasury) were early users of reference class forecasting and continue its practice.  A study in 2002 by Mott MacDonald for Treasury found over the previous 20 years on government projects the average works duration was underestimated by 17%, CAPEX was underestimated by 47%, and OPEX was underestimated by 41%.  There was also a small shortfall in benefits realised.

 

This study fed into the updating of the Treasury’s ‘Green Book’ in 2003, which is still the standard reference in this area. The Treasury’s Supplementary Green Book Guidance: Optimism Bias[8] provides the recommended range of markups with a requirement for the ‘upper bound’ to be used in the first instance by project or program assessors.

These are very large markups to shift from an estimate to a likely cost and are related to the UK government’s estimating (ie, the client’s view), not the final contractors’ estimates – errors of this size would bankrupt most contractors.  However, Gartner and most other authorities routinely state project and programs overrun costs and time estimates (particularly internal projects and programs) and the reported ‘failure rates’ and overruns have remained relatively stable over extended periods.

 

Conclusion

Organisations can choose to treat each of their project failures as a ‘unique one-off’ occurrence (another manifestation of optimism bias) or learn from the past and develop their own framework for reference class forecasting. The markups don’t need to be included in the cost baseline (the project’s estimates are their estimates and they should attempt to deliver as promised); but they should be included in assessment process for approving projects and the management reserves held outside of the baseline to protect the organisation from the effects of both optimism bias and strategic misrepresentation.  As systems, and particularly business cases, improve the reference class adjustments should reduce but they are never likely to reduce to zero, optimism is an innate characteristic of most people and political pressures are a normal part of business.

If this post has sparked your interest, I recommend exploring the UK information to develop a process that works in your organisation: http://www.gov.uk/government/publications/the-green-book-appraisal-and-evaluation-in-central-governent

______________________

[1] For more on risk assessment see: http://www.mosaicprojects.com.au/WhitePapers/WP1015_Risk_Assessment.pdf

[2] For more on cost estimating see: http://www.mosaicprojects.com.au/WhitePapers/WP1051_Cost_Estimating.pdf

[3] For more on ‘black swans’ see: /2011/02/11/black-swan-risks/

[4] For more on portfolio management see: http://www.mosaicprojects.com.au/WhitePapers/WP1017_Portfolios.pdf

[5] Project Management Journal, August 2006.

[6] For more on the effects of bias see: http://www.mosaicprojects.com.au/WhitePapers/WP1069_Bias.pdf

[7] Kahneman, D. (1994). New challenges to the rationality assumption. Journal of Institutional and Theoretical
Economics, 150, 18–36.

[8] Green Book documents can be downloaded from: http://www.gov.uk/government/publications/the-green-book-appraisal-and-evaluation-in-central-governent