Monday

Project Risk Management – how reliable is old data?

One of the key underpinnings of risk management is reliable data to base probabilistic estimates of what may happen in the future.  The importance of understanding the reliability of the data being used is emphasised in PMBOK® Guide 11.3.2.3 Risk Data Quality Assessment and virtually every other risk standard.

One of the tenets underpinning risk management in all of its forms from gambling to insurance is the assumption that reliable data about the past is a good indicator of what will happen in the future – there’s no certainty in this processes but there is degree of probability that future outcomes will be similar to past outcomes if the circumstances are similar. ‘Punters’ know this from their ‘form guides’, insurance companies rely on this to calculate premiums and almost every prediction of some future outcome relies on an analogous interpretation of similar past events. Project estimating and risk management is no different.

Every time or cost estimate is based on an understanding of past events of a similar nature; in fact the element that differentiates an estimate from a guess is having a basis for the estimate! See:
–  Duration Estimating
–  Cost Estimating

The skill in estimating both normal activities and risk events is understanding the available data, and being able to adapt the historical information to the current circumstances. This adaptation requires understanding the differences in the work between the old and the current and the reliability and the stability of the information being used. Range estimates (three point estimates) can be used to frame this information and allow a probabilistic assessment of the event; alternatively a simple ‘allowance’ can be made. For example, in my home state we ‘know’ three weeks a year is lost to inclement weather if the work is exposed to the elements.  Similarly office based projects in the city ‘know’ they can largely ignore the risk of power outages – they are extremely rare occurrences. But how reliable is this ‘knowledge’ gained over decades and based on weather records dating back 180 years?

Last year was the hottest year on record (by a significant margin) as was 2014 – increasing global temperatures increase the number of extreme weather events of all types and exceptionally hot days place major strains on the electrical distribution grids increasing the likelihood of blackouts.  What we don’t know because there is no reliable data is the consequences.  The risk of people not being able to get to work, blackouts and inclement weather events are different – but we don’t know how different.

Dealing with this uncertainty requires a different approach to risk management and a careful assessment of your stakeholders. Ideally some additional contingencies will be added to projects and additional mitigation action taken such as backing up during the day as well as at night – electrical storms tend to be a late afternoon / evening event. But these cost time and money…..

Getting stakeholder by-in is more difficult:

  • A small but significant number of people (including some in senior roles) flatly refuse to accept there is a problem. Despite the science they believe based on ‘personal observations’ the climate is not changing…….
  • A much larger number will not sanction any action that costs money without a cast iron assessment based on valid data. But there is no valid data, the consequences can be predicted based on modelling but there are no ‘facts’ based on historical events……..
  • Most of the rest will agree some action is needed but require an expert assessment of the likely effect and the value proposition for creating contingencies and implementing mitigation activities.

If it ain’t broke, don’t fix it???? 

The challenge facing everyone in management is deciding what to do:

  • Do nothing and respond heroically if needed?
  • Think through the risks and potential responses to be prepared (but wait to see what actually occurs)??
  • Take proactive action and incur the costs, but never being sure if they are needed???

There is no ‘right answer’ to this conundrum, we certainly cannot provide a recommendation because we ‘don’t know’ either.  But at least we know we don’t know!

I would suggest discussing what you don’t know about the consequences of climate change on your organisation is a serious conversation that needs to be started within your team and your wider stakeholder community.

Doing nothing may feel like a good options – wait and see (ie, procrastination) can be very attractive to a whole range of innate biases. But can you afford to do nothing?  Hoping for the best is not a viable strategy, even if inertia in your stakeholder community is intense. This challenge is a real opportunity to display leadershipcommunication and  negotiation skills to facilitate a useful conversation.

5 responses to “Project Risk Management – how reliable is old data?

  1. I find the emphasis on hard data doesn’t fit my experience. Clearly, the past is the foundation of our expectations about the future but, unless one enters into the sort of data gathering and normalising process that IPA Global use for their parametric models, it’s the condensed experience of senior people that lies at the heart of most risk assessment and I expect it will stay that way, especially as the pace of change and complexity of projects accelerates.

  2. Agree with your proposition Steve, but how reliable is ‘experience’ if the circumstances are now different? I know from ‘experience’ you need to allow 2 weeks per annum for inclement weather in Brisbane and 3 weeks in Melbourne (construction projects) but is that knowledge still valid?

    • On the one hand, I don’t see what alternative we have to using experience and judgement. There are too many diverse information requirements to be able to satisfy them all from a database with clean checked data.

      On the other hand, or in addition, one of the reasons there is no master book of risk information is that each job is different to the last. The work of IPA and people like John Hollman manages to find consistent themes that can be used to exploit historical data but only within well defined sectors in which change is gradual, such as materials processing. At the other end of the spectrum, consider the task of assessing uncertainty in a customer facing IT systems development for a bank. Anything more than a couple of years old is ancient history.

      The fact that we can make progress even in the face of such challenges testifies to the capacity of human beings to assimilate and interpret information that is related to but not the same as the work we are about to carry out.

      In my experience, we usually have several sources for an estimate (standard rates, benchmarks, actuals from similar jobs …) and test the uncertainty in the estimate by questioning the validity of the data and the differences between the work on which it is based and the work we are doing. that all takes place in the minds of a team and in the interactions between them as they discuss the matter.

      I might have misinterpreted the initial post but any suggestion that we should try to build and rely on large amounts of historical data seems to be at odds with the dominant trend of the moment, which is towards agility and embracing rapid change in a complex environment. The natural approach for complex environments is not more intense analysis but incremental development with fast moving responsive decision making.

  3. Pingback: New PM Articles for the Week of January 25 – 31 - The Practicing IT Project Manager

  4. Pingback: Project Risk Management: How Reliable Is Old Data? – IT Workforce Journal

Leave a comment