Monday

Tag Archives: uncertainty

Fine Tune your detectors

The quality of any decision you make is determined by the quality of the information and advice you receive. Good information does not necessarily mean a good decision, but bad information will almost certainly lead to a bad decision.

The decision making process and the types of decision a project manager, and almost anybody else, has to make are discussed in WP1053 Decision Making.  The closely aligned process of problem solving in WP1013 . Good information and advice is an essential input to both of these processes.

The right information has the potential to reduce or remove the uncertainty at the centre of every decision. If you are lucky and the information or advice removes all of the uncertainty, then there is nothing left to decide! Usually even with good advice, there is still some uncertainty and you still have to make the decision.

In reality, we rarely if ever have enough information; the challenge is to get as much information as is sensible in the circumstances and then make a timely decision accepting there will inevitably be gaps in your knowledge potentially leading to suboptimal outcomes.

However, simply collecting vast quantities of information does not help (unless you are using data mining). Generally information has no value, unless it has the potential to change your decision! The critical thing in decision making is having the key elements of information available when needed, in a useful form, which improves your awareness of the situation and your ability to decide.

But no information or advice is perfect. Before making use of any information, the decision maker has to evaluate the reliability and accuracy of the information or advice and look for any vested interests or bias on the part of the people developing the information or proposing the advice. Good decision makers usually have very finely tuned ‘bull s**t’ detectors.  And whilst this skill often seems to be innate to an individual many of the skills can be learned.

Some of the elements to consider when weighing up information are:

  1. As a starting point, everyone is biased and most people have vested interests.
    The antidote to bias and vested interests are to consider what effect these influences may have. The more effort someone has committed to developing a set of information, the greater their vested stake in the work. See more on Biases.
  2. Beware of factoids!
    You will be pleased to know, you are one of the 1635 people who have read this post, and as a consequence are now aware of factoids.How do we know this? We don’t. I just made it up; but you can’t call me wrong, because you don’t know, either. A factoid is something that looks like a very precise fact.  The antidote to factoids is source information. Good source information in the statement above would be ‘our web counter shows that you are visitor 1635 to this page’. Start worrying if the source is nebulous ‘our webmaster advises’ or ‘based on a sophisticated time related algorithm…’.
  3. Beware of false precision.
    Almost everything that affects project decisions is a guess, assessment or estimate (the terms are largely synonymous) about something that may occur in the future But no one has precise information about the future! False precision damages credibility (see: Is what you heard what I meant?) and is generally less than useful.  The antidote to false precision is to ask for ranges and the basis of the range statement.
  4. Lies, dam lies and statistics 1.
    Some statistics result from the counting of real things. If you trust the people who do the counting, the math and the reporting, the data is as good as you are going to get. However, most statistics are estimates for a large population, derived from the extrapolation of the results from a small sample. Professional statisticians and pollsters attach a calculated margin of error to their work – this margin is important!  The antidote to false statistics is to ignore any that do not come with a statement of the margin for error and how this was derived.
  5. Lies, dam lies and statistics 2.
    Understand the basis for comparison – it is very easy to distort information. Project A will increase the profit on the sale of widgets by 50% whereas project B will only increase the profit on our training business by 10%, if both projects involve a similar cost outlay which one is best??? You need to know the basis for comparison to answer the question: a 50% increase in profits from a base of $100,000 = $50,000 which is half the value of a 10% increase in profits from a base of $1 million.  The antidote to statistical distortion is to largely ignore percent changes and statements such as ‘fastest growing’, ‘biggest increase’, etc.  It is always easier to be the ‘biggest’ if your starting point is the smallest.
  6. The ‘one-in-a-million’ problem
    Discussed in The role of ‘sentinels’  many ‘one-off’ problems are symptoms of a much deeper issue. Our entire working life is less than 20,000 days so the chances of you encountering a genuine ‘one-in-a-million’ event, just once in your working life, is about 2%. Other phrases that should trigger concern include; ‘she’ll be right’, ‘no-problems’, ‘it’s easy’, etc…  The antidote to these type of expression is to simply reverse the statement:
    – one-off / one-in-a-million = there’s probably a structural cause to be discovered;
    – she’ll be right = I have no idea how to fix it (and its definitely not OK);
    – no-problems = this is a major problem for me;
    – it’s easy = this will be very difficult (unless the ‘easy’ is followed by an explanation of how it is easy).
  7. The false prophet
    False prophecies are allegations and unsubstantiated statements made with the expectation that the ‘expertise’ of the person the statement is attributed to will cover the statement with absolute credibility. If the statement is improbable, it is improbable regardless of the alleged source.  The antidote to false profits being quoted in the ‘third party’; eg, “Einstein said controlled nuclear fusion was easy”; is simply to seek authentication from the source. If the ‘prophet’ is present, ask them for more information.  Real experts know both the upside and the down side of any course of action they are proposing – they understand the uncertainty. Wannabe experts pretend there is no downside or uncertainty.
  1. Well known facts
    Remember, most ‘well known facts’ are in fact commonly held misconceptions (this statement is a factoid but also useful).  The antidote to ‘well know facts’ is to dig deeper and gather actual facts.

These are just a few ways bad advice and information can be introduced into a decision making process. Taking a few minutes to verify the quality of the advice you are being given, ditch the unsound advice and information, and then use what’s left to inform the decision will enhance the probability of making the best decision in the circumstances.  This is not easy to do (but good decisions are rarely ‘easy’); the consolation is once you develop a reputation for having a good ‘bull s**t’ detector, most sensible people will stop trying to use it on you. Then all you need to do is make the right decision.

The illusion of control: dancing with chance

In a new book called ‘Dance with chance, making luck work for you’ authors Spyros Makridakis, Robin Hogarth and Anil Gaba suggest that people tend to assume they can control much more than they actually do and as a consequence, they underestimate the role of chance.

One of the key ways of dealing with risk is accepting that there are things that you simply can’t control, and one of those things is the future. Underestimating uncertainty has very serious implications for risk management, and project managers should pay special attention to what can be predicted and what we can’t predict. The authors pinpoint two kinds of risks: subways and coconuts. You can plan for the subways, but it is difficult to plan for the coconuts.

  • You can do research and be relatively sure that the subway will be predictable most of the time (but never all of the time!).
  • On the other hand, you know that coconuts fall from trees, but you can’t predict when they will fall or where they will land.

They argue that we have to accept that there are some things that we simply can’t predict and as a consequence, the idea that project managers can control risk is an illusion. “Just accepting that is a huge step. It doesn’t come easily for most people, but it is absolutely the first critical step” Gaba says.

Acceptance is the first part of what the authors call the ‘Triple A’ strategy of accept, assess and augment:

  • First you accept that there are things you can’t control.
  • Then you try to assess the uncertainty and finally
  • Augment your project plans to make sure you manage risk more effectively.

This means using models, independent opinions, internal and external advice and any other means to assess the unknown risks and to make your team nimble and open to change when the unexpected does happen.

In this context, the project schedule and cost plan are two models that can help in the assessment but they neither control the future nor eliminate risk. What these plans should do though is provide a good foundation for implementing a nimble response when the unexpected does happen.

For more on the book see: http://dancewithchance.com/index.html (and any men of a ‘certain age’ should read the blog ‘Testing, testing, testing… is it necessary?’)

To understand the role of schedules and other plans in a 21st century project see: Project Controls in the C21 – What works / What’s fiction

See also:

The Probability of Chance

I have just returned from a trip to Singapore where I was facilitating a workshop to set up the initial risk register and risk management plan for a $1 billion project to deliver one package in a multi billion oil development. The beginning of November is also the Spring Racing Carnival in my home state featuring the Melbourne Cup – the race that literally stops the nation. The combination of these two events and many hours sitting in aeroplanes started me thinking about the difference between project risk and the more widely understood actuarial risks managed by insurance companies and the like.

I have already posted on some of the challenges faced by project risk managers dealing with a single occurrence, the project, using theories based on constrained probability distributions in large populations (see: A Long Tail); and written a number of papers on risk management, see: http://www.mosaicprojects.com.au/Resources_Papers.html#Risk. This post looks at the challenges from a different perspective, how people in project teams perceive and understand probability.

The Singapore workshop started with the consideration of range statements for two sets of parameters, the likely impact of a risk event and the probability of it occurring. The outcomes were quite straightforward:

  • >$20 million was seen as a very high impact risk through to <$500,000 for a very low impact risk.
  • >70% probability was seen as a very high probability through to <5% for a very low probability.

The valuation of a ‘very high impact’ was based on a percentage of the project’s anticipated profit. Interestingly, the project manager for the overall project (some $20 billion investment) thought the monetary values were on the high side but accepted the views of the engineering company I was working with.

The focus of this post is on the difficulty of assessing probability based on limited data for a one off event such as a project. The following simple scenario illustrates the problem:

There are 3 sealed envelopes – one contains $100.

As a starting point, most people would agree there is a 33.33% chance any one of the envelopes will contain the money.

If we open one envelope and it is empty, there is now a 50:50 chance either of the remaining envelops has the money. One does, one does not.

Now to make the situation interesting…….

I give you one envelope and keep two for myself.

As a starting point you have a 33.33% chance of having the money and I have a 66.66% chance – the odds in my favour are 2 envelopes to your 1 envelope

Now I open one of my envelopes and we see it is empty. What does this do to the probabilities?

One perspective says there is now a 50:50 chance the money is in your envelope and 50:50 it is in my envelop – we know it has to be in one or the other and it has not moved.

On the other hand nothing has changed the original starting scenario – the odds in my favour were 2:1 and at least one of my envelopes had to be empty so on this basis is there still twice the probability my remaining envelop has the money compared to yours…… we have done nothing to improve your chances, you still only have one out of the three original envelopes!

Which scenario best represents the situation and why??

Now to make the situation even more interesting….

If I was to offer you $40 for your envelop would taking the money be a good or a bad bet???

If the scenario suggesting a 50:50 chance is true, the Expected Monetary Value (EMV) of your envelope is $100 x 50% = $50

If nothing has changed the starting scenario the EMV of the envelope is $100 x 33.33% = $33.33.

Which option is correct????

Peter de Jager posed a similar question to the PMI Melbourne chapter and favours the 2:1 option remaining true, many of the chapter disagreed.

Any thoughts would be appreciated.

Projects aren’t Projects

Project management is not a one-size-fits-all process or discipline. The PMBOK® Guide makes this clear in Chapter 1. There are at least 4 dimensions of a project,

  • its inherent size usually measured in terms of value;
  • the degree of technical difficulty (complication) involved in the work;
  • the degree of uncertainty involved in defining its objectives; and
  • the complexity of the relationships surrounding the project.

Project Size
The size of the project will impact the degree of difficulty in achieving its objectives but large projects are not necessarily technically complicated or complex. There are projects in Australia to shift millions of cubic meters of overburden from mine sites with expenditures rising to several $million per day but the work is inherently simple (excavating, trucking and dumping dirt), and the relationships in and around the project are relatively straight forward. The management challenges are essentially in the area of logistics.

Technical Difficulty (degree of complication)
Complicated high tech projects are inherently more difficult to manage than simple projects. The nature of the technical difficulties and the degree of certainty largely depend on how well understood the work is. The important thing to remember with complicated work though is that systems can be developed and people trained to manage the complications. The work may require highly skilled people and sophisticated processes but it is understandable and solvable.

Uncertainty
The degree of uncertainty associated with the desired output from the team’s endeavours has a major impact on the management of the project. The less certain the client is of its requirements, the greater the uncertainty associated with delivering a successful project and the greater the effort required from the project team to work with the client to evolve a clear understanding of what’s required for success. This is not an issue as long as all of the project stakeholders appreciate they are on a journey to initially determine what success looks like, and then deliver the required outputs. Budgets and timeframes are expected to change to achieve the optimum benefits for the client; and the project is set up with an appropriately high level of contingencies to deal with the uncertainty. Problems occur if the expectations around the project are couched in terms of achieving an ‘on time, on budget’ delivery when the output is not defined and the expected benefits are unclear. Managing uncertainty is closely associated with and influences the complexity of the relationships discussed below.

Complexity = The People
Complexity Theory has become a broad platform for the investigation of complex interdisciplinary situations and helps understand the social behaviours of teams and the networks of people involved in and around a project. These ideas apply equally to small in-house projects as to large complicated programs. In this regard, complexity is not a synonym for complicated or large. It focuses on the inherent unpredictability of people’s actions and reactions to ideas and information within the network of relationships that form in and around the project team.

Discussion
Size is straightforward and most organisations have processes for assigning more experienced project managers to larger projects. What’s missing is consideration of the other three aspects.

The last item, complexity is very much an emerging area of thought and discussion. For a brief overview see: A Simple View of ‘Complexity’ in Project Management  and for some practical considerations of the impact of complexity theory on scheduling see: Scheduling in the Age of Complexity. However, I expect it will be some years before ‘complexity theory’ and project management sit comfortably together.

Of more immediate interest is the interaction of uncertainty and technical difficulty. Knowing both ‘what to do’ and ‘how to do it’; or more importantly knowing how much you know about these two elements is critically important in establishing a framework to manage a project. Some ideas on this topic will be the subject of my next post.