Monday

Risky Business

I came across the following whilst finishing my paper Scheduling in the Age of Complexity.

Statistical and applied probabilistic knowledge is the core of knowledge; statistics is what tells you if something is true, false, or merely anecdotal; it is the “logic of science”; it is the instrument of risk-taking; it is the applied tools of epistemology; you can’t be a modern intellectual and not think probabilistically—but… let’s not be suckers. The problem is much more complicated than it seems to the casual, mechanistic user who picked it up in graduate school. Statistics can fool you. In fact it is fooling your government right now. It can even bankrupt the system (let’s face it: use of probabilistic methods for the estimation of risks did just blow up the banking system).

The quote is from Nassim Nicholas Taleb, author of the Black Swan and Distinguished Professor of Risk Engineering at New York University’s Polytechnic Institute.

Read his full essay, THE FOURTH QUADRANT: A MAP OF THE LIMITS OF STATISTICS, at http://www.edge.org/3rd_culture/taleb08/taleb08_index.html and you will start to understand the current financial crisis.

7 responses to “Risky Business

  1. Pat,
    Your statement…
    “To take one example Monte Carlo analysis can be used to develop an appreciation of the degree of uncertainty in a project. However, the basic assumption underlying the methodology is the presumption of a predictable distribution for the duration of a task (Triangular, Beta, Normal, etc) and as a consequence, the range of outcomes for whole project may be assessed with a degree of certainty. This fundamental assumption is based on a false premise. ”
    Is not actually correct. That is not the purpose for the probability distribution in a Monte Carlo simulator.
    The probability distribution describes the frequency of the values drawn from that distribution by the Latin Hypercube engine powering nearly all MCS products (Risk+, @Risk for Project, and Crystal Ball are 3 I use nearly every day).
    The range of outcomes are defined (depending on the tool) by the upper and lower bounds of the distribution 0/100 for Risk+ and 10/90 for @Risk for Project. From these bounds and the Most Likely value, the random number sampler draws a value.
    You might start with http://www.sceaonline.net/Files/RR-08%20Price_Implementation%20of%20Lurie%20Goldberg.ppt as a backgrounder on the issues associated with sampling from under the distribution curve.
    Next, when you say…
    “A project is not part of a large ‘data set’ it is a unique entity.” It is not the “project” that is being modeled in Monte Carlo, but the possible task distributions. The probability distribution function (PDF) provides the sampling space for the range of possible task durations, and the frequency that they occur. BTW the sampling process, described in the above PPT, makes sure these samples are un-correlated. With a sample drawn, the scheduling tool sets the task duration, and does this for all tasks. Then the F9 key for MSFT Project is “pushed” and new completion dates are calculated, recorded and another sample taken for each task.
    Each task can have its own separate PDF, some risk variances can be modeled separately.
    This “small sample” approach is not related to your “law of large numbers” example of traffic accidents.
    Finally…
    “Similar issues arise with adding additional layers of detail and complexity to the CPM model; the ‘extras’ do not prima facie improve the value of the CPM model.”
    I doubt prima facia improvements result from any simple application. But the layered schedule – CPM is a method, not a “thing” is mandated by the Integrated Master Plan / Integrated Master Schedule approach found in most nations defense business. A good starting point for this approach – layered descriptions of physical progress to plan and increasing maturity – along with the Critical Path – can be found at, http://www.acq.osd.mil/sse/docs/IMP_IMS_Guide_v9.pdf
    The CP is one of several “indicators” of the credibility of the plan. DIDS 81650 requires a Monte Carlo simulation of the Integrated Master Schedule and along with that comes the “probabilistic critical path,” showing not just near-critical path tasks, but which tasks “might” appear on the critical given the statistical nature of other tasks in the network.
    When I hear the term “radically different,” I usually get nervous, when the basis of this radical new approach comes from a potentially weak understanding of the current approaches to probabilistic risk analysis.
    Finally Taleb has many problems. If you Google “Taleb critique criticism” you’ll be introduced to some.
    I see by your outline for the paper, that Monte Carlo is suggested. But care needs to be taken to separate the approaches of the past from the application of those approaches. PERT has statistical problems of under estimating the completion dates – this problem can be found via Google, but here’s a starting point:
    “PERT Completion Times Revisited,” Fred E. Williams, School of Management, University of Michigan–Flint, July 2005, http://som.umflint.edu/yener/PERT%20Completion%20Times%20Revisited.htm
    An advanced guide to “estimating” for both cost and schedule can be found in “Cost Assessment Guide: Best Practices for Estimating and Managing Program Costs,” Exposure Draft, GAO-07-1134SP. Google will take you there with the GAO-07-1134SP phrase.

    Glen B. Alleman
    VP, Program Planning and Controls
    Aerospace and Defense
    Denver, Colorado, USA

  2. Thanks for the feedback Glen, and the valuable links (I have repaired the PERT link as per your advice).

    However, I think you have largely missed the point of this post, the paper I will be presenting at the PMI COS conference in Boston and Taleb’s work. In project space there is no normally occurring data that will allow any valid statistical analysis. All of the data used in a schedule with, or without Monte Carlo, is an informed guess about what might happen at some time in the future.

    The message I am trying to convey is that assuming a certainty in scheduling that does not exist is a recipe for disaster. It is only by management recognising all of the information is wrong that processes can be implemented to manage projects effectively. Certainly well informed guesses are better than wild assumptions so we still need skilled people in ‘project controls’ but none of us can actually control the future.

    As Prof. George E. P. Box said: all models are wrong – some are useful and that the practical question is how wrong do they have to be, to not be useful?

  3. Pat,
    you are correct if there is no historical data. That is when you employee the “ordinal” assesment process for bounding the probabilities distributions. This is common here in space and defense. The best descriptions of this approach are found in
    Effective Risk Management: Some Keys to Success, 2nd Edition, Edmund Conrow, AIAA Press.
    I have worked with Ed on several manned space flight programs where we had little or no history of work durations or costs.
    The Ordinal approach uses a geometric scale for ranking the risk and assigning boundaries.
    Assuming certainty is not allowed in the US DoD. That is established. DID 81650 (Google will find it for you) requires Monte Carlo. It needs to be more widely established outside of US DoD. But all the information is NOT wrong, that is a misconception. Conrow is one of the authors of the DoD Risk Management Guide and frequent contributor to several defense journals on risk and program management.
    David Hillson has similar approaches.
    Clearly projects can be managed effectively. I persoannly work on two (one a $12B manned space flight program and one a $7B US Air Force communictaions systems rebuild). There are many-many more “working” programs than failed ones. Look at the FAA examples in the GAO library for example, or Wayne Abba’s materials or Paul Solomon’s Performance Based Earned Value approachs. We use Earned Schedule on another orbiting vehicle to great benefit with probabilistic estimates for “discovery design” work processes. The ES approach even has DCMA concurrance for use in parallel with the standard EV System Description.
    The notion that “controlling” the future is a red herring, worthy of the best “agile-speak.”
    Managing in the presence of uncertainty is the process used in defense, space, and large construction.
    Look for papers by Michael T. Pich, Christoph H. Loch, and Arnoud De Meyer in INSEAD for guidance in this area – uncertainty.
    The world you paint is much too black and white.
    Lastley there IS normalized data in the project space – it depends on your domain. Your statement is much too broad and without context. We are in NASA and DoD land have 1,000’s of projects, technical estimates, and statistical databases for nearly everything that has ever been built. In my previous role as a PMO for the US Deapartment of Energy, we had extensive database for cost and schedule estimating for large construction projects in the National Nuclear Weapons Complex.
    Context and Domain need to be the preface to any broad statement about nearly anything in the PM world.

  4. Just passing by.Btw, you website have great content!

  5. Pingback: A Long Tail « Aavssitedev’s Blog

  6. Pingback: The illusion of control: dancing with chance « Aavssitedev’s Blog

  7. Pingback: Black Swan Risks | Aavssitedev’s Blog

Leave a comment