Monday

Tag Archives: Black Swans

Black Swan Risks

A black swan???

Black swans are becoming popular in far too many places! Not too many people would confuse Daffy Duck with a black swan but when it comes to risks, it seems too many people are prepared to accept anything with black feathers is a black swan….

I have just finished a really interesting discussion with Bob Prieto on the subject. Bob’s article in January’s PM World Today discussing ‘black swan risks’ and my letter to the editor in the February edition set the framework (the PMWT website has closed).

The key definition of a ‘black swan’ proposed by N.N. Taleb is that the ‘black swan’ was unpredicted and unpredictable, but in hindsight it appears that it should have been foreseeable. Birds have a range of different plumages, there is no reason not to presume swans could not have other colours but equally, but in the 18th century, there was no reasonable basis to assume swans would be anything but ‘white’ (view links to further discussion).

Another black swan...???

Following on from the debate, the challenge I see facing management is in two parts firstly providing sufficient rigour in the assessment of risks to encapsulate most of the reasonably foreseeable risks and making appropriate decisions based on a proper understanding of the issues. The key issue in the 2000 Ericsson semiconductor factory fire highlighted in Bob’s article was not the fire, it was the single source of supply, creating a critical single point of failure. Many events: fire, earthquake, industrial, environmental and dozens of other causes could have shut down the factory. Add all of the individual low risk occurrences together and the likelihood of the factory being out of business starts to increase. This seems to be exactly the same scenario that played out in the BP Deepwater Horizon disaster. The compounding effect of multiple individual decisions caused the disaster. Any one decision on its own was probably OK, the combination was not. Understanding the whole cannot be based on rigid rules the interactions are far too complex, Practical Wisdom  is needed.

The second is building adequate resilience into an organisation to cope with both the accepted high impact low probability risks and the unknowable unknowns that are genuine ‘black swans’. This involves having some spare capacity and some rehearsed disaster management processes. This may not be 100% cost effective but the damage caused by having no capacity to deal with major problems is far worse.

Real black swans

The final thought is ensuring you have an effective system for watching the environment to identify as early as possible the emerging problems. As Josh Billings said “It ain’t so much the things we don’t know that get us into trouble. It’s the things we know that just ain’t so!”  The idea of Black Swans is a valuable concept that warns us to expect the unexpected even after we have implemented effective risk management! The only certainty is uncertainty, and we need to remember that we will continue to be surprised even if we have implemented the most effective risk management strategies. For more ideas and resources, visit our Practical Risk Management page.

If it can go wrong……

One derivative of Murphy’s Law is: If it can go wrong it will go wrong, usually at the most inconvenient moment!

Planning the assault

This post may be old news to many European’s but in November 2009, the 27-kilometer (16.8 mile) Large Hadron Collider (LHC), buried under fields on the French/Swiss border, suffered serious overheating in several sections after a small piece of baguette landed in a piece of equipment above the accelerator ring. Dr Mike Lamont, the LHC’s Machine Coordinator, said that “a bit of baguette”, believed to have been dropped by a passing bird (other sources suggest a malicious pigeon), caused the superconducting magnets to heat up from 1.9 Kelvin (-271.1C) to around 8 Kelvin (-265C), close to the level where they stop superconducting.

In theory, had the LHC been fully operational, this could cause a catastrophic breakdown similar to the one that occurred shortly after it was first switched on. Fortunately, the machine has several fail-safes which would have shut it down before the temperature rose too high.

Part of the LHC

Given the total cost of the project to build and commission the accelerator is of the order of 4.6 billion Swiss francs (approx. $4400M, €3100M, or £2800M as of Jan 2010) with an overall budget of 9 billion US dollars (approx. €6300M or £5600M as of Jan 2010), making the LHC is the most expensive scientific experiment in human history. Politicians are probably asking how a bungling bird could target a critical part of the machine with a small piece of bread and shut the whole system down?

A more realistic question for project practitioners is how could design engineers and risk managers be expected to foresee this type of problem? Failure Mode Analysis (FMA) may help but I can just see the reaction to someone in a risk workshop hypothesising that a bird would fly over the machine and drop its dinner precisely on target to cause the maximum damage. Theoretically possible, but hardly plausible would be a polite reaction……until after it happened.

One of the messages from books like ‘The Black Swan’ and from complexity theory  is the future is inherently unpredictable. This is probably as good an example of a ‘Black Swan’ as any I’ve heard of.

For more on the LHC see: http://en.wikipedia.org/wiki/Large_Hadron_Collider