Monday

Tag Archives: decision making process

Practical Decision Making

Project managers seem to be becoming too focused on risk averse decision making and scientific ‘studies’ to determine the right answer in situations where there is probably no right answer (or the ‘scientific knowledge’ is not available). Better information should lead to better decisions but there’s usually competing issues of urgency and the cost of deciding

This is where experience cuts in. Medieval stonemasons knew the proportions needed to construct Gothic Cathedrals centuries before the science of engineering design evolved. These ancient structures were designed according to semi-empirical rules based on few simplified mechanical principles; nevertheless, their structural performance has been rather good in most cases. The ‘rules’ of proportions evolved over centuries and were passed on through the guilds. Similar sets of knowledge informed shipbuilding, and most other technical endeavours before the 19th century.

This accumulation of context specific knowledge is the genesis of the engineering method: Solving problems using heuristics (rule of thumb) that create the best chance of a workable solution, using the available resources, in a poorly understood situation. 

The engineering method can be used to solve practical problems before full scientific knowledge is available and can be particularly useful in most aspects of project management – no one knows the future. But you also need to be prepared to find out you have made a mistake.

To paraphrase René Descartes, Discourse on the Method: ‘Situations in life often permit no delay, and when we cannot determine the course which is certainly best, we must follow the one which is probably the best. …… This frame of mind also freed me from the repentance and remorse commonly felt by those vacillating individuals who are always seeking worthwhile things that they later judge as bad[1].

For more on practical decision making see: https://mosaicprojects.com.au/PMKI-TPI-010.php#Decisions


[1] The full quote is: ‘And thus since often enough in the actions of life no delay is permissible, it is very certain that, when it is beyond our power to discern the opinions which carry most truth, we should follow the most probable; and even although we notice no greater probability in the one opinion than in the other, we at least should make up our minds to follow a particular one and afterwards consider it as no longer doubtful in its relationship to practice, but as very true and very certain, inasmuch as the reason which caused us to determine upon it is known to be so. And henceforward this principle was sufficient to deliver me from all the penitence and remorse which usually affect the mind and agitate the conscience of those weak and vacillating creatures who allow themselves to keep changing their procedure, and practice as good, things which they afterwards judge to be evil’.

Fine Tune your detectors

The quality of any decision you make is determined by the quality of the information and advice you receive. Good information does not necessarily mean a good decision, but bad information will almost certainly lead to a bad decision.

The decision making process and the types of decision a project manager, and almost anybody else, has to make are discussed in WP1053 Decision Making.  The closely aligned process of problem solving in WP1013 . Good information and advice is an essential input to both of these processes.

The right information has the potential to reduce or remove the uncertainty at the centre of every decision. If you are lucky and the information or advice removes all of the uncertainty, then there is nothing left to decide! Usually even with good advice, there is still some uncertainty and you still have to make the decision.

In reality, we rarely if ever have enough information; the challenge is to get as much information as is sensible in the circumstances and then make a timely decision accepting there will inevitably be gaps in your knowledge potentially leading to suboptimal outcomes.

However, simply collecting vast quantities of information does not help (unless you are using data mining). Generally information has no value, unless it has the potential to change your decision! The critical thing in decision making is having the key elements of information available when needed, in a useful form, which improves your awareness of the situation and your ability to decide.

But no information or advice is perfect. Before making use of any information, the decision maker has to evaluate the reliability and accuracy of the information or advice and look for any vested interests or bias on the part of the people developing the information or proposing the advice. Good decision makers usually have very finely tuned ‘bull s**t’ detectors.  And whilst this skill often seems to be innate to an individual many of the skills can be learned.

Some of the elements to consider when weighing up information are:

  1. As a starting point, everyone is biased and most people have vested interests.
    The antidote to bias and vested interests are to consider what effect these influences may have. The more effort someone has committed to developing a set of information, the greater their vested stake in the work. See more on Biases.
  2. Beware of factoids!
    You will be pleased to know, you are one of the 1635 people who have read this post, and as a consequence are now aware of factoids.How do we know this? We don’t. I just made it up; but you can’t call me wrong, because you don’t know, either. A factoid is something that looks like a very precise fact.  The antidote to factoids is source information. Good source information in the statement above would be ‘our web counter shows that you are visitor 1635 to this page’. Start worrying if the source is nebulous ‘our webmaster advises’ or ‘based on a sophisticated time related algorithm…’.
  3. Beware of false precision.
    Almost everything that affects project decisions is a guess, assessment or estimate (the terms are largely synonymous) about something that may occur in the future But no one has precise information about the future! False precision damages credibility (see: Is what you heard what I meant?) and is generally less than useful.  The antidote to false precision is to ask for ranges and the basis of the range statement.
  4. Lies, dam lies and statistics 1.
    Some statistics result from the counting of real things. If you trust the people who do the counting, the math and the reporting, the data is as good as you are going to get. However, most statistics are estimates for a large population, derived from the extrapolation of the results from a small sample. Professional statisticians and pollsters attach a calculated margin of error to their work – this margin is important!  The antidote to false statistics is to ignore any that do not come with a statement of the margin for error and how this was derived.
  5. Lies, dam lies and statistics 2.
    Understand the basis for comparison – it is very easy to distort information. Project A will increase the profit on the sale of widgets by 50% whereas project B will only increase the profit on our training business by 10%, if both projects involve a similar cost outlay which one is best??? You need to know the basis for comparison to answer the question: a 50% increase in profits from a base of $100,000 = $50,000 which is half the value of a 10% increase in profits from a base of $1 million.  The antidote to statistical distortion is to largely ignore percent changes and statements such as ‘fastest growing’, ‘biggest increase’, etc.  It is always easier to be the ‘biggest’ if your starting point is the smallest.
  6. The ‘one-in-a-million’ problem
    Discussed in The role of ‘sentinels’  many ‘one-off’ problems are symptoms of a much deeper issue. Our entire working life is less than 20,000 days so the chances of you encountering a genuine ‘one-in-a-million’ event, just once in your working life, is about 2%. Other phrases that should trigger concern include; ‘she’ll be right’, ‘no-problems’, ‘it’s easy’, etc…  The antidote to these type of expression is to simply reverse the statement:
    – one-off / one-in-a-million = there’s probably a structural cause to be discovered;
    – she’ll be right = I have no idea how to fix it (and its definitely not OK);
    – no-problems = this is a major problem for me;
    – it’s easy = this will be very difficult (unless the ‘easy’ is followed by an explanation of how it is easy).
  7. The false prophet
    False prophecies are allegations and unsubstantiated statements made with the expectation that the ‘expertise’ of the person the statement is attributed to will cover the statement with absolute credibility. If the statement is improbable, it is improbable regardless of the alleged source.  The antidote to false profits being quoted in the ‘third party’; eg, “Einstein said controlled nuclear fusion was easy”; is simply to seek authentication from the source. If the ‘prophet’ is present, ask them for more information.  Real experts know both the upside and the down side of any course of action they are proposing – they understand the uncertainty. Wannabe experts pretend there is no downside or uncertainty.
  1. Well known facts
    Remember, most ‘well known facts’ are in fact commonly held misconceptions (this statement is a factoid but also useful).  The antidote to ‘well know facts’ is to dig deeper and gather actual facts.

These are just a few ways bad advice and information can be introduced into a decision making process. Taking a few minutes to verify the quality of the advice you are being given, ditch the unsound advice and information, and then use what’s left to inform the decision will enhance the probability of making the best decision in the circumstances.  This is not easy to do (but good decisions are rarely ‘easy’); the consolation is once you develop a reputation for having a good ‘bull s**t’ detector, most sensible people will stop trying to use it on you. Then all you need to do is make the right decision.