Monday

Category Archives: Governance

The governance of project management and organisations

Organizational Governance and Project Controls

For the last 13 years I’ve been part of the team developing delivering the Project Governance and Controls Symposium in Canberra.  In a couple of weeks’ time from the 22nd to 24th August, we will be celebrating the 10th Anniversary symposium at the Canberra Rex Hotel, for more on this see: https://www.pgcsymposium.org.au/ 

The concept of linking project controls and governance may have been seen as something unusual a decade ago, but as an article in this month’s Australian Institute of Company Directors magazine, the topic is (or should be) of significant interest to both senior managers and directors.  In every organization there are a number of projects that are central to the organization’s ability to respond to change, and deliver its strategy. These projects affect the performance of the organization and therefore the performance of the project has legal implications for the organization and its directors and officers. There have been successful prosecutions of numerous organizations that failed to manage project issues effectively.  

Director’s responsibilities

Each director has a core responsibility to be involved in the management of the company and to take all reasonable steps to be in a position to guide and monitor management[1].  This requires information, and under the Corporations Act, the director can rely on information provided by the company’s officers and employees, provided the director has reasonable grounds to believe they are reliable and competent people, and the director has made an independent assessment of the information.  The legislation also includes a positive due diligence obligation in respect to a number of key business activities including financial reporting, OH&S, and the management of ‘mission-critical risks’. 

This means where a project or program has been established to deliver a critical capability the directors need to be across the project and understand its status and predicted outcomes. This of course need information!

Management’s responsibilities

While directors have been subject to legally imposed obligations for decades, management has largely been able to avoid legal liability. This is changing and the legal obligations of company officers and employees who provide information to the board are steadily increasing. The law requires information provided to the board to be complete and accurate.

The Officers of the company will typically include most members of the ‘C-suite’ and may extend to other senior management roles. As an officer, each person is subject to a general statutory duty of care and diligence that applies to all aspects of their role including briefing the board[2].

This was extended in 2019 when the Corporations Act was amended by the Treasury Laws Amendment (Strengthening Corporate and Financial Sector Penalties) Act to create new civil penalties for both corporate officers and employees who mislead the board by providing incorrect information, or by omitting information. This applies to any employee who ‘makes available or gives information, or authorizes or permits the making available or giving of information’ to a director that relates to company affairs. This provision applies to information in any form, all that is required is for the information to be materially misleading, which includes ‘half-truths’.  If the information has been provided without the person taking reasonable steps to ensure that the information is not misleading, they have contravened section 1309 (12) of the Act.  

The reasonable steps include the person being able to show they made all reasonable enquiries under the circumstances and having done so believed the information was reliable, accurate, and not misleading. If these duties are breached ASIC can run civil penalty proceedings against the individuals concerned without having to show they knew the information was materially misleading or intended to mislead. 

What this means for project controls

Within an organization where the delivery of the benefits, or capabilities, created by projects is a core part of the organizations business strategy, information on changes in the expected delivery date and/or cost to complete important projects is likely to be seen as information that is material to the affairs of the company with a particular focus on its continuous disclosure obligations.  Failure to comply with the Corporations Act has consequences for the company[3]. But where the directors were acting reasonably on the information provided to them, liability may well flow down to the officers and employees who provided inaccurate, or incomplete information to the board.

The solution is simple, set up governance and controls systems that provide ACCURATE information[4].

But achieving this is not easy, success requires the right culture, management support, and capable staff. However, even with these factors in place providing correct information is not easy. One of the major challenges is predicting the likely completion date for both ‘Agile’ and ‘distributed’ projects where traditional CPM simply does not work! And without knowing the overall timeframe, any cost predictions are questionable. Using EVM and ES is of course one solution that’s ideal for larger projects; a simpler, more pragmatic option for most normal projects is to use WPM to calculate the current status and projected end date. For more on WPM see: https://mosaicprojects.com.au/PMKI-SCH-041.php#WPM

This is just a brief overview, there are two ways to find out more:

  1. Attend PGCS on the 22nd to 23rd August, in-person or virtually: https://www.pgcsymposium.org.au/
     
  2. Make use of the free information on Governing the organization’s Projects, Programs and Portfolios at:  https://mosaicprojects.com.au/PMKI-ORG-005.php#Process3

[1] See the ‘Centro case’: ASIC -v- Healey [2011] FCA 717.

[2] In ASIC -v- Lindberg [2012] VSC 322, the former CEO of the Australian Wheat Board admitted to failing to inform the board of key issues.

[3] In 2006 Veterinary pharmaceuticals company Chemeq Ltd paid a $500,000 fine, in part for failing to keep the market informed of cost overruns and delays on a project to construct its manufacturing facility [Re Chemeq [2006] FCA 936].

[4] For a definition of ACCURATE Information see: https://mosaicprojects.com.au/Mag_Articles/SA1055_ACCURATE_Information.pdf  

Risk mitigation requires courage – How Cockcroft’s Folly saved 100s of lives!

One of the speakers at PGCS 2023 is Alex Walsh, his presentation Managing wicked program delivery looks at the UK nuclear program to decommission the Sellafield complex, one of the most complex high hazard nuclear facilities in the world that was operating from the 1940s through to 2022. For more on this presentation and the PGCS program see: https://www.pgcsymposium.org.au/.

As part of my work on preparing the PGCS program, I had a virtual look at this project and came across this fascinating risk mitigation story where the courage of two managers probably saved hundreds of lives in the North of England.

The site

Sellafield, formerly known as Windscale, is a large multi-function nuclear site close to Seascale on the coast of Cumbria, in NW England. As of August 2022, primary activities are nuclear waste processing and storage and nuclear decommissioning. Former activities included plutonium production for nuclear weapons, nuclear power generation from 1956 to 2003, and nuclear fuel reprocessing from 1952 to 2022.

After the war ended, the Special Relationship between Britain and the United States “became very much less special”. The British government saw this as a resurgence of United States isolationism which raised the possibility that Britain might have to fight an aggressor alone. It also feared that Britain might lose its great power status, and therefore its influence in world affairs, so in July 1946, the Chiefs of Staff Committee recommended that Britain acquire nuclear weapons.

Two reactors (called ‘piles’ at the time) were constructed to enrich uranium to create plutonium and other isotopes. The designers of these reactors desired a passively safe cooling system. In place of water, they used air cooling driven by convection through a 400-foot (120 m) tall chimney, which could create enough airflow to cool the reactor under normal operating conditions. The chimney was arranged so it pulled air through the channels in the reactor core, and huge fans were positioned in front of the core, to greatly increase the airflow rate.

The risk

During construction, physicist Terence Price considered the possibility of a fuel cartridge splitting open, causing the hot uranium to catch fire, resulting in fine uranium oxide dust being blown up the chimney and escaping into the environment.

Raising the issue at a meeting, he suggested filters be added to the chimneys, but his concerns were dismissed as too difficult and too expensive to deal with. However, Sir John Cockcroft, leading the project team, was sufficiently alarmed to order the filters.

They could not be installed at the base as construction of the chimneys had already begun, and were constructed on the ground then winched into position at the top once the chimneys were complete. They became known as Cockcroft’s Folly as many regarded the delay they caused and their great expense to be a needless waste.

This all changed after the Windscale fire of 10th October 1957. This fire was the worst nuclear accident in the United Kingdom’s history, and one of the worst in the world. The fire was in Unit 1 of the two-pile Windscale site and burned for three days releasing radioactive fallout which spread across the UK and the rest of Europe[1].

But, the filters trapped about 95% of the radioactive dust and arguably saved much of northern England from becoming a nuclear wasteland. With typical British understatement, Terence Price said “the word folly did not seem appropriate after the accident“.

The UK government under Harold Macmillan ordered original reports into the fire to be heavily censored and information about the incident to be kept largely secret. It later came to light that small but significant amounts of the highly dangerous radioactive isotope polonium-210 were released during the fire. But the presence of the chimney scrubbers at Windscale was credited with minimising the radioactive content of the smoke.

Both the ‘piles’ were shut down after the fire, but a large quantity of radioactive materials are still inside the sealed #1 pile; this is one of the challenges for the decommissioning program Alex will be speaking about at PGCS in a couple of weeks’ time.

More relevant to this post though is the moral courage exhibited by Sir John Cockcroft in doing the right thing rather than the easy thing to guard against an accident that ‘could not happen’, but did! Thinking through this dilemma puts a whole new perspective on risk assessment and mitigation – in the right circumstances ‘black swans’ can kill.

For more on risk management see: https://mosaicprojects.com.au/PMKI-PBK-045.php


[1] For more on the fire see: https://en.wikipedia.org/wiki/Windscale_fire

Predicting project outcomes is important!

The recent cancellation of the 2026 Commonwealth Games by the Victorian Government is a dramatic example of using predicted project outcomes to minimize damage to an organisation. The escalation in the predicted cost of delivery from $2.6 billion to above $6 billion suggests the original bid was wildly optimistic and discovering how such an error occurred should be worthy of enquiry, but that is not the focus of this post.

Once the predicted costs moved to a point where there was no benefit in continuing with the project, it was terminated. While the cancellation could have been done earlier and far more elegantly the fact remains cancelling the project was by far the best decision.

However, to be able to make this type of call, management need information they can rely on. Even then the decision is not simple.

Cost considerations

A decision to cancel a project has to balance: sunk costs, the cost to complete, the expected benefits, and the cost of not completing the project.  The three elements that matter are the cost to complete vs the benefits (sunk costs are lost either way), and the costs of not completing the project.

For more on sunk costs see: https://mosaicprojects.com.au/Mag_Articles/P022_Sunk_Costs.pdf

Time considerations

Time is usually a secondary consideration but can be vital – the Commonwealth Games facilities would need to be open before the games start!  For more normal projects knowing the current projected completion date and the variance at completion are still important for two reasons.

First, the cost of time matters and needs to be included in the cost to complete estimate. Delayed completion may also impact benefits.

Second and more important, time issues tend to emerge as problem well before cost issues show up. A project that is losing time and is expected to finish late will almost inevitably show negative cost variances sooner or later.  Conversely, fixing the root cause of the time issues will often have a positive effect on the overall costs as well.

The problem is most projects do not run systems that are capable of producing a reliable prediction of the expected completion date. Our recent paper Calculating Completion looked at seven different methodologies for managing a project, only two gave reliable predictions of the completion date; these were Earned Schedule and Work Performance Management.

Earned Schedule (ES) was the best option, but implementing ES requires a significant investment in skills and systems. 

Work Performance Management (WPM) achieved similar results to ES without the overhead. All that is required to use the WPM spreadsheet is three bits of information.  To set up the WPM model the amount of work to be accomplished and the time allowed is needed, any metric can be used provided it is applied consistently. This baseline gives you the amount of work expected to be accomplished by a given date. The other bit of information is the actual amount of work achieved by the date.  From this data the predicted completion date is calculated.

The assumption built into WPM is that work will continue at the current rate. If the result is not acceptable, management needs to do something to change the rate of working. If this is not feasible, then the viability of the project needs to be considered and/or the baseline reset to what is achievable. For more on WPM see: https://mosaicprojects.com.au/PMKI-SCH-041.php#Overview

Built to last

This is a cross section of a Roman Road some 1900 years after it was built. The Fosse Way linked Isca Dumnoniorum (Exeter) in the southwest of England and Lindum Colonia (Lincoln) to the northeast, via Lindinis (Ilchester), Aquae Sulis (Bath), Corinium (Cirencester), and Ratae Corieltauvorum (Leicester). Built in the first and second centuries, this cross section excavated in the early 1900s demonstrates the standard road building techniques of the time. The surface ruts show the road was heavily trafficked for an extended period*, and the road was clearly built to last. 

Compared to many modern road construction techniques that seem to need continuous maintenance was the high level of initial investment repaid by well over a millennium of use? The concepts of good substrata, good drainage and a long-lasting wearing course are the same today as in Roman times, so why are we building roads designed to fail? Are the cost horizons too short??

A similar concept is Roman cement. The Roman recipe, a mix of volcanic ash, lime (calcium oxide), seawater and lumps of volcanic rock, held together piers, breakwaters and harbours (as well as structures such as the Pantheon) for centuries. In contrast to modern materials, these ancient structures became stronger over time. 

The chemical processes involved in the cement are known[1], and what we consider corrosion processes can produce extremely beneficial mineral cement and lead to continued resilience over time. The study of Roman cement offers clues for a concrete recipe that does not rely on the high temperatures and carbon dioxide production of modern cement, while providing a blueprint for a durable construction material, particularly for use in marine environments.

Everyone is talking about ESG and we are seeing ‘green buildings’ with bits of timber bolted onto the exterior to get a ‘green star’ – with a life span of 50 years if you are lucky…… 

The 30-year old timber facade at Melbourne Central.

Is it time to start thinking about long term durability and building for 500 to 1000 years with a view to repurposing rather than recycling?

For more on Green Building see: https://mosaicprojects.com.au/PMKI-TPI-005.php#GB

* Note: Despite this photograph proving Roman roads developed cartwheel ruts, there is no support for the common myth that these ruts are linked to the creation of standard guage railways (correlation is not causation!), see: https://mosaicprojects.com.au/Mag_Articles/AA016_The_Origins_of_Standard_Gauge_Railways.pdf


[1] See: Jackson, Marie D., Mulcahy, Sean R., Chen, Heng, Li, Yao, Li, Qinfei, Cappelletti, Piergiulio and Wenk, Hans-Rudolf. “Phillipsite and Al-tobermorite mineral cements produced through low-temperature water-rock reactions in Roman marine concrete” American Mineralogist, vol. 102, no. 7, 2017, pp. 1435-1450. https://doi.org/10.2138/am-2017-5993CCBY

The 19th century Spanish Prisoner Swindle!

Every improvement in technology leads to a new way of parting people from their money, and it appears gullible victims can still be found after at least 150 years of swindling.

One of the first technological advances that allowed direct communication to individuals occurred in the 19th century.  In the 1840s Britain introduced a pre-paid national postal service and the ‘Penny Black’ postage stamp, you could post a letter to anyone and expect it to be delivered. Shortly thereafter, this new service was being used to scam unsuspecting victims, and as similar postal services were established in other countries, the scam spread.

The ‘Spanish Prisoner’, as the name suggest, was operated by criminals based in Spain. Using trade directories to obtain names and addresses of people, they sent out hundreds of letters across Britain spinning a tale of a person held in a Spanish prison. In other parts of the world, similarly close, but difficult to access places became the location of the ‘prison’. 

Generally, the story was that a former military officer was being held prisoner in a Spanish prison. He wrote that his father or grandfather was English, but he had entered the military of service of Spain and had been wrongly accused of stealing money. He was now seriously ill and was in fear of death.

He would announce that he had a daughter who needed looking after and in his Will had appointed the recipient of the letter to be his daughter’s guardian. Furthermore, he had a large sum of money hidden away which needed to be recovered from a secret location. Once the victim responded sympathetically to what they thought was a sincere and truthful story, money would be requested to pay for the daughter’s travel expenses, etc. All totally fictitious of course.

As with modern scams, there is no doubt that victims were found. There are numerous newspaper articles starting around 1876 which refer to the fraud and how victims had been taken in, and the occasional article detailing a successful police response.  

It appears some things never change:

–  Developers of new technologies rarely think about potential abuses

–  Criminals are always early adopters

–  People get caught out.

This post is outside of our normal range focused on the history of projects and allied disciplines but this subject does raise questions around the ethical responsibility of people developing new processes and technologies. For more on the evolution of ethics see: https://mosaicprojects.com.au/PMKI-ZSY-015.php 

New Articles posted to the Web #91

We have been busy beavers updating the PM Knowledge Index on our website with Papers and Articles.   Some of the more interesting uploaded during the last couple of weeks include:

You are welcome to download and use this information under our free Creative Commons licence.

Visit our PMKI Library for free access to many more papers and articles: https://mosaicprojects.com.au/PMKI.php

Do 80% of organizations average a project failure rate of 80%?

The answer to this question depends on how you perceive success and failure.  Our latest article published in the May PM World Journal offers several possible alternatives.

However, reflecting on data in the article shows a worrying trend:

  1. Using traditional measures, 80% of organizations do appear to have project failure rates averaging around 80%, but this is not the perception of most managers in those organizations.
  2. Organizations that manage projects successfully achieve a significant cost-benefit over those that do not. Poor project delivery is directly linked to higher project costs.

Therefore, the long sought after answer to Cobb’s Paradox can at last be unveiled:

If 75% of the managers in poorly performing organizations believe their projects are being delivered successfully, they have no reason to invest in improving project delivery capability. Outsiders may see project failure and know how to improve the organization’s systems to prevent future failures, but the majority of the managers in the organization cannot see, or will not see, there is a problem that needs fixing.  The answer to Cobb’s paradox: ‘We know why projects fail, we know how to prevent their failure — so why do they still fail?’ is the responsible managers do not perceive their projects as failing and therefore will not invest in solving a problem they cannot, or will not, acknowledge. Changing this flawed perception is a major governance challenge.  

Download Do 80% of organizations average a project failure rate of 80%?

For more papers on project governance see: https://mosaicprojects.com.au/PMKI-ORG-005.php#PPP-Success

Measuring Project Success

Can a project be years late, £ Billions over budget, and a success?  It appears the answer to this question depends on your perspective.

In a recent post looking at project success, we identified a significant anomaly between the percentage of projects classed as failing and the perception of executives.  In round terms some 70% of projects are classified as failing, but over 70% of executives think their organization deliver projects successfully. You can read the updated version of this post at: Do 80% of organizations average a project failure rate of 80%?

A number people providing feedback on the original post suggested this anomaly could be caused by different perspectives of project success. This concept has been identified by many people over the years as the difference between project management success (on time and on budget), verses project success (the delivery of value to stakeholders).  These different types of project success are briefly discussed in Achieving Real Project Success.

Measuring project management success.

However, the concept of project management success, typically measured as delivering the project on time and on budget, raises its own set of challenges. 
Consider the following:

Two different organizations own the same commercial software, and decide to implement the latest upgrade, essentially two identical projects to deliver similar benefits to two separate organizations.

  • Organization A estimates their project will cost $110,000 and the work is approved.
  • Organization B estimates their project will cost $80,000 and the work is approved.

Both projects complete the work successfully and on time. The final project costs are compiled and the close-out reports completed:

  • Organization A has a final cost of $100,000 – $10,000 under budget, and the project is declared a success!
  • Organization B has a final cost of $90,000 – $10,000 over budget, and the project is declared a failure! 

But Organization B has achieved the organizational benefits of the upgrade for $10,000 less than Organization A – which project was really successful?

The above example is simplistic but clearly shows the need for better processes to define project success and failure. Comparing the estimated time and cost at the start with the achieved outcomes can be very misleading. On time and on budget may be valid measures for a contractor delivering a project under a commercial contract that defines a fixed time and cost for completion, but for everyone else the question of success is more nuanced. 

For example, consider the Crossrail Project in London, initiated in 2008 to deliver the new Elizabeth Line, it is years late and £Billions over budget. But since its partial opening in May 2022, more than 100 million journeys have been made on the Elizabeth Line, currently around 600,000 journeys are made every day. This patronage is above forecast levels and the project is on track to break even by the end of the 2023/24 financial year. Even the British Tabloid press are declaring the Elizabeth Line a success despite the final upgrade to achieve 100% of the planned service frequencies not happening until 21st May 2023 (for our thoughts on Crossrail over the years see: https://mosaicprojects.com.au/PMKI-ITC-012.php#Crossrail).

The May 2023 upgrade will mark the successful completion of the Crossrail project and its final transition to operations. This is some 4 ½ years after the original planned opening date in December 2018 and £4+ Billion over budget – so who gets to declare it a success and what is the basis for measuring this? When we looked at this question in  Success and Stakeholders our conclusion was success is gifted to you by your stakeholders, you have to earn the gift by delivering the project, but there is no way of knowing for sure if it will be considered successful. The Elizabeth Line has achieved the accolade of successful from its stakeholders, but this is hardly a scientific measure, or an effective KPI for general use. Which poses the question how do you realistically measure project success? Asking the question is easy, finding a generally applicable answer is not – any ideas??
For more on defining project success see: https://mosaicprojects.com.au/PMKI-ORG-055.php#Success 

Project Governance Challenges – Delusions or Data Errors

Note: This post has been updated and augmented, for the latest version see: Do 80% of organizations average a project failure rate of 80%?

This post is not intended to provide precise numbers, rather to highlight an intriguing anomaly that could benefit from some structured research.  Over many years, and many different reports, based on different survey methods, we regularly see the following data presented:

  1. Far more projects fail than succeed, the ratio is typically around 30% success 70% fail.
     
  2. There are some organizations that routinely achieve project success, these are slowly increasing as an overall percentage and currently sit at around 20% of the organizations that ‘do projects’.

  3. The vast majority of executives surveyed think their organization manages its projects successfully.  The percentage of executive with this view seems to sit comfortably above 80%.

But, unless there is a major distortion in one or more of the data sets, these data are mutually incompatible!

If 20% of organizations that ‘do projects’ get most of their projects delivered successfully, it means this group have to account for at least half of the 30% of successes, which pushes the ratio of fails for the rest of the organizations to 15:65 = 19% success vs 81% fails.

In round numbers 80% of the organizations doing projects, have a failure rate of around 80%.

But if more than 80% of executives feel their organizations deliver projects successfully this data suggests that some 60% of these executives are seriously misinformed. So, my question is why do some 75% of executives in the 80% of organizations that routinely fail to deliver projects successfully appear to believe the opposite?  The answer to this question probably sits in the complex area of communication failures caused by organizational culture and governance issues, for more on this see: https://mosaicprojects.com.au/PMKI-ORG.php

This assessment also helps explain why so many organizations simply do not invest in systems to improve project delivery. There is no point in spending money to fix a problem the executives cannot acknowledge. So whereto from here??

The answer will not be easy. To quote from the 2018 PMI Pulse of the profession survey: “There is a powerful connection between effective project management and financial performance. Organizations that are ineffective with project management waste 21 times more money than those with the highest performing project management capabilities. But the good news is that by leveraging some proven practices, there is huge potential for organizations to course correct and enhance financial performance.”  But it appears that while the people setting their organizations strategy, culture and governance systems may be aware of this, a large percentage do not believe it applies to them – their projects are managed appropriately, even if 80% of them fail. 

Changing the culture to implement effective project governance and controls needs executive support! For more on the strategic management of projects and programs see: https://mosaicprojects.com.au/PMKI-ORG-015.php#Process1

Notes:

  1. First, I am fully aware of the ‘Flaw of Averages’, and the resulting problems in the way the calculations in this post have been made. But in the absence of an integrated data set for proper statistical analysis, I believe the trends highlighted above are valid indicators of a problem. What is needed to test these indicators is a proper survey that contrasts executive opinions against project success rates across a large sample of organizations.

  2. The second issue is the sample of executives surveyed. Most of the data I have seen comes from ‘opt-in’ surveys which is likely to bias the sample towards executives that consider projects important.

Estimating Updates

Over the last couple of weeks, we have been updating the estimating pages on our website, partly in response to the #NoEstimating idiocy.

There is no way an organization that intends to survive will undertake future work without an idea of the required resources, time, and cost needed to achieve the objective and an understanding of the anticipated benefits – this is an elementary aspect of governance. This requires estimating! BUT there are two distinctly different approaches to estimating software development and maintenance:

1.  Where the objective is to maintain and enhance an existing capability the estimate is part of the forward budgeting cycle and focuses on the size of the team needed to keep the system functioning appropriately.  Management’s objective is to create a stable team that ‘owns’ the application. Methodologies such as Scrum and Kanban work well, and the validity of the estimate is measured by metrics such as trends in the size of the backlog.  For more on this download De-Projectizing IT Maintenance from: https://mosaicprojects.com.au/PMKI-ITC-040.php#Process1

2.  Where the objective is to create a new capability, project management cuts in.  Projects need an approved scope and budget which requires an estimate! The degree of detail in the estimate needs to be based on the level of detail in the scope documents. If the scope, or objectives, are only defined at the overall level, there’s no point in trying to second guess future developments and create an artificially detailed estimate. But, with appropriate data high level estimates can be remarkably useful. Then, once the project is approved, normal PM processes cut in and work well. Some of the sources of useful benchmarking data are included in our update estimating software list at: https://mosaicprojects.com.au/PMKI-SCH-030.php#Cost

The #NoEstimating fallacies include:

The fantasy that software is ‘different’ – its not! All projects have a degree of uncertainty which creates risk. Some classes of project may be less certain than others, but using reliable benchmarking data will tell you what the risks and the range of outcomes are likely to be.

Estimates should be accurate – this is simply WRONG (but is a widely held myth in the wider management and general community)! Every estimate of a future outcome will be incorrect to some degree.  The purpose of the estimate is to document what you thought should occur which provides a baseline for comparing with what is actually occurring. This comparison highlights the difference (variance) between the planned and actual to create management information. This information is invaluable for directing attention towards understanding why the variance is occurring and adjusting future management actions (or budget allowances) to optimize outcomes.

Conclusion

The fundamental flaw in #NoEstimating is its idiotic assumption that an organization that commits funding and resources to doing something without any concept of how long its is going to take, or what it will cost will survive.  Good governance requires the organizational leadership to manage the organization’s assets for the benefit of the organization’s stakeholders. This does not preclude risk taking (in many industries risk taking is essential). But effective risk taking requires a framework to determine when a current objective is no longer viable so the work can be closed down, and the resources redeployed to more beneficial objectives. For more on portfolio management and governance see: https://mosaicprojects.com.au/PMKI-ORG.php  

In summary #NoEstimating is stupid, but trying to produce a fully detailed estimate based on limited information is nearly as bad.  Prudent estimating requires a balance between what is known about the project at the time, a proper assessment of risk, and the effective use of historical benchmarking data to produce a usable estimate which can be improved and updated as better information becomes available.  For more on cost estimating see: https://mosaicprojects.com.au/PMKI-PBK-025.php#Process1