Monday

Author Archives: Pat Weaver

Waterfall is Dead

The PMI 2024 Pulse of the Profession has introduced a framework for categorizing projects based on the management approach being used of: Predictive – Hybrid – Agile.  If generally adopted, this framework will at long last kill of the notion of waterfall as a project delivery methodology.

As shown in our historical research The History of Agile, Lean, and Allied Concepts, the idea of waterfall as a project delivery methodology was a mistake, and its value as a software development approach was limited.

The PMI framework has some problems but the predictive project delivery paradigm is described as focused on schedule, scope, and budget. The projects tend to use a phase-based approach and are plan driven.  This describes most hard projects and many soft projects that are not using an unconstrained agile approach.

For a detailed review of the PMI 2024 Pulse of the Profession report, and how the classification system works see How should the different types of project management be described?, download from: https://mosaicprojects.com.au/Mag_Articles/AA026_How_should_different_types_of_PM_be_described.pdf

For more on project classification see: https://mosaicprojects.com.au/PMKI-ORG-035.php#Class

WPM for Lean & Distributed Projects

The core concept underlaying the Critical Path Method (CPM) is there is one best way to undertake the work of the project and this can be accurately modelled in the CPM schedule. This premise does not hold for either distributed projects, or projects applying Lean Construction management. These two types of projects differ, lean is a management choice, whereas distributed projects are a physical fact:
–  Distributed projects are ones where the physical distribution of the elements to be constructed means significant amounts of the work can be done in any sequence and changing the sequence when needed is relatively easy.
–  Lean construction is a project delivery process that uses Lean methods to maximize stakeholder value and reduce waste by emphasizing collaboration between everyone involved in a project. To achieve this the work is planned and re-planned as needed by the project team focusing on optimising production.

In both cases, the flexibility in the way the detailed work is performed and the relative ease with which the sequence can be changed means CPM is ineffective as a predictor of status and completion.

Our latest article WPM for Lean & Distributed Projects looks at how Work Performance Management (WPM) can be used to assess both the current status and the projected completion for these types of project, regardless of the number of sequence changes made to the overall plan.

Download WPM for Lean & Distributed Projects from: https://mosaicprojects.com.au/PMKI-SCH-041.php#WPM-Dist

See more on WPM as a valuable tool to add to you project controls system: https://mosaicprojects.com.au/PMKI-SCH-041.php#Overview

Ethics and Governance in Action


The best governed organizations will have ethical failures, even criminal activities, occurring from time to time. When an organization employs 1000s of people there will always be some who make mistakes or choose to do the wrong thing.  The difference between a well governed organization with a strong ethical framework and the others is how they deal with the issues.

The Bad

Over the last few months there has been a lot of commentary on major ethical failures by some of the ‘big 4’ accountancy firms (see: The major news story everyone missed: KPMG hit with record fine for their role in the Carillion Collapse). With a common theme being attempts by the partners running these organizations to minimize their responsibility and deflect blame. As a consequence, there have been record fines imposed on KPMG and massive long-term reputational damage caused to PWC by the Australian Tax Office scandal.

The Good

The contrast with the way the Jacobs Group (Australia) Pty Ltd (Jacobs Group) has managed an equally damaging occurrence could not be starker! Jacobs Group had pleaded guilty to three counts of conspiring to cause bribes to be offered to foreign public officials, contrary to provisions of the Criminal Code Act 1995 (Cth). But, the exemplary way this issue has been managed is an example for all.

Offering bribes to foreign public officials has been a criminal offence in Australia since 1995, and the Crimes Legislation Amendment (Combatting Foreign Bribery) Bill 2023 has just passed into law significantly increasing penalties.

Despite this, between 2000 and 2012, SKM was involved in two conspiracies in the Philippines and Vietnam. Both conspiracies involved employees of SKM’s overseas development assistance businesses (the SODA business unit) paying bribes to foreign public officials in order to facilitate the awarding of public infrastructure project contracts to SKM. SKM employees embarked on a complex scheme to conceal the bribes by making payments to third party companies, and receiving fake invoices for services which were not in fact rendered. The conduct was known to and approved by senior persons at SKM, although concealed from the company more widely.

Jacobs Group acquired SKM in 2013, after the conduct had ceased. During the vendor due diligence processes, the conduct came to the attention of persons outside those involved in the offending, and the company’s external lawyers.

Despite the lawyers findings being subject to legal privilege, and the very remote possibility of the Australian Authorities discovering the crime, the non-conflicted directors unanimously voted to self-report the findings to the Australian Federal Police (AFP), to waive legal privilege in the draft report, and to make it available to the AFP. The company also reported the findings of its investigation to a number of other authorities, including the World Bank, Asian Development Bank, AusAid, and ASIC.

The company and a number of individuals were charged in 2018, and Jacobs pleaded guilty to three counts of conspiring to cause bribes to be offered to foreign public officials. The matter only came to our attention because of a recent High Court ruling dealing with technical issues around the calculation of the fine to be paid by Jacobs.

When Justice Adamson in the New South Wales Supreme Court sentenced the company on 9 June 2021. She found that while each of the offences committed fell within the mid-range of objective seriousness for an offence, this was mitigated by the fact that the company had self-reported the offending to authorities, and that the self-reporting was motivated by remorse and contrition rather than fear of discovery. The sentencing judge also found that the conduct was not widespread, and effectively limited to the SODA business unit. She accepted evidence from the AFP that it was unlikely to have become aware of the conduct absent the company’s self-reporting, and that the company’s post offence conduct was “best practice” and “of the highest quality”.

Based on these findings the amount of the fine to be paid by Jacobs is likely to be in the region of $3 million – a massive discount from the potential maximum that, based on the High Court decision, is likely to exceed $32 million.

Lessons on Governance and Ethics

The approach taken by Jacobs Group, following the identification of potential criminal conduct, is a useful guide as to how an ethical organization works:

  1. The prompt retention of independent external lawyers to investigate suspected instances of criminal misconduct.
  2. The decisions of the board of directors to self-report the conduct to authorities and provide ongoing assistance and cooperation to law enforcement and prosecutorial authorities, notwithstanding the risk of criminal sanction.
  3. Committing to remediation steps to address the conduct (and seeking to prevent any repeat of it), including by overhauling relevant policies and procedures and making appropriate operational changes including:
  • suspending and then terminating relevant individual employees who had participated in the conduct;
  • operational changes to management and oversight of the SODA business unit that had been involved in the conduct, and changing approval processes for all payments by that unit;
  • introducing a new Code of Conduct which explicitly prohibited the offering of inducements to public officials;
  • introducing a requirement for the completion of a bribery and corruption risk assessment before committing to new projects;
  • upgrading various internal policies, including the company’s whistleblower, donations and gifts and entertainment policies. It also introduced new policies which discouraged the use of agents, and required the screening of all new suppliers and sub-consultants for bribery and corruption risk. The company also engaged an independent monitor to review the changes made to its policies;
  • updating and expanding existing bribery and corruption training programs for staff; and
  • modifying internal audit practices to more closely scrutinize non-financial risks, such as bribery and corruption.

One definition of ethical behaviour is doing the right thing when no one is looking. The contrast between Jacobs and KPMG’s outcomes is a lesson worth remembering.

For more on governance and organizational ethics see: https://mosaicprojects.com.au/PMKI-ORG-010.php#Overview

White Constructions v PBS Holdings Revisited

White Constructions Pty Ltd v PBS Holdings Pty Ltd [2019] NSWSC 1166, involved a claim for delay and costs arising out of a contract to design a sewerage system for a subdivision and submit it for approval. The alleged breach was the failure to create and submit a sewer design acceptable to the approval authority which had the effect of delaying completion of the subdivision, giving rise to a claim for damages by White.

White and PBS both appointed experts to undertake a schedule analysis, and they did agree an ‘as-built’ program of the works but disagreed on almost everything else including the delay analysis method to use, the correct application of the methods, and the extent of the overall project delay caused by the delays in approving the sewer design.

The Judge found:

[Clause 18]      Plainly, both experts are adept at their art. But both cannot be right. It is not inevitable that one of them is right.
[Note: This approach is consistent with the UK court decision of Akenhead J in Walter Lilly & Company Ltd v Mckay [2012] EWHC 1773 (TCC) at [377], “the court is not compelled to choose only between the rival approaches and analyses of the experts. Ultimately it must be for the court to decide as a matter of fact what delayed the works and for how long”. This precedent has been followed on a number of occasions[1].]

[Clause 22]      The expert reports are complex. To the unschooled, they are impenetrable. It was apparent to me that I would need significant assistance to be put in a position to critically evaluate their opinions and conclusions.

[Clause 25]      Under UCPR r 31.54, the Court obtained the assistance of Mr Ian McIntyre (on whose appointment the parties agreed).

[Clause 137]   The major components of the works were:
       • earthworks,
       • roadworks and kerbing,
       • sewerage,
       • electrical and National Broadband Network (NBN) installation,
       • footpaths, and
       • landscaping.,

[Clause 138]   The electrical and NBN installation was contracted to and carried out by an organisation called Transelect. Landscaping was contracted to RK Evans Landscaping Pty Ltd. The as-built program is not in dispute.
[Note: the rest of the work was undertaken by other contractors]

[Clause 184]   White bears the onus of establishing that it suffered loss and the quantum of it.

[Clause 185]   White’s damages are based on delay to the whole project, said to be attributable to the late (underbore) sewer design. This is not the type of subject upon which precise evidence cannot be adduced. [Therefore] It is not a subject which involves the Court having to make an estimation or engage in some degree of guesswork.

[Clause 188]   The descriptions of the methods adopted by Shahady and Senogles respectively are evidently derived from the publication of the United Kingdom Society of Construction Law, the Delay and Disruption Protocol….

[Clause 191]   Mr McIntyre’s opinion, upon which I propose to act, is that for the purpose of any particular case, the fact that a method appears in the Protocol does not give it any standing, and the fact that a method, which is otherwise logical or rational, but does not appear in the Protocol, does not deny it standing.
[Note: this is the same wording as an express statement contained in the Delay and Disruption Protocol]

[Clause 195]   Mr McIntyre’s opinion, upon which I propose to act, is that neither method [used by the parties experts] is appropriate to be adopted in this case.

[Clause 196]   Mr McIntyre’s opinion, upon which I propose to act, is that close consideration and examination of the actual evidence of what was happening on the ground will reveal if the delay in approving the sewerage design actually played a role in delaying the project and, if so, how and by how much. In effect, he advised that the Court should apply the common law common sense approach to causation In effect, he advised that the Court should apply the common law common sense approach to causation referred to by the High Court in March v E & MH Stramare Pty Ltd (1991) 171 CLR 506.

[Clause 197]   The Court is concerned with common law notions of causation. The only appropriate method is to determine the matter by paying close attention to the facts, and assessing whether White has proved, on the probabilities, that delay in the underboring solution delayed the project as a whole and, if so, by how much.

[Clause 198]   This requires it to establish that:
• the whole project would have been completed by 15 July 2016,
• the final sewer approval delay delayed sewer works,
• the sewer works delay prevented non-sewer works from otherwise proceeding, that is, that the programme could not reasonably have been varied to accommodate the consequences of late approval, and
• other works could not have been done to fill downtimes so as to save time later.

[Clause 199]   ……… White has failed to discharge this burden.

Summary

The factors required to prove a delay outlined by the Judge at Clause 198 can be generalised as follows:

  1. The completion date for the project before the delay event occurred has to be known with some certainty.
  2. The delay event has to be shown to cause a delay which flowed through to extend the overall project completion date.
  3. There were not reasonable alternative ways of working that could mitigate the effect of the delay on project completion.

More significant, none of these steps needs a CPM schedule.  The project status and the effect of the disruption on project completion can be assessed based on its effect on the productivity of key resources. This is discussed in Assessing Delays in Agile & Distributed Projects: https://mosaicprojects.com.au/PDF_Papers/P215_Assessing_Delays_In_Agile_+_Distributed_Projects.pdf   


[1]     This approach by the courts is discussed in Delivering Expert Evidence is Becoming Harder: https://mosaicprojects.com.au/Mag_Articles/AA028_Delivering_Expert_Evidence.pdf

The Artificial Intelligence Ecosystem

We have posted a number of times discussing aspects of Artificial Intelligence (AI) in project management, but what exactly is AI?  This post looks at the components in the AI ecosystem and briefly outlines what the various terms mean.

𝗔𝗿𝘁𝗶𝗳𝗶𝗰𝗶𝗮𝗹 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝗰𝗲: a range of computer algorithms and functions that enable computers and machines to mimic the problem-solving and decision-making capabilities of the human mind.

Automatic Programming: is a technology that enables computers to generate code or write programs with minimal human intervention.

Knowledge Representation: is concerned with representing information about the real world in a way that a computer can understand, so it can utilize this knowledge and behave intelligently.

Expert System: is a computer system emulating the decision-making ability of a human expert. A system typically includes: a knowledge base, an inference engine that applies logical rules to the knowledge base to deduce new information, an explanation facility, a knowledge acquisition facility, and a user interface.

Planning and Scheduling: an automated process that achieves the realization of strategies or action sequences that are complex and must be discovered and optimized in multidimensional space, typically for execution by intelligent agents, autonomous robots, and unmanned vehicles.

Speech Recognition: the ability of devices to respond to spoken commands. Speech recognition enables hands-free control of various devices, provides input to automatic translation, and creates print-ready dictation.

Intelligent Robotics: robots that function as an intelligent machine and it can be programmed to take actions or make choices based on input from sensors.

Visual Perception: enables machines to derive information from, and understand images and visual data in a way similar to humans

Natural Language Processing (NLP): gives computers the ability to understand text and spoken words in much the same way human beings can.

Problem Solving & Search Strategies: Involves the use of algorithms to find solutions to complex problems by exploring possible paths and evaluating the outcomes. A search algorithm takes a problem as input and returns a solution in the form of an action sequence.

𝗠𝗮𝗰𝗵𝗶𝗻𝗲 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴: is concerned with the development and study of statistical algorithms that allow a machine to be trained so it can learn from the training data and then generalize to unseen data, to perform tasks without explicit instructions. There are three basic machine learning paradigms, supervised learning, unsupervised learning, and reinforcement learning.

• Supervised learning: is when algorithms learn to make decisions based on past known outcomes. The data set containing past known outcomes and other related variables used in the learning process is known as training data.

• Unsupervised learning: is a type of machine learning that learns from data without human supervision. Unlike supervised learning, unsupervised machine learning models are given unlabelled data and allowed to discover patterns and insights without any explicit guidance or instruction.

Reinforcement Learning (RL): is an interdisciplinary area of machine learning concerned with how an intelligent agent ought to take actions in a dynamic environment to maximize the cumulative reward.

Classification: a process where AI systems are trained to categorize data into predefined classes or labels.

K-Means Clustering: cluster analysis is an analytical technique used in data mining and machine learning to group similar objects into related clusters.

Principal Component Analysis (PCA): is a dimensionality reduction method used to reduce the dimensionality of large data sets, by transforming a large set of variables into a smaller one that still contains most of the information in the large set.

Automatic Reasoning: attempts to provide assurance about what a system or program will do or will never do based on mathematical proof.

Decision Trees:  is a flow chart created by a computer algorithm to make decisions or numeric predictions based on information in a digital data set.

Random Forest: is an algorithm that combines the output of multiple decision trees to reach a single result. It handles both classification and regression problems.

Ensemble Methods: are techniques that aim at improving the accuracy of results in models by combining multiple models instead of using a single model. The combined models increase the accuracy of the results significantly.

Naive Bayes: is a statistical classification technique based on Bayes Theorem. It is one of the simplest supervised learning algorithms.

Anomaly Detection: the identification of rare events, items, or observations which are suspicious because they differ significantly from standard behaviours or patterns.

𝗡𝗲𝘂𝗿𝗮𝗹 𝗡𝗲𝘁𝘄𝗼𝗿𝗸𝘀: are machine learning (ML) models designed to mimic the function and structure of the human brain and help computers gather insights and meaning from text, data, and documents by being trained to recognising patterns and sequences.

Large Language Model (LLM): is a type of neural network called a transformer program that can recognize and generate text, answer questions, and generate high-quality, contextually appropriate responses in natural language. LLMs are trained on huge sets of data.

Radial Basis Function Networks: are a type of neural network used for function approximation problems. They are distinguished from other neural networks due to their universal approximation and faster learning speed.

Recurrent Neural Networks (RNN): is a type of neural network where the output from the previous step is used as input to the current step. In traditional neural networks, all the inputs and outputs are independent of each other. For example, when it is required to predict the next word of a sentence, the previous words are required and hence there is a need to remember the previous words.

Autoencoders: is a type of neural network used to learn efficient coding of unlabelled data (unsupervised learning). An autoencoder learns two functions: an encoding function that transforms the input data into code, and a decoding function that recreates the input data from the encoded representation.

Hopfield Networks: is a recurrent neural network having synaptic connection pattern such that there is an underlying Lyapunov function (method of stability) for the activity dynamics. Started in any initial state, the state of the system evolves to a final state that is a (local) minimum of the Lyapunov function.

Modular Neural Networks: are characterized by a series of independent neural networks moderated by some intermediary to allow for more complex management processes.

Adaptive Resonance Theory (ART): is a theory developed to address the stability-plasticity dilemma. The terms adaptive and resonance means that it can adapt to new learning (adaptive) without losing previous information (resonance).

Deep Learning:  is a method in artificial intelligence (AI) that teaches computers to process data in a way that is inspired by the human brain. Deep learning models can recognize complex patterns in pictures, text, sounds, and other data to produce accurate insights and predictions. The adjective deep refers to the use of multiple layers in the network.

Transformer Model:  is a neural network that learns context and thus meaning by tracking relationships in sequential data by applying an evolving set of mathematical techniques to detect subtle ways even distant data elements in a series influence and depend on each other.

Convolutional Neural Networks (CNN): is a regularized type of feed-forward neural network that learns feature engineering by itself via filters or kernel optimization.

Long Short-Term Memory Networks (LSTM): is a recurrent neural network (RNN), aimed to deal with the vanishing gradient problem present in traditional RNNs.

Deep Reinforcement Learning: is a subfield of machine learning that combines reinforcement learning (RL) and deep learning.

Generative Adversarial Networks (GAN): is a class of machine learning frameworks for approaching generative AI. Two neural networks contest with each other in the form of a zero-sum game, where one agent’s gain is another agent’s loss.  Given a training set, this technique learns to generate new data with the same statistics as the training set. A GAN trained on photographs can generate new photographs that look at least superficially authentic.

Deep Belief Networks (DBN): are a type of neural network that is composed of several layers of shallow neural networks (RBMs) that can be trained using unsupervised learning. The output of the RBMs is then used as input to the next layer of the network, until the final layer is reached. The final layer of the DBN is typically a classifier that is trained using supervised learning. DBNs are effective in applications, such as image recognition, speech recognition, and natural language processing.

For more discussion on the use of AI in project management see:
https://mosaicprojects.com.au/PMKI-SCH-033.php#AI-Discussion

One Defence Data – Another ‘Big Consultant’ issue?

Hidden in the pre-Christmas holiday fun, the ABC[1] published an ‘investigations exclusive’ by Linton Besser and defence correspondent Andrew Greene[2] that needs more attention.

It appears project ICT2284 (One Defence Data), a $515 million project to unify and exploit the data resources held by the Department of Defence is in trouble due to Hastie (pun intended) decisions made before the last election. Within this overall project, a $100 million One Defence Data “systems integrator” contract was awarded to KPMG Australia Technologies Solutions on the eve of the last federal election, and the then assistant minister for defence, Andrew Hastie, announced KPMG’s contract, promising it would “deliver secure and resilient information systems”.

This award was made after KPMG had been paid $93 million Between 2016 and 2022, for consulting work on a range of strategic advice, which included the development of ICT2284 and its failed forerunner, known as Enterprise Information Management, or EIM.

Unsurprisingly, the review by Anchoram Consulting highlighted both governance and procedural issues including:

  • The project has been plagued by a “lack of accountability” and conflicts of interest.
  • The documents suggest there is profound confusion inside Defence about who is in charge and what is actually being delivered.
  • Core governance documents have not been signed off and key requirements of KPMG’s contract have been diluted from “mandatory” to “desirable”, sometimes in consultation with KPMG itself.
  • The project had been “retrospectively” designed to justify a $100 million contract that was issued to KPMG Australia Technologies Solutions, or KTech, exposing the department to “significant risk”.

The heart of ICT2284’s problem appears to be the project’s fundamental design work had “not been done due to … the rush to meet deadlines tied to the Cabinet submission and related procurement activities”, with “no understood and agreed, desired end-state”.

Predictably both the area of Defence running the project, known as CIOG, or the Chief Information Officer Group, and KPMG reject the report findings.

The full ABC report is at: https://amp.abc.net.au/article/103247476

From a governance perspective the biggest on-going issue appears to be the lack of capability within CIOG and government generally to manage this type of complex project. The downsizing and deskilling of the public service has been on-going for decades (under both parties). This means the outsourcing of policy development to the big consultancies is inevitable, and their advice will be unavoidably biased towards benefitting them.

The actions by the current government to reverse this trend are admirable but will take years to be effective. In the meantime, we watch.

For more on governance failures see: https://mosaicprojects.com.au/PMKI-ORG-005.php#Process4

For good governance practice see: https://mosaicprojects.com.au/PMKI-ORG-005.php#Process3


[1] Australian Broadcasting Corporation

[2] Posted Tue 19 Dec 2023 at 6:41pm:

A Brief History of Agile

The history of agile software development is not what most people think, and is nothing like the story pushed by most Agile Evangelists.

Our latest publication A Brief History of Agile shows that from the beginning of large system software development the people managing the software engineering understood the need for prototyping and iterative and incremental development. This approach has always been part of the way good software is developed.

The environment the authors of the early papers referenced and linked in the article were operating in, satellite software and ‘cold-war’ control systems, plus the limitations of the computers they were working on, did require a focus on testing and documentation – it’s too late for a bug-fix once WW3 has started…..  But this is no different to modern day control systems development where people’s lives are at stake. Otherwise, nothing much has changed, good software is built incrementally, and tested progressively,  

The side-track into ‘waterfall’ seems to have bee started by people with a focus on requirements management and configuration management, both approached from a document heavy bureaucratic perspective. Add the desire of middle-management for the illusion of control and you get waterfall imposed on software developers by people who knew little about the development of large software systems. As predicted in 1970, ‘doing waterfall’ doubles to cost of software development. The fact waterfall survives in some organisations through to the present time is a factor of culture and the desire for control, even if it is an illusion.

The message from history, echoed in the Agile Manifesto, is you need to tailor the documentation, discipline, and control processes, to meet the requirements of the project. Developing a simple website with easy access to fix issues is very different to developing the control systems for a satellite that is intended to work for years, millions of miles from earth.

To read the full article and access many of the referenced papers and third-party analysis see: https://mosaicprojects.com.au/PMKI-ZSY-010.php#Agile

The evolution of the DCMA 14 Point Schedule Assessment

The DCMA 14-Point schedule assessment process was part of a suite of systems developed by the Defense Contract Management Agency (DCMA) starting in 2005. Its purpose was to standardize the assessment by DCMA staff of contractor developed Integrated Master Schedules.  

The DCMA is an agency of the United States federal government reporting to the Under Secretary of Defense for Acquisition and Sustainment, responsible for administering contracts for the Department of Defense (DoD), and other authorized federal agencies. It was created in February 1990 as the Defense Contract Management Command (DCMC) within the Defense Logistics Agency (DLA), then in March 2000, the DCMC was renamed as the Defense Contract Management Agency (DCMA) and made an independent agency.

The DCMA works directly with defense suppliers and contractors to oversee their time, cost, and technical performance, by monitoring the contractors’ performance and management systems to ensure that cost, product performance, and delivery schedules comply with the terms and conditions of the contracts.

The DOD had published specifications for an Integrated Master Schedule since 1991. In March 2005, the USA Under Secretary of Defense for Acquisition and Technology (USDA(AT&L)) issued a memo mandating the use of an Integrated Master Schedule (IMS) for contracts greater than $20 million and requiring the DCMA to establish guidelines and procedures to monitor and evaluate these schedules.

In response, the DCMA developed their 14-Point Assessment Checks as a protocol to be used for CPM schedule reviews made by their staff. The initial documentation describing this protocol appears to have consisted of an internal training course provided by the DCMA. The training was deployed as an on-line course in late 2007 and updated and re-issued on 21st November 2009 (none of these training materials appear to be available – there may have been other changes).

The final and current update to the 14 Point Assessment was included in Section 4 of the Earned Value Management System (EVMS) Program Analysis Pamphlet (PAP) DCMA-EA PAM 200.1 from October 2012.  This PAP appears to still be current as at the start of 2024 with DCMA training available to assessors. This version is the basis of our guide: DCMA 14-Point Assessment Metrics.

However, the usefulness of this approach to assessing schedule quality is questionable!

The objectives of the DCMA 14 Point assessment is frequently misunderstood, this was discussed in DCMA 14 Point Schedule Assessment – Updated.

The validity of some of the ‘checks’ are questionable, they do not conform to standards set by various scheduling authorities.

And, the relevance of the DCMA 14-Points is also questionable. The publication of the U.S. Government Accountability Office (GAO) Schedule Assessment Guide in December 2015 appears to provide a more realistic basis for assessment. 

Based on the above, one has to ask if this old baseline for assessing schedule quality still relevant? For more on schedule quality assessment see: https://mosaicprojects.com.au/PMKI-SCH-020.php#Overview

Agile’s Hidden Secret!

The two fundamental questions standard agile metrics cannot answer consistently are:

1.  How far ahead or behind schedule are we currently?

2.  When are we expected to finish?

Most of the tools and techniques used to manage Agile projects are good at defining the work (done, in-progress, or not started) and can indicate if the work is ahead or behind a nominated planned rate of production, but there is no direct calculation of the time the work is currently ahead or behind the required production rate, or what this is likely to mean for the completion of the project. A full discussion of this topic is in Calculating Completion.  However, most project sponsors and clients need to know when the project they are funding will actually finish, they have other people that need to make use of the project’s outputs to achieve their objectives. At present all Agile can offer is an educated assessment based on the project teams understanding of the work.

Work Performance Management (WPM) has been designed to solve this challenge by providing answers to these questions based on consistent, repeatable, and defensible calculations.

WPM is a simple, practical tool that uses project metrics that are already being used for other purposes within the project, to assess progress and calculate a predicted completion date by comparing the amount of work achieved at a point in time with the amount of work needed to have been achieved. Based on this data WPM calculates the project status and the expected completion date assuming the rate of progress remains constant.

Our latest article, WPM for Agile Projects identifies the cause of this information gap in Agile project management, explains the inability of current tools to accurately predict completion and demonstrates how WPM will effectively close this critical information gap.
Download WPM for Agile Projects: https://mosaicprojects.com.au/Mag_Articles/AA040_-_WPM_for_Agile_Projects.pdf

For more on the practical use of WPM, free sample files, and access to the tool see: https://mosaicprojects.com.au/PMKI-SCH-041.php  

The Port of Melbourne is not what it seems

View of the Port of Melbourne looking East.

The Port of Melbourne is the largest port for containerized and general cargo in Australia. Anyone visiting the port, or Melbourne generally, would think the Yarra River must have been a useful harbor at the time of settlement, and the various docks were built to enhance this natural asset. Modern maps, and the view out to Port Philip Bay reinforce this concept.

Aerial view of the Port of Melbourne
View to the South West of the port and bay.

The truth is very different, almost everything shown in the pictures above is man made.

 Settlement

The settlement of Melbourne started in 1835. To put this date in context, it is 20 years after the Battle of Waterloo and 2 years before the coronation of Queen Victoria[1].  The original settlement was located in the area of the current day CBD. This site was chosen for its access to fresh water from the stream running through the site, rather than its potential as a port.  

The full Once As It Was map showing the lands of the Boon Wurrung people can be obtained from:
https://www.ecocentre.com/programs/community-programs/indigenous/

Early problems

An underwater sand bar at the entrance of the Yarra River ruled out the entry of vessels drawing more than about nine feet of water and access up river was blocked at ‘The Falls’, a rock bar running across the river which was used as a crossing point by the local Aboriginal peoples.

This limited access meant ships arriving from overseas had to drop anchor in Hobson’s Bay, or moor at the Sandridge (Port Melbourne) Pier. Passengers and goods then had to walk, use carts, or be transshipped up the river in smaller vessels or ‘lighters’ as they were called. Costs were excessive! It has been recorded that it cost 30 shillings per ton (half the entire freight costs for the voyage from England) to have goods taken the eight miles from sea to city.

The discovery of gold in 1850 exacerbated the problems of the port. In just one week in 1853 nearly 4000 passengers from 138 ships arrived in Hobson’s Bay and in 1858 the average delay moving goods from the port into the city was three weeks.

The initial solution to this problem was the construction in 1854 of the first steam railway in Australia from piers built at Sandridge (now the suburb of Port Melbourne) to the city this railway is discussed in The First Steam Powered Railway in Australia, but a better solution was needed for cargo.

Developing the Port of Melbourne – 1839 to 1877

The Yarra River was progressively improved to facilitate trade. Jetties were built along the banks of the river from 1839 onwards funded by wharfage charges. This 1857 photograph shows the wharfs downstream from ‘The Falls’.

1857 – Yarra River downstream from ‘The Falls’

The 1864 map of Melbourne shows the limitations outlined above were still limiting the development of Melbourne despite the massive influx of money from the Victorian gold rush. The Falls effectively dammed the river causing major flooding and the restrictions caused by the swamps bends and shoals in the Yarra restricted trade.

See a full version of this 1864 map at:
https://mosaicprojects.com.au/PDF-Gen/Melbourne_1864.jpg

Developing the Port of Melbourne – 1877 to 1902

To overcome the problems, the Melbourne Harbor Trust was formed in 1877 and engaged English engineer, Sir John Coode to recommend solutions.

Starting in 1879 Sir John Coode made three key recommendations:
– the development of a canal to improve access for ships,
– the demolition of The Falls to reduce flooding, and
– the deepening of the narrow entrance to Port Phillip Bay from the ocean.

Based on Sir John’s recommendation, the course of the lower Yarra’s was significantly altered. This visionary feat of engineering involved 2,000 workers for 20 years. The proposal not only significantly shortened travel time up the river for ships, but also created Victoria Harbour and Victoria Dock.

Plan for Coode Canal and Victoria Dock
(the dock was changed to a single body of water later)

The original wide loop in the river was eliminated through the construction of the 1.5 km Coode Canal which opened in 1886, and West Melbourne Dock (now Victoria Dock) opened to shipping in 1893. The canal created Coode Island and caused the shallow, narrow and winding Fishermans Bend to be cut off along with other sections of the river including Humbug Reach and the original junction with the Maribyrnong River. The Coode Canal was deepened to 25 feet and widened from 100 to 145 feet in 1902.

The plan to remove ‘The Falls’ involved clearing the reef to a uniform depth of 15 feet 6 inches, at an estimate cost of £20,000.  The demolition was complete by 1883, having been funded by a combination of the Victorian Government and the Harbour Trust. The reduction in flooding caused by the improved river flows converted flood plains and swamps into dry land, encouraging the development of South Melbourne discussed in The evolution of South Melbourne. The lake in Albert Park is all that remains of the freshwater lagoons and seasonal swamps South of the Yarra.

The Rip at Port Phillip Heads was also deepened in 1883 using explosives.

Developing the Port of Melbourne – 1902 onward.

The piers at Sandridge (Port Melbourne) continued to be important, but mainly for passengers. A new pier, built to the west of Railway Pier, opened in 1916, called the New Railway Pier. This was renamed Prince’s Pier in 1921. These piers were important locations for the departure and return of troops to the Boer War in South Africa and WW1 &2, as well as the arrival of many thousands of migrants after WW2.

The Piers at Port Melbourne c1950.

The current Station Pier, which replaced the original Railway Pier, was built between 1922 and 1930 and remains the primary passenger arrival point in Melbourne with cruise ships visiting throughout summer.  Princes Pier has been demolished.   

Meanwhile, most cargo shipments were handled by the Victoria Dock, and by 1908 it was handling ninety per cent of Victoria’s imports. In 1914 its capacity was enlarged by the addition of a central pier and in 1925 the entrance was widened. But with rapidly increasing imports and exports further renovation and development was needed. Also, as ships increased in size there was a need for larger wharves and deeper berths to accommodate them.

The growth of the city also encroached on the Eastern end of the docks. The construction of the Spencer Street Bridge in 1927-28 meant that all port traffic had to be handled further downstream, foreshadowing the need for even more docks, and the expansion of the port towards the West and the bay.

Construction of Spencer St. Bridge
Port development at Coode Island in 1958

To overcome these challenges, the docks spread to the West, and no cover all of the land between the land from Moonee Ponds Creek to the Maribyrnong River, totally absorbing Coode Island.

The original Victoria Dock and the adjacent North Wharf on the river continued to play a vital role, handling up to half of the Port of Melbourne’s trade until the shift to containerisation and then the construction of the Bolte Bridge in 1999, made the old port facilities redundant.  From 2000 the Victoria Docks became Docklands, and were revitalised as commercial and residential areas, while the Port of Melbourne continues to expand downstream.  

The last major expansion of the port was the construction of Webb Dock at the mouth of the Yarra River in 1960.

Webb dock at the mouth of the Yarra

Improvements in both wharf-side and land-side facilities continue, but despite all of these improvements, the Port of Melbourne is approaching capacity, the next developments are not far off but need political decisions on the location, either in Port Phillip Bay or Westernport Bay to allow the next transformation to start.

Footnote on the names

The rock bar called ‘The Falls’ in this post was called Yarro- Yarro meaning “waterfall or ever flowing” by the local peoples. The river was known as the Birrarung by the Wurundjeri people. The first settlers confused the names and called the river Yarra.

The , Yarro Yarro Falls were important to the local Aboriginal tribes, the Woiwurrung and the Boonerwrung, who used it as a crossing point between their lands, in order to negotiate trade and marriages. This was the only means of crossing the Yarra River until ferries and punts began operating c 1838. The first bridge was constructed c 1845, a little further upstream, at the location very near what is now known as the ‘Princes Bridge’.

For more papers on the history of the construction industry see:
https://mosaicprojects.com.au/PMKI-ZSY-005.php#Bld


[1] A historical timeline can be viewed at: https://mosaicprojects.com.au/PDF_Papers/P212_Historical_Timeline.pdf