Monday

Tag Archives: Schedule Analysis

DCMA 14 Point Schedule Assessment – Updated

There seems to be a lot of misunderstanding around the intention and use of the DCMA 14 Point Schedule Assessment.  Following on from several discussions over the last month or so we have updated our White Paper: DCMA 14-Point Assessment Metrics and uploaded the last published version of the Metrics:  Earned Value Management System (EVMS) Program Analysis Pamphlet (PAP), DCMA-EA PAM 200.1, October 2012.

The EVMS-PAP is designed for use in performing an integrated baseline review of a major program using EVM, but EVM relies on a competent schedule and Section 4 of DCMA-EA PAM 200.1 defines the last published version of the DCMA 14 Point Schedule Metrics. As can be seen from the date of publication, the DCMA 14 Points are quite old, and they did change in the years before 2012 (for more on the evolution of the DCMA 14 Points see: The evolution of the DCMA 14 Point Schedule Assessment). This leads to two significant problems:

The first is many people misunderstand the objective of the assessment. The objective is stated explicitly in the document:

The DCMA 14 Point Schedule Metrics were developed to identify potential problem areas with a contractor’s IMS…… These metrics provide the analyst with a framework for asking educated questions and performing follow-up research. The identification of a “red” metric is not in and of itself synonymous with failure but rather an indicator or a catalyst to dig deeper in the analysis for understanding the reason for the situation. Consequently, correction of that metric is not necessarily required, but it should be understood.

Earlier versions talked about pass/fail, this concept has been dropped (and was never a good idea).

The second issue is implementation of the assessment. The implementation of the DCMA 14-Point Assessment in the various software tools is not certified by the DCMA or any other body and varies between the tools!  The biggest issue is around counting of the number of tasks to be considered. The 2012 version stated that the Total Tasks should exclude: Completed tasks, LOE tasks, Subprojects (called Summary tasks in MS Project), and Milestones (Zero Duration Tasks). This differs from the 2009 update, and the 2009 update changed from earlier versions.

There is an established correlation between a competently prepared schedule and project success – successful projects tend to have an effective controls system and a ‘good’ schedule, but the key measure of a good schedule is it is useful and is used. The purpose of the DCMA checks is to identify issues that need to be understood.

For more on schedule quality see: https://mosaicprojects.com.au/PMKI-SCH-020.php#Overview

Assessing Delay and Disruption

In preparation for the IAMA National conference later this week I have just finished developing and updating a short series of papers focused on addressing schedule delay and disruption.

  • Assessing Delay and Disruption – an overview of the accepted methods of forensic schedule analysis [ view the paper ]
  • Prolongation, Disruption and Acceleration Costs – an overview of the options for calculating costs associated with approved delays and acceleration [ view the paper ]
  • The complexities around concurrent and parallel delays are discussed on Mosaic’s White Paper WP1064 Concurrent and Parallel Delays

Any comments are welcome.