The objective of the audit was to assess whether Defence has a fit-for-purpose framework for the management of materiel sustainment.

Summary and recommendations

Background

1. Defence materiel sustainment is about the maintenance and support of Defence’s fleets of specialist military equipment: the provision of in-service support for naval, military and air platforms, fleets and systems operated by Defence. Effective sustainment of naval, military and air assets is essential to the preparedness of the Australian Defence Force (ADF) and to enable Defence to conduct operations. Defence spends similar amounts each year on sustainment and the acquisition of new equipment. In 2015–16, Defence spent $6.3 billion—21 per cent of its total departmental expenditure—on the sustainment of specialist military equipment.

2. Parliamentary committees have frequently stated an interest in Defence’s reporting of its sustainment performance and, in particular, obtaining greater insight into that performance. During the course of this audit the Joint Committee of Public Accounts and Audit (JCPAA) announced that it had commenced an inquiry into Defence sustainment expenditure.

3. Part of the context for this audit is to inform the Parliament regarding Defence’s approach to managing sustainment expenditure, including in the context of the current JCPAA inquiry, and to inform the ANAO’s future program in this area.

Audit objective and criteria

4. The objective of the audit was to assess whether Defence has a fit-for-purpose framework for the management of materiel sustainment. To form a conclusion against the audit objective, the ANAO adopted the following high-level criteria:

  • Defence has established an appropriate governance and operational framework for the management of materiel sustainment;
  • Defence has established and implemented a high quality performance framework to support the management and external scrutiny of materiel sustainment; and
  • Defence has achieved key outcomes expected from the Smart Sustainment reforms and has progressed its implementation of the reforms to sustainment flowing from the First Principles Review.

Conclusion

5. The fundamentals of Defence’s governance and organisational framework for the management of materiel sustainment are fit-for-purpose. However, Defence continues to address specific operational shortcomings and there remains scope for Defence to improve its performance monitoring, reporting and evaluation activities to better support the management and external scrutiny of materiel sustainment.

6. Defence has clear and long-standing governance and organisational arrangements for managing the sustainment of specialist military equipment.

7. Research and reviews conducted for Defence have revealed a range of specific operational problems that are detracting from the efficient and effective sustainment of Defence capability, including the functioning of Systems Program Offices. Defence has initiated a reform project as part of its First Principles Review implementation.

8. The development of an effective sustainment monitoring system remains a work in progress, and the effectiveness of Defence’s internal reporting system for sustainment could be improved in several areas. Opportunities also remain to increase the completeness and transparency of publicly reported information regarding materiel sustainment.

9. The 2015 First Principles Review was preceded by earlier major reform initiatives, notably the Strategic Reform Program (SRP) begun in 2009. Defence records show that the department made substantial efforts to keep track of the large number of diverse initiatives identified across the department under the ‘smart sustainment’ reforms associated with the SRP, including internal reporting to management. However, Defence did not adequately assess the outcomes from the ‘smart sustainment’ reforms.

10. Reforms to the management of sustainment flowing from the First Principles Review remain at an early stage, and this stream of activity is likely to take much longer than the expected two years. For example, the Systems Program Office reviews are not yet complete and Defence has provided no evidence that decisions have been taken on changes to their structure and functioning.

11. Defence has engaged industry expertise to guide and help it with the First Principles reforms relating to acquisition and sustainment, including $107 million with a single company where the contract for services is not performance-based. Reform is expected to lead to greater outsourcing of functions currently performed in-house by Defence’s Systems Program Offices.

Supporting findings

Defence’s governance and operational framework for the management of materiel sustainment

12. Defence has clear and long-standing arrangements for managing the sustainment of specialist military equipment. Key elements of the governance and organisational framework include: a specialist organisational unit—currently the Capability Acquisition and Sustainment Group—staffed by a mix of civilian and military personnel; contract-like arrangements for sustainment between that unit and the Chiefs of Navy, Army and Air Force (Defence’s Capability Managers); and day-to-day responsibility for most sustainment activities falling to Systems Program Offices, which carry out that work using a mix of in-sourced and out-sourced service provision. Military units also undertake some operational-level sustainment activities.

13. Nevertheless, an internal review conducted by Defence and research conducted for Defence following the 2015 First Principles Review have identified a range of operational problems that detract from the efficient and effective sustainment of Defence capability, including: adherence to procurement principles; staff capabilities; duplication of effort and transparency of internal costs. A project to reform and consolidate Systems Program Offices is currently underway, with 24 out of 64 Systems Program Offices (37.5 per cent) reviewed as at February 2017.

Defence’s performance framework for materiel sustainment

14. With the introduction of its Sustainment Performance Management System (SPMS), Defence continues to develop a basis for an effective monitoring system for sustainment. Once fully implemented this system should be capable of systematically reporting against a suite of performance indicators settled in agreement with Capability Managers.

15. There remains potential to improve some core key performance indicators used within the system—for example, to more usefully determine the total cost of the capability to Defence. The SPMS system was not fully implemented during audit fieldwork but is expected to be fully operational by the end of June 2017. In the longer term the system is to be expanded to cover acquisition.

16. The effectiveness of Defence’s internal reporting system for sustainment could be improved in several areas. The Quarterly Performance Report is the primary way by which Defence provides information to government and senior Defence personnel about the status of major acquisition and sustainment activities. However, based on the ANAO’s review of a Quarterly Performance Report produced during the audit, its contents are neither complete nor reliable, it takes two months to produce and its contents are sometimes difficult to understand. The ANAO’s analysis found that the report may not include additional information available to Defence that is critical to the reader’s ability to understand the status of significant military platforms. It provides only a partial account of materiel sustainment within Defence and is potentially at odds with the ‘One Defence’ model promoted by the First Principles Review. The ANAO has recommended that Defence institute a risk-based quality assurance process for information included in the Quarterly Performance Report.

17. Defence conducts reviews of sustainment performance through sustainment gate reviews that help Defence to obtain insight into a project or product’s progress and status. The effectiveness of these reviews could be increased if the lessons obtained from gate reviews were routinely incorporated into management reporting on sustainment and if gate reviews were extended to contribute to the proposed quality assurance mechanism for Quarterly Performance Reports.

18. Defence has not implemented measures of efficiency and productivity for all sustainment products. Reviews have consistently emphasised the need for Defence to improve the efficiency of its operations, including in sustainment. Most recently, the First Principles Review recommended immediate implementation of measures of productivity, a related concept. The ANAO found that, 18 months after implementation commenced, there had been limited progress.

19. Defence has recently improved its whole-of-life costing of proposals to acquire major capital equipment but remains unable to measure or report reliably the total cost of ownership. It is now planning to implement a new model, which seeks to capture the full cost of ownership throughout the life of an asset, with implementation planned for completion in July 2018. The First Principles Review has pointed out that Defence has treated Systems Program Office staff costs at the project level as a free good, reducing the transparency of the cost of sustainment work and providing inaccurate price signals and a distorted incentive structure to Capability Managers.

20. Opportunities remain to improve the quality and transparency of Defence’s publicly reported information regarding materiel sustainment, while being sensitive to national security concerns. Areas for improvement include:

  • achieving a clearer line of sight between planning and reporting documents—for both expenditure and descriptive information at the corporate and project levels; and
  • the use of consistent time series data and analysis of sustainment expenditure.

21. Defence’s second corporate plan prepared under the Public Governance, Performance and Accountability Act 2013 (PGPA Act) identifies sustainment more clearly than in the first corporate plan. Under the PGPA Act, Defence will need to report meaningfully against this plan in its annual performance statement.

22. Defence’s 2015–16 public reporting of sustainment activity included expenditure information and other descriptive material. However, it was not: complete; consolidated in one easy to locate area; prepared in a manner which permitted the comparison of actual expenditure against estimates; or consistent in its presentation of clear reasons for full-year variances. Further, performance summaries were highly variable and inconsistent between public planning documents—Defence’s Portfolio Budget Statements and Portfolio Additional Estimates Statements—and the Annual Report.

23. Defence has not published program level expenditure data on a consistent basis over time, or time series analysis, to assist with external scrutiny of its sustainment expenditure.

Smart Sustainment reforms

24. There is no record of whether the intended savings to sustainment costs were achieved from the Defence Materiel Organisation’s (DMO) 2008 initiative to identify efficiencies. A target expenditure cut of five per cent of sustainment costs over a financial year was set by the Chief Executive Officer of the DMO in February 2008. Defence expected savings from: reducing spares; making more use of performance-based contracting; discouraging unnecessary end-of-year spending; travel efficiencies and improved maintenance philosophies.

25. Defence has not kept a systematic record of the outcomes of the Smart Sustainment initiative associated with the 2009 Strategic Reform Program. In 2011, a ‘health check’ by external consultants found some early successes with cost reductions and changes in practice. Nevertheless, the consultants considered the program was failing because of shortcomings in governance, program management, and Defence’s approach to reform described by a major vendor as ‘minor reform, driven by piecemeal, top down budget pressure’.

26. Smart Sustainment had a ten-year savings target of $5.5 billion. In its 2014–15 Annual Report, Defence claimed to have achieved $2 billion of savings from the initiative in its first five years. Defence has not been able to provide the ANAO with adequate evidence to support this claim, nor an account of how $360 million allocated as ‘seed funding’ for Smart Sustainment initiatives was used.

Materiel sustainment reform—First Principles Review

27. Defence has drawn heavily on contracted industry expertise to support its implementation of the program of organisational change relating to acquisition and sustainment that has followed the First Principles Review. This represents a major investment of at least $120 million, including contracts for services from the principal provider (Bechtel) valued at some $107 million. Contracts with the principal provider are not performance-based.

28. Systems Program Office consolidation and reform is one of the largest reforms to Capability Acquisition and Sustainment Group deriving from the 2015 First Principles Review. This stream of activity is likely to take much longer than the two years expected for implementation. In June 2017, Defence advised the ANAO that implementation plans for significant reforms to Systems Program Offices would be developed in the second half of 2017.

29. Other reforms flowing from the First Principles Review remain underway:

  • Introducing performance-based contracting into Defence sustainment has been underway for over a decade. Defence does not yet have a completed register of its acquisition and sustainment contracts though, since January 2016, it has had a facility in place and had commenced populating it.
  • Initial establishment of ‘centres of expertise’ in Capability Acquisition and Sustainment Group is underway. Defence expects full implementation to take a further two years. Similarly, the new Capability Acquisition and Sustainment Group Business Framework is expected to take ‘many years’ to fully implement.
  • Defence has developed its own meaning for the term ‘smart buyer’, which does not clearly articulate the intent of the First Principles Review recommendation. This introduces risks related to ensuring that Systems Program Office staff working on outsourcing have the necessary skills and competencies.

30. Defence has also been developing its approach to asset management over many years, having obtained substantial advice from internal and external sources to adopt a Defence-wide asset management strategy to underpin a sustainment business mode for specialist military equipment. It is now not clear whether Defence will continue with this work.

31. Defence has not put in place plans to evaluate either the reforms themselves or its implementation of them. There is a risk that insights into a very substantial reform process could be lost. This was the case with the earlier Smart Sustainment reforms. The ANAO has recommended that Defence develop and implement an evaluation plan.

Parliamentary interest

32. In light of the Parliamentary interest in Defence sustainment (see paragraph 1.4 forward) the ANAO has considered, on the basis of the current audit’s findings, the issue of enhanced scrutiny of sustainment through a process similar to the existing Major Projects Report (MPR).1 The MPR has been prepared annually over the last decade to provide independent assurance of the status of selected Defence major acquisition projects. The MPR has added value through the review process and the transparency of information providing increased assurance to the Parliament and the Government. This has been achieved while managing risks to national security.2

33. A key step in the MPR process is the preparation of a Project Data Summary Sheet for each project being reviewed. It is apparent from this audit that Defence has developed information systems that could support the preparation of similar information for its sustainment work. Defence has undertaken work on performance measures for sustainment and has developed the infrastructure to collect and report on its sustainment work (see Chapter 3).

34. Three principal issues remain for the conduct of a process for sustainment parallel to the MPR for acquisition:

  • First, the effective management of risks to national security which can arise with the exposure of details of the readiness and availability of Defence capability. This issue could be managed provided the scope of any sustainment review is appropriately selected. For example, a small number of products could be selected, with rotation or other variation from year-to-year, limiting the risks that may flow from time-series analysis and the release of other material whose aggregation could add risk.3
  • Second, the material to be produced for scrutiny of sustainment performance would need to reflect a ‘One Defence’ (that is, whole of portfolio) view. As noted in this audit report, the focus of some current arrangements requires clarification as it may reflect only the performance of Capability Acquisition and Sustainment Group.
  • A final consideration is that of resourcing. In setting the scope and methodology of any review of sustainment performance, the costs and benefits of the proposed program of work and its relationship to existing scrutiny arrangements would need to be considered, both for Defence and for the ANAO.

35. The ANAO has also observed in Chapter 3 of this audit report that there is scope to improve the quality and transparency of public reporting on sustainment in existing Defence reports—the Portfolio Budget Statements, Portfolio Additional Estimates Statements and Defence Annual Report.

Recommendations

Recommendation no.1

Paragraph 3.27

The ANAO recommends that Defence institutes a risk-based quality assurance process for the information included in the Defence Quarterly Performance Report.

Defence response: Agreed

Recommendation no.2

Paragraph 5.66

The ANAO recommends that Defence develop and implement an evaluation plan to assess the implementation of the recommendations of the First Principles Review.

Defence response: Agreed

Entity response

The Department of Defence welcomes the Australian National Audit Office (ANAO) performance audit on Defence’s management of materiel sustainment. Defence’s comments, and suggested editorial amendments, have been provided to the ANAO.

Defence notes the findings of the ANAO, and agrees to both recommendations.

Defence concurs with the ANAO’s conclusion that the fundamentals of the governance and organisational framework for the management of sustainment are clear, and fit for purpose.

Defence notes that the audit was conducted at a strategic level, and acknowledges the opportunities for improvement identified by the ANAO. Defence continues to strive towards efficiencies in the delivery of sustainment outcomes for Defence capability and to report on outcomes, noting the need to balance transparency and accountability to the Australian public and Parliament, with the national security interests of the Commonwealth.

Defence’s implementation of the recommendations of the First Principles Review has also introduced a single end-to-end capability development function, the Capability Life Cycle, which will reduce previous delineations between the management of acquisition and sustainment activity.

1. Background

1.1 Defence materiel sustainment is about the maintenance and support of Defence’s fleets of specialist military equipment. Defence defines sustainment as involving the provision of in-service support for specialist military equipment, including platforms, fleets and systems operated by Defence. Typically, sustainment entails repair and maintenance, engineering, supply support and disposal of equipment and supporting inventory.4 Effective sustainment of naval, military and air assets is essential to the preparedness of the ADF and to enable Defence to conduct operations in accordance with the Government’s requirements.5

1.2 Defence has commented that sustainment management is not a technical discipline: ‘it is an over-arching business-oriented management function focused on meeting the outcomes required of Capability Acquisition and Sustainment Group customers’.6

1.3 In recent years, Defence has spent, each year, similar amounts on sustainment and the acquisition of new equipment (Figure 1.1, below). In 2015–16, Defence spent about $6.3 billion—21 per cent of its total departmental expenditure—on sustainment of specialist military equipment.

Figure 1.1: Defence acquisition and sustainment expenditure

 

 

Note: For 2005–06 to 2014–15, expenditure was made through the Defence Materiel Organisation. Figures for 2015–16 are based on Defence advice with one-third of Capability Acquisition and Sustainment Group staff costs allocated to acquisition and two-thirds to sustainment.

Source: Defence Annual Reports 2005–06 to 2014–15, and Defence advice.

Parliamentary interest

1.4 Parliamentary committees over several years have stated an interest in Defence’s reporting of its sustainment performance and, in particular, obtaining greater insight into that performance.7 In May 2014, the JCPAA recommended that the DMO prepare a suitable methodology for reporting sustainment activity and expenditure, within six months.8 The Government disagreed on the basis that current arrangements balanced Parliamentary scrutiny of sustainment expenditure with the protection of classified information on the military capability, readiness and availability associated with that sustainment.9 In September 2014, the Committee sought a sustainment options paper from the ANAO, which was made public by the Committee. Defence subsequently provided the Committee with an in camera briefing (November 2015), consistent with an option in the paper put forward by the ANAO. The Committee expressed the view, in May 2015, that:

Sustainment expenditure is currently at approximately $5 billion per annum and predicted to increase significantly over time. The Committee considers sustainment expenditure to be an area requiring further parliamentary scrutiny on the adequacy and performance of Defence involving billions of dollars in the future.10

1.5 The Committee noted, however, that the final structure for sustainment reporting was as yet undecided. During the course of this audit (9 January 2017) the Committee announced that it had commenced an inquiry into Defence Sustainment Expenditure. Part of the context for this audit is to inform the Parliament as to whether Defence has developed its management reporting sufficiently to facilitate a program of ANAO assurance reviews of selected sustainment activities.

1.6 The ANAO’s observations on this aspect of the audit are reflected in the Summary and Recommendations section of this report (paragraph 32 forward).

Audit approach

1.7 The objective of the audit was to assess whether Defence has a fit-for-purpose framework for the management of materiel sustainment. To form a conclusion against the audit objective, the ANAO adopted the following high-level criteria:

  • Defence has established an appropriate governance and operational framework for the management of materiel sustainment;
  • Defence has established and implemented a high quality performance framework to support the management and external scrutiny of materiel sustainment; and
  • Defence has achieved key outcomes expected from the Smart Sustainment reforms and has progressed its implementation of the reforms to sustainment flowing from the First Principles Review.

1.8 The scope of this performance audit is the management of sustainment of specialist military equipment. It addresses sustainment management at a high level and is not focused on the sustainment of individual platforms or the technical aspects of their maintenance. Defence’s broader sustainment program also includes the maintenance and support of corporate information technology and estate and infrastructure assets. These activities are outside the scope of the audit.

1.9 The audit was conducted in accordance with ANAO Auditing Standards at a cost to the ANAO of approximately $422 000.

1.10 The team members for this audit were Dr David Rowlands, Kim Murray and David Brunoro.

2. Defence’s governance and operational framework for the management of materiel sustainment

Areas examined

This chapter examines Defence’s governance and operational arrangements for managing the sustainment of specialist military equipment.

Conclusion

Defence has clear and long-standing governance and organisational arrangements for managing the sustainment of specialist military equipment.

Research and reviews conducted for Defence have revealed a range of specific operational problems that are detracting from the efficient and effective sustainment of Defence capability, including the functioning of Systems Program Offices. Defence has initiated a reform project as part of its First Principles Review implementation.

Does Defence have a clear governance and operational framework for the management of materiel sustainment?

Defence has clear and long-standing arrangements for managing the sustainment of specialist military equipment. Key elements of the governance and organisational framework include: a specialist organisational unit—currently the Capability Acquisition and Sustainment Group—staffed by a mix of civilian and military personnel; contract-like arrangements for sustainment between that unit and the Chiefs of Navy, Army and Air Force (Defence’s Capability Managers); and day-to-day responsibility for most sustainment activities falling to Systems Program Offices, which carry out that work using a mix of in-sourced and out-sourced service provision. Military units also undertake some operational-level sustainment activities.

Nevertheless, an internal review conducted by Defence and research conducted for Defence following the 2015 First Principles Review have identified a range of operational problems that detract from the efficient and effective sustainment of Defence capability, including: adherence to procurement principles; staff capabilities; duplication of effort and transparency of internal costs. A project to reform and consolidate Systems Program Offices is currently underway, with 24 out of 64 Systems Program Offices (37.5 per cent) reviewed as at February 2017.

2.1 The ANAO examined Defence’s governance and operational arrangements for managing the sustainment of specialist military equipment, with a particular focus on the clarity of:

  • roles and responsibilities; and
  • processes for documenting capability managers’ requirements around sustainment.

Governance and operational arrangements

2.2 Defence has clear and long-standing arrangements for the management of materiel sustainment for specialist military equipment. Key elements are:

  • since 2000, a specialist acquisition and sustainment unit—previously the Defence Materiel Organisation (DMO) and most recently the Capability Acquisition and Sustainment Group—staffed by a mix of civilian and military personnel;
  • maintenance by the specialist unit of physically dispersed sub-units—known as ‘Systems Program Offices’—with day-to-day responsibility for Capability Acquisition and Sustainment Group sustainment activities11;
  • since 2005, contract-like arrangements known as ‘Materiel Sustainment Agreements’ between the specialist unit and the Australian Defence Force’s capability managers—primarily the Chiefs of Navy, Army and Air Force;
  • a mix of in-sourced and out-sourced service provision for sustainment activity; and
  • reliance on a mix of public and private equipment and facilities for sustainment activity.

2.3 While a number of reviews have focused on the optimal distance of the specialist entity from the Department of Defence12—with changes over time in its degree of separation—and have made recommendations on the unit’s operations and practices, the fundamentals of the sustainment management framework have been stable for over a decade.

Capability Acquisition and Sustainment Group

2.4 The sustainment of specialist military equipment was formerly a core responsibility of the DMO. Following government acceptance of a recommendation of a review of Defence—the First Principles Review, which reported in April 2015—DMO was delisted from 1 July 2015.13 DMO’s sustainment role then passed immediately to Defence’s Capability Acquisition and Sustainment Group.14 As at November 2016, Defence employed about 5500 people in that Group.15

2.5 Defence describes the accountabilities of the Group as follows:

The Capability Acquisition and Sustainment Group (CASG) purchases and maintains military equipment and supplies in the quantities and to the service levels that are required by Defence and approved by Government.16

We are accountable to:

  • The Australian Government through the Defence Ministers (our owner).
  • The Secretary of Defence and Chief of the Defence Force.17
  • The women and men of the ADF through the capability managers (our customers).
  • Defence industry (our partners).
Systems Program Offices

2.6 As at February 2017, Defence had some 64 Systems Program Offices, which manage the acquisition, sustainment and disposal of specialist military equipment. They do this through a combination of internal work and commercial contracts. In January 2017 Defence’s Systems Program Offices managed the sustainment of 112 fleets of equipment and services. Systems Program Offices are located in different parts of the country but are organisationally all part of the Capability Acquisition and Sustainment Group within Defence. They may also be involved in acquisition projects but the greater bulk of their work involves sustainment.

2.7 In most cases, a major platform—such as an aircraft type or class of ship—is managed by a single Systems Program Office.18 Systems Program Offices maintain a relationship with industry bodies, particularly those providing maintenance services, spares, engineering and other support, and with the representative of the relevant capability manager.19 Each Systems Program Office has contracts with industry suppliers of services, parts and consumables.

Box 1: The origin of Systems Program Offices

Reviews of Defence over the last two decades have identified many opportunities for improvement in Defence sustainment practice. Defence established Support Command Australia in August 1997 to coordinate materiel support to Navy, Army and Air Force, ‘and to help Defence “do more with less”’. In 2000, when the expected benefits were not being realised, Defence reviewed Support Command Australia and the Defence Acquisition Organisation, finding that serious problems persisted in capital acquisition and whole-of-life support. To address these problems, Support Command Australia, the Defence Acquisition Organisation and part of National Support were merged to form the DMO. The DMO was to be accountable for whole-of-life materiel management, with Systems Program Offices comprising multi-functional teams to undertake the work.

The Defence White Paper, Defence 2000—Our Future Defence Force, stated that ‘the DMO will adopt commercial best practice as its norm and assess its performance against industry benchmarks’. The subsequent Defence Procurement Review (‘Kinnaird Review’) in 2003 also stressed the need for high-quality, highly skilled sustainment managers in delivering modern military capability in a ‘business-like manner’.

2.8 As a consequence of the 2015 First Principles Review and the decision to consolidate and reform Systems Program Offices, Defence has adopted a new definition of ‘Systems Program Office’. A number of acquisition projects had been managed in project offices which are now counted as Systems Program Offices.20 Systems Program Offices do not endure indefinitely: for example, after the withdrawal from service of an asset and its disposal the relevant Systems Program Office may be dissolved or amalgamated with another. Similarly, a new Systems Program Office may be created for a major acquisition project. A list of Systems Program Offices (including staff numbers, assets under sustainment and contracts) is at Appendix 1.

Box 2: Sustainment work in Defence’s Capability Acquisition and Sustainment Group

Sustainment can be considered as falling into three categories: sustainment of platforms, products and commodities, as well as the provision of services such as test and measurement. Platforms represent a large and complex capability such as frigates, tanks or aircraft. There are over 40 platform fleets being supported and they represent around 60 per cent of total sustainment expenditure in any given year. These are usually long life items, with little fleet replacement across years to decades and have budgets dominated by maintenance. Most of the sustainment budget is for maintenance of major platforms which, in turn, depends on age and use. Thus, for example, slippage in new acquisition projects can increase sustainment costs for ageing platforms.

Products are those that deliver a capability based on smaller but more numerous systems often involving regular replacement of all, or components of, the capability. Examples include B Vehicles (Land Rovers, G-Wagons), Direct Fire Support Weapons, Aeronautical Life Support Equipment and Command and Support Systems—Maritime. Maintenance is still a large cost component for sustainment of these systems but regular replacement is more significant in comparison with platforms. Support of products represents about 20–25 per cent of sustainment expenditure.

For commodities, the maintenance component of the budget is comparatively small with replacement or replenishment being the dominant activity. Fuel and lubricants, combat clothing and explosive ordnance fall into this category, which makes up about 20 per cent of sustainment expenditure.

Source: Abstracted from Defence, ‘Reform Issues’, April 2011.

Internal Agreements

2.9 Agreements, internal to Defence, between the Capability Acquisition and Sustainment Group and each capability manager, set out the level of performance and support required by the capability manager.21 These are ‘Materiel Sustainment Agreements’ (MSAs). MSAs include an agreed ‘price’ for the sustainment work and performance indicators by which Capability Acquisition and Sustainment Group internal service delivery is measured and reported.22

2.10 Defence intends to replace MSAs (and the associated Materiel Acquisition Agreements) progressively with Product Delivery Agreements. The Product Delivery Agreement will cover both acquisition and sustainment of a capability system over its life, through to disposal.

2.11 A simplified representation of the essential relationship among elements responsible for the sustainment of specialist military equipment is in Figure 2.1 below.

Figure 2.1: Simplified representation of organisational arrangements supporting Defence sustainment of specialist military equipment

 

 

Note: Military units also undertake some operational-level sustainment activities.

Source: ANAO

2.12 Navy, Army and Air Force policy requires six-monthly reviews of MSAs. These are known within Defence as ‘Fleet Screenings’.23 While ‘Fleet Screening’ procedures vary among the Services, the intent is broadly similar: that is, to review the funding allocated to sustainment through the MSAs, and make decisions about changes to funding levels, equipment operation, or performance indicators.24 For example, additional funding may be required for equipment that has recently returned from operations. Fleet Screenings comprise a meeting or series of meetings between the Services and Capability Acquisition and Sustainment Group.

Relationship between sustainment and acquisition

2.13 Generally, the lifecycle of Defence’s military assets falls into two periods: acquisition and sustainment. There has long been a strong internal perception in Defence that sustainment attracts less management attention than acquisition. An internal survey a decade ago found that Systems Program Offices attributed the problem to Defence management in Canberra:

sustainment still does not get sufficient level of attention and needs to be recognised as the major component of the life cycle. [Systems Program Offices] claim that policies and new tools are developed which do not sit well with sustainment and, because Canberra is acquisition-focused, sustainment advice is not taken.25

2.14 Five years later, the Rizzo Review into naval sustainment drew a similar conclusion:

The need for the sustainment of assets is understood in Defence and DMO, but it is not given the same rigorous attention as asset acquisition. Sustainment costs can exceed those of the original procurement and the challenges can be more complex.26

2.15 Coles took a similar view a few months later, in the Collins Class Sustainment Review, arguing that Defence should give sustainment much higher attention and priority during the initial phases of the asset’s lifecycle (‘Needs’, ‘Requirements’, and ‘Acquisition’ phases). Failure to pay sufficient attention to sustainment early on would increase the cost of ownership to Defence. That review found ‘sustainment is still being treated as a “poor relation” compared to the generally higher-profile acquisition work’.27 The Sustainment Complexity Review (2010), which found ’50 plus’ sustainment business models in DMO’s Systems Program Offices, explained this diversity by reference to the greater attention Defence had given acquisition. Defence had developed, by the time of that review, a consistent approach to acquisition. But ‘no such global improvement approach’ had been implemented across sustainment. The Review found, in terms similar to those of Coles: ‘In fact, the general comment is that “sustainment is the poor cousin to acquisition.”’28 In June 2017 Defence advised the ANAO that it was attempting to address this perception through the current round of reforms, such as the new Capability Life Cycle and the December 2016 release of Support Procurement Strategies.

Review of Systems Program Offices

2.16 Two recent reviews of the operations of Systems Program Offices have identified major challenges for the future management of sustainment. The first is a Defence internal review conducted in late 2015; the second comprises work done following the First Principles Review as part of the Systems Program Office reform and consolidation project.

Review of contracted services arrangements in Systems Program Offices

2.17 In late 2015, Defence’s Audit and Fraud Control Division reviewed the contracted service arrangements within three Systems Program Offices in response to a request by Defence’s Chief Finance Officer Group. That review found shortcomings in the application of procurement principles, which exposed Defence to risks. Specifically, the review identified instances where:

  • Defence had not established a strong case for value for money for the continual renewals of supplier contracts. This practice: has created the risk of dependency on particular suppliers; raises concerns about probity as contractors and APS staff involved in the procurement may not be dealing with each other at ‘arms-length’; and means that Defence may not be getting value for money.
  • Defence suppliers have access to potentially commercially sensitive Defence financial information which could provide an unfair advantage to some service providers.
  • Defence has entered into sole source contracts with suppliers outside of the Defence system established to manage such arrangements. As a result: service providers may have been contracted to provide services for which Defence has previously determined they were not suitable to deliver, and/or charge higher rates than previously negotiated with Defence; and there is a risk that the documentation required to support the procurement decision may not be adequate.
  • Defence procurement guidance did not reflect some of the requirements of the Commonwealth Procurement Rules in place at the time of the review.

2.18 Defence makes extensive use of external service providers and contractors to deliver acquisition and sustainment services. Further, Defence’s Capability Acquisition and Sustainment Group, is implementing reforms that will potentially increase reliance on industry to deliver these services. Defence therefore requires a robust framework to manage the risks associated with the widespread use of contracted services. Defence informed the ANAO that it had updated its procurement guidance in April 2017 and intends to address the other concerns raised in the 2015 review over the next 18 months.

Reform and consolidation of Systems Program Offices

2.19 Following the First Principles Review, Capability Acquisition and Sustainment Group is undertaking a project to reform and consolidate Systems Program Offices. In its submission to the JCPAA inquiry into sustainment expenditure about this project (February 2017), Defence advised that:

[Capability Acquisition and Sustainment Group] Systems Project Offices Reform project is undertaking a review of all SPOs across [Capability Acquisition and Sustainment Group]. The outcome from this review is alignment of the SPO activity to Planning, Governance and Assurance roles, and enabling industry to undertake the management and delivery of sustainment activity, in particular the transactional functions required to maintain capability.

2.20 Defence records indicate that, as at February 2017, the project had reviewed 24 Systems Program Offices across three ‘domains’ (Maritime, Aerospace, and Joint), comprising 37.5 per cent of the Group’s business delivery units. Systemic issues identified by Defence from the reviews of Systems Program Offices to date are set out in Box 3. They include:

  • the experience, skills and competencies of Systems Program Office staff;
  • duplication of effort; and
  • overhead costs.

2.21 The review is examined in more detail in Chapter 5 of this audit report, while the transparency of internal costs is examined in Chapter 3.

Box 3: Systemic issues identified by Defence in the reform of Systems Program Officesa

Capability Manager Engagement

The Support and Operating Intent of the capability are not being adequately articulated by Capability Managers, which results in:

  1. [Systems Program Offices (SPOs)] not being able to manage to an effective asset management plan;
  2. Disrupted maintenance schedules, affecting availability and increasing costs;
  3. Shortened life of type, and increased whole-of-life cost; and
  4. SPOs prevented from engaging industry using optimised outcome-based performance contracts.

Activities assigned to the Services that affect asset management and asset condition are not being completed, and limited feedback is being provided to the Capability Manager Representative regarding the impact of non-completion on cost and availability.

Supplier Management

SPO staff have limited experience, skills, and competencies needed to effectively establish, govern, and assure industry delivery of capability.

  1. The majority of contracts could be reshaped to enhance industry accountability for asset management outcomes, which could result in more effective time and materials management; and
  2. Revised KPIs could more effectively support more targeted performance based contracting outcomes.

The bulk of SPO staff consist of engineering and logistics experts who are trained to solve technical and practical problems. They tend to manage outcomes by testing quality and auditing activity rather than assuring capability outcomes governing supplier processes; this results in ‘man marking’b [industry], duplication of effort, rework, and delays. The number of SPO staff supporting activities such as planning, contract management, and assurance may not be sufficient and this is a risk to achieving FPR goals (including SPO reform) and ongoing delivery of [Integrated Investment Program] activities.

Hidden Overhead

SPO overhead has increased through a number of avenues:

  1. Management:

    i. The recruitment freeze has resulted in the incremental engagement of multiple individual contractors versus a single Integrated Support Contractor; and

    ii. Ongoing legacy reform activities that continue to require oversight and do not align with current reform activities;

  2. Reporting: SPOs report to multiple bodies for performance management and technical assurance, and all require different information, formats, and timings;
  3. Policy: implementing changes to policy and process from multiple sources with limited guidance or support; and
  4. Shared services: do not provide what is needed by SPOs, resulting in significant use of SPO staff on Financial Management and HR tasks.

Transactional Work

SPOs continue to manage and perform work that industry may be able to do more effectively or efficiently, whilst insufficient resources are being applied to governing and assuring. The review team assesses that in some cases, 25–30 per cent of current SPO activity is involved in auditing and ‘man marking’ industry, making basic modifications, and the management of these activities.

Note a: Defence, CASG Executive Advisory Committee paper position paper reflecting the views of the SPO Reform Project, 13 February 2017.

Note b: ‘Man-marking’ is where Defence staff in the Systems Program Office supervise the performance of one or a small number of contractor personnel.

3. Defence’s performance framework for materiel sustainment

Areas examined

This chapter examines Defence’s performance framework to support the management and external scrutiny of materiel sustainment for specialist military equipment.

Conclusion

The development of an effective sustainment monitoring system remains a work-in-progress, and the effectiveness of Defence’s internal reporting system for sustainment could be improved in several areas. Opportunities also remain to increase the completeness and transparency of publicly reported information regarding materiel sustainment.

Areas for improvement

The ANAO has recommended that Defence institute a risk-based quality assurance process for information included in the Quarterly Performance Report. Defence’s sustainment ‘gate reviews’ could usefully contribute to this assurance process.

Defence should also clarify whether the Quarterly Performance Report system reports on sustainment activities across Defence or is limited to the activities of the Capability Acquisition and Sustainment Group.

3.1 To assess whether Defence has in place an effective performance framework to support the management and external scrutiny of materiel sustainment, the ANAO examined:

  • monitoring systems for sustainment;
  • internal reporting practices for sustainment;
  • reviews of sustainment performance;
  • whether Defence has established measures of efficiency and productivity for sustainment activity;
  • whether Defence assesses whole-of-life costs and the total cost of ownership of its assets; and
  • public reporting on sustainment.

Does Defence have effective monitoring systems for sustainment?

With the introduction of its Sustainment Performance Management System (SPMS), Defence continues to develop a basis for an effective monitoring system for sustainment. Once fully implemented this system should be capable of systematically reporting against a suite of performance indicators settled in agreement with Capability Managers.

There remains potential to improve some core key performance indicators used within the system—for example to more usefully determine the total cost of the capability to Defence. The SPMS system was not fully implemented during the ANAO’s audit fieldwork but is expected to be fully operational by the end of June 2017. In the longer term the system is to be expanded to cover acquisition.

The Monthly Reporting System (MRS) and the Sustainment Performance Management System (SPMS)

3.2 The reporting system relied upon by Capability Acquisition and Sustainment Group and, before it, the DMO, to track major acquisition projects since mid-2004 has been its web-based Monthly Reporting System (MRS). At the time DMO developed MRS it was noted internally that ‘DMO currently spends around 3 billion dollars per year on sustainment activities and at this stage has no system that can adequately report on the effectiveness or efficiency of these activities’.29 Work began in September 2005 to introduce sustainment reporting capacity into MRS.

3.3 Several major reports have criticised sustainment reporting over the last decade, including Mortimer (2008) and Rizzo (2011).30 Rizzo was highly critical of existing performance indicators:

Each current [MSA] Product Schedule has inadequate Key Performance Indicators … They only include a small subset of the ‘contractual’ measures that should be placed on each party and there are no consequences associated with non‐compliance. At worst, non‐delivery will result in a red traffic light status in DMO Sustainment Overview Reports which it seems that few stakeholders read.31

3.4 This led to the commencement of work on a new IT system, the Sustainment Performance Management System (SPMS), in mid-2011 to replace the sustainment module in MRS.32 Defence agreed in August 2012 to the ‘continuing refinement and implementation of a shared Sustainment Performance Management Framework and reporting system for use in the sustainment of all products jointly managed by Defence and DMO’.33 An objective of developing SPMS is that the system should provide ‘the central source of truth for sustainment management data and reporting for sustainment management personnel at all levels of both Capability Acquisition and Sustainment Group and Capability Manager organisations’.34 This was contrasted with the MRS, which reportedly required substantial work-arounds requiring double and triple handling. MRS and, to an increasing extent, SPMS, are now used to populate a Quarterly Performance Report on both acquisition and sustainment (see paragraph 3.11 forward).

How SPMS works

3.5 In common with MRS, SPMS is a web-based system designed to provide performance reports for Capability Acquisition and Sustainment Group and Capability Managers. Data is entered monthly by subject matter experts, usually based in the relevant Systems Program Office. The Systems Program Office Director reviews the data and comments for each measure and provides comments for a set of ‘key performance indicators’ and an additional set of ‘key health indicators’.35 Further comments can be added up the hierarchy to the relevant Capability Acquisition and Sustainment Group division head. By way of example, Appendix 2 of this audit report lists Navy’s MSA Performance Framework core Key Performance Indicators and Key Health Indicators.36 Defence advised the ANAO that Air Force has also developed a corresponding set of indicators and Army is progressing similar work.

SPMS implementation

3.6 SPMS was introduced, first, for Navy (May–July 2015) following work to develop its sustainment performance framework with a standard suite of performance indicators.37 The system was being implemented in Air Force and Army from late 2015 through 2016. Implementation is incremental and subject to feedback and review. Defence expects that all sustainment products will be reporting in SPMS by the end of June 2017.

3.7 A joint review of Navy’s performance framework, completed in March 2016 by Navy and Capability Acquisition and Sustainment Group, concluded that there was ‘universal acceptance that the framework provides a sound basis for the assessment of sustainment outcomes’.38 However, the review decided to remove one of Navy’s core key performance indicators (cost per materiel-ready day achieved) on the basis that the measure required the Systems Program Office to enter four separate data elements and ‘stakeholder compliance’ was reportedly low. The indicator was replaced with a measure ‘CASG-related Product Cost per materiel-ready day’, whose value could be generated automatically and would not require comment by the Systems Program Office.39 This would be reported in SPMS but not be represented as a key performance indicator40 because it did not include the cost of all the fundamental-inputs-to-capability and did not measure the cost per materiel-ready day. Defence informed the ANAO that both measures are based on direct product costs incurred by the Capability Acquisition and Sustainment Group and do not include sustainment costs incurred by other parts of Defence.

3.8 Neither indicator is useful in determining the total cost of the capability to Defence. Knowledge of the total cost could contribute to a better understanding of the whole-of-life costs of Navy assets and, over time, could provide useful insight into whether cost-effectiveness is improving or deteriorating.41

Further development of SPMS

3.9 In September 2016, Defence decided to expand SPMS to cover acquisition. When implemented, the system will be renamed the ‘Program Performance Management System’.42 In the long term, Defence intends to build an ‘enterprise solution’ to encompass both sustainment and acquisition management and reporting using SAP software.

Does Defence have effective internal reporting for sustainment?

The effectiveness of Defence’s internal reporting system for sustainment could be improved in several areas. The Quarterly Performance Report is the primary way by which Defence provides information to government and senior Defence personnel about the status of major acquisition and sustainment activities. However, based on the ANAO’s review of a Quarterly Performance Report produced during the audit, its contents are neither complete nor reliable, it takes two months to produce and its contents are sometimes difficult to understand. The ANAO’s analysis found that the report may not include additional information available to Defence that is critical to the reader’s ability to understand the status of significant military platforms. It provides only a partial account of materiel sustainment within Defence and is potentially at odds with the ‘One Defence’ model promoted by the First Principles Review.

The ANAO has recommended that Defence institute a risk-based quality assurance process for information included in the Quarterly Performance Report.

3.10 Reviews of Defence over the years have highlighted the importance of management reporting on its activities. The Mortimer Review (2008, p. 48) noted that the efficiency and effectiveness of DMO sustainment performance will not improve unless it is measured.

The Quarterly Performance Report

3.11 The Quarterly Performance Report is the primary way by which Defence provides information to government and senior Defence personnel about the status of major acquisition and sustainment activities.43 The report was developed in 2015 in consultation with the office of the Minister for Defence. Staff in the Capability Acquisition and Sustainment Group compile the report manually, incorporating information contained in MRS and, increasingly, SPMS. The report includes performance summaries for the Top 30 major acquisition projects, all major acquisition projects reported in the Major Projects Report and the Top 30 sustainment products. The report also includes an overview of projects of concern and underperforming acquisition projects and sustainment products.

3.12 In a recent submission to the JCPAA, Defence stated that ‘The Defence Ministers are provided with a Quarterly Performance Report, which includes Projects of Concern, Projects and Products of Interest and Performance Summaries for the Top 30 Sustainment Products’.44 The stated intent of the Quarterly Performance Report is to provide ministers and senior Defence personnel with:

a clear and timely understanding of emerging risks and issues in the delivery of capability to our Australian Defence Force end-users. These risks and issues are highlighted so that stakeholders can respond in a coordinated manner to guide the conduct of remediation actions.

3.13 This broad focus is consistent with Defence advice to the ANAO that some sustainment services for some equipment items are also undertaken by Capability Managers and other enabling Groups, including Joint Logistics Command.45 The primary data source for the report is SPMS—the ‘central source of truth for sustainment management data and reporting for sustainment management personnel at all levels of both Capability Acquisition and Sustainment Group and Capability Manager organisations’.46 The approach, as advised to the JCPAA, also reflects the First Principles Review recommendation that an end-to-end approach should be taken for accountability for capability—the ‘One Defence’ model.47

Figure 3.1: Preparation of the Quarterly Performance Report

 

 

Source: ANAO

3.14 However, Defence advised the ANAO in June 2017 that the Quarterly Performance Report’s ‘primary focus is on products and projects delivered by the Capability Acquisition and Sustainment Group, and does not encompass the complete sustainment enterprise. This narrower focus risks telling only a partial account of sustainment within Defence and is potentially at odds with the ‘One Defence’ model promoted by the First Principles Review

Assessment of the report

3.15 Under current arrangements for its production, the Quarterly Performance Report is neither timely—it is more than 50 days old by the time it gets to ministers—nor clear—it is dense with acronyms and jargon. Some terms are used in an unusual way, for example, expenditure of more funds than had been budgeted is referred to as an ‘overachievement’.

3.16 One potentially useful feature is a list of underperforming sustainment products.48 This is based on information provided through MRS, SPMS and gate reviews. Eight underperforming products were listed in the second quarter of 2016. Ten such products are listed in the report for the subsequent quarter (July – September 2016). In that later report the list was labelled ‘sustainment products of interest’ rather than ‘underperforming’, though the items are still listed ‘in order of concern’.

Case study—performance reporting on Army’s Armed Reconnaissance Helicopter

3.17 The ANAO observed marked differences in the information provided about Army’s Armed Reconnaissance (Tiger) helicopter between:

  • a Quarterly Performance Report to the Defence Minister produced during the audit (April – June 2016) and, in contrast,
  • a recent ANAO performance audit49, the 2015–16 Major Projects Report50, a sustainment gate review51 and the Houston review of Army Aviation.52

3.18 A comparison shows that the Quarterly Performance Report does not include information available to Defence that is critical to the reader’s ability to understand the gap between the expectation of capability and reality for the Tiger helicopter program (See Appendix 3).

3.19 In this case, the Quarterly Performance Report’s usefulness is reduced by the following:

  • The use of the ‘traffic light’ indicators ‘Green’ (acceptable performance) or ‘Amber’ (early signs of underperformance). Neither of these is a reasonable assessment of the aircraft’s capacity to meet the required level of capability expected by government.
  • The report’s traffic lights, in particular, focus attention on the measures contained in contractual and intra-Defence agreements (for example, aircraft ‘availability’) and not the measures that matter to Army, the end-user (aircraft actually able to be flown).53
  • The year-end spend performance indicator is blank.

3.20 The contrasting information identified in this case indicates that the Quarterly Performance Report would benefit from the application of a quality assurance process to ensure that government and senior Defence personnel are presented with a frank and balanced assessment of performance (see paragraph 3.27 below).

Does Defence conduct effective reviews of sustainment performance?

Defence conducts reviews of sustainment performance through sustainment gate reviews that help Defence to obtain insight into a project or product’s progress and status. The effectiveness of these reviews could be increased if the lessons obtained from gate reviews were routinely incorporated into management reporting on sustainment and if gate reviews were extended to contribute to the proposed quality assurance mechanism for Quarterly Performance Reports.

3.21 Defence has implemented a strategy in recent years that should help to draw problems in individual sustainment products to senior management attention. This has occurred with the extension of DMO’s earlier program of gate reviews to cover sustainment as well as acquisition.54

3.22 In 2009 Defence commenced gate reviews for capital acquisition projects as an internal assurance process.55 They have provided insight into a project’s progress and an opportunity for project staff to discuss difficult issues with senior management and seek guidance. Defence later recognised that gate reviews also have potential value for sustainment, especially where it allows management to become aware of maintenance problems that might otherwise remain hidden. The Rizzo Review into naval sustainment raised this problem in the following terms:

To avoid being seen to fail personally, there is a danger (especially in the can do, make do environment) that staff will choose to not raise bad news. This can result in bad news remaining at lower levels in the organisation, increasing enterprise risk and only becoming apparent when recovery is expensive, difficult or even impossible.56

3.23 Defence commenced sustainment gate reviews in December 2015. Gate reviews are conducted to ‘provide high quality and reliable advice to Defence and Government regarding the health and outlook of both acquisition projects and sustainment products’. Defence requires that the first sustainment gate review be held within 12 months of the review held before final operational capability is reached.57 Thereafter, further such reviews are to be undertaken periodically (every one, two or three years) with the exact timing determined on a risk basis. Defence has stated that gate reviews ‘review overall performance of the sustainment system and its fitness for purpose rather than just its outcomes for the past month.’

3.24 Gate reviews can draw management attention to sustainment risks in individual fleets. For example, sustainment cost has been identified as a ‘key risk’ for the MRH90 helicopter:

  • An ANAO performance audit in 2013–14 found that, at the time of approval (June 2004), Defence had estimated the sustainment cost for 40 aircraft at $85.2 million a year.58
  • By 2009–10 when only 15 aircraft had been delivered, the cost of sustainment for those aircraft had already exceeded $85 million per annum.
  • During 2012 DMO endeavoured to capture and model the expected cost of ownership for the 47 MRH90 aircraft over the planned life of the aircraft. That modelling indicated cost of ownership of between $240 million and $360 million per year (October 2012 prices), or $5.1 million to $7.7 million per aircraft per year.
  • In May 2016, a sustainment gate review identified the cost of sustainment as a key risk for this program. Defence confirmed in March 2017 that this risk remains.

3.25 Information from sustainment gate reviews is not routinely incorporated into management reporting on sustainment. The ANAO suggests that Defence could devise a means of doing this, thereby capturing additional useful information on sustainment performance and current risks. For example, Quarterly Performance Reports could include a section providing an update on how the issues identified in the sustainment gate reviews are being addressed over time.

3.26 Moreover, there is an opportunity for Defence to extend the use of its Sustainment Gate Review program to have those reviews consider the quality of the information provided in the most recent Quarterly Performance Report for the asset under review, to help quality assure the accuracy, clarity and completeness of that information. As discussed, a quality assurance process for the Quarterly Performance Report would provide a firmer basis for internal and external scrutiny of sustainment reporting.

Recommendation no.1

3.27 The ANAO recommends that Defence institutes a risk-based quality assurance process to ensure the accuracy, completeness and relevance of the information included in the Defence Quarterly Performance Report.

Entity response: Agreed

3.28 Defence will institute a Quality Assurance process covering the Defence Quarterly Performance Report.

Has Defence established measures of efficiency and productivity for sustainment activity?

Defence has not implemented measures of efficiency and productivity for all sustainment products. Reviews have consistently emphasised the need for Defence to improve the efficiency of its operations, including in sustainment. Most recently, the First Principles Review recommended immediate implementation of measures of productivity, a related concept. The ANAO found that, 18 months after implementation commenced, there had been limited progress.

3.29 The efficiency of sustainment has been raised repeatedly over the years. In 2003, a few years after the formation of the DMO, a ‘zero-based’ review found that there was a ‘significant opportunity to reduce resourcing levels by re-engineering processes and improving systems’—in other words, improve efficiency.59 Specifically, there was potential to ‘optimise resource utilisation through judicious outsourcing to original equipment manufacturers in appropriate circumstances’. Later (April 2004), senior DMO managers stated that there was scope to achieve a better risk balance in sustainment program management, where well-defined tasks had, in the past, been transferred to industry successfully with net savings, thereby improving efficiency.

3.30 More recently, after the release of the 2015 First Principles Review, one of the Review panel members characterised inefficiency as the most prominent problem across Defence:

The big problem that the Review team saw was one of efficiency and not of effectiveness. By any reasonable standard the military output—the effectiveness of the ADF—is world class, but does the Department arrive at that effectiveness in the most efficient way possible? Can the Department assure the Government of the day and the Australian taxpayer that the resources—the people, processes and tools—used to create that output are being utilised in the most efficient way? The Review team thought the answer to that was ‘No’.60

3.31 To develop sound measures of efficiency, Defence also needs to understand the cost of ownership of its assets (see paragraph 3.34 forward) and to take account of staff costs in assessing the total cost of its activities (see paragraph 3.44 forward).

3.32 The First Principles Review found no direct measures of productivity in the existing management information systems, though some cost and schedule information is included in MRS and SPMS. The Review proposed the immediate implementation of productivity measures at a project level:

It is also incumbent upon the organisation to have a process in place that effectively measures its cost, schedule and productivity at a project level. This should be implemented immediately and individual leaders from the first line to the Deputy Secretary Capability Acquisition and Sustain­ment must be accountable for cost and schedule targets.61

3.33 On the question of the above recommendation about measuring productivity, Defence referred the ANAO to the Capability Acquisition and Sustainment Group Business Framework and, within it, the proposed process for the Capability Acquisition and Sustainment Group Business Review Cycle. The Business Review Cycle provides for monthly Project/Product Performance Reviews (PPRs) and, at increasing levels of aggregation, Branch monthly reviews and Domain monthly reviews. Deployment of the PPR was, at that point (September 2016), ‘on pause’ pending further review of the prototype arrangements. The envisaged end state is the deployment of a Program Performance Management System (PPMS), see paragraph 3.9 of this audit report), covering both acquisition and sustainment, commencing 30 June 2017.62

Does Defence effectively assess whole-of-life costs and the total cost of ownership of Defence assets?

Defence has recently improved its whole-of-life costing of proposals to acquire major capital equipment but remains unable to measure or report reliably the total cost of ownership. It is now planning to implement a new model, which seeks to capture the full cost of ownership throughout the life of an asset, with implementation planned for completion in July 2018. The First Principles Review has pointed out that Defence has treated Systems Program Office staff costs at the project level as a free good, reducing the transparency of the cost of sustainment work and providing inaccurate price signals and a distorted incentive structure to Capability Managers.

3.34 Achieving value for money is the core rule of the Commonwealth Procurement Rules. When conducting procurement, an official must consider the relevant financial and non-financial costs and benefits including, among other things, whole-of-life-costs.63 Sustainment expenditure is a major contributor to whole-of-life costs—in some cases, the largest contributor.

3.35 Defence major capability decisions have substantial long‐term financial consequences for Defence and the Commonwealth budget. Nevertheless, developing whole‐of‐life cost estimates for major capital projects has been a challenge for Defence for decades. A Defence instruction on lifecycle costing has been in place since at least November 1992.64 In addition, a ‘Through Life Support Manual Volume 6—Life Cycle Costing Analysis’ has been available since 2001 (and remains current).65 The issue of instructions and availability of guidance has not provided assurance of adequate costing action.

3.36 Numerous reviews and audits have urged greater focus on whole-of-life costs and better knowledge of the total costs of ownership of military assets, including sustainment, but have found that Defence has struggled to establish the skills, systems and data for this activity66:

  • Defence was urged to use lifecycle costing throughout the acquisition lifecycle by an ANAO performance audit in early 1998. The audit had found, among other things, that it was not generally used in the ‘in-service’ (sustainment) stage.67 The audit included, as an appendix, a brief better practice guide to lifecycle costing.
    • Defence agreed to the recommendation that total cost information be provided to relevant Defence committees, with qualification.68 When Defence reported in 1998 on the implementation of the then recent Defence Efficiency Review, it stated that ‘A project has been initiated by the Joint Logistic Systems Agency to establish a common process to address through-life support arrangements, including … lifecycle costing … ’.69
  • In 2008, the Mortimer Review recommended that decisions to purchase new equipment or maintain existing systems should be based on the through-life cost of each option.70
  • By July 2011, the Rizzo Review stated that ‘The Team recognises that Defence has a policy instruction on life‐cycle costing … However, it is considerably out‐of‐date and compliance is inadequate’.71

Presenting whole-of-life cost estimates to government for approval

3.37 In recent years, Defence has generally estimated whole-of-life costs when seeking government agreement to a proposed acquisition. An ANAO performance audit in 2013 found that, in most cases, submissions to government included whole-of-life cost estimates. However, this was not presented as prominently as acquisition costs nor was its significance made clear to decision-makers.72

3.38 In the course of this audit, the ANAO examined nine Defence major capital equipment submissions approved by government between July 2015 and April 2016 to review the presentation of whole-of-life costs.73 In all but one submission examined, whole-of-life cost estimates are more prominent than in the submissions examined for the 2013 ANAO performance audit. In one case, whole-of-life cost estimates were not prominent with only the acquisition cost and ‘net personnel and operating costs’ referred to in recommendations to government.74

Defence has plans to improve its whole-of-life costing

3.39 The 2015 First Principles Review found that: ‘Costing methodology does not account for all of the inputs to capability, at acquisition and over project life, and the true total cost of ownership is opaque’.75 It also found that, to manage its major equipment efficiently and effectively, Defence needs to manage on a whole-of-life basis. This requires visibility of the total cost of ownership of major equipment. Such visibility would enhance Defence’s ability to benchmark its sustainment performance against allied countries with similar equipment.

3.40 In October 2016, Defence concluded that inadequate attention to managing its equipment on a whole-of-life basis had resulted in funding shortfalls for ongoing operating, maintenance and support costs. Defence is seeking to address the issue.76 In the same month, it endorsed a total cost of ownership approach to estimating whole of life costs for all major capital equipment, infrastructure, and information and communications technology projects. The model seeks to capture the full cost of ownership throughout the life of an asset.77

3.41 Defence also decided that, in the absence of tender quality cost information it would mandate an analogous78 or a parametric79 approach to developing cost estimates for all Defence major capital equipment projects. It also agreed that deviation from the endorsed approach will be permitted only in exceptional circumstances, and with the approval of Defence’s costing authority—its Chief Finance Officer.

3.42 To provide the historical cost data to support analogous or parametric cost estimates, Defence’s Chief Finance Officer will build a database of historical costs and cost attributes. The data will be obtained from: Defence’s internal cost data; industry suppliers; industry-maintained databases; and cost data made available to Defence from the United States, United Kingdom, New Zealand and Canada.80

3.43 In developing this new approach to cost estimating, Defence sought information from Defence partner nations about the structures, data and approaches they use to develop cost estimates for major capital projects. Defence’s view is that, once implemented, this initiative will improve Defence’s cost estimates for major capital equipment projects and will bring Defence practice into line with the approach adopted by capital intensive industries and Defence partner nations. Defence has an implementation plan with target dates, concluding in July 2018.

Staffing costs are not included in Materiel Sustainment Agreements

3.44 The Mortimer Review (2008) observed that ‘through-life maintenance and support account for more than half of the DMO annual budget and involve about two-thirds of its workforce’. The cost of that workforce is not included in the internal price paid by Defence capability managers for the sustainment services provided by DMO/Capability Acquisition and Sustainment Group through MSAs. Those prices comprise only the costs incurred by the relevant Systems Program Office for goods and services supplied under contract.

3.45 The First Principles Review found that, at the project level, Defence treats staff as a ‘free good’ across the department:

Employee costs are not considered part of a project’s costs and a manager cannot adjust numbers of staff based on project need. An alternative approach is to manage a total project budget that includes employee expenses. This encourages managers to exercise judgement and discretion to ‘trade’ within that operating budget (assessing their need for how many staff and what levels, skills and contracted expertise are required) to most efficiently and effectively deliver the outcomes required.81

3.46 Treating staff as costless lessens transparency of the true cost of activity and introduces risks because of the inaccurate price signals and distorted incentive structure then facing Systems Program Office managers and the capability manager. For example, this approach could make ‘in-sourced’ labour appear costless and put outsourcing at a comparative disadvantage. This risks encouraging insourcing in preference to outsourcing even where the total real cost to the Commonwealth of the former is greater.82 Defence has noted this internally (February 2017):

Any reduction of SPO [full-time equivalent staff], without a commensurate reduction of activity, will result in work being outsourced to industry and therefore increase the cost to the [Capability Manager].

Acquisition and sustainment salaries are held by [Capability Acquisition and Sustainment Group] and not funded by [Capability Managers], so reductions in this Commonwealth APS workforce do not generate a saving for [Capability Managers]. … The anticipated transfer transactional workloads to industry will therefore increase the unit cost of capability delivery as far as the [Capability Managers] are concerned, regardless of the expected overall improvement in Defence’s budget.83

3.47 Systems Program Offices have had between three and four thousand staff over the last decade (Figure 3.2, below). Defence advised the ANAO that Capability Acquisition and Sustainment Group’s expenses on all employees in 2015–16 were $490.4 million. In this light, the cost of those staff is a substantial omission from the apparent price of services facing the capability manager. This is especially so, given the recent observation by Defence that, in some cases, ‘[Systems Program Offices] have been spending … more money on salaries than they were on actual acquisitions or sustainment’.84

Figure 3.2: Numbers of staff in DMO/Capability Acquisition and Sustainment Group Systems Program Offices, 2004–16

 

 

Note a: In compiling the data underlying this table PMKeys Reporting is not able to identify all those organisational units now recognised by Capability Acquisition and Sustainment Group as Systems Program Offices. This data is likely therefore to understate the total number of staff in Systems Program Offices.

Note b: These figures include Australian Public Service and Australian Defence Force staff.

Source: Defence, PMkeys Reporting, Full-time equivalent staff in all Systems Program Offices DMO/Capability Acquisition and Sustainment Group, by month, July 2004 to June 2016.

3.48 On the inclusion of staff costs in the new Product Delivery Agreements, Defence informed the ANAO in March 2017 that ‘The PDA remains under development, and it is inappropriate to speculate in any detail on how anticipated elements of the document will be used’.

3.49 In October 2016, Defence’s Chief Finance Officer stated that, under current proposals, actual costs will be reported in Defence’s central financial management system and that system will be consistent with the total cost of ownership model. Capability Acquisition and Sustainment Group also undertook in July 2016 that ‘over the next 12 months, Capability Acquisition and Sustainment Group will commence effort capture (where staff time is tracked against the projects and products we support), … led by the Group Business Manager’.85

Does Defence produce high quality public reporting for sustainment?

Opportunities remain to improve the quality and transparency of Defence’s publicly reported information regarding materiel sustainment, while being sensitive to national security concerns. Areas for improvement include:

  • achieving a clearer line of sight between planning and reporting documents—for both expenditure and descriptive information at the corporate and project levels; and
  • the use of consistent time series data and analysis of sustainment expenditure.

Defence’s second corporate plan prepared under the Public Governance, Performance and Accountability Act 2013 (PGPA Act) identifies sustainment more clearly than the first corporate plan. Under the PGPA Act, Defence will need to report meaningfully against this plan in its annual performance statement.

Defence’s 2015–16 public reporting of sustainment activity included expenditure information and other descriptive material. However, it was not: complete; consolidated in one easy to locate area; prepared in a manner which permitted the comparison of actual expenditure against estimates; or consistent in its presentation of clear reasons for full year variances. Further, performance summaries were highly variable and inconsistent between public planning documents—Defence’s Portfolio Budget Statements and Portfolio Additional Estimates Statements—and the Annual Report.

Defence has not published program level expenditure data on a consistent basis over time, or time series analysis, to assist with external scrutiny of its sustainment expenditure.

3.50 To assess Defence’s public reporting for the sustainment of specialist military equipment, the ANAO considered:

  • public reporting principles and requirements, including under the Public Governance, Performance and Accountability Act 2013;
  • Defence’s public annual reporting of sustainment activity; and
  • Defence’s pubic reporting of program level estimates over time.

Public reporting principles and requirements, including under the Public Governance, Performance and Accountability Act

3.51 The Public Governance, Performance and Accountability Act 2013 (PGPA Act) ‘aims to improve the line of sight between what was intended and what is delivered’. A key focus of the enhanced Commonwealth performance framework established under the PGPA Act is ‘ensuring that programme managers, accountable authorities, ministers, the parliament and the public are able to use performance information to draw clear links between the use of public resources and the results achieved’.86

3.52 Under the PGPA Act, Commonwealth entities must prepare and publish a corporate plan by the last day of August each year. The plan must include the purposes and activities that the entity will pursue, the results it expects to achieve and planned performance measures. It must cover a minimum of four years.

3.53 Commonwealth entities must report—in an annual performance statement—their actual results against the planned performance criteria outlined in their corporate plan and Portfolio Budget Statements. The report is expected to be a direct acquittal of the performance measurement and reporting intentions identified in the corporate plan.87

3.54 Under section 17 of the PGPA Act, one of the functions of a Commonwealth entity’s audit committee is to provide independent advice and assurance to the entity’s accountable authority—the Secretary of Defence—on performance reporting for the entity. The Defence Audit and Risk Committee will therefore require appropriate information on sustainment reporting to enable it to provide this advice and assurance to the Secretary.

3.55 Sustainment has been identified as a distinct category of expenditure since 1 July 2005.88 In effect, this was the point at which the term ‘sustainment’ came into regular Defence use.

3.56 The first planning and reporting cycle under the PGPA Act requirements was for 2015–16. The 2015–16 Defence Corporate Plan did not include an explicit reference to sustainment in the Purposes statement. The 2016–17 Defence Corporate Plan incorporated a revised set of Defence Purposes and performance measures.89 These identify sustainment more clearly than in the previous plan, under the revised Purpose 2, ‘Deliver and sustain Defence capability and conduct operations’. A performance measure under this Purpose is ‘Military capability is sustained consistent with Government requirements’. This, it states, is to be measured tri-annually and reported annually for each year of the corporate plan. The assessment method is to be ‘Assessment of sustainment against capability manager requirements’. Defence has informed the ANAO that it will report against this measure in its Annual Report for 2016–17.

Defence’s public annual reporting of sustainment activity

3.57 Defence provides information about sustainment in its Portfolio Budget Statements (PBS), Portfolio Additional Estimates Statements (PAES) and Annual Reports.

3.58 Since 2007–08 Defence has reported financial and descriptive information for the Top 20 or Top 30 sustainment products (Top 20 to 2012–13 and Top 30 from 2013–14). In general, reporting has included: a description of sustainment activity; estimated expenditure; actual expenditure; the variation between estimated and the actual expenditure; and some explanation of the variation between estimates in the PBS and PAES.

3.59 At the time of this audit, the latest set of these documents where both prospective and retrospective elements are available was for the financial year 2015–16. The following paragraphs provide an overview of the contents of the 2015–16 performance documents. Further detail is provided in Appendix 4.

3.60 The Defence PBS 2015–16 reports estimates for Defence’s Capability Sustainment Program and Top 30 sustainment products, as well as some descriptive information on Defence’s Top 30 sustainment products. This information is presented in multiple tables and sections of the PBS and an explanation of the variation between tables is not provided.90 Further, some of the descriptive information about sustainment activity is not always expressed in clear language, which detracts from its usefulness. The Defence PAES 2015–16 provides revised estimates for the 2015–16 financial year, the variation between the 2015–16 estimate from the PBS, and, for the Top 30 sustainment products, some explanation of the variation, plus a ‘project performance update’.

3.61 The Defence Annual Report 2015–16 includes financial information and some descriptive material for the same Top 30 sustainment products reported on in the PBS and PAES.91 However, the Annual Report does not report the actual expenditure for Defence’s Capability Sustainment Program, for which estimates were reported in the Defence PBS and PAES.92 Additionally, there is no reconciliation of the ‘total sustainment product funds available’ amounts reported in the Annual Report93 and the total funding for the Capability Sustainment Program reported in the PBS and PAES tables.

3.62 In summary, the ANAO’s review of public reporting on sustainment activity indicates that:

  • reported sustainment expenditure is not complete in that it does not include staffing costs, which are significant94;
  • information on sustainment expenditure and descriptive material is not consolidated in one easy-to-locate area;
  • actual expenditure against estimates is not reported for the Defence Capability Sustainment Program overall;
  • reported expenditure information is not fully reconciled between different information sets;
  • there was no explanation in the 2015–16 annual report for expenditure variations between the PAES estimate and the actual expenditure for the Top 30 sustainment products—therefore the reasons for the full year variances are unclear; and
  • the performance summaries are highly variable and inconsistent between planning documents (PBS and PAES) and the Annual Report. For some Defence systems the planning documentation includes a description of the system followed by a statement of the primary focus for that asset for the forthcoming financial year, and in the annual report there is an account of what has been achieved. In other cases, this information is so general as to be of limited value.

Defence’s public reporting of program level estimates over time

3.63 Defence has not published program level estimate data on a consistent basis over time, and no time series analysis is published by Defence to assist with external scrutiny of sustainment expenditure.

3.64 Published estimates comprise of two separate data series. From 1 July 2005 until the DMO was delisted on 30 June 2015, Defence reported its sustainment expenditure estimates in its Portfolio Budget Statements (and Portfolio Additional Estimates Statements). This was funding to be spent by DMO on sustainment on behalf of Defence.95 From 2013–14 forward, Defence also reported the estimated cost of capability sustainment for the whole of Defence. These estimates, based on the latest available figures each year (generally from the Portfolio Additional Estimates Statements) are set out below (in Figure 3.3) for the period 2005–06 to 2020–21 (the end of the current forward estimates period as at May 2017).

Figure 3.3: Defence estimates of the cost of its capability sustainment program, 2005–06 to 2020–21

 

 

Source: Defence, Portfolio Budget Statements and Additional Estimates Statements, 2005–06 to 2017–18.

3.65 During DMO’s existence, the data represents expenditure by DMO on capability sustainment excluding DMO service fees/DMO departmental appropriation (representing DMO staff and other operating costs). The second series, commencing in 2013–14, represents the reported totals for Defence’s capability sustainment program.

3.66 There is no direct explanation in the estimates documents reconciling or explaining the difference or helping the reader to identify any trends. It is apparent, however, that different and additional items are included in the latter series.

3.67 The estimates documents generally provide components of these estimates, divided among the Services and other Defence Groups (Navy sustainment, Army sustainment and so on). Again, this presentation is not consistent as in certain years, for example, individual product sustainment costs are identified as separate line items.96 Nevertheless, Navy, Army and Air Force sustainment costs can be traced to identify trends. This is reflected in Figure 3.4. (These figures do not include DMO/Capability Acquisition and Sustainment Group staff costs.)

Figure 3.4: Estimates of Service sustainment costs, 2006–07 to 2020–21

 

 

Note: Excludes Defence staff costs.

Source: Defence, Portfolio Budget Statements and Additional Estimates Statements, 2006–07 to 2017–18.

3.68 For each Service, comparing the amounts presented in the first and last years of this analysis yields an average annual (compound) rate of growth in the estimates.97

Table 3.1: Increases in sustainment estimates by service, 2006–07 to 2020–21 ($m)

Service

Navy

Army

Air Force

2006–07

1 325.30

942.40

1 153.60

2020–21

2 175.70

1 704.50

2 703.50

Annual rate of growth (compound) from 2006–07 to 2020–21

3.60%

4.32%

6.27%

       

Source: Defence, Portfolio Budget Statements and Additional Estimates Statements; ANAO analysis.

3.69 When sustainment costs are presented cumulatively, the contribution of items other than Navy, Army and Air Force sustainment to sustainment cost growth over the forward estimates can be seen. As others have observed, the apparent growth flows mainly from the inclusion of certain Estate and Infrastructure Group costs (such as garrison support) and the balance of Defence’s estimated future sustainment funding that Defence has yet to allocate to a Defence Group within the broader Capability Sustainment Program in the last years of the period.98 This is apparent for the period following the delisting of DMO and the inclusion of items other than specialist military equipment in Defence’s overall sustainment estimates (Figure 3.5).

Figure 3.5: Cumulative estimates of Defence sustainment costs, by year, 2006–07 to 2020–21

 

 

Source: Defence, Portfolio Budget Statements and Additional Estimates Statements, 2006–07 to 2017–18.

4. Smart Sustainment reforms

Areas examined

The ANAO examined the reforms to Defence sustainment—known as ‘smart sustainment’—carried out as part of the Strategic Reform Program from 2009 forward.

Conclusion

The 2015 First Principles Review was preceded by earlier major reform initiatives, notably the Strategic Reform Program (SRP) begun in 2009. Defence records show that the department made substantial efforts to keep track of the large number of diverse initiatives identified across the department under the ‘smart sustainment’ reforms associated with the SRP, including internal reporting to management. However, Defence did not adequately assess the outcomes from the ‘smart sustainment’ reforms.

Areas for improvement

Defence would benefit from evaluating future major change programs. This matter is also taken up in the next chapter.

4.1 Sustainment practices have been subject to numerous reform initiatives over the last decade, especially since the Mortimer Review (2008).99 In 2015, the First Principles Review concluded that the change programs associated with previous reviews ‘have resulted in only incremental change’.100 This remark explicitly included all phases of the capability lifecycle, including sustainment. The most substantial reform effort was Smart Sustainment under the Strategic Reform Program, associated with the 2009 Defence White Paper. The ANAO examined first, a 2008 DMO initiative to cut sustainment costs and, second, Smart Sustainment to identify what became of those reforms.

Did DMO achieve the intended savings to sustainment costs from its 2008 initiative?

There is no record of whether the intended savings to sustainment costs were achieved from DMO’s 2008 initiative to identify efficiencies. A target expenditure cut of five per cent of sustainment costs over a financial year was set by the Chief Executive Officer of the DMO in February 2008. Defence expected savings from: reducing spares; making more use of performance-based contracting; discouraging unnecessary end-of-year spending; travel efficiencies and improved maintenance philosophies.

4.2 In February 2008, the Chief Executive Officer (CEO) of DMO announced a target of reducing by five per cent in real terms, in 2008–09, the sustainment cost of each of the 100 or so fleets of equipment then sustained by DMO.101 The CEO announced this measure as ‘an ongoing reform program that aims to identify efficiencies within the DMO and reduce the cost of ownership to the Commonwealth in the operation of sustainment fleets’.102 This, he wrote, was to satisfy a key priority of the government to reduce inefficiencies across the public sector.

4.3 The CEO wrote to all Systems Program Office directors seeking their help. He described the areas where he expected they could make cost reductions, including by: reducing spares; more performance-based contracting; discouraging end-of-year spending to meet budget; travel efficiencies; and improved maintenance philosophies. The initiative was not prescriptive except that capability and safety should not be compromised by any savings proposal. The cut was subsequently referred to within DMO as ‘a harvesting of “fat” within the products … readily absorbed by relevant Systems Program Offices and Divisions’.103 The CEO DMO also wrote to the CEOs of companies undertaking sustainment work for DMO seeking their support. That letter also identified a savings target of $200 million a year for re-use elsewhere in the portfolio.

4.4 Defence informed the ANAO that it believed it had achieved a five per cent sustainment budget cut in 2008–09. However, the department’s approach to monitoring savings does not provide sufficient information to determine whether that occurred, and how the reported savings were achieved.

What did Smart Sustainment achieve?

Defence has not kept a systematic record of the outcomes of the Smart Sustainment initiative associated with the 2009 Strategic Reform Program. In 2011, a ‘health check’ by external consultants found some early successes with cost reductions and changes in practice. Nevertheless, the consultants considered the program was failing because of shortcomings in governance, program management, and Defence’s approach to reform described by a major vendor as ‘minor reform, driven by piecemeal, top down budget pressure’.

The 2009 Smart Sustainment initiative

4.5 The government announced Defence’s Strategic Reform Program in May 2009. Defence began its Smart Sustainment reforms in early 2010 as part of the Strategic Reform Program Implementation package. The Program was largely based on the findings of the 2008 Audit of the Defence Budget (the Pappas Review)104, led by consultant Mr George Pappas and undertaken in parallel with the development of the 2009 Defence White Paper, companion reviews, and the Mortimer Review. The Pappas review had observed that sustainment was ‘an area with significant opportunities for efficiencies’.105

4.6 The Strategic Reform Program comprised 15 separate streams of reform, several of which were intended to reduce costs.106 Overall, Defence expected the program to yield $20.6 billion, to be reallocated within Defence over the following decade, with Smart Sustainment contributing $5.5 billion of that figure.107 This would make sustainment savings the largest single source of funds under the Strategic Reform Program. Defence planned to reallocate the $20.6 billion to other areas of Defence capability and contribute to the government’s 2009 Defence White Paper, Defending Australia in the Asia Pacific century: Force 2030. To facilitate the reforms, Defence made $360 million of ‘investment funds’ available for projects that would yield net savings.

4.7 Defence’s 2010 publication The Strategic Reform Program—Making It Happen stated:

Defence’s budget to 2019 has already been adjusted to take account of the $20.6 billion in reinvestment required for Force 2030—this requires us to think less in terms of adding up savings and more in terms of living within our means.108

4.8 The harvesting and reallocation of the $20.6 billion from Defence’s Strategic Reform Program are not apparent in Defence’s Portfolio Budget Statements and Annual Reports. Further, the ANAO understands that the adjustments related to the expected savings were not identified in the Commonwealth’s central budget management system. This indicates that any actual adjustments, if made, were to internal Defence allocations.

4.9 Defence’s Sustainment Complexity Review in mid-2010 warned that most Systems Program Offices ‘did not have the capacity to undertake Smart Sustainment initiatives’.109 Nevertheless, in late 2010, an internal ‘health check’ on Defence’s governance arrangements for Smart Sustainment found that, despite some difficulties, Defence had the fundamentals in place to support their success.110

4.10 Defence gave examples of activities it considered successful in its 2010–11 Annual Report (pp. 22 and 81):

  • More efficient support services for the Jindalee Operational Radar Network, reportedly saving $100 million over ten years;
  • Navy using Telstra’s NextG network for several minor war vessels, relinquishing satellite communications subscriptions and saving $2.3 million a year; and
  • A service optimisation program for Army Land Rovers, saving $590 000 a year.111

4.11 In 2011, a Chief of Service advised DMO’s audit committee112 that an emerging issue from the Strategic Reform Program, common across all the ADF, was a shortage of future funding for sustainment. In July 2011, a further health check—this time by external consultants—found that Smart Sustainment had achieved some early successes with cost reductions and changes in practice. Nevertheless, it was at risk of failure. This was because of shortcomings in governance, program management, and Defence’s approach to reform.113 Its major findings include:

  • Defence’s devolved and fragmented leadership approach to the program meant that Defence had no holistic vision of what the reform program was setting out to achieve, and therefore no logical program of work designed to support it.
  • Fragmented management meant Defence did not have an enterprise-wide view of the many Smart Sustainment initiatives, and was missing opportunities for wider reform.
  • Defence did not track expenditure or benefits from the $360 million allocated to support the program.
  • The lack of understanding of whole-of-life costings and poor management of sustainment funding within Defence meant that the increasing levels of future cost reductions expected were at risk (the review estimated a shortfall of $1.8 billion could increase to a $3 billion shortfall).
  • The multiple approaches and varying quality for determining baselines and metrics for the various improvements expected for the program, and inadequate tracking of benefits would limit Defence’s assurance about improvements and cost reductions.
  • Defence had not developed robust capability and cost baselines for all capabilities, did not fully understand sustainment cost drivers nor whole-of-life costs for platforms.
  • Sustainment costs were increasing because of the inappropriate application of DMO policies and procedures, and a lack of business acumen in contracting and engagement with industry.
  • Defence had not defined the cultural changes it expected from the program and therefore may not deliver the behavioural changes required for long-term reform.
  • Many Systems Program Offices had concerns that capability and safety were at risk.

4.12 A year later, in August 2012, a Defence internal audit of Strategic Reform Program savings and performance reporting questioned the integrity of reporting and found it had focused mainly on reporting savings rather than achieving reform.114 A Defence update to Defence Ministers on the Smart Sustainment reform program (August 2012) reported that:

Currently more than 800 Smart Sustainment initiatives are either underway or planned; about half of these have quantified the estimated savings, equating to about $3.7b across the remaining years of SRP (against a residual target of about $4.6b). Work continues with the Capability Managers to develop more systemic demand-related reform activities for implementation in the medium to longer term.

4.13 In contrast to the advice to ministers, in September 2012, an internal progress report on the action items from the 2011 health check showed that Smart Sustainment was still struggling to gain traction. There had been limited progress on the 35 action items identified in the 2011 health check with seven action items implemented, five abandoned, and 23 outstanding (among which Defence intended to re-write 14).

4.14 Also in September 2012, Defence advised the minister that the performance of the broader Strategic Reform Program was continuing to decline, losing credibility as a transformation program, and that the program’s financial performance ‘was particularly difficult to assess’. Defence informed the minister there was a significant risk that Defence would fall short of the cost reduction target for the Program ($20.6 billion) by some $4 billion, almost one quarter of the remaining target ($17.6 billion). Smart Sustainment was showing ‘significant stress’. One major contractor noted that ‘Smart sustainment … has achieved limited success to date through cost cutting and minor reform driven by piecemeal, top down budget pressure’.115

4.15 Defence records show that the department made substantial efforts to keep track of the large number of diverse initiatives identified across the department under Smart Sustainment, including internal reporting to management. However, Defence did not adequately assess that their intended outcomes were realised or enduring.

Did Defence achieve the expected savings from Smart Sustainment?

Smart Sustainment had a ten-year savings target of $5.5 billion. In its 2014–15 Annual Report, Defence claimed to have achieved $2 billion of savings from the initiative in its first five years. Defence has not been able to provide the ANAO with adequate evidence to support this claim, nor an account of how $360 million allocated as ‘seed funding’ for Smart Sustainment initiatives was used.

4.16 Defence documents indicate that by June 2013, it had determined that the Strategic Reform program targets were no longer valid and Defence was developing a revised approach to reform. This was to ‘focus more holistically on benefits realised to the organisation through reform, rather than only acquitting financial targets’. Defence expected that the comprehensive reform benefits and reporting for the revised reform model would be further refined in the near future.116 The government agreed that Defence could cease reporting savings from Smart Sustainment and Defence undertook to report to government annually, detailing achievements and benefits of the new reform approach across productivity, cultural and ADF Service-specific reforms.

4.17 There is no evidence that Defence further developed a planned approach to a reform program supported by clear objectives or benchmarks, or that it provided reports annually to government on the achievements and benefits of this new approach. Following the change in government in 2013, it terminated work on the Strategic Reform Program.

4.18 By June 2015, Defence was claiming approximately $2 billion in savings from the Smart Sustainment Program since its inception in 2009.

4.19 Notwithstanding Defence’s effort to keep track of the claimed and planned savings from Smart Sustainment, Defence has not been able to provide the ANAO with adequate evidence that the savings were realised. Defence was able to provide the ANAO with copies of a suite of standard letters of assurance from managers throughout DMO, each stating that, ‘to the best of their knowledge’, cost reductions of various values had been implemented for the respective financial year. These letters of assurance provide no details of the specific initiatives or how the claimed savings were calculated.

4.20 In October 2015, Defence’s First Principles Review Implementation Team decided that work on Smart Sustainment, along with other reform activities underway within Capability Acquisition and Sustainment Group at that time, would cease and the ‘relevant elements’ were to be incorporated into work on the new Capability Life Cycle.

4.21 There are reasons to doubt some of the savings estimates for Smart Sustainment:

  • It is likely that some unquantifiable portion of claimed savings derive from the five per cent across-the-board cut among Systems Program Offices imposed by the Chief Executive Officer before the Strategic Reform Program commenced. This was, strictly speaking, a previous initiative.
  • Internal Defence briefs and letters of assurance reveal that Defence believed it could claim some $500 million in savings from no longer having to sustain the Super Seasprite helicopter following the project’s cancellation. This equipment never made it into service before the project was cancelled, yielding no capability whatsoever at a cost of $1.5 billion.117 It is not clear how this windfall reduction in future sustainment costs can properly be attributed to the program. Moreover, the project was cancelled in March 2008, over a year before the government announced Defence’s Strategic Reform Program.
  • An unspecified proportion of Smart Sustainment savings of $197 million in 2009–10 flowed from a change in fuel pricing. This is another windfall benefit which cannot be properly attributed to the program.118
  • Some $360 million was allocated as ‘seed funding’ to be invested in projects intended to yield savings under Smart Sustainment. In 2011, a review of the Smart Sustainment program by consultants engaged by Defence identified that there were no records to support how these funds were spent or what benefits had been realised. Defence advised the ANAO in March 2017 that it was ‘looking into what became of the [unquantified] residual seed funds’.

Lessons from Smart Sustainment

4.22 The Smart Sustainment initiative lost momentum and its costs and benefits are uncertain. There was no systematic evaluation of the constituent projects at their conclusion nor overall evaluation of this major program. The main insights into what became of the initiative can be derived only from external consultancy reports. It is now too late to revisit these events; nonetheless, Defence could benefit from evaluating future major change programs. This matter is also taken up in the next chapter.

5. Materiel sustainment reform—First Principles Review

Areas examined

This chapter examines the First Principles Review to identify implications for Defence’s management of sustainment. It also considers how Defence—particularly within the Capability Acquisition and Sustainment Group—is implementing the reforms most relevant to the management of sustainment, and the progress of those reforms. Defence’s asset management project is also examined. This is a project relevant to sustainment and which has been underway for many years.

Conclusion

Reforms to the management of sustainment flowing from the 2015 First Principles Review remain at an early stage, and this stream of activity is likely to take much longer than the expected two years. For example, the Systems Program Office reviews are not yet complete and Defence has provided no evidence that decisions have been taken on any changes to their structure and functioning.

Defence has engaged industry expertise to guide and help it with the First Principles reforms relating to acquisition and sustainment, including $107 million with a single company where the contract for services is not performance-based. Reform is expected to lead to greater outsourcing of functions currently performed in-house by Defence’s Systems Program Offices.

Areas for improvement

The ANAO has made a recommendation aimed at evaluation of the reforms now underway.

Discussion of sustainment in the First Principles Review

5.1 The most recent major review of Defence, the 2015 First Principles Review, was a comprehensive examination with a suite of recommendations for reform. The Review’s terms of reference included one specific reference to sustainment—a requirement that Defence ‘ensure a commercially astute, focused and accountable materiel acquisition and sustainment capability’. The terms of reference were also accompanied by a range of issues for consideration, among which was ‘How Defence sustains and supports capabilities and extending lessons from the Coles review [of Collins Class submarine sustainment] across all areas of sustainment’. The 2012 Coles Review, by subsequent reports, has been both successful and insightful.119

5.2 The report of the First Principles Review refers to sustainment in the discussion of its recommendation that Systems Program Offices each be examined and analysed:

In some cases it would be appropriate to outsource completely the acquisition and sustainment functions conducted by a Systems Program Office. In other cases it may be appropriate to outsource sustainment and partner with industry for the acquisition phase.120

5.3 Otherwise, the report does not discuss sustainment as a topic separate from the acquisition stage of the equipment lifecycle. It does not explicitly analyse the Coles Review and its lessons (such as the value of international benchmarking, discussed below).121 Rather, the report passes the task of learning from Coles back to Defence, stating that the implementation of the First Principles Review’s proposed approach ‘should encompass the progress made through the Rizzo and Coles reforms’.

Box 4: The Coles Review

The Coles Review identified five root causes of the problems it observed. Appendix 5 of this audit report sets out both the root causes and those of the Coles Review’s consequential recommendations that have wider applicability in Defence sustainment.

The value of international benchmarking

The Coles Review (2012) used benchmarking against international comparator navies to review the sustainment of Defence’s Collins Class submarines.a The Coles team used data from an international benchmarking report compiled by Defence to compare the performance of the Collins Class boats against submarine fleets of international navies. That data showed the performance of the Collins Class submarines fell well below that of international comparator submarine fleets.b The review also found the cost effectiveness (cost per materiel ready day)c of the international navies’ fleets was at least twice that of the Collins Class sustainment program.

In 2016, the Coles review team found significant improvements in the performance of the Collins Class submarines. The review also found that while the cost effectiveness of the Collins Class submarine sustainment program was showing signs of improving as submarine availability increased, it was yet to achieve international benchmarks.d The review team recommended that Defence should now focus on cost reductions to improve the efficiency of the Collins Class submarine sustainment program.e

Benchmarking the performance of the Collins Class submarines gave Defence a baseline against which to measure its progress in remediating the sustainment program. Defence has advised that it has done limited benchmarking elsewhere, in particular, within the Aerospace Domain for two platforms: C17 (Globemaster transport aircraft) and C130J (Super-Hercules transport aircraft). Work towards benchmarking was begun in 2013 for Navy’s surface ships and shore capabilities in response to a Chief of Navy directive. There is also evidence of a planned ‘benchmarking exercise to identify appropriate operating and functional models that can support DMO’s future End State Design’ (around May 2015). Defence has not explained what became of these initiatives.

Note a: The Collins Class submarine sustainment program has been a ‘project of concern’ since November 2008.

Note b: The availability of the Collins Class was about half that achieved by comparable submarine fleets; time spent in planned maintenance was about one third greater; and maintenance overruns and percentage days lost due to defects were approximately double.

Note c: Defence defines a ‘materiel ready day’ as a day in which a submarine is not in planned maintenance or does not have defects that prevent it from being at sea.

Note d: In late 2016, Defence estimated that the Collins Class submarine sustainment program would achieve international benchmarks for cost per materiel ready day by 2022–23.

Note e: The 2016 report from the Coles review team states that ‘Attaining benchmark performance was a higher priority than efficiency. With benchmark availability on the verge of being achieved, the focus should now be on efficiency improvements and cost reductions across the sustainment program. Such cost reductions may be required to re-invest into inventory, obsolescence remedies, new infrastructure to manage an ageing fleet and the transition to future submarines. The achievement of this should be greatly enhanced by the application of the cost model’. Defence has advised that Collins Class sustainment cost per materiel ready day has decreased by approximately 40 per cent. This has largely been achieved by increasing the number of materiel ready days being obtained by adopting a 10+2 year Usage and Upkeep Cycle, but also by reducing costs of maintenance periods, particularly Full Cycle Dockings.

Reforms which will affect sustainment and its management

5.4 Although sustainment does not attract a separate focus in the First Principles Review, the review’s findings and recommendations have major implications for how sustainment is performed. The management of sustainment arises mainly in the discussion leading to First Principles Review Key Recommendation 2:

Establish a single end-to-end capability development function within the department to maximise the efficient, effective and professional delivery of military capability.122

5.5 A subsidiary recommendation (‘2.2 disbanding the DMO and transferring its core responsibilities in relation to capability delivery to a new Capability Acquisition and Sustainment Group’) has led to the creation of Capability Acquisition and Sustainment Group in Defence. Although this largely represents the absorption and re-badging of the former DMO, other subsidiary recommendations are directed at substantial restructuring and reformulating the business practices in this part of Defence. In particular, the other recommendations with substantial implications for the management of sustainment are as follows:

We recommend:

2.3 developing a new organisational design and structure as part of the implementation process for the Capability Acquisition and Sustainment Group with reduced management layers;

2.4 examining [and analysing] each Systems Program Office to determine where each fits within the smart buyer function, [and] the most appropriate procurement model [for delivering the capability] and achieving value for money; and

2.11 significant investment to develop an operational framework which comprehensively explains how the organisation operates and the roles and responsibilities within it; a detailed set of life cycle management processes which provide project and engineering discipline to manage complex materiel procurement from initiation to disposal; and reviewing architecture to reinforce accountability at all levels and bringing together information upon which good management decisions can be made.123

5.6 Recommendation 2.11 aggregates three First Principles Review recommendations which appear after the following conclusions in the text:

In an organisation which routinely manages complex projects and programs we found it remarkable that there is no common project management architecture or artefacts to support it. There are no standardised reporting mechanisms (reporting is informal, anecdotal, local or crisis based) or management processes, with all divisions having different methodologies and manage­ment systems. There has been no consistent application of fundamental tools such as Earned Value Management. Such tools will be required to support the new leadership team.124

5.7 Defence’s Capability Acquisition and Sustainment Group has its own ‘CASG First Principles Review Implementation Plan’, which contains a range of elements, some of which correspond directly to the summary set of recommendations in the First Principles Review (such as ‘[Systems Program Office] Consolidation and Reform’). There are others which relate to the more detailed findings of the First Principles Review, including those identified in the paragraph above (such as a common approach to project management).125 All elements have implications for both acquisition and sustainment.

5.8 The ANAO identified the following streams of work—either directly flowing from the First Principles Review or from Capability Acquisition and Sustainment Group’s own list of deliverables—as having the most apparent potential to affect the management of sustainment:

  • examining each Systems Program Office to determine where each fits within the smart buyer function, and the most appropriate procurement model for delivering the capability and achieving value for money. The primary element of this reform is the consolidation and review of the Systems Program Offices;
  • performance-based contracting;
  • establishing centres of expertise as part of a matrix organisation within Capability Acquisition and Sustainment Group to make best use of specialist skills that are common across its Systems Program Offices126;
  • developing the new operational framework (‘Business Framework’) to explain comprehensively how the organisation (specifically, Capability Acquisition and Sustainment Group) operates;
  • development of the ‘smart buyer’ model; and
  • though not part of the First Principles Review, any further work Capability Acquisition and Sustainment Group carries out on its asset management strategy, a stream of activity that has been under way for some years.

5.9 The First Principles Review specifies few targets or measures of performance. However, it does set two years as a timeframe for delivery of the changes it recommends.127 In setting that objective it allows for a ‘tail of activity’ that might run beyond two years for which it gives, as an example, completing the roll-out of its new ‘smart buyer’ model.

5.10 The ANAO considered how Capability Acquisition and Sustainment Group is managing the reform process, including by the extensive use of industry expertise. The ANAO considered progress with two of the above elements briefly (business framework, centres of expertise) but in more detail for three others (smart buyer, Systems Program Office reform and performance-based contracting).

Has Defence developed relevant capability to support its implementation of sustainment reforms?

Defence has drawn heavily on contracted industry expertise to support its implementation of the program of organisational change relating to acquisition and sustainment that has followed the First Principles Review. This represents a major investment of at least $120 million, including contracts for services from the principal provider (Bechtel) valued at some $107 million. Contracts with the principal provider are not performance-based.

5.11 A major feature of the implementation of the First Principles Review has been the use of industry expertise. Capability Acquisition and Sustainment Group officers have stated that, because of the lack of internal capacity and expertise, the group has depended heavily on consultants/contractors with its reforms. Reflecting the First Principles Review theme of better engagement with industry, Defence has engaged a USA-based engineering company, Bechtel, to help with implementing the Review’s recommendations. Bechtel was formerly a contender to manage and operate the UK counterpart organisation to Capability Acquisition and Sustainment Group, Defence Equipment and Supply. Bechtel is now, with others, providing the UK with tailored private sector expertise and human resources, indicating relevant experience in the Defence procurement field.

5.12 Bechtel had made contact with Defence in July 2013 seeking an opportunity to help with reform of the DMO, presenting a paper ‘Australian Defence Organisation—DMO Transformation Opportunities’. Following the First Principles Review, Defence first engaged Bechtel from October 2015 to April 2016 (extended to June 2016) to provide initial implementation help.128

5.13 Senior Bechtel representatives met with Defence officials again in December 2015 and subsequently presented a new paper ‘Delivering CASG Transformation’. This expanded on various models of contract, including ‘Management & Operations (M&O) contracts, as vehicles for enterprise transformation’ and offered suggestions as to how Capability Acquisition and Sustainment Group might approach the following 18 months of reform.129 The paper set out a range of options for organisational transformation.

5.14 Defence offered Bechtel significant additional work in May 2016, engaging them to provide advisory services and a senior manager, seconded into a Capability Acquisition and Sustainment Group division head role.130 The estimated cost of all current contracts with Bechtel is about $107 million.131

5.15 The contracts with Bechtel are not performance-based. The Commercial Division in Capability Acquisition and Sustainment Group expressed concern that Defence needed to articulate clearly its ‘objective/end state’ before agreeing to contracts of this cost and magnitude. It pointed out that a similar contract with a counterpart organisation in another jurisdiction included a performance element where Bechtel’s fee was based on outcomes achieved. It expressed concern that without such a feature Defence was exposed to risk. Defence advised the ANAO that it would have been difficult to implement the contract as a performance-based contract:

Unlike standard Defence capability projects, Bechtel has been contracted to develop Defence’s capacity and capability to deliver rather than for Bechtel to deliver itself. Bechtel’s role is to support, train and transfer knowledge to Defence staff. The success of the contract will be measured by Defence’s ability to operate without Bechtel support at the end of the contract, rather than by any specific product delivered by Bechtel.

5.16 As at November 2016, the total estimated value of contracts awarded to support Defence activity in Capability Acquisition and Sustainment Group directly or indirectly associated with implementing the First Principles Review, excluding the Bechtel contract, was $40 million.

Box 5: Tasks being undertaken by Bechtel

Defence has engaged Bechtel to ‘assist [the] Deputy Secretary of Capability Acquisition and Sustainment, … on an as required basis, to design and implement First Principles Review deliverables that relate to Capability Acquisition and Sustainment Group and its functions’. Bechtel provides this assistance through Bechtel staff embedded within Defence.

Defence has summarised Bechtel’s tasks under the contract as follows:

  • Provide high level advice to Defence Senior Leaders on reform, business improvement and change management initiatives;
  • Support the design and implementation of the new Capability Acquisition and Sustainment Group business framework;
  • Assist Capability Acquisition and Sustainment Group in strengthening the project controls discipline across the Group including monthly reporting, business process improvement and cost transparency, mentoring and coaching Defence leaders, and providing advice on international best practice;
  • Assist with implementing the smart buyer function across Capability Acquisition and Sustainment Group, including conducting pilots in Chief Information Officer Group and Estate and Infrastructure Group;
  • Assist Defence redesign and restructure Systems Program Offices to align with required reform activities, including but not limited to smart buyer function, business framework and capability life cycle reforms;
  • Assist Capability Acquisition and Sustainment Group with the establishment of the Commercial Centre of Expertise including business process improvement, mentoring and coaching Defence leaders and providing advice on international best practice;
  • Assist Capability Acquisition and Sustainment Group with the establishment of a Program Management Office at the Group level to support the Capability Acquisition and Sustainment Senior Leadership Group in managing the program;
  • Assist Defence to measure the efficiency gained through the above reform activities (in terms of both cost and schedule);
  • Other reform activities as required by Defence.

The contract is fee-based with no performance measures.

As noted in paragraph 5.14, in July 2016, a former senior Bechtel executive was seconded to the position of First Assistant Secretary Program Performance in Capability Acquisition and Sustainment Group, under the day-to-day direction of the senior Capability Acquisition and Sustainment Group leadership.

What progress has been made with the reforms?

Systems Program Office consolidation and reform is one of the largest reforms to Capability Acquisition and Sustainment Group deriving from the 2015 First Principles Review. This stream of activity is likely to take much longer than the two years expected for implementation. In June 2017, Defence advised the ANAO that implementation plans for significant reforms to Systems Program Offices would be developed in the second half of 2017.

Other reforms flowing from the First Principles Review remain underway:

  • Introducing performance-based contracting into Defence sustainment has been underway for over a decade.
  • Defence does not yet have a completed register of its acquisition and sustainment contracts though, since January 2016, it has had a facility in place and begun populating it.
  • Initial establishment of ‘centres of expertise’ in Capability Acquisition and Sustainment Group is underway. Defence expects full implementation to take a further two years. Similarly, the new Capability Acquisition and Sustainment Group Business Framework is expected to take ‘many years’ to fully implement.
  • Defence has developed its own meaning for the term ‘smart buyer’, which does not clearly articulate the intent of the First Principles Review recommendation. This introduces risks related to ensuring that Systems Program Office staff working on outsourcing have the necessary skills and competencies.

Defence has also been developing its approach to asset management over many years, having obtained substantial advice from internal and external sources to adopt a Defence-wide asset management strategy to underpin a sustainment business mode for specialist military equipment. It is now not clear whether Defence will continue with this work.

5.17 As set out above (paragraph 5.8), the activities most likely to affect the management of sustainment are:

  • consolidation and review of Systems Program Offices;
  • performance-based contracting;
  • establishing centres of expertise;
  • developing a new operational framework;
  • development of the ‘smart buyer’ model; and
  • any further development of Defence’s approach to asset management.

Consolidation and review of Systems Program Offices

5.18 When Defence established Systems Program Offices in 2000 as the focus of the newly created DMO, Systems Program Offices were to have the following characteristics:

  • a collection of multi-functional, multi-disciplined teams;
  • staffed to achieve a balance between environmental, capability system and technical needs in whole-of-life management;
  • organised along class/fleet/weapon system lines; and
  • ideally located with the principal user but, dependent on the maturity and nature of the current, new and enhanced materiel systems, may be located at more than one site.132

5.19 Defence has more recently (April 2016) defined Systems Program Offices as:

a business unit that works directly with Capability Managers and Suppliers as necessary to deliver agreed capability across the [Capability Life Cycle], coordinating the [Fundamental Inputs to Capability] elements required to achieve this objective.133

5.20 The purpose and structure of Systems Program Offices have remained essentially the same since their inception in 2000. A consultancy found in March 2013 that:

DMO evolved through the amalgamation of many dispersed business units, without undergoing typical post-merger integration activities needed to drive efficiency. Consequently, DMO retains over 50 SPOs [Systems Program Offices] each operating relatively independently of others and in different positions in the value chain. SPOs vary widely in their shape, size, supplier relationships and services to Capability Managers.134

5.21 The report found that the lack of Systems Program Office commonality added complexity, blurred accountability and confounded the ability to build enterprise solutions for common services. Moreover, the many changes to DMO, in response, for example, to the Mortimer and Kinnaird reviews, had not significantly altered its size, shape or footprint.

5.22 In May 2016, Defence completed a stocktake of its Systems Program Offices to determine how many existed, and which were candidates for further reform. Defence subsequently increased the numbers of organisational units it considered to be Systems Program Offices from 54 to 78, with staff numbers ranging from three to 178.135

5.23 As discussed in paragraph 3.47 of this report, Systems Program Offices have had between three thousand and four thousand staff over the last decade. Defence advised the ANAO that Capability Acquisition and Sustainment Group’s expenses on all employees in 2015–16 were $490.4 million. The First Principles Review observed that, at the project level, Defence treats staff as a ‘free good’ (see paragraph 3.45), and the cost of those staff is a substantial omission from the apparent price of services facing the capability manager. Systems Program Office staff are involved in both acquisition and sustainment. Defence is not easily able to identify the effort, in terms of staff numbers, expended on each. In March 2017, Defence informed the ANAO that work was underway to address this.

5.24 According to Defence documentation, the degree of outsourcing among Systems Program Offices has long been diverse, ranging from a totally outsourced model to a high degree of transaction-based activities carried out in-house.136 A 2015 review of Defence’s Systems Program Offices concluded that Defence manages the majority of its acquisition and sustainment activity in-house.137

Purpose of the reform of Systems Program Offices

5.25 The 2009 report from the Pappas Review identified opportunities for efficiencies in maintenance activities carried out by Systems Program Offices by consolidating maintenance facilities; restructuring outsourcing contracts to ensure the contractual conditions create the right incentives for performance improvements; and implementing lean techniques in workshop management.138 From the formation of DMO, Defence received similar external advice from different sources that substantial improvements in efficiency were possible in the Systems Program Offices.

5.26 In 2016, the First Principles Review concluded that Defence had too many Systems Program Offices139, and that Defence could make savings by restructuring them and reducing their staff.140 The Review drew on a 2013 consultant’s report that identified ways to reduce the DMO workforce.141

5.27 The First Principles Review recommended that each Systems Program Office be examined and analysed to determine:

  • Where it fits within the smart buyer function; and
  • The most appropriate procurement model for delivering the capability and achieving value for money.142

5.28 The First Principles Review also saw outsourcing as an important element of the reform of Systems Program Offices.143 Under the First Principles Review reforms, Defence expects to reduce the number of Systems Program Offices and, from 2015 levels, the number of staff directly undertaking sustainment activity, though no estimates have been made known and no targets have been set.144

Progress with the reform of Systems Program Offices

5.29 In September 2015, Defence engaged a consulting firm to review and develop an optimal design for one of its Systems Program Offices. Defence received the final report in December 2015.145 In August 2016, Defence re-engaged the same consulting firm to complete reviews of a further 53 Systems Program Offices. Defence advised that, during the course of this contract, a number of Systems Program Offices have merged into larger organisations, resulting in a total of 48 being reviewed (as at end February 2017). The value of the contracts for this work is $8.8 million. Defence is now undertaking a rolling program of reviews of all Systems Program Offices, adding a further 17 to those being reviewed. Defence advised that a contract variation has been applied to accommodate this additional work, bringing the total cost to $16.3 million.

5.30 In August 2016, Defence finalised a Systems Program Office Design Guide to provide Capability Acquisition and Sustainment Group senior managers ‘with sufficient guidance to design or redesign’ Systems Program Offices.146

5.31 In March 2017, Defence advised that the first 25 Systems Program Office reports had been provided to the Capability Acquisition and Sustainment Group Executive Advisory Committee for exposure and awareness. There is no evidence of any decisions having been made.147

5.32 Defence is in the early stages of the Systems Program Office reform envisaged by the First Principles Review. Completing this work is likely to take longer than the two years intended for the First Principles Review Implementation.

Performance-based contracting

5.33 Defence defines performance-based contracting as ‘an outcomes-oriented contracting method that ties a range of monetary and non-monetary consequences to the contractor based on their accomplishment of performance requirements’.148

5.34 Performance-based contracting rose to prominence in the USA for its military in the late 1990s. The 2003 Australian Defence Aerospace Industry Sector Strategic Plan, produced jointly by Defence and industry representatives, recommended this approach, as well as a move to long-term through-life support contracts.149 Aerospace Systems Division then led in this reform area, producing a ‘Performance-Based Contracting Handbook’ in 2005.

5.35 The problem of existing contract arrangements was described as follows by an RAAF officer in 2005, with a special focus on ageing aerospace platforms:

The typical [support entity] entails a Systems Program Office managing a variety of small to large volume-based, process-oriented contracts. The nature of the contracts provides an inherent disincentive for contractors to improve platform supportability of their own accord as the quantum of their financial reward is often directly linked to the volume of repairs conducted.150

5.36 Performance-based contracting was described in a chapter on managing sustainment in the DMO’s 2007 Acquisition and Sustainment Manual, but not made a requirement or a priority.151 In 2008, a Defence internal audit evaluating the effectiveness of Defence’s use of performance-based contracting found that: the maturity of management systems for such contracts varied, there was inadequate central oversight and support, inconsistent contracting arrangements, and less effective outcomes in many cases.

5.37 Defence undertook substantial work in 2009 and 2010 to explore greater use of the technique so as to enhance Defence’s ability to meet capability preparedness requirements, and reduce the total cost of ownership.152 DMO advised industry CEOs at a meeting in July 2009 that ‘For greenfield sustainment and other new contracts there will be a greater focus on contracting for availability’.153 Defence established a Performance-Based Contracting Centre of Excellence at that time and produced a new Defence contracting template with performance-based contracting integrated into it.154 Defence has claimed that ‘almost all projects since 2011 have used a performance-based contracting template for materiel support services and have enjoyed the support of the Performance-Based Contracting Centre of Excellence’.155 It has not been able to substantiate this claim (Box 6).

5.38 Major industry organisations have continued to urge Defence to progress performance-based contracting as a means of improving capability while reducing sustainment costs.

Box 6: Defence’s Contracts Register

Defence has had no central facility to maintain a register of contracts for sustainment (or, indeed, acquisition) until January 2016. According to Defence, there has been no way to easily identify and report on contracts from existing Defence systems because they have been designed for financial accounting with very limited information recorded about procurement and contracting.a

A Defence Central Contracts Register facility commenced operation in January 2016.b This enables efficient collection of data for new Defence contracts: it does not include existing contracts, which can be added only by a separate stream of work. Defence decided to migrate only material, high-risk existing contracts onto its central register. It has engaged a consultancy (at an estimated cost of $1.2 million) to undertake the data identification and cleansing for active major contracts for the ‘top 15 Acquisition and Sustainment projects/products (about 60 contracts) by 30 June 2016’.c After completing approximately 15 major contracts the project was suspended due to competing priorities within Capability Acquisition and Sustainment Group. The total spend was about $350 000.

Note a: Many entries on the AusTender site represent individual payments rather than separate contracts. The system supporting the current AusTender site is described by Defence as ‘in a state of decay as it is over a decade old and no longer viable to upgrade and sustain’. Defence advises that ‘Over the past 12 months the CASG e-Procurement Team and the DoF AusTender Team, and SAP Australia Canberra Office have been progressing a whole of Government project to address the technical and business process risks posed by the current SAP AusTender Contract Reporting data capture and reporting solution’. In Defence’s case, this should eliminate the need for some 20 000 manual update transactions to AusTender each year.

Note b: This is part of a substantial project within Defence to improve both the recording of contracts and purchasing within Defence. Major steps include the development and delivery of a single (electronic) procurement form.

Note c: Defence, ‘Current e-Procurement Projects’, June 2016.

Current position

5.39 To work out whether performance-based contracting is yielding the outcomes it expects, Defence is undertaking an ‘academic analysis of the Critical Success Factors for PBC and their role in Program success’. Defence expects that this research will be complete in 2018.

5.40 The analysis could usefully have regard to the outcomes of past reviews and audits, which provide insight into the effectiveness of Defence’s contracting arrangements for specific fleets of equipment. For example, in ANAO Report No.52 2013–14, Multi-Role Helicopter Program, the ANAO concluded that the pre-2012 sustainment contract provisions for Defence’s Multi-role Helicopter Program were largely ineffective.156 The audit report noted that this was largely due to Defence not applying its own processes to the standards required. The report also acknowledged that contract amendments were made to introduce a revised performance management regime. In May 2016, the Sustainment Gate Review for the program found that the commercial incentive in the contract was insufficient to obtain the desired rate improvement.

Establishing centres of expertise

5.41 The proposal to establish Centres of Expertise is reminiscent of the ‘business improvement teams’ envisaged under the Sustainment Business Model project (Appendix 6), and a further similar proposal to establish a ‘centre of procurement excellence’ recommended in 2000.157 The novel element is the adoption of a matrix management arrangement for the control and deployment of a range of areas of expertise. The idea is to establish a managed resource pool that provides skilled personnel for CASG activities. This proposal is regarded as a key principle, integral to the acquisition and reform agenda. It is also seen as needed to streamline and consolidate the Systems Program Offices.

5.42 A design guide for the establishment of the Centres of Expertise has been produced and initial establishment of these centres is scheduled for April 2017, with the establishment of a ‘balanced matrix’ expected to be over a two-year period.158

Developing a new operational framework

5.43 Developing the Capability Acquisition and Sustainment Group Business Framework is the first component of First Principles Review Recommendation 2.11, which provides: ‘that there be significant investment in the development of an operational framework which briefly but comprehensively explains how the organisation operates and the roles and responsibilities within it’ (the full recommendation is set out in paragraph 5.5, above). Such a framework would need to articulate how an organisation responsible for both acquisition and sustainment would carry out those major tasks. It is an overarching element in progressing First Principles Review implementation in Capability Acquisition and Sustainment Group.

5.44 Defence began development of the Capability Acquisition and Sustainment Group Business Framework as of 31 July 2015 with a view to completion on 1 February 2016. In February 2016, the Deputy Secretary, Capability Acquisition and Sustainment Group, informed the First Principles Review Implementation Committee that ‘the business framework project has commenced and been resourced to deliver a fundamental rebuild of all Group processes’.159 The completion date was extended to 1 July 2016 and the objective was to have a ‘High-level business framework approved’.

5.45 A review of progress by a consultant in May–June 2016 found that no significant progress had been made: ‘As a result, my review changed from “progress made” to why none had been made’.160 The consultant concluded that ‘Lack of decision making has frankly constipated this project’.161

5.46 Work was then reinvigorated and Defence’s First Principles Review Implementation Committee approved the Business Framework on 1 September 2016. Defence published a final, complete edition, comprising 18 pages plus appendices with a date of 12 December 2016. Although the document provides a description of the intended new organisational structure and the functions and responsibilities of its elements, it is not apparent that a ‘fundamental rebuild of all Group processes’ has yet been delivered. Defence advised that the new Business Framework ‘will take many years to fully implement’.162 Between February 2016 and February 2017, Defence entered into contracts with the International Centre for Complex Project Management (ICCPM) worth a total of $2.94 million for work described as relating to the development and implementation of the business framework. Based on Austender records, Defence also engaged two other companies, Cordelta and Eveille Consulting, to assist with the Business Framework. The value of these contracts totals about $835 000.

Development of the ‘smart buyer’ model

5.47 Operating as a smart buyer is an essential function for an organisation seeking to outsource work. To be a smart buyer requires the capacity and technical knowledge to articulate its organisation’s requirements and recognise, independently of the vendor, whether the vendor is meeting those requirements.163 Past Australian use of the ‘smart buyer’ concept (discussed in Appendix 7 of this audit report) is consistent with this approach.

5.48 The First Principles Review recommended that Defence establish a single end-to-end capability development function and that this would require, among other changes, ‘Moving to a leaner “smart buyer” model that better leverages industry, is more commercially oriented and delivers value for money’.164 The Review cites a US Government Accountability Office report which is also consistent with past Australian usage:

A smart buyer is one who retains an in-house staff who understands the organization’s mission, its requirements, and its customer needs, and who can translate those needs and requirements into corporate direction. A smart buyer also retains the requisite capabilities and technical knowledge to lead and conduct teaming activities, accurately define the technical services needed, recognize value during the acquisition of such technical services, and evaluate the quality of services ultimately provided. As long as the owner retains the in-house capabilities to operate as a smart buyer of facilities, there does not appear to be any greater risk from contracting out a broad range of design review-related functions, so long as such functions are widely available from a competitive commercial marketplace. If the owner does not have the capacity to operate as a smart buyer, the owner risks project schedule and cost overruns and facilities that do not meet performance objectives’.165

Capability Acquisition and Sustainment Group’s implementation of ‘smart buyer’

5.49 Defence has developed its own meaning for the term ‘smart buyer’, which does not clearly capture the intent of the First Principles Review recommendation. At its core, this is a two-step decision-making framework comprising risk analysis and a tailored strategy done at the earliest practicable stage of a project:

  • Step 1—develop a risk profile for the project, which identifies the severity of risk in nine categories for the acquisition phase and eight for the sustainment phase; and
  • Step 2—develop a risk-based tailored project execution strategy, covering approvals, project management, and through life asset management, including a coherent approach to the acquisition and sustainment phases.166

5.50 Defence considers that ‘The game changer for Capability Acquisition and Sustainment Group will be the “Smart Buyer” model that will unleash our organisational capability’.167 Defence sees the main benefit of this process as early identification of risks and, as a result, better allocation of Defence resources.168 Its ‘smart buyer’ paper sets out detailed issues for consideration in areas of risk separately for both acquisition and sustainment.

Box 7: The current Defence definition of ‘smart buyer’

A Smart Buyer first and foremost achieves good outcomes for its Customers. A Smart Buyer will also enable appropriate financial return for its Suppliers.

A Smart Buyer undertakes the roles that government must perform, and effectively outsources other functions when that is the Smart thing to do.

A Smart Buyer maintains ethical, transparent and constructive engagement with industry, establishing trusted long term relationships that can better be relied upon to deliver capabilities for its Customers.

A Smart Buyer has the organisation, skills, and suitable decision-making frameworks to make timely decisions on the optimum procurement, project management and approvals strategies for each acquisition or sustainment program, based on its critical risk features. It has the agility to refine those strategies as new information becomes available.

A Smart Buyer collaborates with Industry and engages early and throughout the Capability Life Cycle. It uses industry best practice tools and techniques to execute projects throughout the Capability Life Cycle, including through Sustainment, in a way that strikes the optimum balance between performance, time and cost.a

Note a: Defence, ‘Smart Buyer—Detail Design’, approved 20 October 2016, p. 1.

5.51 In its implementation of the smart buyer concept, Capability Acquisition and Sustainment Group does not clearly articulate the competencies required to specify outsourcing requirements. These competencies include: specifying requirements independent of suppliers, working out where best to obtain those requirements; and knowing how to assess what they have received on behalf of the Commonwealth. These are all fundamental skills in an organisation seeking to engage more with industry and contract out sustainment work wherever that represents value for money. This is a risk in light of the systemic issue identified by Defence in its work towards reforming Systems Program Offices: ‘SPO staff have limited experience, skills, and competencies needed to effectively establish, govern, and assure industry delivery of capability’ (see Box 3, p. 23 of this audit report).

5.52 The same risk has been highlighted in a number of internal Defence reviews including:

  • a 2016 sustainment gate review of the Multi-Role Helicopter (MRH) project which commented on the need to clarify the roles and responsibilities of Defence and industry, and ensure the Systems Program Office has the appropriate skills and sufficient capacity to discharge its responsibilities169;
  • the 2016 Houston Review into Army Aviation, which observed that the Systems Program Office for the Tiger and the MRH helicopters relied heavily on several key contractors170; and
  • a sustainment gate review of Air Force’s Airborne Early Warning and Control fleet, which observed that all Systems Program Offices staff needed to be fully capable to deal with a complex platform and small fleet of aircraft.171

5.53 A relevant finding also emerged in the recent Parker Review in the UK of the shipbuilding strategy for the Royal Navy. The Review found a need for the Ministry of Defence ‘to rebuild its capability as an intelligent client for warship design and build’.172 It concluded that, to ensure pace could be maintained ‘additional technical expertise in the form of a Client Friend (or Friends) is procured early, in particular to assist in the development of a detailed specification’. This suggests that, where in-house expertise is not (or is no longer) available, expertise from industry could be bought in. Care would need to be taken to avoid any conflict of interest among vendors in these circumstances.173

Asset management

5.54 Defence has long aspired to introduce a systematic approach to asset management for its specialist military equipment (see Appendix 6 of this audit report). Significantly, the Coles review recommended that Defence develop an asset management strategy for sustainment (see Appendix 5 of this audit report) and Chief of Navy subsequently expressed a strong desire for a total asset management framework.174

5.55 In November 2016, Defence informed the ANAO that it intends to use the principles of the ISO 55000 Asset Management Standard as a guide of good practice and to further develop Defence’s framework for the management of sustainment.175 In March 2017, it advised that the policy issued under the earlier DMO governance framework was gradually being reviewed and updated. Defence has continued a program offering its staff professional development options in asset management.

Proposals that asset management underpin a Sustainment Business Model

5.56 Appendix 6 of this audit report summarises proposals between 2008 and 2011 to introduce a Defence Sustainment Efficiency Office and a Sustainment Business Model. Three significant reviews into Defence sustainment around this time observed that Defence did not have an asset management approach to sustainment.176 An asset management approach seeks to derive the best possible value and return on investment from assets across their lifecycle.177 According to the standards, the approach is ‘vital for organisations that are dependent on the function and performance of their physical assets in the delivery of services or products, and where the success of an organisation is significantly influenced by the stewardship of its assets’.178

5.57 A briefing to DMO Division Heads in September 2011 summarised the shortcomings of Defence’s approach to sustainment in the absence of an asset management approach:

  • The absence of an asset management approach within Defence makes it ‘difficult to piece together a holistic asset management view across the myriad individual policies’ which focused on discrete processes.
  • Reporting and compliance was ‘devoted to “soft” governance rather than to “hard” asset management decisions, based on sound financials, across the whole capability lifecycle’.
  • Defence had multiple repositories and versions of asset management information making it hard to assess whether Defence was aligned to best practice.
  • Information sharing about asset management was inconsistent and fractured.

5.58 In 2011, DMO set about developing a Sustainment Business Model drawing on generically applicable asset management principles.179 There is then little evidence of substantial progress of the project for some years. However, in 2013, a further report by external consultants stated that Defence needed an integrated, whole-of-life approach to sustainment and that Defence could realise significant benefits by modelling the management of their assets on the approach taken by commercial organisations—an asset management approach.

5.59 Defence paid $5.5 million over four years (2011–12 to 2014–15) for consultancy services related to its sustainment business model. Also, it incurred related staff costs of $3.1 million over six financial years (2011–12 to 2016–17). Defence informed the ANAO that it had used this work to reform Maritime asset management, inform its submissions to the First Principles Review, underpin the Systems Program Office reform guide and develop its Systems Program Office review methodology.

5.60 In May 2015, DMO released a sustainment policy stating that Defence assets would be managed in accordance with the international standard for asset management. However, a September 2015 assessment by external consultants of Capability and Sustainment Group policy and guidance against the requirements of the Asset Management Standards concluded180:

While the organisation can demonstrate clear leadership and commitment to delivering and sustaining asset capabilities that meet its strategic business and operational intent, its ability to uniformly translate this intent into actions that demonstrate the consistent delivery of the business outcomes from its assets, is hindered by incomplete process documentation and [the absence of] an integrated Business Management System for its assets.181

5.61 The report noted that ‘it is impossible to compare different operational business processes within the organisation, in order to determine which process is the most effective for the entire organisation, until the organisation has determined the base policies (rules) by which it will conduct the comparisons’.182

5.62 The same view was reflected in advice received by Capability and Sustainment Group from the Chief Defence Scientist, also in September 2015. The Defence Science and Technology Group’s Aerospace Division had concluded that a systematic and top-down approach such as enterprise-wide asset management was particularly suitable to address the challenges of reforming sustainment management in Capability and Sustainment Group. The Defence Science and Technology Group had been helping individual Systems Program Offices with asset management but believed its real potential was at the enterprise level. Therefore it was important to approach this from a Capability and Sustainment Group or Defence perspective.183

5.63 The Sustainment Business Model project has not delivered all that DMO expected. In October 2015, some four years after the Sustainment Business Model project started, Defence’s First Principles Review Implementation Team decided that work on developing the Sustainment Business Model, along with other reform within Capability and Sustainment Group at that time, would cease and the ‘relevant elements’ were to be incorporated into the new Capability Life Cycle initiative. The documentation does not specify which elements remained relevant.

5.64 The current Capability Life Cycle manual and its detailed design document do not mention a business model for sustainment nor any overarching sustainment framework.

Has Defence developed an evaluation plan for the reform of Capability Acquisition and Sustainment Group?

Defence has not put in place plans to evaluate either the reforms themselves or its implementation of them. There is a risk that insights into a very substantial reform process could be lost. This was the case with the earlier Smart Sustainment reforms. The ANAO has recommended that Defence develop and implement an evaluation plan.

5.65 The audit found limited evidence that Defence was preparing to evaluate the reforms to Capability Acquisition and Sustainment Group, its execution of the reforms or their outcome in terms of improvements to either acquisition or sustainment. Current proposals appear to be limited to an ‘independent health check’ on progress.184 On the face of it, the scope of the health check is narrow, focused on closing recommendations and assessing progress. Given the substantial resources involved, both in the expenditure on sustainment and the reform process, Defence would benefit from assessing all aspects of these changes carefully.

Recommendation no.2

5.66 The ANAO recommends that Defence develop and implement an evaluation plan to assess the implementation of the recommendations of the First Principles Review.

Entity response: Agreed

5.67 Defence already has an evaluation process developed that will be implemented by the end of 2016–17.

Appendices

Appendix 1 Defence’s Systems Program Offices

Table A.1: Key data for each of Defence’s Systems Program Offices: staff, assets and major contracts

Systems Program Officea

No of ADF staffb

No of APS staffb

No of contract staffb

Total staff

Assets under sustainment/consumables supplied

Number of sustainment contracts with a value greater than $1 million

Total value of contracts with a value greater than $1 million ($m)

Maritime domain (17 Systems Program Offices)

DDG [Guided Missile Destroyer]

9

3

0

12

Air Warfare Destroyer

1

62

Amphibious and Afloat Support

18

33

0

51

Amphibious fleet (excluding LHD) and supply ships

5

165

ANZAC

43

38

1

81

ANZAC class frigates

1

774

FFG [Guided Missile Frigate]

19

37

2

56

Adelaide class frigates

3

306

Directorate, Maritime Operational Support

2

8

0

10

Maritime targets and ranges and Dock Management at Garden Island

7

110

Landing Helicopter Dock

21

16

1

37

Canberra class Amphibious Assault Ship

5

373

Maritime Cross Platform

9

100

6

109

Common logistics items

7

332c

Hydrographic

12

22

12

34

Hydrographic fleet, minor fleet support vessels and Army amphibious fleet

1

80

Mine Warfare and Clearance Diving

19

29

13

48

Mine hunting and clearance diving materiel

5

84

Patrol Boat

14

21

11

35

Armidale class

2

718

Pacific Patrol Boat

1

4

0

5

Pacific Patrol Boats

3

117

Landing Helicopter Dock Project Office

10

62

0

74

None – acquisition office only

0

0

Submarine

43

164

80

273

Collins Class Weapons System

4

1262

SEA 5000 [Future Frigate] Project Office

22

79

70

102

None – acquisition office only

0

0

Specialist Ships Acquisition

3

39

31

43

None – acquisition office only

0

0

Air Warfare Destroyer program

32

26

10

58

None – acquisition office only

0

0

Boats, Upgrades & Infrastructure

2

27

10

29

None – acquisition office only

0

0

Land domain (12 Systems Program Offices)

Armament

6

41

2

47

Small, Medium, and heavy arms, and surveillance systems

10

44

Clothing

7

36

0

43

All clothing and apparel for all ADF elements

11

25

Health

1

38

0

39

Health materiel and combat rations for all ADF elements

9

26

Soldier Modernisation

13

47

11

60

Chemical, Biological, Radiological, Nuclear and Explosive Equipment, Personal Field Equipment, Aerial Delivery and Adventurous Training Equipment and Combat Protective Equipment, and Special Forces equipment and needs.

12

23

Combat Support

9

44

6

53

Indirect fire support, radar and ground based air defence systems and associated simulation systems.

37

516

Combat Support Vehicle

9

50

27

59

Combat Support Vehicles, including Fire Vehicles, C and D Vehicles, Bulk Liquid Distribution equipment, Special Forces Vehicles and Engineer Equipment

7

17

General Support

2

27

0

29

Management of vehicle maintenance and support equipment, electrical systems and general stores

11

19

Mounted Combat

8

47

2

55

Acquisition and through life support for armoured fighting vehicles, including ASLAV, M1A1 Main Battle Tank, M88 Recovery Vehicle, M113 Armoured Personnel Carrier and the Protected Mobility Vehicle (Bushmaster) and associated equipment.

34

324

Commercial and General Service Vehicle

3

46

0

49

ADF’s G-Wagon fleet, the commercial vehicle fleet and the B-vehicle fleet.

23

220

LAND 121 PH 3B Project Office [Project Over­lander—Medium Heavy Capability, Field Vehicles, Modules and Trailers]

4

21

15

25

None – acquisition office only

0

0

LAND 121 PH 4 Project Office [Protected Mobility Vehicles (Light)—Hawkei]

3

14

8

17

None – acquisition office only

0

0

LAND 400 [Combat Reconnaissance Vehicles, Infantry Fighting Vehicles, Manoeuvre Support Vehicles, and an Integrated Training System.]

14

20

3

34

None – acquisition office only

0

0

Air domain (19 Systems Program Offices)

Air Combat & Electronic Attack

45

24

16

69

F18 Super Hornet and Growler

20

7002

Tactical Fighter

69

101

36

171

F18 Classic Hornet and Lead In Fighter

34

1542

Airborne Early Warning & Control

35

20

7

55

Wedgetail AEWC

21

8904

Aircrew Training Project Office

19

10

1

29

None – acquisition office only

0

0

Maritime Patrol

79

65

0

144

P3 Orion, P8 Poseidon, Herron UAV, and Aeronautical Life Support Equipment

14

1023

Training Aircraft

30

32

5

62

PC9/A, B300 and B300 ILT aircraft

8

1447

Aerospace Materiel

40

85

0

125

Ground support equipment, aircraft common spares, aerospace support and test equipment, and technical data and publications

4

151

Air Lift

61

60

0

121

C27J and C130J aircraft, gas turbines

7

682

Heavy Air Lift

32

30

2

62

C17 and KC-30 aircraft

7

868

AIR7000Ph2B

36

20

0

56

None – acquisition office only

0

0

Air Mobility & Tanker Program Office (AMTPO)

42

25

0

71

None – acquisition office only

0

0

Directorate Aero­space Simulators & Special Aircraft (DASSPA)

4

15

0

19

Simulators and VIP aircraft

6

336

Army Aviation

43

59

57

102

Tiger (ARH), Taipan (MRH-90), Blackhawk and Kiowa helicopters

5

3477

Cargo Helicopter & Unmanned Surveillance Project Office

29

31

14

60

Chinook and Unmanned Aerial Vehicle aircraft

1

16

MH60R [Romeo Helicopters] Project Office

12

15

1

27

None – acquisition office only

0

0

Navy Aviation Systems

34

97

4

131

Squirrel and Seahawk helicopters and aerial targets

13

2075

Helicopter Aircrew Training System Project Office

6

8

8

14

None – acquisition office only

0

0

Multirole Heli­copter [MRH-90] Project Office

11

15

13

26

None – acquisition office only

0

0

Joint Strike Fighter Project Office

56

37

54

93

Joint Strike Fighter

2

59

Joint domain (16 Systems Program Offices)

Civil Military Air Traffic

6

2

31

8

None – acquisition office only

0

0

Surveillance and Control

60

95

15

155

Surveillance and control capability

10

414

Wide Area and Space Surveillance

20

31

29

51

Wide area and space surveillance capability

12

537

Information Assurance

4

28

0

32

Information assurance capability

4

40

Maritime Strategic Communications

6

45

0

51

Maritime strategic communications capability

5

419

Satellite Communications

7

62

6

69

Satellite communications capability

7

144

Battlespace Communications

33

86

0

119

Battle space communications capability

4

138

Joint C4I [communications, command, con­trol, computers and intelligence]

13

49

0

62

Joint communi­cations, command, control, computers and intelligence capability

14

311

Land C3 [communications, command and control]

16

50

0

66

Land communi­cations, command and control capability

6

64

Maritime C2 [command and control]

6

34

0

40

Maritime command, control, and communications capability

2

26

Joint & Air Intelli­gence, Surveil­lance, Reconnaissance and Electronic Warfare

25

51

4

76

Joint & air intelligence surveillance reconnaissance and electronic warfare capability

12

213

Land Intelligence Surveillance, Reconnaissance and Electronic Warfare

9

56

0

65

Land intelligence surveillance reconnaissance and electronic warfare capability

1

10

Maritime Intelligence, Surveillance, Reconnaissance & Electronic Warfare

10

39

24

49

Maritime intelligence surveillance reconnaissance and electronic warfare capability

5

49

Aeronautical Explosive Ordnance

41

44

15

85

Aeronautical explosive ordnance

41

553

Maritime Explosive Ordnance

19

157

21

176

Maritime explosive ordnance

42

577

Land Explosive Ordnance

9

51

43

60

Land explosive ordnance

50

580

TOTAL (all 64 Systems Program Offices)

1 325

2 768

727

4 168

 

555

36 922

               

Note a: Data is current as at 24 February 2017. No of contract staff as advised 2 June 2017.

Note b: Staff numbers are rounded to the nearest whole number.

Note c: Defence advised the ANAO that this amount ‘includes elements considered “acquisition” due to the impossibility of separating these items in this context’. SPOs with a preponderance of contractor activity in acquisition include Submarines and those SPOs designated ‘Project Offices’.”

Source: Department of Defence.

Appendix 2 Navy core Key Performance and Health Indicators for sustainment

Table A.2: Navy Materiel Sustainment Agreement Performance Framework – Core Key Performance Indicators

Key Performance Indicator

Description

Measurement interval

K 1 (1)

Monthly Materiel Ready Days

An output measure, which directly reflects achievement against the Materiel Availability outcome. The Materiel Ready Days measure is applicable to Platform related products only. Materiel Ready Days Achievement, measured as a percentage achieved against planned reflects the reliability of the Mission System and the Support System.

Monthly

K 1 (2)

Service Level Achievement

 

An output measure, which directly reflects achievement against the Materiel Availability outcome. This measure is applicable to Service / Commodity products only and the achievement against the agreed service levels contained within the Requirements element of the Product Schedule.

Monthly

K 1 (3)

Rate of Effort / Aircraft Availability Achievement

An output measure, which directly reflects achievement against the Materiel Availability outcome. This form of measure is applicable to Aviation related products only measured as a percentage achieved against planned reflects the reliability of the Mission System and the Support System.

Monthly

K 2

Achievement of External Maintenance Period Planning Milestones

An output measure, which directly reflects achievement against the Materiel Availability outcome. Failure to achieve planning milestones has been identified as a key constraint and has a direct impact upon the ability of Capability Acquisition and Sustainment Group to deliver platforms out of maintenance periods on time.

Monthly

K 4

Conformance to Operating Intent

An output measure, which directly contributes to the Material Availability and Materiel Confidence outcome. There is a proven link between lack of conformance to operating intent / requirement and reduced platform life / reduce Sustainment Efficiency. This Key Performance Indicator assesses conformance to Product operation within Statement of Operating Intent and the adherence to cyclical maintenance schedules.

Monthly

K 5

Price Reliability

A group of measures (Monthly and Year to Date) which assess the reliability of price prediction by Capability Acquisition and Sustainment Group and is represented as measure of Achievement against In-Year Phasings. It is an output measure, which directly reflects achievement against the Sustainment Efficiency outcome. These measures track expenditure against phasing and is key indicator of the quality of Systems Program Office planning and costing processes across the Financial Year.

Monthly

     

Source: Defence.

Table A.3: Navy Materiel Sustainment Agreement Performance Framework - Core Key Health Indicators

Key Health Indicator

Description

Measurement interval

H 1

Positions Filled.

The availability of appropriately skilled, qualified and authorised staff (Navy and Civilian) to fill positions in the Systems Program Office and Capability Manager Representative organisations is a key contributor to all three of the Materiel Sustainment Agreement outcomes and a key constraint to long term Materiel Confidence.

Monthly

H 2

Funding Adequacy over the Defence Management Finance Plan.

The lack of funds to deliver necessary Mission Support outcomes across the DMFP will result in increased risk to Materiel Confidence in outer years. Collateral impact will be transferred to Sustainment Efficiency (expedient choice making) and Materiel Availability (poor supply chain or maintenance output).

Annually at end of May each year

H 3

Raised / Open Priority 1 / Priority 2 Maintenance Deficiency Report (Urgent Defect).

The existence of open Priority 1 / Priority 2 defects have a direct influence on the achievement of Materiel Ready Days (Materiel Availability outcome) due to reduced Seaworthy Materiel. High levels of unplanned maintenance will affect Sustainment Efficiency through uncontrolled cost of repairs and a probable increase in Cannibalisations.

Monthly

H 4

Achieved Demand Satisfaction.

Demand Satisfaction Rate measures the effectiveness of the supply chain. Poor Demand Satisfaction outcomes will ordinarily be reflected in the increase of Urgency of Need Designators A and B demands or directly in Priority 1 stores related Urgent Defect. Primary impact will be on achievement of Materiel Ready Days (Materiel Availability outcome) with secondary impact upon Sustainment Efficiency outcome as alternate supply chain or cannibalisation will be required.

Monthly

H 5

External Maintenance Period Demand Fill Rate.

Ongoing Failure to provide spares in full to support External Maintenance Period demands will drive deferral of maintenance, or partial completion of external maintenance resulting is a direct impact of Completion of Maintenance to Plan and an increasing likelihood of system failure resulting in Open Priority 1 Urgent Defects. Failure of this Key Health Indicator will affect the achievement of all three of the Materiel Sustainment Agreement outcomes.

Monthly

H 6

NAVALLOWa Configuration Effectiveness.

An indicator of the accuracy of the NAVALLOW Configuration record for each Platform. Deficiencies in the configuration record indicates potential Configuration variance and will impact the availability of necessary inventory for planned and corrective maintenance tasks resulting in increased risk to achieving Materiel Ready Days (Materiel Availability outcome).

Monthly

H 7

Cannibalisations (Number).

Measures the number of occurrences where Cannibalisation was approved by the Capability Manager Representative. Cannibalisation reflects an inability of the supply chain to support requirements, but is not an indicator of Supply Chain performance alone as the requirement may be the result of induced failure from a number of domains. Consequently, while cannibalisations are undertaken to avoid failure against Materiel Ready Days (Materiel Availability outcome), their true impact is on Materiel Confidence with secondary impact upon Sustainment Efficiency outcomes.

Monthly

H 8

Obsolescence Liability.

This measure seeks to qualify the degree to which the supply chain is confined by obsolescence - systems or components. Obsolescence will result in increased Urgent Defect, Cannibalisations and reduced Demand Satisfaction Rates. Accordingly, the Key Health Indicator will have a primary impact upon Materiel Confidence with a secondary, often more obvious, impact on Materiel Availability.

Monthly

Maintenance

H 9

External Maintenance Period Activity Cost Growth.

This Key Health Indicator measures the cost growth associated with individual External Maintenance Period. Poor planning, poor compliance with External Maintenance Period Planning Milestones, transference of Organic maintenance tasks to External Maintenance Period due to poor Organic Maintenance Completion rates or failure to undertake reliable Pre Refit Condition Assessment will result in increased External Maintenance Period cost. The primary impact of poor outcomes in this Key Health Indicator is Sustainment Efficiency outcome.

Monthly

H 10

Organic Maintenance Backlog.

Maintenance Completion (tasks) to Plan achievement is an output measure, which directly reflects achievement against the Materiel Confidence outcome. It is both a lead and a lag indicator, which has a direct correlation to decreased near term and future unit availability and increased corrective maintenance and Priority 1 Maintenance Deficiency Report (Urgent Defect).

Monthly

H 11

External Maintenance Backlog

Maintenance Completion (tasks) to Plan is an output measure, which directly reflects achievement against the Materiel Confidence outcome. It is both a lead and a lag indicator, which has a direct correlation to decreased near term and future unit availability and increased corrective maintenance and Priority 1 Urgent Defect.

Monthly

H 12

External Maintenance Effectiveness

 

Defects post Maintenance Period. This Key Health Indicator serves to measure the quality and appropriateness of the work undertaken during maintenance periods. Due to the nature of the Key Health Indicator, it seeks to measure disruption post External Maintenance Period and consequential impact on Materiel Ready Days (Materiel Availability outcome). Poor outcomes in this Key Health Indicator will also have an impact upon the Sustainment Efficiency outcomes, as rework will require additional funding and stores to remedy.

Monthly

Engineering

H 13

Open Variations (Number)

This Key Health Indicator relates to the ability of the Capability Acquisition and Sustainment Group to manage temporary variations to the platforms configuration. Poor outcomes in the Key Health Indicator result in primary risk to Material Confidence outcomes (supportability) and secondary risk to achieving Material Availability.

Monthly

H 14

Open ECP

(Number).

This Key Health Indicator relates to the Engineering Change Proposal action, which is yet to be finalised - either raised and not actioned or incomplete. While poor outcomes in the Key Health Indicator represent risk across all Materiel Sustainment Agreement outcomes, these risks are more acute in relation to Material Confidence and Materiel. Increased duration for Engineering Change Proposals also presents latent risk to Sustainment Efficiency outcomes due a potential increase in cannibalisations and the requirement to fund future Engineering Change Proposals in out years.

Monthly

H 15

Configuration Baseline

Accuracy is an output measure, which directly reflects achievement towards the Materiel Confidence outcome. The ability to maintain, change and control the Configuration baseline of the product is key to ensuring long-term supportability and the health of maintenance and supply support outcomes.

Quarterly

     

Note a: Navy Allowance (NAVALLOW) is the Navy logistics system.

Source: Defence.

Appendix 3 Reporting on the Armed Reconnaissance (Tiger) helicopter

The performance of the Armed Reconnaissance (Tiger) Helicopter is reported thus in the Quarterly Performance Report for April – June 2016, produced in August 2016:

The Final Acceptance Milestone under the Acquisition Contract with Airbus Group Australia Pacific (formerly Australian Aerospace) was declared achieved on 28 May 2013, approximately five years later than originally contracted. All 22 Tiger helicopters, simulators, ground support systems and contracted training devices have been accepted and introduced into service within project budget approvals. Final Operational Capability was declared with caveats by Chief of Army (CA) on 14 April 2016.

The Final Materiel Release (FMR) Report Approval Certificate has been signed by all stakeholders with caveats. The red traffic light for FMR is due to the late achievement against the MAA milestone. The last item of Support & Test Equipment requires codification before it can be formally accepted by the Commonwealth of Australia (CoA). Late delivery has minimal impact on the overall development of the Armed Reconnaissance Helicopter (ARH) capability.

[Materiel Acquisition Agreement] Closure and accompanying Product Schedule Change Proposal (PdCP) to transfer the delivery of the Deployable Aircraft Maintenance Rig capability to CA12 completed and awaiting Assets Under Control (AUC) balance adjustments before paperwork can be processed for signature.

The final Contract Acceptance milestone was achieved on 28 May 2013 within the approved acquisition budget envelope.

Final Operational Capability was declared in April 2016 and ministerial advice was noted on 3 May 2016. Closure of the Materiel Acquisition Agreement and project closure will be completed during 2016.

The Tiger fleet achieved a total of 312.0 flying hours in June against a planned 698 hours. This month closes out the Financial Year total at 3995.2 against the planned 5848 hours, a result of 68.32 percent. Improvements are still being seen in Rate of Effort (ROE) generation, since the implementation of the renegotiated ARH Sustainment contract arrangement. However, even with higher rates of availability, improved fleet serviceability and Repairable Item (RI) turnaround time, flying hours are still lower than expected.

Tiger is not currently deployed on operations.

Tiger sustainment is continuing to mature. Performance has been effective but unstable during the recent high demand collective training exercise series period that culminated in Exercise Hamel. This high tempo period of exercises has tested the robustness of the logistics system and deficiencies have been identified. Poor product reliability has contributed to weak serviceability at the 1st Aviation Regiment in Darwin.

The availability of the aircraft is either shown as ‘Green’ (acceptable performance) or ‘Amber’ (early signs of underperformance), depending on the part of the report examined.

The project has spent $133.966 million on sustainment in 2015–16 against a budget for the year of $133.679 million, and $1867 million of an acquisition budget of $2033 million.

In the Quarterly Performance Report, Defence does not identify the Tiger as an under-performing acquisition or sustainment project, even though the other sources suggest just that. For example, those sources state:

  • that the aircraft has not met, and is unlikely ever to meet, the required level of capability expected by government. It was not expected to meet key sustainment metrics such as Reliability/Availability/Maintainability, Rate of Effort and cost of ownership, and should be replaced;
  • the Chief of Army has place caveats on its use—effectively limiting what the aircraft can be used for, and where it can be used;
  • Army did not deploy the helicopter in its engagements in Afghanistan or Iraq;
  • the program has consistently underperformed;
  • on average only 3.5 aircraft of the operational fleet of 16 aircraft were serviceable at 10 am on any given day in 2015, against a requirement of 12 aircraft;
  • availability for the aircraft is predominantly limited by premature component level failures and is constrained by inadequate sustainment arrangements185;
  • despite the lower than expected hours flown, by June 2014 sustainment costs for the Tiger had exceeded the original sustainment contract (2004–19) value of $571 million, with five years still remaining; and sustainment costs to June 2016 total $921 million.
  • in June 2016, the cost per flying hour for the Tiger fleet was $30 335, compared to a target of $20 000 (the long-term average in June 2016 was $39 472 per flying hour); and
  • the project was approved in March 1999 with a budget of $1.58 billion (December 1999 prices). Since 2001, the budget has increased by $418.2 million from price index variation, and $121.5 million from exchange rate variation. The current approved budget is $2.03 billion.

The 2016 Houston Review also found that the full potential of the Australian Tiger program, when reached, will not be adequate to provide the robust, reliable and sustainable aviation reconnaissance and attack capability that Army requires.

Appendix 4 Sustainment activity public reporting 2015–16

Defence Portfolio Budget Statements 2015–16

  • Table 6: Capability Sustainment Program lists estimates for nine elements of Defence’s Capability Sustainment Program (Navy Sustainment, Army Sustainment and so on) over the four financial years of the forward estimates period.186 This table also includes estimates for items other than sustainment of military equipment.
  • Table 94: Top 30 Sustainment Products by End of Financial Year Outcome 2015–16 sets out a list of the Top 30 sustainment products and the estimate, at Budget time, of expenditure on sustainment of each of those products over the 2015–16 financial year.187 There is no reconciliation of the ‘total sustainment product funds available’ figure reported in this table with the ‘total sustainment’ estimate in Table 6 (discussed above).
  • Under the heading ‘Top 30 Sustainment Product Descriptions’, Defence provides a description for each of the Top 30 sustainment products. Generally, this comprises a description of the asset and an account of the focus of Defence’s sustainment effort during the prospective year. This is potentially useful information though not always written in clear language. For example, the description for the Canberra Class ships states that during 2015–16, ‘the focus will be on embedding and fostering the critical enabling functions for the initial period of LHD and LLC operations’.188 This conveys very little insight into what Defence is planning to achieve.
  • The PBS includes other occasional references to sustainment. For example, a note to Table 21: Army Deliverables, which in relation to the Army’s Armed Reconnaissance (Tiger) Helicopter, states ‘while Tiger reliability, availability and maintainability continue to improve, rate of effort is affected by the low rate of availability of aircraft with fully serviceable mission equipment due to poor but improving contracted sustainment support and the requirement for unscheduled maintenance’.189
  • The PBS also contains broader commitments on sustainment management for example, ‘the DMO will continue the implementation of a more standardised Sustainment Model to promote better and more consistent practice’.190

Defence Portfolio Additional Estimates 2015–16

  • Table 10: Capability Sustainment Program corresponds to Table 6 in the PBS (see above). This table lists, for each of the nine elements of Defence’s Capability Sustainment Program: the 2015–16 budget estimate (as presented in the 2015–16 PBS); a revised estimate for 2015–16; and the variation.191
  • Table 77: Top 30 Sustainment Products by End of Financial Year Outcome 2015–16 corresponds to Table 94 in the PBS (see above). This table lists the same Top 30 sustainment products presented in Table 94 of the PBS, in the same order. For each sustainment product, the table includes: the budget estimate for 2015–16 (as presented in the 2015–16 PBS); a revised estimate for 2015–16; variation; and an explanation of the variation plus a ‘project performance update’. Reasons for the variations in 2015–16 estimates vary, but changes in exchange rate and increased effort to support operations are among the more common. A notable feature of this particular table is that financial data and information about performance is contained in the same place, whereas it is presented in separate tables in both the PBS and Annual Report (see below).192

Defence Annual Report 2015–16

The Defence Annual Report 2015–16 provides information relating to the PBS and PAES tables, referred to above, in online web tables. The web tables are accessible through the on-line version of the Annual Report on Defence’s website, specifically at the end of Chapter 8. The two most relevant tables for sustainment reporting are Web Table 8.11: Top 30 sustainment products by expenditure, as forecast in the Portfolio Budget Statements 2015–16, and Web Table 8.2: Top 30 sustainment products, performance summary as at 30 June 2016.193

  • Web Table 8.11 includes a list of the same Top 30 sustainment products reported on in the PBS and PAES. However, they are in a different order, which may reflect changes in organisation following the delisting of DMO. For example, the category listed in the PBS ‘Helicopters, Tactical Unmanned Aerial Systems and Guided Weapons’ no longer appears and the items are distributed among other categories. The table reports, for each Top 30 sustainment product: the budget estimate for 2015–16 (as reported in the PBS); the revised estimate for 2015–16 (as reported in the PAES); the actual expenditure for 2015–16; and the variation against the revised estimate. The table also reports the budget estimate for 2015–16, revised estimate for 2015–16, actual expenditure for 2015–16, and the variation against the revised estimate for ‘other approved sustainment products’ and ‘support to operations’.194 The information reported in Web Table 8.11 is reproduced in Table A.4 below.
  • Web Table 8.2 provides a one-paragraph ‘performance summary’ by product for each of the same Top 30 sustainment products.195 In contrast to Web Table 8.11, Web Table 8.2 treats each product in the same order as the PBS and PAES.
  • The 2015–16 Annual Report does not report the actual expenditure for the nine categories of estimated sustainment expenditure reported in Table 6 of the PBS and Table 10 of the PAES. Additionally, there is no reconciliation of the ‘total sustainment funding’ amounts reported in Web Table 8.11 of the 2015–16 Defence Annual Report and the Capability Sustainment Program in the PBS and PAES tables discussed above.

Table A.4: Web Table 8.11: Top 30 sustainment products by expenditure, as forecast in the Portfolio Budget Statements 2015–16

Product name

Budget estimate 2015–16 $m

Revised estimate 2015–16 $m

Actual expenditure 2015–16 $m

Variation (under)/over revised estimate $m

Aerospace 

F/A-18A/B Classic Hornet Weapon System

243

248

228

(20)

E-7A Airborne Early Warning and Control Capability System

214

218

202

(16)

Multi-Role Helicopter Weapon System

161

197

173

(24)

F/A-18F Super Hornet Weapon System

180

184

211

27

Armed Reconnaissance Helicopter System

119

135

133

(2)

C-130J-30 Weapon System

125

129

135

6

AP-3C Orion Weapons System (Electronic Warfare)

120

121

119

(2)

MH-60R Seahawk Romeo Weapon System

97

94

56

(38)

Lead-in Fighter Hawk Weapon System

89

91

106

15

C-17 Heavy Air Lift Weapons System

79

82

71

(11)

KC-30A Weapon System

67

68

63

(5)

PC-9/A Weapon System

51

51

57

6

S-70B-2 Seahawk Weapon System

52

51

44

(7)

Special Purpose Aircraft

50

51

51

0

Join

Navy Munitions, Army Munitions, Explosive Ordnance

304

306

288

(18)

Navy, Army and Air Force Guided Weapons

201

115

150

35

Wide Area Surveillance Capability and Air Force Minor Projects

101

102

102

0

Command and Intelligence Systems Software Applications

54

57

81

24

Battlespace Communications Systems

55

45

36

(9)

ADF Tactical Electronic Warfare Fleet

51

31

30

(1)

Land 

Commercial Vehicle Fleet

68

82

82

0

General Service B Vehicle Fleet

61

64

73

9

ADF Clothing

79

59

67

8

Health Systems Fleet

53

54

56

2

Maritime 

Collins Class Submarines

521

523

513

(10)

Anzac Class Frigate

343

379

417

38

Adelaide Class Frigate

138

148

132

(16)

Canberra Class Landing Helicopter Dock

92

90

81

(9)

Auxiliary Oiler Replenishment HMAS Success

75

83

89

6

Huon Class Mine Hunter Coastal

53

61

60

(1)

 

Total—top 30 products

3,894

3,922

3,906

(16)

Other approved sustainment products

1,288

1,429

1,461

32

Total sustainment product funds available

5,182

5,350

5,367

17

Support to operations

357

257

210

(47)

Total sustainment and operations funding

5,539

5,608

5,577

(31)

         

Source: Department of Defence, Annual Report 2015–16, Web Table 8.11 available from <http://www.defence.gov.au/annualreports/15-16/Chapters/chapter-8.asp> [accessed 20 April 2017].

Appendix 5 Lessons from Coles

Analysis based on 2012 Coles Report (Phase 2 and 3)

1. The 2012 Coles Report identified five root causes of problems with the sustainment of Collins Class Submarines, and made 25 recommendations.196 Based on the ANAO’s analysis, the recommendations with potentially broader application across Defence sustainment are listed below.

2. Unclear sustainment requirements—Defence lacked a clear unclassified top-level operational requirement for the platform. A single defined and realistic requirement is critical to target setting, future planning, and the development of enterprise behaviours.

Recommendations with potential Defence-wide application:

  • Set realistic performance targets and document them in the Materiel Sustainment Agreements.197
  • Define and document a clear, unclassified requirement for the sustainment program.

3. Lack of a performance based ethos—Defence had failed to define performance targets for individuals and organisations and this had been a key driver of inefficiency and sub-optimal behaviours that affected sustainment performance.

Recommendations with potential Defence-wide application:

  • Design and implement sustainment arrangements that encourage performance-based behaviour.
  • Defence needs to be an intelligent customer for sustainment.
  • Create a collaborative framework for the management of sustainment without diluting individuals’ responsibilities. As part of the governance system, create a forum to bring together all suppliers to raise issues and identify opportunities.
  • Develop and implement a contracting strategy to improve performance based contracting [Management Recommendation.
  • Improve leadership skills, knowledge and experience.

4. Poor planning—Defence lacked a clearly stated long-term strategic plan. This prevents accurate lower level plans and targets (for example, for maintenance and operations) being established and achieved. The absence of asset management strategy resulted in poor obsolescence and a lack of reliability management and, consequently, a ‘bow wave’ of reliability defects and obsolescence.

Recommendations with potential Defence-wide application:

  • Develop an Asset Management Strategy for sustainment.
  • Performance measures in Materiel Sustainment Agreements must be realistic and must be consistent with the requirements in sustainment performance artefacts such as the sustainment contract.
  • Develop a through-life capability management plan.
  • Define and endorse an Asset Management Plan.

5. Unclear lines of responsibility—Many key sustainment roles and responsibilities were not clearly defined or understood, from both an individual and organisational perspective. This resulted in the various parties failing to act on, or enforce, their responsibilities within the sustainment space.

Recommendations with potential Defence-wide application:

  • Develop clear lines of authority and responsibility for sustainment.
  • Develop and implement a strategy to address skills shortages in sustainment role.

6. Defence lacked a single set of accurate information to inform decision making—The review found that, whilst significant levels of information and analysis exist across the Collins Class Sustainment Program, the basis for long-term decision making was not always consistent or accurate, with multiple systems and datasets in use for financial, maintenance and supply chain activities. In many cases, these are not linked resulting in data integrity issues. This affects financial planning, as it makes it extremely difficult to set an agreed baseline position and accurately link expenditure to outputs. Without a single version of the truth from a financial perspective, it is difficult to assess the impacts of variances in funding appropriations on sustainment objectives.

Recommendations with potential Defence-wide application:

  • Develop enterprise-wide IT strategy and information management strategy.
  • Develop cost baselines, models, and supporting processes for sustainment program.
Summary of lessons learned about sustainment from previous reviews (as summarised by Coles)
  • Defence should give sustainment much higher attention and priority during the initial phases of the asset’s life cycle (‘Needs’, ‘Requirements’, and ‘Acquisition’). Failure to pay sufficient attention to sustainment early on will increase the cost of ownership to Defence. Retrofitting the necessary sustainment arrangements is more challenging than getting them right at the outset.
  • Defence should base its selection of systems/equipment on systems sourced from credible/proven suppliers.
  • Defence should give proper attention to asset management, logistic support and engineering requirements.
  • The capability management role needs to be resourced appropriately to be able to take the lead roles of the asset owner and the sustainment customer.

Appendix 6 Earlier attempts in Defence to take a systematic approach to asset management

The Mortimer Review (2008)

The effectiveness of the reforms implemented following the 2003 Defence Procurement Review (the Kinnaird Review) were evaluated in 2008 by the Defence Procurement and Sustainment Review (the Mortimer Review).198 Mortimer made specific recommendations to improve sustainment, including that Defence create ‘an independent Sustainment Efficiency Office to measure, benchmark, and find ways to improve the efficient delivery of sustainment’.

This could have provided, for the first time, a systematic approach to the organisation of sustainment activities across the Systems Program Offices. The government agreed to the recommendation but Defence did not implement it.199 By December 2013, some five years later, Defence decided that it would not meet the recommendation’s intent. Defence then rewrote the Mortimer recommendation to read ‘implement a system200 to improve oversight of DMO sustainment performance’.201 Thus Defence took a passive role, focusing on developing a system to monitor Systems Program Offices’ work rather than the more active analysis and management of sustainment across the organisation, envisaged by Mortimer.

The Sustainment Complexity Review (2010) found that diversity remained

A separate reform stream began in July 2010, when consultants engaged by DMO found its sustainment business model was strikingly different from those in the private sector. DMO’s approach was ‘extremely variable and inconsistent’, particularly when compared with most commercial and other government organisations:

It has inherited 50+ business models that existed before the DMO was formed in 2000. There is no evidence that specific merger integration activities have been undertaken to ensure consistent integrated operating models. From the perspective of the field review, there is some indication that the sustainment models continue to drift apart.202

The consultants attributed the diversity to ‘historical issues around original parent organisations, [and] individual service influences’. Their report made many recommendations generally aimed at reducing complexity in sustainment management, including greater use of performance-based contracts, a more consistent operating model across sustainment and improving commercial asset management expertise.

Defence attempted to develop a Sustainment Business Model

In 2011, DMO established a project to develop a Sustainment Business Model to address its ‘extremely variable’ approach to sustainment that was ‘inconsistent compared to most commercial and other government organisations’.203 DMO saw such an enterprise-wide approach to the management of sustainment as a critical component of one of the strategic recommendations of the Rizzo Review of naval sustainment, a review which had come about because of inadequate maintenance and sustainment practices.204

Appendix 7 The ‘Smart Buyer’ concept

Origins

The term ‘smart buyer’ is a term of art, with a specific meaning in public administration, relating to outsourcing, and which arose around 1990 following work in the USA by the GAO and the Brookings Institution.205 The key paragraph in an early GAO study states:

The government must have an active role in the decision-making process. Government officials must be the ones applying discretion and making value judgments throughout the process. A key criterion in determining whether consulting contracts would be appropriate is whether the government could maintain sufficient in-house capacity to be thoroughly in control of the policy and management functions of the agency. This includes the capacity to adequately direct, supervise, and monitor contracts. The government must be a “smart buyer” when purchasing assistance services and be able to make independent judgments about policy recommendations.206

Other, related GAO work has used the term in relation to the management of sustainment:

privatizing total support on new and future weapon systems can make it difficult for the organic depots [equivalent to Systems Program Offices] to acquire and sustain technical competence on new systems, leading edge technologies, and critical repair processes necessary to maintain future core capabilities, provide a credible competitive repair source, and be a smart buyer for those logistics activities that will be contracted out.207

Past use in Defence

With a substantial amount of its sustainment work contracted out, Defence has often emphasised its desire to have the internal expertise to be a smart buyer:

  • In a submission to the Joint Committee of Public Accounts and Audit inquiry into contract management in the Australian Public Service in 2000, Defence, through the then Under Secretary for Defence Acquisition—almost a direct counterpart position to the current Deputy Secretary, Capability Acquisition and Sustainment Group —stated that: ‘Defence must see that it has available the technical, financial and managerial expertise to be a smart buyer’.208
  • Similarly, in an international seminar on Defence acquisition in July 2000, an issue identified by Defence at the time was ‘the challenge to be a smart buyer’.
  • The concept was also included in the initial Materiel Sustainment Agreements in 2005 between DMO and the Defence Service Chiefs.
  • Defence refers to itself as a ‘smart buyer’ in the Defence Industry Policy Statements in 2007209 and in 2010.210

Defence’s Defence Science and Technology Group has long been represented as providing the technical and scientific expertise to enable Defence to be a smart buyer. A 2016 Parliamentary inquiry into Capability of Defence’s physical science and engineering workforce noted that:

Several submissions and witnesses emphasised the need for Defence to have a sufficient [science and engineering] workforce to allow it to be a ‘smart’ buyer or an informed customer. For example, Dr Davies211 noted that in his experience, having in-house expertise from public service engineers who could examine bids for work from external firms made him ‘a much smarter buyer’.212

The Committee recommended that Defence commit to maintaining its physical science and engineering workforce capabilities in key areas to allow it to be both a smart buyer and a technically proficient owner of materiel. The Government agreed to the recommendation.

Footnotes

1 See, for example, ANAO Report No.40 2016–17, 2015–16 Major Projects Report.

2 The MPR Guidelines, which are endorsed by the JCPAA, provide that data of a classified nature is to be prepared in such a way as to allow for unclassified publication (see ANAO Report No.40 2016–17, 2015–16 Major Projects Report, paragraph 1.16, p. 463).

3 In February 2017, Defence advised the Joint Committee of Public Accounts and Audit in a submission to the Committee’s Inquiry into Defence Sustainment Expenditure that ‘A recent review by [the] Defence Intelligence Organisation determined that the current public reporting regime is “safe”. In this context any proposal for new reporting requirements will need to consider if the new information might be aggregated to disclose classified information on capability readiness and availability.’

4 Defence advice, March 2017.

5 At the strategic level, guidance is given by the Chief of the Defence Force’s Preparedness Directive.

6 Defence Materiel Handbook (Sustainment Management) DMH (SUS) 4-0-001, September 2015, p. 1. ‘CASG customers’ means ‘internal customers’: the Capability Managers for whom the Capability Acquisition and Sustainment Group undertakes sustainment.

7 Recent Parliamentary inquiries include: Joint Standing Committee on Foreign Affairs, Defence and Trade, Procurement procedures for Defence capital projects, August 2012, p. xxvii; Joint Standing Committee on Foreign Affairs, Defence and Trade, Review of the Defence Annual Report 2011–12, Canberra, June 2013, p. 90; Joint Committee of Public Accounts and Audit, Report 442: Inquiry into the 2012–13 Defence Materiel Organisation Major Projects Report, Canberra, May 2014, and Report 448: Review of the 2013–14 Defence Materiel Organisation Major Projects Report, Canberra, May 2015, pp. 27–32.

8 Joint Committee of Public Accounts and Audit, Report 442, pp. 22–9.

9 Executive Minute on Joint Committee of Public Accounts and Audit Report No. 442, Review of the 2012–13 Defence Materiel Organisation Major Projects Report.

10 Joint Committee of Public Accounts and Audit, Report 448, p. 30.

11 Military units regulate the technical integrity of capability and undertake some operational level maintenance.

12 For example, the Kinnaird and Mortimer reviews recommended an independent entity. The First Principles Review recommended integration. The DMO was a half-way house with a separate chief executive and badging, but staff and funds continued to be provided by the department.

13 Defence, First Principles Review: Creating One Defence, 2015—‘the First Principles Review’.

14 For the great majority of Defence’s military assets, Capability Acquisition and Sustainment Group (CASG) is responsible for sustainment. Navy, Army, Air Force units and Joint Logistic Command business units also conduct sustainment activities at a tactical level, inclusive of maintenance, inventory management and regulation of the technical integrity of materiel within units.

15 Defence advised the ANAO that this comprised 3933 ongoing and non-ongoing public servants and 1616 Australian Defence Force regular and reservist personnel. Defence further advised that approximately 850 contractors were employed in the Group in March 2017.

16 Defence, Capability Acquisition and Sustainment Group, Accountability, available from <http://www.defence.gov.au/casg/AboutCASG/Accountability/> [accessed 13 April 2017]. The Chief of the Defence Force’s Preparedness Directive provides the strategic preparedness guidance for Defence Groups, Services and Capability Acquisition and Sustainment Group. The tasks articulated in the Directive cover the full range of Defence activities from humanitarian assistance and disaster relief to peace and stability operations, while ensuring the joint force retains the necessary baseline posture to mobilise in the self-reliant defence of Australia.

17 ANAO comment: The Secretary is the Department’s accountable authority under the Public Governance, Performance and Accountability Act 2013. The Chief of the Defence Force is Australia’s senior military leader. These two positions make up the Defence’s leadership ‘diarchy’. Defence comprises the Department of Defence and the Australian Defence Force.

18 Where a Systems Program Office manages multiple equipment lines, one or more product managers with responsibility for specific equipment types may support the Systems Program Office Director.

19 The capability manager for most assets is one of the three Service chiefs (Navy, Army or Air Force).

20 In April 2016, Capability Acquisition and Sustainment Group defined a Systems Program Office as a ‘business unit that works directly with Capability Managers and Suppliers as necessary to deliver agreed capability across the CLC [capability lifecycle], coordinating the FIC [fundamental-inputs-to-capability] elements required to achieve this objective’.

21 In April 2015, the ANAO completed a performance audit of Materiel Sustainment Agreements (ANAO Report No.30 2014–15, Materiel Sustainment Agreements). The audit found that Defence had established and continued to refine a generally sound Materiel Sustainment Agreement (MSA) framework to facilitate the management of sustainment activity for specialist military equipment. The framework had enabled Defence to clearly identify roles and responsibilities at a functional level, and individual agreements documented funding, deliverables, risks and performance measures for sustainment products. The development and maintenance of the MSA framework had also encouraged and facilitated collaboration between Defence and the (then) DMO at both the management and operational levels (see paragraph 12).

22 Performance measurement and reporting on sustainment is considered in Chapter 3 of this report.

23 ‘Fleet Screenings’ is a term originally used by Navy. Army refer to the reviews as ‘Sustainment Financial Screens’, and Air Force call their reviews ‘Sustainment Assessment Reviews’. These meetings typically involve representatives of the Capability Managers, CASG and other Groups such as Joint Logistics Command.

24 Navy’s Fleet Screening instructions state that the focus of Navy’s review has evolved beyond a financial emphasis in light of the Rizzo Review to include consideration of how Navy and Capability Acquisition and Sustainment Group are meeting their obligations for whole of life sustainment of Navy capability.

25 Defence, Inspector General’s Group, Review of Support Provided to DMO Systems Program Offices by the DMO Operations Divisions, 2006, p. 16.

26 Paul J Rizzo, Plan to Reform Support Ship Repair and Management Practices, July 2011, p. 8–‘the Rizzo Review’.

27 Defence, Collins Class Sustainment Review, November 2011, p. 10. Coles also draws an insightful distinction between acquisition and sustainment work in relation to submarines. On its face, this is arguably of wider applicability: ‘[Acquisition and sustainment each have] a distinct character, but need to be carefully linked to ensure a smooth passage from acquisition (design, build, test and commission) to sustainment (maintain, operate, support logistically). The skill sets required in sustainment are quite different from those required for design and build: repairing the range of equipments in a submarine poses problems of access and control which are not likely to be experienced during build, and the design issues which crop up during sustainment are focused on keeping equipment operational rather than on design for performance’.

28 The same issue continues to arise. For example, in April 2015, Defence’s ‘Lessons Learned’ report for the Air 87 Phase 2 Armed Reconnaissance Helicopter project stated: ‘By its nature the project definition and contract negotiation stages of the project focus on cost-capability aspects of the weapon system. Establishing the validity of the logistic support framework is a secondary consideration’ (p. 14).

29 Defence, ‘Operational Concept Document for DMO Sustainment Reporting System’ (Version A, December 2004, Sponsored by Director General Standardisation).

30 Defence, Going to the Next Level, the report of the Defence Procurement and Sustainment Review [Mortimer Review], 2008, p. 49, Recommendation 4.2; Defence, Plan to Reform Support Ship Repair and Management Practices, [Rizzo Review] July 2011, p. 46.

31 Defence, Rizzo Review, p. 46.

32 Defence, Decision Brief—Sustainment Performance Management System Development, 8 August 2011.

33 Defence internal documents. The Smart Sustainment reform stream of the Strategic Reform Program funded the work (see paragraph 4.6).

34 Defence, Decision Brief—Sustainment Performance Management System Development, 8 August 2011.

35 Key performance indicators are intended to relate directly to a product outcome, for example, ‘materiel-ready days’. Key health indicators are, generally, lead indicators of factors contributing to future outcomes, for example, ‘external maintenance backlog’.

36 Not all Performance or Health Indicators will apply to all Navy sustainment products.

37 The performance framework was set out in some detail in ANAO Report No.30 2014–15, Materiel Sustainment Agreements, p. 103 forward. The audit found that scope remained for Defence to enhance its sustainment management through the implementation, use and refinement of newly developed performance measures.

38 Defence, Navy/MSD [Maritime Systems Division of Capability Acquisition and Sustainment Group] Performance Framework Review, Summary of Issues and Recommendations, 4 March 2016.

39 Capability Acquisition and Sustainment Group Cost is the MSA-related expenditure.

40 Rather, it would be a ‘Strategic Sustainment Analytic’, a ‘high-level sustainment health indicator that can be used for broad cross-product comparison’.

41 Defence was able to provide a report from SPMS for February 2017 showing ‘Capability Acquisition and Sustainment Group cost per materiel ready day’ for five fleets within Maritime Systems Division’s responsibility.

42 Defence advised the ANAO in June 2017 that no decision had been made on the scope of the Program Performance Management System, nor its roll-out plan.

43 The Quarterly Performance Report, introduced in late 2015, has evolved from earlier management reporting designed to identify potential problems in the formative stages of the project life cycle (Defence advice of March 2017).

44 Joint Committee of Public Accounts and Audit, Inquiry into Defence Sustainment Expenditure, submission from the Department of Defence, February 2017.

45 Defence advice of March 2017.

46 Defence, Decision Brief—Sustainment Performance Management System Development, 8 August 2011.

47 Defence, First Principles Review, p. 17.

48 This is separate from the long-established ‘Projects of Concern’, which generally include acquisition projects rather than sustainment products.

49 ANAO Audit Report No.11 2016–17, Tiger—Army’s Armed Reconnaissance Helicopter.

50 ANAO Report No.40 2016–17, 2015–16 Major Projects Report.

51 Defence, CA12—Armed Reconnaissance Helicopter (ARH) Weapon System, Sustainment Performance Gate Review, 8 July 2016.

52 Defence, Houston Review into Army Aviation, November 2016.

53 The 2008 Audit of the Defence Budget found that the lack of joint metrics for the serviceability of the FA-18 Hornets contributed to a silo mentality between the Systems Program Office and the Air Force squadron using the aircraft. The former focused on ‘availability’ (aircraft made available to the squadron), while Air Force focused on ‘serviceability’ (aircraft actually able to be flown). While the Systems Program Office was generally meeting targets for ‘availability’, the number of aircraft able to be flown was markedly lower.

54 During 2016, Defence renamed its gate reviews ‘independent assurance reviews’. However, they are not conducted according to any auditing standard.

55 ANAO Audit Report No.52 2011–12, Gate Reviews for Defence Capital Acquisition Projects, June 2012.

56 Defence, Plan to Reform Support Ship Repair and Management Practices, [Rizzo Review] July 2011, p. 68.

57 Final operational capability is a milestone in a Defence major capability project that marks the point at which the Capability Manager agrees that final capability can be operationally employed.

58 ANAO Audit Report No.52 2013–14, Multi-Role Helicopter Program.

59 Defence Materiel Organisation, ‘Zero-Based Review—Phase 2, Final Report’, 9 December 2003.

60 Defence, Jim McDowell Workshop, First Principles Review Implementation, 15 August 2015.

61 Defence, First Principles Review, p. 40. This recommendation was not included in the summary of First Principles Review recommendations at the front of the report of the review, whereas the recommendations set out in the preceding and subsequent paragraphs of the report are included in that summary (recommendations 2.12 and 2.13 respectively). Progress in implementing the quoted recommendation has not been explicitly tracked with the other recommendations such as 2.12 and 2.13 in regular reporting on progress of the First Principles Review.

62 Until the implementation of the full system, project performance review data will continue to be captured using spreadsheets. Defence has done earlier work relevant to the assessment of its productivity. A primary example is the detailed, technical paper produced by Navy in 2006, ‘A Quantitative Approach to Navy Performance Measurement’. This followed a joint speech by the Secretary of Defence and Chief of the Defence Force stating that improving Defence productivity was a priority.

63Commonwealth Procurement Rules, July 2014, p. 13. An amended version commenced on 1 March 2017. The value-for-money requirement remains, in the amended version, the core rule for all procurements (p. 11).

64 The instruction was Defence Instruction (General) LOG 4–5‐004 ‘Defence Policy on Life‐Cycle Costing Analysis’; this was updated on 14 November 2003. The instruction was cancelled in March 2014 and replaced by the Defence Logistics Manual, DEFLOGMAN Part 2, Volume 10, Chapter 16, Life Cycle Costing Analysis.

65 Other significant Defence documentation has also referred to the cost of ownership. For example, the 2012–17 Defence Corporate Plan refers to ‘reduced cost of ownership’ as a ‘Key Benefit’ of certain strategies addressing capability development, acquisition and sustainment. No estimates of the reduction are stated.

66 Defence, Report on Defence Governance, Acquisition and Support (prepared by KPMG), April 2000; Defence Procurement Review (Malcolm Kinnaird AO, chairman), August 2003—‘the Kinnaird Review’; Going to the Next Level: the Report of the Defence Procurement and Sustainment Review (David Mortimer AO, chairman), September 2008—‘the Mortimer Review’; 2008 Audit of the Defence Budget (George Pappas, consultant), April 2009—‘the Pappas Review’; and Review of the Defence Accountability Framework (Associate Professor Rufus Black), January 2011—‘the Black Review’.

67 ANAO Report No. 43 1997–98, Lifecycle Costing in the Department of Defence, examined Defence’s 1992 policy on lifecycle costing, finding that while Defence’s policy and procedures advocate the use of whole-of-life costings in acquisition decisions, in practice, Defence’s use of whole-of-life costing was uneven.

68 The Defence qualification was that ‘The amount and type of LCC [life cycle costing] information presented to committees will vary according to the issues being considered and only pertinent information should be included’.

69Defence Annual Report 1997–98, p. 224.

70 Defence, Mortimer Review, Recommendation 4.4.

71 Defence, Rizzo Review, Chapter 4, p. 29.

72 ANAO Report No.6 2013–14, Capability Development Reform, Chapter 7.

73 The ANAO examined nine Second Pass or Combined First and Second Pass approvals including submissions related to Army, Navy, and Air Force capital acquisitions. Second and Combined approvals were selected, as Defence is expected to understand costs better at the Second Pass approval stage, than at First Pass.

74 In all cases Defence included estimates for ‘Net Personnel and Operating costs’, even though the First Principles Review (p. 40) had recommended that Defence stop using this concept: ‘Net Personnel and Operating Cost is Defence’s current estimate of the [additional] personnel, operating and sustainment cost of a proposed capability. It is an estimate of the differential cost above that of the current capability and is included in submissions to government for approval. This approach is problematic as it does not inform government of the total cost of the project across the life of the capability. Net Personnel and Operating Cost is also hard to estimate and often inaccurate. We recommend this process cease immediately’.

75 Defence, First Principles Review, April 2015, p. 14.

76 Defence, ‘Defence Integrated Investment Program—Total Cost of Ownership’, paper put to the Defence Committee, October 2016.

77 The total cost of ownership (‘whole-of-life costing’ or ‘lifecycle costing’) includes the costs of: capital acquisition; bringing the equipment into service; operating it; sustainment; disposal; and financing.

78 The analogy estimating method uses actual costs from a similar asset/system and then relies heavily on expert judgement to adjust these costs to account for differences between the similar asset/system and the new asset/system to develop an estimated cost for the new asset/system. This method produces a rough order-of-magnitude estimate, and is generally used early in the life cycle of a new asset/system when technical definition is immature and insufficient cost data is available. Source: NASA Cost Estimating Handbook, v.4.0, 2015, Appendix C: Cost Estimating Methodologies, available from <https://www.nasa.gov/offices/ocfo/nasa-cost-estimating-handbook-ceh> [accessed 21 December 2016].

79 Parametric cost estimates are based on historical cost data and mathematical relationships between those costs and other variables or cost drivers (for example, weight, area, volume). The implicit assumption is that future costs will be influenced by the same factors as past costs. Source: NASA Cost Estimating Handbook, v.4.0, 2015, Appendix C: Cost Estimating Methodologies, available from <https://www.nasa.gov/offices/ocfo/nasa-cost-estimating-handbook-ceh> [accessed 21 December 2016].

80 An example of a potential comparator database is the US Navy’s Visibility and Management of Operating and Support Costs (VAMOSC) database.

81 Defence, First Principles Review, p. 66.

82 In the case of the Smart Sustainment program (discussed later) Defence calculated any savings from the program on the basis of changes to MSA costs. Therefore it did not take into account any reductions (or increases) in staff costs in assessing that program’s outcome.

83 Defence, FPR Implementation Committee paper, ‘SPO Reform Implementation—Avoiding Cost Transfers’, February 2017.

84 Mr Kim Gillis, Deputy Secretary, Capability Acquisition and Sustainment Group, Defence, quoted in Ziesing, K, ‘From the Source: Deputy Secretary CASG Kim Gillis’, (interview) Australian Defence Magazine, 1 November 2016.

85 Deputy Secretary, Capability Acquisition and Sustainment Group, 5 July 2016, email Message [to CASG staff].

86 Department of Finance, Resource Management Guide No. 130, Overview of the enhanced Commonwealth performance framework, p. 4, available from <http://www.finance.gov.au/resource-management/performance/> [accessed 21 April 2017].

87 Department of Finance, Resource Management Guide No. 130, Overview of the enhanced Commonwealth performance framework, July 2016.

88 Previously, the relevant expenses would have been allocated across several categories, including repair and overhaul, and inventory supplies.

89 Defence advised the ANAO in March 2017 that ‘The 2015–16 purposes have been replaced by the 2016–17 purposes and will not be reported on in 2016–17’.

90 In the 2017–18 Defence PBS, Defence presents the estimates and descriptive information for its Top 30 sustainment products in a single table. See Defence Portfolio Budget Statements 2017–18, Table 68: Top 30 Sustainment Products by End of Financial Year Outcome 2017–18, pp. 130–139, available from <http://www.defence.gov.au/Budget/17-18/PBS.asp> [accessed 10 May 2017].

91Defence Annual Report 2015–16, Chapter 8, Web Tables 8.2 and 8.11.

92 See Defence Portfolio Budget Statements 2015–16, p. 21, Table 6: Capability Sustainment Programme; Defence Portfolio Additional Estimates Statements 2015–16, p. 21, Table 10: Capability Sustainment Programme.

93 See Defence Annual Report 2015–16, Web Table 8.11, available from <http://www.defence.gov.au/AnnualReports/15-16/Chapters/chapter-8.asp> [accessed 24 April 2017].

94 Defence has advised that around two-thirds of Systems Program Office staff are engaged in sustainment work. As noted in Figure 3.2, Systems Program Office staff have numbered between three thousand and four thousand over the last decade.

95 Since 2013–14, Defence has also reported estimates for sustainment by Defence Group/Service.

96 For example, in the Portfolio Additional Estimates 2007–08 (pp. 35–6), items such as Super Hornet fighter aircraft (operating costs) and C-17 transport aircraft (operating costs) are shown as separate line items.

97 The Pappas Review (2008, p. 30), in discussing the long-run cost of Defence, estimated the real growth rate of maintaining ‘today’s capability for military equipment (capital and sustainment) at 3.5 per cent. The real growth rate for the then recommended force structure option was 4.2 per cent.

98 Australian Strategic Policy Institute, The Cost of Defence: ASPI Defence Budget Brief 2016–17, p. 58.

99 Some of those attempts are examined later (paragraph 5.54 forward).

100 Defence, First Principles Review, p. 14.

101 Any savings in sustainment costs were to be retained by Defence for reinvestment primarily in operating costs for new equipment.

102 Email from Steve Gumley, CEO, to DMO General Managers, Division Heads and Branch Heads, 26 February 2008.

103 DMO internal paper, ‘Reform Issues’, April 2011.

104 Defence, 2008 Audit of the Defence Budget, April 2009.

105 ibid. p. 22.

106 The reforms were aimed at: ‘simplifying our internal processes to reduce time and/or waste; consolidating where process work is conducted so it is not duplicated in other parts of the organisation; aligning some of our more complex processes, like the acquisition of new capability, so there is a clear linkage between the identified need and the final product; ensuring our policies reflect contemporary standards; improving our decision making around expending resources; reducing our demand for goods and services; and building a cost-conscious culture in Defence’. Implementing the Mortimer Review recommendations became, as a whole, one of the 15 streams of the Strategic Reform Program in 2010.

107 The Pappas Review examined a sample of equipment fleets and interviewed senior Defence personnel. Based on this analysis the Review suggested indicative savings targets. Defence analysed these targets and developed overall targets to finalise the Defence White Paper and design of the Strategic Reform Program.

108 Defence, The Strategic Reform Program: Making it Happen, 2010, p. 4.

109 Defence, Helmsman Sustainment Complexity Review, July 2010, p. 5. DMO engaged the Helmsman Institute to review the complexity of DMO sustainment operations. The review considered the causes of this complexity, made comparisons with commercial organisations with similar operations and analysed the impact of this complexity on DMO. Additionally, it provided recommendations to reduce this complexity.

110 The report did, however, make 25 recommendations ‘to increase the chances of success and decrease risks in the Smart Sustainment Stream’.

111 The ANAO has not verified the reported savings in the three cases quoted here.

112 The committee was called the ‘Materiel and Risk Committee’. It terminated upon the delisting of DMO.

113 Defence, Smart Sustainment Reform Health Check, June 2011.

114 Defence, Final Audit Report, Audit Task: 12-041, SRP Savings and Performance Reporting, Audit and Fraud Control Division, August 2012.

115 Raytheon Australia, Smart Sustainment Solutions, March 2013.

116 Defence, advice to the Minister.

117 ANAO Audit Report No.41 2008–09, The Super Seasprite.

118 The Australian Strategic Policy Institute commented: ‘Why a savings [sic] should be claimed from a fluctuation in fuel prices is unclear. Will future rises in fuel prices be subtracted from savings?’ (The Cost of Defence: ASPI Defence Budget Brief 2012–13, 2012, p. 138).

119 Australian Strategic Policy Institute, ‘Graphs of the week: finally getting the Collins class we paid for’, 27 October 2016 available from <https://www.aspistrategist.org.au/graphs-week-finally-getting-collins-class-paid/> [accessed 28 April 2017].

120 Defence, First Principles Review, p. 36. The report of the Review, at various points, advocates outsourcing of work of a transactional nature but otherwise gives little guidance on criteria for outsourcing decisions.

121 Annex A to the First Principles Review report indicates that the reviewers considered six FPR recommendations, including one key recommendation, to have been influenced by Coles’ earlier review.

122 Defence, First Principles Review, pp. 32–42. A preference in the report of the First Principles Review for a whole-of-life focus (rather than on acquisition and sustainment as distinct spheres) is evident from the discussion of total cost of ownership (pp. 14 and 40).

123 Defence, First Principles Review, p. 10.

124 Defence, First Principles Review, p. 40.

125 Defence’s current plans for systems development envisage the department implementing support for earned value management in late 2019.

126 This mirrors similar changes underway in the counterpart UK organisation, Defence Equipment and Support.

127 Defence, First Principles Review, Key Recommendation 6, p. 73.

128 In mid-2015, Defence established a standing offer panel to support the implementation of the First Principles Review. Bechtel was one of the companies on this panel. In July 2015, Defence issued a request for quote and tasking statement to Bechtel. In September 2015, Defence entered into a contract with Bechtel for $8.5 million, to conclude on 1 April 2016. In March 2016, Defence extended Bechtel’s contract until 30 June 2016, which increased the value to $9.7 million. In May 2016, Defence entered into another contract with Bechtel for $65.8 million, to conclude in June 2018. In November 2016, the contract was amended and increased by $32 million.

129 A ‘Management and Operations contractor’ approach involves an external organisation being contracted to take on all responsibility for the day-to-day running of the organisation, including its transformation, but with the government maintaining ultimate ownership and strategic control over the body.

130 In May 2016, Defence entered into another contract with Bechtel for $65.8 million, to conclude in June 2018. In November 2016, the contract was amended and increased by $32 million. The Bechtel team assisting Defence comprises around 60 people (January 2017).

131 This is as of January 2017.

132 Defence considered not using the term ‘Systems Program Office’ as this could be confused with the use of the same term in the USA, where Systems Program Offices were said to be concerned only with acquisition and not sustainment. Notwithstanding these concerns, Defence adopted the term ‘Systems Program Office’ where Systems Program Offices perform both functions.

133 Defence internal correspondence, April 2016.

134 Defence, Strategy& [sic] (formerly Booz & Company), ‘Right Sizing the DMO—Using a capability-driven approach: same outcomes, fewer staff’, Canberra, March 2013, p. 2.

135 These numbers are for 30 June 2016 and based on the pre-stocktake number of Systems Program Offices.

136 Defence advised the ANAO in March 2017 that ‘examples of “fully outsourced” models in the Air Domain include AEW&C, C-17, the Lead-in-Fighter and the Heron Unmanned Aerial System’.

137 Defence, SMS–Indec, Defence Materiel Organisation: Executive Summary, Future Operating Model, Work Streams 1 to 3, p. 17, 13 July 2015.

138 Pappas describes “lean” as ‘a way of working that focuses on (among other goals), the continual reduction of seven forms of ‘waste’… By eliminating waste substantial productivity improvements are made: quality is improved, available time is increased, and cost is reduced’. Source: Defence, 2008 Audit of the Defence Budget, 2009, pp. 279–280.

139 It is not clear from the report how the First Principles Review drew this conclusion. The Review makes no recommendation about the right number of Systems Program Offices.

140 Defence, First Principles Review, p. 36.

141 Defence, DMO End State Design Final Report, December 2013. According to that consultancy, most savings were expected to come from sustainment and support functions.

142 Defence, First Principles Review, p. 36.

143 The First Principles Review (p. 36) states: ‘The Capability Acquisition and Sustainment Group should outsource significant elements of the project management associated with sustainment contracts to an appropriate provider or providers … We envisage efficient implementation would result in reductions in the public service workforce and the transfer of military personnel to other functions within Defence’.

144 The First Principles Review (p. 36) states: ‘We consider that it would be possible to rationalise and reduce the Systems Program Office structure and staffing levels as more sophisticated contracting models are established and their individual support functions are consolidated as part of the service delivery reform’.

145 According to AusTender records, the cost of the review and resulting report was $613 000.

146 Defence, Systems Program Office Design Guide, August 2016, p. 7.

147 A summary of the findings of the first tranche of Systems Program Offices reviews is set out in Chapter 2.

148 Defence advice, 16 August 2016.

149 Defence, The Australian Defence Aerospace Sector Strategic Plan: Strengthening Military Aerospace Capability in Australia, 2003, Recommendation 4, p. 8.

150 D Abraham, ‘Performance based contracting—a panacea for supporting legacy aerospace platforms?’, Defence Geddes Papers, 2005, available from <http://www.defence.gov.au/ADC/publications.asp#Geddes> [accessed 16 January 2017].

151 Defence, DMO Acquisition and Sustainment Manual (2007), pp. 92–3.

152 Defence, Next Generation Performance-Based Support Contracts—Achieving the Outcomes that Defence Requires (PBC Discussion Paper v1.0), February 2010. There is no record of target for these objectives.

153 Defence [DMO], CEO Roundtable Discussions [Note: Document dated 23 July 2008. Correct date believed to be July 2009.] The term ‘contract for availability’ is a term used in British military circles but is equivalent to ‘performance-based contracting’. In the USA ‘performance-based logistics’ (PBL) is also used.

154 The template was referred to as ‘ASDEFCON (Support) version 3.0’, June 2011.

155 Defence, ‘A History of Performance Based Contracting in Defence’, [2016].

156 ANAO Report No.52 2013–14, Multi-Role Helicopter Program.

157 Defence, Report on Defence Governance, Acquisition and Support, (KPMG), April 2000, p. 51.

158 Defence, CASG First Principles Review Implementation Plan, version 2.9, 15 December 2016; Capability Acquisition and Sustainment Group—Matrix Management, 8 September 2016.

159 Defence, First Principles Review Implementation Committee, Outcomes, 11 February 2016.

160 Defence, Review by Mr Jeff Worley, 30 May to 3 June 2016.

161 ibid.

162 Defence advice of March 2017.

163 Defence informed the ANAO that it is seeking to address the capacity and technical knowledge of Defence staff in CASG through the introduction of the matrix management approach.

164 Defence, First Principles Review, pp. 32–3. In essence, this First Principles Review recommendation is very similar to the finding of the Defence Governance and Accountability Review of 2000 which advocated outsourcing provided ‘sufficient in−house competency is retained to allow [Defence] to be a smart customer and that [Defence] has sufficient contract management capability’.

165 United States Government Accountability Office, GAO/GGD-00-172R, Study on Facility Design Reviews, July 2000, available at http://www.gao.gov/assets/90/89986.pdf

166 Defence, ‘Smart Buyer—Detail Design’, approved 20 October 2016. Defence has elaborated on its approach to being a ‘smart buyer’ in its submission to the Joint Select Committee on Government Procurement, April 2017.

167 Deputy Secretary, Capability Acquisition and Sustainment Group, Defence, CASG Bulletin, Issue 4, 2015.

168 See, for example, Deputy Secretary, CASG, address to Australian Defence Magazine, 16 February 2017.

169 Defence, Observations and conclusions of the Gate Review Board, ‘CA48—MRH90 Multirole Helicopter: Sustainment Performance Gate Review, 12 May 2016.

170 This view is supported by a Defence ‘Lessons Learned Report’ on the Tiger project (8 April 2015), which stated that ‘The Commonwealth did not adequately enforce the provisions of the TLS [through-life support] Contract and if it had, then some of the dispute around sustainment performance may have been avoided. Inadequate contract management staffing levels, a lack of Commonwealth commercial acumen and poor leadership all contributed to a weakening of the Commonwealth’s position’ (p. 16).

171 Defence, Airborne Early Warning and Control Sustainment Performance Gate Review, 30 July 2015.

172 Sir John Parker GBE FREng, An Independent Report to inform the UK National Shipbuilding Strategy, November 2016, p. 11.

173 In an Australian context, Simon Domberger pointed out that loss of skills to an organisation may be seen as a contracting cost. However, the skills may be retained in the marketplace and the real issue is whether the organisation loses the capability of being a smart purchaser (The Contracting Organization: A Strategic Guide to Outsourcing, Oxford 1998, p. 70.)

174 Defence, Ship Zero Concept, (QL131 Report) 10 October 2016, p. 9.

175 ISO 55000 is an international standard for management of physical assets.

176 ‘The Helmsman Sustainment Complexity Review’ [Helmsman Review], July 2010; Plan to Reform Support Ship Repair and Management Practices [Rizzo Review], July 2011; and Study into the Business of Sustaining Australia’s Strategic Collins Class Submarine Capability [Coles Review], November 2012. Rizzo’s Recommendation 1 was that ‘Navy and DMO should jointly establish practical methodologies for integrated through life asset and sustainment management’ (p. 38). Asset management might not apply to items at the ‘commodities’ end of the sustainment spectrum, but would apply to platforms and may apply to products (see Box 1, p. 2). Asset management is not mentioned in the 2007 DMO Acquisition and Sustainment Manual.

177 Defence, ‘Defence Asset Management Landscape Document’, May 2012, p. 1.

178 British Standards Institution, Publicly Available Specification for the optimal management of physical assets PAS 55-1:2008: Asset Management, cited in Defence, ‘Defence Asset Management Landscape Document’, May 2012, p. 1.

179 At the time, the British Standards Institution’s Publicly Available Specification for the optimal management of physical assets (PAS 55) was the benchmark international standard. In January 2014, the International Standards Organisation released the ISO 55000 series of standards for asset management.

180 Defence engaged a consulting firm to carry out a high level Asset Management maturity assessment of the CASG policy and guidance documentation against the requirements of the Asset Management Standards. The report assessed the maturity level of asset management policy and supporting guidance documentation only.

181 Defence, ‘ISO55000 Gap Assessment, Capability Acquisition and Sustainment Group’, September 2015, p. 6.

182 Defence, ‘ISO55000 Gap Assessment, Capability Acquisition and Sustainment Group’, ibid.

183 Defence, minute ‘Effective implementation of asset management in Defence’, 17 September 2015.

184 Defence, First Principles Review Oversight Board Proposal—Health Check on Progress, 7 February 2017.

185 For example, a 2015 Lessons Learned Report prepared by Defence states that Defence ‘did not adequately enforce the provisions of the [sustainment] contract …’.

186Defence Portfolio Budget Statements 2015–16, Table 6: Capability Sustainment Programme, p. 21, available from <http://www.defence.gov.au/Budget/15-16/PBS.asp> [accessed 20 April 2017].

187Defence Portfolio Budget Statements 2015–16, Table 94: Top 30 Sustainment Products by End of Financial Year Outcome 2015–16, p. 190, available from <http://www.defence.gov.au/Budget/15-16/PBS.asp> [accessed 20 April 2017].

188Defence Portfolio Budget Statements 2015–16, Top 30 Sustainment Product Descriptions, pp. 191-202, available from <http://www.defence.gov.au/Budget/15-16/PBS.asp> [accessed 20 April 2017].

189Defence Portfolio Budget Statements 2015–16, Table 21: Army Deliverables (Rate of Effort – Flying Hours), p. 43, available from <http://www.defence.gov.au/Budget/15-16/PBS.asp> [accessed 20 April 2017].

190Defence Portfolio Budget Statements 2015–16, Section 2: DMO Outcomes and Planned Performance, p. 151, available from <http://www.defence.gov.au/Budget/15-16/PBS.asp> [accessed 20 April 2017].

191Defence Portfolio Additional Estimates Statements 2015–16, Table 10: Capability Sustainment Programme, p. 21, available from <http://www.defence.gov.au/Budget/15-16/PAES.asp> [accessed 20 April 2017].

192Defence Portfolio Additional Estimates Statements 2015–16, Table 77: Top 30 Sustainment Products by End of Financial Year Outcome 2015–16, Appendix F, pp. 108-115, available from <http://www.defence.gov.au/Budget/15-16/PAES.asp> [accessed 20 April 2017]. In the 2017–18 Defence PBS Defence presents the estimates and descriptive information for its Top 30 sustainment products in a single table. See Defence Portfolio Budget Statements 2017–18, Table 68: Top 30 Sustainment Products by End of Financial Year Outcome 2017–18, pp. 130–139, available from <http://www.defence.gov.au/Budget/17-18/PBS.asp> [accessed 10 May 2017].

193Defence Annual Report 2015–16, Chapter 8, available from <http://www.defence.gov.au/annualreports/15-16/Chapters/chapter-8.asp> [accessed 20 April 2017].

194Defence Annual Report 2015–16, Chapter 8, Web Table 8.11: Top 30 sustainment products by expenditure, as forecast in the Portfolio Budget Statements 2015–16, available from <http://www.defence.gov.au/annualreports/15-16/Chapters/Chapter8-WebTable8.11.xls> [accessed 20 April 2017].

195Defence Annual Report 2015–16, Chapter 8, Web Table 8.2: Top 30 sustainment products, performance summary as at 30 June 2016, available from <http://www.defence.gov.au/annualreports/15-16/Chapters/Chapter8-WebTable8.2.pdf> [accessed 20 April 2017].

196 Defence, Study into the Business of Sustaining Australia’s Strategic Collins Class Submarine Capability, November 2012.

197 The 2012 Coles Review characterised the recommendations from 2012 report as Strategic, Management or Operational recommendations.

198 Defence, Going to the Next Level, the report of the Defence Procurement and Sustainment Review [Mortimer Review], 2008, p. 46.

199 In its 2008–09 Annual Report, Defence stated that to implement this recommendation, it had created the Sustainment Reinvestment Office to co-ordinate DMO’s role in achieving the $5.5 billion in savings expected from Smart Sustainment under the Strategic Reform Program. Smart Sustainment is examined in Chapter 4.

200 This system was the Sustainment Performance Management System (SPMS). The minute did not inform the minister that the SPMS had been under development for over two years at that point.

201 Defence sought, and gained the approval from the Defence Minister to rewrite Mortimer Review Recommendation 4.3. It provided no explanation of its inability to implement the recommendation in its advice to ministers on the matter.

202 Defence, ‘The Helmsman Sustainment Complexity Review’ [Helmsman Review], July 2010, p. 29.

203 Defence, ‘The Helmsman Sustainment Complexity Review’, July 2010, p. 29.

204 The Rizzo Review of naval sustainment was triggered by early decommissioning of HMAS Manoora, the extended unavailability of HMAS Kanimbla and the temporary unavailability of HMAS Tobruk. The Review made 24 recommendations and identified seven of these as strategic in scope because of their significant and enduring impact on the management of sustainment in DMO. Recommendation 1 was that ‘Navy and DMO should jointly establish practical methodologies for integrated through life asset and sustainment management’. Source: Defence, Plan to Reform Support Ship Repair and Management Practices, [Rizzo Review] July 2011, p. 38.

205 See, in particular, D.F. Kettl, Sharing Power, Brookings Institution, 1994; and GAO/GGD-92-11 Government Contractors: Are Service Contractors Performing Inherently Governmental Functions?, 1991. In examining why contracting out work to the private sector had not worked well during the Reagan administration, these studies find that government has often not proved to be an intelligent consumer of the goods and services it has purchased. They provide recommendations as to how government can become a ‘smart buyer,’ knowing what it wants and judging better what it has bought.

206 General Accounting Office, GAO/GGD-92-11 Government Contractors: Are Service Contractors Performing Inherently Governmental Functions?, 1991, p. 30, available from <http://www.gao.gov/products/GGD-92-11> [accessed 16 January 2017].

207 General Accounting Office, GAO/T-NSIAD-97-110 Defense Outsourcing: Challenges Facing DOD as It Attempts to Save Billions in Infrastructure Costs, 1997, pp. 22–3, available from <http://www.gao.gov/products/T-NSIAD-97-110> [accessed 16 January 2017]. Subsequently, the term has been used widely by others, including in a military acquisition context. A RAND study, for example, ‘Maintaining the [US] Army’s “Smart Buyer” Capability in a Period of Downsizing’, states that by smart buyer capability it means: ‘the Army has sufficient in-house technical expertise to stand up to its industry counterparts when dealing with technical issues of the conceptual design, research and development (R&D), and procurement of new military systems’. Source: K. Horn, C. Wong, E. Axelband, P. Steinberg, and I. Chang, RAND white paper, ‘Maintaining the Army’s “Smart Buyer” Capability in a Period of Downsizing’, 1999. In Australia, see also Pat Barrett AM, Auditor-General for Australia, ‘Some issues in contract management in the public sector’, presentation to Australian Corporate Lawyers Association/Australian Institute of Administrative Law conference on outsourcing, Canberra, 26 July 2000.

208 Internally, other areas within Defence had earlier recognised the need for being smart buyers in procurement. For example, the Defence Science and Technology Organisation (DSTO) referred in a 1997 publication to an ‘initiative to help the Australian Defence Organisation (ADO) be a smart buyer’ in relation to software acquisition. See Fisher, P., A Study of Defence Publications Relating to the Procurement of Software-intensive Systems, DSTO, 1997. In the Defence Annual Report 2004–05 (p. 242), DSTO nominated ‘ensuring Australia is a smart buyer of defence equipment’ as one of the ways in which it supports Australia’s defence.

209 Defence, ‘Defence and Industry Policy Statement 2007’, p. 28: ‘Defence’s in-house S&T organisation, DSTO, enables the ADF to be a smart buyer, user and adaptor of military technology’.

210 In the latter case, this is in relation to having the technical knowledge to build the ‘future submarine’.

211 Dr Andrew Davies, Director of Research, Australian Strategic Policy Institute.

212 Senate Foreign Affairs, Defence and Trade References Committee, Capability of Defence’s physical science and engineering workforce, April 2016, paragraph 3.16.