The audit objectives were to examine the effectiveness of Defence’s management of the test and evaluation (T&E) aspects of its major capital equipment acquisition program; and to report on Defence’s progress in implementing T&E recommendations made in the Senate Foreign Affairs, Defence and Trade References Committee’s August 2012 report, Procurement procedures for Defence capital projects.

Summary and recommendations

Background

1. Defence’s capital equipment acquisition program includes aircraft, maritime vessels and land-based equipment in various stages of engineering development and delivery. In 2013–14 that program consisted of some 180 approved projects with a total value of $79 billion. The 2012 Defence Capability Plan contains an additional 111 projects, or project phases, planned for either First or Second Pass government approval over the four year Forward Estimates period. In 2012, the estimated capital cost of these projects was $153 billion.

2. Each of the above projects rely on test and evaluation (T&E) processes to identify areas of cost, schedule and capability risk to be reduced or eliminated. T&E is a key component of systems engineering and its primary function is to provide feedback to engineers, program managers and capability managers on whether a product or system is achieving its design goals in terms of cost, schedule, function, performance and sustainment. It also enables capability acquisition and sustainment organisations to account for their financial expenditure in terms of the delivery of products or systems that are safe to use, fit for purpose and that meet the requirements approved by government. In July 2001, the then Defence Secretary, Dr Allan Hawke, emphasised the importance of T&E as follows:

T&E is an important tool in our plans for the management of Defence capability to ensure successful achievement and maintenance of operational effectiveness. As Defence moves to consider its governance strategies in a theme of organisational renewal it is timely for us to consider T&E as a key management tool.1

Audit objectives and criteria

3. The audit objectives were to examine the effectiveness of Defence’s management of the T&E aspects of its major capital equipment acquisition program; and to report on Defence’s progress in implementing T&E recommendations made in the Senate Foreign Affairs, Defence and Trade References Committee’s August 2012 report, Procurement procedures for Defence capital projects.

4. To form a conclusion against the objectives, the ANAO adopted the following high-level criteria:

  • Defence’s organisational structures, roles and responsibilities enable the coordinated application of adequate T&E at each stage of the capital equipment project life cycle;
  • Defence’s T&E policy and procedures are suitably designed and applied as intended;
  • Defence invests in a broad range of training and skills development for T&E personnel to enable the application of necessary T&E expertise throughout the capital equipment project life cycle; and
  • the T&E aspects of capital equipment acquisition are transparently reported to inform decision making and management of technical risks that may impact the development and maintenance of the major systems component of the Fundamental Inputs to Capability.

Conclusion

5. Over recent years, Defence has strengthened its enterprise-level management of T&E conducted in support of major capital equipment acquisitions. Defence established a lead authority for T&E (the Australian Defence Test and Evaluation Office) to provide advice and consultancy support across Defence, a T&E Principals’ Forum to foster consistency of approach, and has developed an overarching policy on T&E. That said, the conduct of T&E remains distributed across 12 Defence organisations, placing a premium on the effectiveness of Defence’s T&E governance as a means of mitigating the risk of inconsistent conduct of T&E. Defence’s administration of T&E would be further strengthened by introducing arrangements to provide enterprise-level advice to senior responsible leaders on key issues, introducing performance measures and compliance assurance for T&E, and completing reforms to T&E personnel competency and training arrangements. These measures would provide greater assurance over the administration of Defence T&E and would be consistent with reforms underway within Defence, following the recent First Principles Review, to establish a stronger ‘strategic centre.’

6. The case studies examined in this audit highlight the important role played by T&E in managing acquisition risks for major capital equipment. Defence faces challenges in balancing the methodical conduct of T&E for major acquisitions to assure Capability Managers that full contracted capability is delivered, while also seeking to meet delivery schedules and other priorities. In the case of the first Landing Helicopter Dock (LHD), HMAS Canberra, key management decisions were usefully informed by Defence’s T&E, which identified numerous defects and deficiencies for resolution. Defence decided, on balance, to accept HMAS Canberra on the understanding that the deficiencies would be addressed during the ship’s operational phase. In doing so, the Chief of Navy accepted greater risks than would have been the case had System Acceptance been based on more complete objective quality evidence of compliance with contracted specifications, and had Initial Materiel Release been based on less qualified findings by Defence’s regulators concerning compliance with technical, operational and safety management system requirements. As operational T&E is still underway and is not due for completion until the fourth quarter of 2017, it remains to be seen what impact, if any, this elevated risk has on the achievement of Final Operational Capability.

Supporting findings

Key developments in Defence’s test and evaluation management arrangements

7. Defence has 12 organisations that conduct T&E activities. While this approach is beneficial for developing, maintaining and applying specialised T&E expertise, it increases the risk of inconsistent conduct of T&E. This risk places a premium on Defence having effective T&E leadership and governance arrangements.

8. In 2007 Defence established a lead authority for T&E–the Australian Defence Test and Evaluation Office–to provide advice and consultancy support on T&E issues and to develop Defence’s overarching policy on T&E. Defence has recently completed and published its overarching T&E policy in the Defence Capability Development Manual (DCDM). This manual expands upon former Defence Instructions concerning Defence T&E policy and completes a longstanding commitment to the Parliament.

9. Defence has more to do to provide a comprehensive and integrated T&E framework to its project offices by ensuring: the DCDM aligns with Navy, Army and Aerospace regulatory management manuals; the DCDM is aligned with new organisational structures arising from the implementation of the First Principles Review2; and that subsidiary T&E policy and procedural guidance manuals used by the various project offices are consistent with the DCDM.

10. Defence’s application of its T&E policy and procedures to the acquisition projects examined in this audit and past performance audits has varied, driven primarily by the relative complexity of Defence projects. Some project offices manage complex developmental T&E with Defence as the prime system integrator, while other offices manage acceptance T&E of mature products that have already undergone acceptance T&E by another jurisdiction.

11. Defence has made slow progress in implementing the 2012 Senate Inquiry recommendations relating to T&E personnel competency and training requirements. No whole-of-Defence T&E personnel competency and training needs analysis has been conducted and T&E personnel training and competency requirements management vary significantly between the armed Services and the Defence Materiel Organisation (DMO, now the Capability Acquisition and Sustainment Group (CASG)). The ANAO has made a recommendation aimed at strengthening the enterprise-level management of Defence’s T&E workforce.

Managing preview test and evaluation

12. Defence has implemented the 2012 Senate Inquiry recommendation that it mandate a default position of engaging specialist T&E personnel prior to seeking First Pass government approval for acquisition options to proceed into detailed analysis.

13. When implemented well, Defence preview T&E has mitigated acquisition risks, particularly with respect to off-the-shelf (OTS) equipment acquisitions. OTS defence equipment acquisitions are not risk free and acquisition risks still need to be managed through the conduct of preview T&E and operational T&E.

14. The mixed experience with the MRH90 helicopter acquisition and other recent OTS acquisitions, such as Land 121 (medium and heavy vehicle fleet) and Land 125 (F88 Steyr rifle), underscore the importance of OTS equipment being subjected to T&E sufficient to allow the mitigation of cost, schedule and capability risks.

Managing development, acceptance and operational test and evaluation for Defence’s amphibious capability acquisition

15. Initial Materiel Release for LHD 1, HMAS Canberra, was declared in October 2014 with system acceptance test procedures ongoing and many test reports not submitted for Defence’s approval. LHD 1’s report of materiel state provided Navy with an assessment of remaining technical, operational and safety management system risks at the time of the ship’s release by DMO to Navy. Navy decided to accept the risks identified by the T&E, along with remediation plans, and the ship received Initial Operational Release into operational T&E in November 2014. The ANAO has made a recommendation aimed at reducing risk in the transition of capability from the acquisition phase to operations.

16. Early operational T&E of HMAS Canberra commenced against a backdrop of significant work required to verify contractual compliance with 451 function and performance specifications, which had not occurred at the time of System Acceptance and Initial Materiel Release. Among those 451 were known defects and deficiencies within the ship’s communications system, radar system, combat management system, sewage system and logistic support system.

17. HMAS Canberra is scheduled to achieve its Initial Operational Capability milestone during the fourth quarter of 2015, at which time it will be able to undertake its humanitarian assistance and disaster relief support role. Both LHDs and their landing craft are scheduled to undergo Final Operational Capability T&E during Exercise Talisman Sabre in 2017, and Defence anticipates declaring Final Operational Capability in the fourth quarter of 2017, some 12 months later than originally scheduled. The MRH90 helicopters to be embarked upon the LHDs are forecast to achieve Final Operational Capability in July 2019, some five years later than originally scheduled.

Future directions for the governance of Defence test and evaluation

18. The establishment of the Australian Defence Test and Evaluation Office in 2007 and a T&E Principals’ Forum in 2008, along with the finalisation of an overarching T&E policy in 2015, have provided Defence with a stronger basis for the management of T&E. Notwithstanding these positive developments, this performance audit has highlighted the inherent challenges in Defence’s entity-level management and conduct of T&E, which remains distributed across 12 internal organisations. Scope remains to improve key aspects of Defence’s administration, specifically: assuring consistency in the conduct of T&E and compliance with whole of entity T&E requirements; establishing entity-level performance measures for T&E; and T&E personnel competency management.

19. The ANAO has made a recommendation aimed at strengthening enterprise-level governance and advisory arrangements for T&E, in the context of Defence’s implementation of reforms arising from the April 2015 First Principles Review.

Recommendations

Recommendation No. 1

Paragraph 2.48

To strengthen the enterprise-level management of the T&E workforce, the ANAO recommends that Defence:

  1. identifies the training and competencies of the existing Defence T&E workforce;
  2. conducts a T&E personnel competency and training needs analysis for the whole entity; and
  3. monitors the availability of sufficient, appropriately trained T&E personnel in specific competency areas and takes steps to address any gaps identified.

Department of Defence response: Agreed.

Recommendation No. 2

Paragraph 4.22

To reduce risk and assist the transition of capability from the acquisition phase to operations, the ANAO recommends that prior to System Acceptance, Defence ensures that material deficiencies and defects are identified and documented, and plans for their remediation established.

Department of Defence response: Agreed.

Recommendation No. 3

Paragraph 5.14

In the context of its implementation of reforms arising from the First Principles Review, the ANAO recommends that Defence introduce arrangements to provide the Vice Chief of the Defence Force and Capability Managers with enterprise-level advice on the coordination, monitoring and evaluation of the adequacy and results of Defence T&E activities.

Department of Defence response: Agreed.

Summary of entity response

The Department of Defence’s summary response to the proposed report is provided below, while its full response is at Appendix 1.

Defence places a high priority on Test and Evaluation (T&E) to inform decision-making in project acquisitions and risk management during the acceptance into service. Defence develops complex capabilities and T&E is an important practical means to help to mitigate the risk inherent in such endeavours. As such Defence welcomes this audit.

The audit is timely as Defence has published centralised policy on T&E and is undertaking fundamental organisational reform through the First Principles Review (FPR). The ANAO’s three key recommendations are agreed by Defence and will be used to strengthen governance and leadership in T&E, and to ensure Defence decision-making is, wherever possible, based on real and independent test results as early as possible in every project. Defence is committed to addressing the recommendations of the ANAO Report via the FPR reforms of the Defence Capability Life Cycle, which will include improved outcomes for Defence T&E.

Defence notes the ANAO concerns in the report regarding the T&E of the Canberra Class amphibious ship, HMAS Canberra that was the one major case study used in the audit. This amphibious capability is one of the most complex maritime capabilities brought into ADF service. While Defence T&E has matured over the last several years, the new Defence T&E policy and governance framework was not implemented in time to influence the T&E planning in support of acceptance into service of this major capability. Navy made well-considered risk based decisions in accepting the Canberra Class amphibious ship into service, having taken account matters such as the materiel data presented, the overall schedule of the introduction of the capability, and the management of ongoing issues such as defects and safety. The Chief of Navy has undertaken, and continues to undertake, with all Defence stakeholders, a detailed and deliberate assessment of the risks that are present in this new capability.

Defence’s T&E focus remains the timely achievement of capability that is safe, effective and suitable for ADF activities and operations.

1. Background

Introduction

1.1 Australian Defence Force (Defence) capability comprises major systems, such as ships, aircraft and land vehicles operated by Defence personnel, and other elements such as supplies and facilities that contribute to the conduct of Defence operations. The acquisition of major systems is subject to Defence’s test and evaluation (T&E) arrangements that seek to provide:

… decision-makers [with] factual information to help assess risks to achieving the desired capability. T&E in Defence is a deliberate and evidentiary process applied … to ensure that a system is fit-for-purpose, safe to use and that Defence personnel have been trained and provisioned with the enduring operating procedures and tactics to be an effective military force. As such T&E contributes to confirming legal obligations are met and documented in areas like fiduciary, environmental compliance and workplace health and safety.3

1.2 Defence’s T&E arrangements cover the following phases:

  • Preview T&E–assists analysis and refinement of major equipment acquisition options presented to government for First or Second Pass approval by government.4
  • Developmental T&E–assists the design and development of a system during the acquisition phase or during system upgrades. It provides Defence with the information needed to evaluate a product’s progress toward achieving contractually specified function and performance requirements.
  • Acceptance T&E–provides Defence with the information needed to verify contractual compliance of equipment offered by contractors for System Acceptance approval by the acquisition authority.5 Acceptance T&E also informs the Initial Materiel Release decision to transition a system from its construction phase to its in-use phase. Defence’s Capability Managers6 rely on acceptance T&E to inform their decision to approve a system’s Initial Operational Release into operational T&E.7
  • Operational T&E–provides Defence with the information needed to determine if the acquired equipment is operationally effective and suitable when examined under realistic operating conditions by representative operational personnel. It is used to inform the Capability Manager of a system’s Initial Operational Capability, and to determine when a system has achieved Final Operational Capability.8,9 Defence equipment may undergo follow-on operational T&E once Final Operational Capability has been achieved. This T&E phase reviews a system’s effectiveness and suitability in the context of changing operational requirements.

1.3 Figure 1.1 provides a simplified illustration of Defence’s T&E arrangements, respective capability system life cycle milestones and the authorities responsible for approving milestone achievement.

Figure 1.1: Capability life cycle – T&E phases, acquisition milestones and approval authorities

 

 

Source: ANAO analysis of Defence records.

Audits and external reviews of Defence test and evaluation

1.4 The ANAO conducted a performance audit of Defence’s T&E program in 2001–02 which found that there was little evidence of effective corporate initiatives to support the efficient and effective use of Defence’s T&E resources.10 Since that time, aspects of T&E have featured in a number of ANAO audits11, and there has also been ongoing Parliamentary interest in this subject. Most recently, in August 2012, the Senate Foreign Affairs, Defence and Trade References Committee (the 2012 Senate Inquiry) reported on an inquiry into Procurement procedures for Defence capital projects. The report identified several deficiencies in the way T&E was being utilised to support Defence major capital equipment acquisitions and made a range of recommendations. Five of the recommendations, to which the then Government agreed, were directly related to T&E (see Appendix 2).

1.5 The findings of the 2012 Senate Inquiry were broadly consistent with previous audits and reviews, which also raised issues relating to the conduct, oversight and resourcing of Defence T&E (see Table 1.1). In December 2012 and May 2014, the Joint Committee of Public Accounts and Audit identified Defence T&E as an audit priority of Parliament in recognition of the ongoing importance of T&E.

Table 1.1: Common themes in previous reviews

Theme

ReviewA,B

2002 ANAO T&E audit

2003 Senate Inquiry

2003 Kinnaird Review

2008 T&E Roadmap

2012 Senate Inquiry

Inconsistent conduct of T&E.

✔

✔

✔

✔

✔

Inadequate oversight of T&E training.

✔

✘

✘

✔

✔

Inadequate resources for T&E.

✔

✔

✔

✔

✔

Poor translation of T&E policy and process into practice.

✔

✔

✘

✔

✔

Misunderstanding of T&E’s role as an assurance mechanism for the delivery of expected capability.

✔

✔

✘

✔

✔

Note A: Ticks indicate whether an issue was raised in a review.

Note B: ANAO Audit Report No. 30 2001-02 Test and Evaluation of Major Defence Equipment Acquisitions, January 2002; Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Report on the inquiry into materiel acquisition and management in Defence (2003); Defence Procurement Review (Malcolm Kinnaird AO, chairman), August 2003—‘the Kinnaird Review’; Department of Defence, Defence Test and Evaluation Roadmap, Defence, Canberra, 2008; and Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Procurement procedures for Defence capital projects, August 2012.

Source: ANAO analysis of reviewed documents.

Audit approach

1.6 The audit objectives were to examine the effectiveness of Defence’s management of the T&E aspects of its major capital equipment acquisition program; and to report on Defence’s progress in implementing T&E recommendations made in the Senate Foreign Affairs, Defence and Trade References Committee’s August 2012 report, Procurement procedures for Defence capital projects.

1.7 To form a conclusion against the objectives, the ANAO adopted the following high-level criteria:

  • Defence’s organisational structures, roles and responsibilities enable the coordinated application of adequate T&E at each stage of the capital equipment project life cycle;
  • Defence’s T&E policy and procedures are suitably designed and applied as intended;
  • Defence invests in a broad range of training and skills development for T&E personnel to enable the application of necessary T&E expertise throughout the capital equipment project lifecycle; and
  • the T&E aspects of capital equipment acquisition are transparently reported to inform decision making and management of technical risks that may impact the development and maintenance of the major systems component of the Fundamental Inputs to Capability.12

1.8 The audit scope included: Defence’s T&E policy; T&E organisational structures, roles and responsibilities; coordination of T&E between the then Defence Capability Development Group (CDG), the then Defence Materiel Organisation (DMO) and the armed Services; the management of T&E personnel competence development; and the implementation of T&E for related major equipment acquisitions. The audit scope also responded to ongoing Parliamentary interest in Defence’s T&E program.

1.9 On 1 April 2015, the Australian Government released the First Principles Review of Defence13, which contained recommendations regarding Defence’s organisational structure, including the establishment of a single end-to-end capability development function. This involved merging the then DMO and some functions of CDG to form a Capability Acquisition and Sustainment Group (CASG). Consequently, this audit report primarily focuses on Defence’s T&E organisational structures, roles and responsibilities prior to the implementation of First Principles Review recommendations. However, the audit report pays specific attention to the position of T&E in Defence’s future structure.

1.10 The audit method involved analysis of:

  • Defence T&E policy and procedures and T&E personnel skills development;
  • Defence records relating to the implementation of recommendations made by the 2012 Senate inquiry that relate to T&E;
  • project records relating to Defence’s evolving amphibious deployment and sustainment capability; and
  • records relating to off-the-shelf equipment acquired through US Foreign Military Sales contracts and direct commercial sales contracts.

1.11 The audit was conducted in accordance with ANAO auditing standards at a cost to the ANAO of approximately $544 500.

2. Key developments in Defence’s management arrangements for test and evaluation

Areas examined

This chapter examines key developments within Defence relating to T&E institutional arrangements, T&E policy and processes, and the management of T&E personnel.

Conclusion

Defence has made progress towards its goal of a ‘unified’ approach to T&E, particularly in terms of establishing a lead authority for T&E, and finalising its overarching T&E policy framework. However, ensuring consistency in the conduct and use of T&E remains inherently challenging, as Defence has 12 organisations that conduct T&E—each with their own procedures and reporting accountabilities.

Defence’s progress in establishing T&E personnel competency frameworks has been slow and remains incomplete. No whole-of-Defence training and competency needs analysis for T&E personnel has been conducted, and the competency frameworks in use vary significantly between the armed Services and the Capability Acquisition and Sustainment Group.

Areas for improvement

The ANAO has made a recommendation aimed at strengthening the enterprise-level management of Defence’s T&E workforce.

Introduction

2.1 A well-managed T&E program consists of suitably qualified and experienced personnel undertaking T&E in accordance with sound policy and with support from appropriate institutional arrangements. As noted in Chapter 1, since the early-2000s several ANAO audits and external reviews of Defence have identified deficiencies in these aspects of Defence’s T&E program and provided recommendations for improvement.

2.2 In this chapter the ANAO examined key developments in Defence’s:

  • institutional arrangements supporting the conduct of T&E;
  • T&E policy and procedures; and
  • T&E personnel training and competencies.

Does Defence have a unified approach to the conduct of test and evaluation?

Defence has 12 organisations that conduct T&E activities. While this approach is beneficial for developing, maintaining and applying specialised T&E expertise, it increases the risk of inconsistent conduct of T&E. This risk places a premium on Defence having effective T&E leadership and governance arrangements.

In 2007 Defence established a lead authority for T&E–the Australian Defence Test and Evaluation Office–to provide advice and consultancy support on T&E issues and to develop Defence’s overarching policy on T&E.

2.3 Defence has recognised, since at least the mid-1990s, the benefits available from a ‘unified’ approach to T&E, which facilitates the effective and efficient use of available T&E resources and avoids unnecessary duplication of effort and resources.14 A key institutional reform was the establishment in 2007 of the Australian Defence Test and Evaluation Office (ADTEO) to replace the then ‘Directorate of Trials’ and to lead T&E in Defence. ADTEO is the lead Defence authority for T&E, and is responsible for providing advice and consultancy on T&E issues, and for managing joint and single-Service trials in support of T&E activities. ADTEO is also responsible for the provision of Defence policy on T&E through the Defence Capability Development Manual (DCDM), which provides more detailed authoritative guidance on T&E than the Defence Instructions for T&E that it replaces.

2.4 ADTEO’s establishment was supported by the publication in 2008 of the Defence Test and Evaluation Roadmap, which identified the need for wide-ranging T&E management improvements. These included the need to:

  • centralise, through ADTEO, the coordination of all T&E resources, policy and processes to ensure the most efficient and effective application of T&E;
  • support existing Defence T&E agencies and other stakeholders in their delivery of T&E by the provision of adequately skilled T&E professionals who are trained and equipped to engage with emerging technologies15 in future capability systems; and
  • centralise and coordinate the management of T&E facilities and equipment to ensure projects in the Defence Capability Plan16 can be delivered into service on time.

2.5 During 2012, the Chief of the Defence Force tasked ADTEO with examining Defence T&E policy and practice, and to report its findings to the Defence Chiefs of Service Committee in September 2012. ADTEO’s report recommended seven areas for improvement, including consolidation of T&E policy, management of Army and Navy T&E personnel and consistency between Joint and Service T&E. The Chiefs of Service Committee agreed to six of the seven recommendations.17

Defence’s test and evaluation organisational structure

2.6 Successive Defence reforms in relation to the institutional arrangements supporting T&E have emphasised centralisation, coordination and consolidation of key functions, principally in ADTEO. However, responsibility for developing specific procedures for the conduct of T&E and the conduct itself of T&E remains distributed among 12 organisations within Defence, as shown in Figure 2.1.

Figure 2.1: Outline of Australian Defence T&E agencies

 

 

Notes for Figure 2.1

Note A: Acronyms used in Figure 2.1:

ADTEO: Australian Defence Test and Evaluation Office.

RANTEAA: Royal Australian Navy Test, Evaluation, and Acceptance Authority.

LEA: Land Engineering Agency.

ARDU: Aircraft Research and Development Unit.

AMAFTU: Aircraft Maintenance and Flight Trials Unit.

JPEU: Joint Proof and Experimental Unit.

AMTDU: Air Movements Training and Development Unit.

FEG OT&E Cells: Force Element Group Operational T&E Cells.

JEWOSU: Joint Electronic Warfare Operational Support Unit.

ADFTA: ADF Tactical Data Link Authority.

CIOG: Chief Information Officer Group.

I&S: Intelligence and Security Group.

Note B: CIOG and I&S are independently responsible for the conduct of T&E across the capability system life cycle for projects under their purview.

Note C: While the acquisition organisation is primarily responsible for acceptance T&E, Service and Joint T&E organisations may provide support as required.

Note D: Developmental T&E is conducted to assist with the development of a system, product or service. Therefore, for most Defence projects and programs, developmental T&E is undertaken by the contractor developing the system, product or service. The outcomes of this T&E are reviewed by Defence personnel and used in the system design and development process to support verification of technical or other performance criteria and objectives.

Source: ANAO analysis.

2.7 Defence’s 12 T&E organisations include acquisition organisations, single and Joint Service T&E organisations, and specialist T&E organisations.18 Each of these organisations have their own T&E manuals, report their T&E activities independently to their respective Capability Manager and are all required to comply with one or more of Defence’s three technical regulatory management manuals.19 The exception to those arrangements is the Army, which has embedded a significant proportion of its operational T&E staff in ADTEO.

2.8 Defence’s decentralised approach to T&E is beneficial to developing, maintaining and applying specialised T&E expertise. For example, fixed wing flight testing is carried out by the Air Force, embarked rotary wing flight testing in the maritime environment is conducted by the Navy and Army conducts its own rotary wing testing. Where centralised T&E management is practical, it has been undertaken. For example, ADTEO conducts all Early Test planning and CASG conducts acceptance T&E. However, the generally decentralised structure increases the risk of an inconsistent approach to T&E, which makes it ‘difficult to identify gaps to ensure T&E is managed and conducted to an appropriate level’.20 This risk places a premium on the effectiveness of Defence leadership and governance arrangements, particularly as T&E governance needs to extend over 12 T&E organisations. Chapter 5 discusses future directions for Defence’s T&E governance arrangements.

Does Defence have a policy framework to support the consistent conduct of test and evaluation?

Defence has recently completed and published its overarching T&E policy in the Defence Capability Development Manual (DCDM). This manual expands upon former Defence Instructions concerning Defence T&E policy and completes a longstanding commitment to the Parliament.

Defence has more to do to provide a comprehensive and integrated T&E framework to its project offices by ensuring: the DCDM aligns with Navy, Army and Aerospace regulatory management manuals; the DCDM is aligned with new organisational structures arising from the implementation of the First Principles Review1; and that subsidiary T&E policy and procedural guidance manuals used by the various project offices are consistent with the DCDM.

2.9 Each of Defence’s internal T&E organisations (identified in Figure 2.1) conducts T&E in accordance with regulatory manuals, and with a range of project management policy and procedural guidance manuals. The capstone of these documents is Part Three of the DCDM21, which provides centralised policy describing:

  1. how T&E is to be conducted by Defence;
  2. the requirements for organisations and individuals conducting T&E;
  3. what T&E planning is required for major projects, from project initiation to withdrawal from service; and
  4. how T&E policy compliance is to be assured and non-compliance resolved.

Updating Defence test and evaluation policy and procedures

2.10 In 2003, Defence advised the Parliament22 that a review and redevelopment of Defence T&E policies had been initiated by ADTEO’s predecessor organisation. This followed the tabling of the ANAO’s 2002 audit report, which had recommended that Defence review and update its T&E policy organisation and responsibilities and articulate the way that the policy is to be implemented.23 However, by 2008, the Defence T&E Roadmap (discussed in paragraph 2.4) indicated that ADTEO was still ‘reviewing all policies, procedures and guidance in order to develop a standardised framework which ensures consistency and alignment at all levels’.24 Further, the review had not been completed by the time of the 2012 Senate Committee inquiry into Defence procurement, and the Committee recommended the immediate finalisation of central defence policy on T&E to be implemented by Defence’s Capability Managers. The 2012 Senate Inquiry recommendation was in line with the committee’s recommended shift of full accountability to Capability Managers for all technical assessment of capability procurement and sustainment (independently assessed in conjunction with the then Defence Science and Technology Organisation).25

2.11 In September 2012, Defence decided that ADTEO should consolidate tri-service T&E strategic direction, policy and procedures into the DCDM. In November 2013, the Chiefs of Service Committee and DMO agreed to a revised Part Three of the DCDM, and by December 2014, the revised T&E policy was in use by all Defence T&E agencies and an incomplete version of Part Three of the DCDM26 was published. A completed version of Part Three of the DCDM was published in June 2015, some 12 years after Defence had originally advised Parliament that a review and redevelopment of Defence T&E policy and procedures was underway.

2.12 The DCDM requires specialist T&E personnel to be engaged pre-First Pass, during system development and acceptance for all projects including military off-the-shelf/commercial off-the-shelf acquisitions. This requirement addresses a longstanding Parliamentary interest that Defence’s T&E policy should adopt a ‘cradle to grave’ philosophy.27

2.13 While the publication of Part Three of the DCDM completes a longstanding commitment to the Parliament, it is also important that Defence puts in place a process to ensure that: the DCDM aligns with Navy, Army and Aerospace technical regulatory management manuals; the DCDM is aligned with new organisational structures arising from the implementation of the First Principles Review (see paragraph 1.9); and that subsidiary T&E policy and procedural guidance manuals prepared by the various T&E organisations are consistent with the DCDM. In October 2015 Defence informed the ANAO that:

The process to ensure alignment with Technical Regulatory Framework resides within [the T&E Principals’ Forum28] where each year issues can be raised and policy improvements discussed. In general, an annual update of the T&E policy is planned following the outcomes of the T&E Principles Meeting, held in [April or May] each year, with follow up meetings if required.

2.14 In August 2015, Defence advised the ANAO that it will be aligning its T&E policy with technical regulatory manuals and acquisition project management manuals during the next DCDM update, planned for November 2015. Defence also expects to update the DCDM in 2016, to incorporate First Principles Review outcomes and new Defence Capability Life Cycle processes. In addition, Defence is implementing a newly developed Quality Management System, which is to streamline and standardise the lower level procedures used by Defence’s capability acquisition and sustainment project offices. As these project offices have T&E responsibilities, Defence will need to ensure that their procedures align with T&E policy set out in the DCDM.

Policy changes in response to 2012 Senate Committee inquiry

2.15 The 2012 Senate Committee report included a recommendation that Capability Managers should require their developmental T&E practitioners to be an equal stakeholder with the then Defence Science and Technology Organisation in the pre-First Pass risk analysis and specifically to conduct the pre-contract evaluation so they are aware of risks before committing to a project.29

2.16 In response, Defence included guidance covering preview T&E in the DCDM. ADTEO oversees or conducts preview T&E and provides preview T&E reports to assist First or Second Pass approval submissions to government. Preview T&E may involve Defence Science and Technology Organisation personnel. However, any Defence T&E organisation or delegated staff/units may be tasked by ADTEO to conduct preview T&E. As discussed in Chapter 3, well-conducted preview T&E policy should result in improved pre-First Pass risk analysis and pre-contractual cost and benefit evaluation, resulting in Defence being more aware of acquisition risks and so better inform the First and Second Pass approval process.

2.17 The 2012 Senate Inquiry also recommended improvements in Defence’s Technical Risk Assessment and Technical Risk Certification processes, which contribute to the development of First and Second Pass approval submissions. The committee saw a need for a technical risk reporting structure to be transparent such that the assessments could not be ignored without justification to the key decision-makers (such as the Minister for Defence).30 In conducting this audit, the ANAO observed significant improvements in Defence’s Technical Risk Assessment policy and process manual. The DCDM would be improved if it included a more complete reference to this manual.

Is Defence’s test and evaluation policy framework being applied consistently?

Defence’s application of its T&E policy and procedures to the acquisition projects examined in this audit and past performance audits has varied, driven primarily by the relative complexity of Defence projects. Some project offices manage complex developmental T&E with Defence as the prime system integrator, while other offices manage acceptance T&E of mature products that have already undergone acceptance T&E by another jurisdiction.

2.18 The DCDM includes policy guidance on arrangements to assure compliance with T&E policy and to resolve non-compliance. Also, Defence equipment acquisition personnel are required by Defence’s technical regulatory management instructions and manuals to conduct T&E and/or examine T&E reports. This activity enables these personnel to certify that the products they are accepting on behalf of Defence comply with specified standards and are safe and technically fit for service in their intended role.

2.19 T&E policy and procedural compliance arrangements are described below, along with examples of these arrangements in practice at key phases in the Defence capability life cycle, drawn from the case studies included in subsequent chapters in this report.

Preview test and evaluation policy compliance arrangements

2.20 The Director General T&E31 is responsible for determining compliance with preview T&E policy as each Defence Capability Plan submission progresses through its First and Second Pass approval processes. ADTEO is responsible for reporting to the Chief of CDG any non-conformance with preview T&E policy.

Case study 1. Example of preview test and evaluation: Land 121 Phase 3B project

In 2009, ADTEO implemented preview T&E in the LAND 121 Phase 3B project to ensure that prior to acquisition contracts being signed, the vehicles selected for Army complied with specified standards and were technically fit for service in their intended role. This followed the project’s failure in 2007 to conduct adequate T&E during the project’s initial tender process. This failure resulted in Defence abandoning acquisition contract negotiations and undertaking a tender resubmission process. Preview T&E conducted in 2009 enabled Defence to identify those vehicles suitable for inclusion in the tender resubmission process.A

Note A: The Land 121 Phase 3B acquisition is examined in detail in ANAO Report No. 52 2014-15, Australian Defence Force’s Medium and Heavy Vehicle Fleet Replacement (Land 121 Phase 3B), June 2015.

Developmental and acceptance test and evaluation policy compliance arrangements

2.21 Managing compliance with developmental and acceptance T&E policy is a shared responsibility between the acquisition authority and the applicable Service T&E agency. A project’s T&E Planning Group is required to report to the associated Project Management Stakeholder Groups and Capability Manager Stakeholder Groups if a T&E program is failing to adequately ensure:

  1. T&E managers and key T&E staff assigned to the acquisition are competent in their respective roles;
  2. Defence T&E staff assigned to contractors or foreign military programs appropriately report T&E results;
  3. contractors report T&E results and progress;
  4. T&E working groups are effective;
  5. stakeholders, including affected T&E agencies, remain involved in their relevant aspect of the T&E program; or
  6. T&E Master Plans remain current and suitable.

2.22 If T&E policy non-compliance is not resolved by the appropriate T&E Planning Group, Project Management Stakeholder Group or the Capability Manager Stakeholder Group (once notified), the matter is to be referred to ADTEO, which is then required to investigate and, if necessary, raise the issue at the project’s next Gate Review Board or to the Defence Capability and Investment Committee. This process aligns with the standard tiered approach to issues resolution, whereby resolution is attempted at the lowest level practicable, with unresolved issues elevated to higher organisational levels if they are not resolved promptly.

Case study 2. Example of issues resolution processes in operation: Landing Helicopter Dock project

In August 2012, the JP2048 Phase 4A/B Landing Helicopter Dock Project’s (LHD Project) Performance Gate Review Board was made aware that the LHD Project was proactively addressing workforce pressures as a result of 17 vacant positions, including eight considered critical to the completion of test and verification activities.

By the time of the August 2013 Performance Gate Review Board, two T&E positions remained vacant, and this risk was being mitigated through temporary transfers of ADTEO personnel to the LHD Project Office.

In this example, the T&E policy compliance mechanism functioned in identifying issues for resolution, and then acted to resolve these issues, including by temporarily transferring ADTEO personnel into the LHD Project Office.

Operational test and evaluation policy compliance arrangements

2.23 Each Capability Manager has operational T&E policy and processes that are to comply with policy set out in the DCDM. The approach taken varies according to the Service to which a project relates. Aerospace and maritime projects use internal test readiness and compliance checking functions provided by each Service32, while land projects rely on ADTEO to provide this function. However, there can be variability with the extent to which Defence’s arrangements to assure compliance with T&E policy are translated into practice. An example of such variability was the processes used to release the first LHD, HMAS Canberra, and the MRH90 helicopters into operational T&E.

Case study 3. Example of variability: Amphibious capability seaworthiness and airworthiness board reviews

LHD acceptance was governed by the Navy Regulatory Framework, which involves the Royal Australian Navy Test, Evaluation and Acceptance Authority, and Navy’s technical, operational and safety management system regulators reviewing the ship’s report of material state and Safety Case Report, and making recommendations concerning the LHD’s Initial Operational Release into operational T&E. On the basis of these recommendations, the ship received approval to operate within limitations set by the Chief of Navy. As at October 2015, the first LHD, HMAS Canberra, had not been the subject of Navy’s Seaworthiness Board reviews, the first of which is planned for the fourth quarter of 2016.

The MRH90 helicopters, which are to operate from the LHDs, have been introduced into service within a regulatory framework subject to annual review by an independent Airworthiness Board. The Airworthiness Board conducted six Special Flight Permit reviews and three Airworthiness Board reviews for the MRH90. In April 2013 the MRH90 was awarded an Australian Military Type Certificate.A

In both instances, operational T&E is continuing within operational limitations set by the Chief of Navy and the Chief of Air Force.

Note A: See ANAO Audit Report No.52, 2013–14, Multi-Role Helicopter Program, June 2014, pp.170–171.

2.24 The variability highlighted above illustrates that this is an area requiring ongoing management focus if the intent behind the T&E policy is to be assured. As the example below illustrates, there are existing mechanisms that could, if implemented well, provide periodic assurance to Defence that acquisition project offices have adhered to T&E policy.

Case study 4. Example of Defence’s technical regulation instructions and management manuals

In addition to the DCDM policy requirements, Defence acquisition and sustainment project offices are subject to technical regulatory instructions and associated management manuals. Under this arrangement, organisations are granted Engineering Authority once they are recognised by the Technical Regulatory Authorities as being competent in assessing the design, manufacture or maintenance of Defence materiel. Defence advised the ANAO that all post-Second Pass approved project offices had been granted Engineering Authority status.

Technical Regulatory Authorities may audit materiel to ensure that it is properly certified. These compliance assurance audits review evidence recorded in support of design certificates and confirm that recognised organisations are employing sound processes in their approved engineering and quality management systems. The audits may also examine evidence of policy and process compliance, report on areas of non-compliance and issue ‘Corrective Action Requests’. If non-compliances are not resolved, the regulatory authority may make a recommendation regarding the suitability of the organisation to retain its Engineering Authority.

While this process nominally provides assurance to Defence that project offices have been assessed as compliant with T&E policy, Defence advised that, to date, no Corrective Action Requests relating to the conduct of T&E by Engineering Authorities had ever been issued.A This outcome would imply that no substantive deficiencies have ever been found in any audits conducted. The ANAO notes that this positive outcome is hard to reconcile with the significant technical, operational and safety management system deficiencies identified in other high-profile acquisition projects previously audited by the ANAO, and the shortcomings identified by Defence’s regulators in the LHD Project’s report of materiel state (known as the TI 338 report) and Safety Case Report (see Chapter 4).

Note A: Defence was unable to advise the total number of audits conducted. However, Army and Navy advised that, on average around 8-10 audits and 7-8 audits respectively are conducted annually, while Air Force advised that they have conducted an average of 22 audits of Defence’s AEOs per year over the last five years.

Does Defence have an adequate framework for managing test and evaluation personnel training and competency requirements?

Defence has made slow progress in implementing the 2012 Senate Inquiry recommendations relating to T&E personnel competency and training requirements. No whole-of-Defence T&E personnel competency and training needs analysis has been conducted and T&E personnel training and competency requirements management vary significantly between the armed Services and DMO (now CASG).

2.25 Defence’s approach to the training of T&E personnel has been the subject of ongoing concern. In 2001, the ANAO found that Defence’s approach to providing T&E training was decentralised and ad hoc, and not well linked in terms of coordination or information sharing.33 Defence’s 2008 Roadmap also identified substantial ongoing deficiencies in the area of T&E training, and emphasised ADTEO’s role in overseeing T&E training across the entity. The 2012 Senate Inquiry was sufficiently concerned about the issue to suggest that each Capability Manager should ensure adequate skilled resources are available to oversee all T&E activity in line with central policy, as part of all acquisitions, including off-the-shelf equipment, and as part of the Capability Managers’ total responsibility for procurement, both prior to and after Second Pass.34 The committee also recommended that Defence build on the capability already extant in aerospace to identify training and experience requirements for operators and engineers in the land and maritime domains.35 The ANAO assessed Defence’s progress in addressing these recommendations.

Application of policy requirements across Defence

2.26 Under the DCDM, each Defence T&E organisation is responsible for establishing the minimum competencies required of Defence personnel employed in planning, managing or conducting T&E for their respective organisation. While ADTEO is responsible for ensuring that these competencies are specified in the DCDM, it does not make any assessment as to the suitability of specified competencies.

2.27 As at June 2015, 11 of Defence’s 12 T&E organisations (identified in Figure 2.1) had specified the T&E competencies for their personnel and included these in the DCDM, however the Intelligence and Security Group had not. The ANAO also observed substantial variation in the T&E competency frameworks between each of the organisations. For example, the ADTEO competency framework provides a breakdown of required experience and training by position, whereas the Land Engineering Agency framework states more broadly the requirement for personnel conducting system integration and verification to have undertaken training in Systems Engineering.

Assessing and monitoring competency needs and training gaps

2.28 At the time of this audit, no comprehensive, whole-of-Defence T&E personnel competency and training needs analysis had been conducted. An analysis of competency levels (qualifications, training and experience) would assist Defence to improve its T&E personnel training arrangements by providing an enterprise-wide view of the competencies available, employed and required by each armed Service and CASG project offices.

2.29 In discussions with the ANAO, Defence personnel advised that in practice, the tracking and monitoring of T&E qualifications and experience across the entity remains highly decentralised and variable.36 The ANAO sought to draw together the available data on the number of Defence personnel with formal T&E training, the number of positions within each Service requiring T&E qualified personnel, and the number of qualified personnel filling each of those positions. This data was not readily available and there were substantial limitations on the reliability of data that could be provided by Defence. In August 2015, CASG informed the ANAO that:

whilst it is true Defence currently does not hold this workforce data at the enterprise level, it needs to be highlighted that for any business unit operating under technical regulator authority or being NATA [National Association of Testing Authorities] accredited they would hold this workforce data locally.

2.30 Within the limitations of the available data, the ANAO has observed that Defence has made extensive investments in T&E training, with more than 1000 qualified T&E personnel identified across Defence. However, in the absence of an enterprise-wide understanding of the existing T&E workforce and an analysis of training and competency needs, the data does not provide Defence management with the information required to assess whether the number of trained personnel and their competency levels meets the entity’s needs.37

Test and evaluation personnel competency development within the armed Services and the Capability Acquisition and Sustainment Group

2.31 The 2012 Senate Inquiry recommended that Defence build on the capability already extant in aerospace to identify training and experience requirements for operators and engineers in the land and maritime domains. The inquiry noted that Capability Managers will need to invest in a comparable level of training to enable their personnel to conduct (or at least participate in) developmental testing. The committee stated that the intention of this recommendation was ‘to provide a base of expertise from which Defence can draw on as a smart customer during the [F]irst [P]ass stage and to assist in the acceptance testing of capability.’38

2.32 Implementation of the 2012 Senate Inquiry recommendations vary between the Services and CASG. In September 2012, the Chief of the Defence Force directed the Chief of Navy and Chief of Army to review the management and independence of T&E specialists assigned to Maritime and Land projects in order to determine the best method for developing skills and managing career progression. Progress was slow, and in January 2014, the Chief of the Defence Force agreed to grant the Army and Navy a further 12 months to complete their reviews. In November 2014, the Chiefs of Service Committee determined that Army and Navy’s responses—discussed below—were appropriate and acceptable for implementation.

Navy T&E personnel

2.33 As reported above, the 2012 Senate Inquiry commented that Capability Managers will need to invest in a comparable level of training to the Air Force to enable their personnel to conduct (or at least participate in) developmental testing, as occurs in Aerospace projects. However, at present, the Navy’s T&E authorities, the Royal Australian Navy Test, Evaluation and Acceptance Authority, and the Aircraft Maintenance and Flight Trials Unit are predominantly responsible for operational T&E. This limits their ability to conduct (or at least participate in) developmental T&E as envisaged by the Committee, and hence assist in the acceptance testing of Navy capability. In August 2015, Defence advised that little (if any) developmental T&E is conducted within maritime projects, as Defence’s general approach is to procure equipment with a mature design and a high technology readiness level.39,40

2.34 In July 2014, the Deputy Chief of Navy decided that Navy would not establish a specific T&E specialisation, but requested that Head Navy Engineering, Head Navy Capability, and Commodore Warfare41 identify and report the T&E skills necessary for personnel employed in their areas of responsibility. Head Navy Engineering reported that, while he had no defined requirements for T&E specialists:

The lack of a clear and agreed T&E operating model between Navy, CDG, and the DMO in regards to Developmental Testing and Acceptance Testing inhibits the identification of associated training requirements for the Navy Engineering Division, Navy technicians, and Navy Engineer Officers.

2.35 Head Navy Engineering also noted that under the career continuums developed through the Navy’s Rizzo Reform program42, all Navy technicians, Engineer Officers and Naval Engineers will necessarily be exposed to T&E activities, and that a fundamental knowledge of T&E was a requirement at senior ranks. In June 2015, Navy advised the ANAO that Naval Engineer Officers complete an intensive T&E module as part of their training, and a specialist operational T&E course is provided to Navy personnel if they are posted to specified operational T&E positions.

2.36 The improvements to Navy’s management of T&E personnel training and competency requirements described above have helped strengthen Navy’s understanding of its T&E workforce. Nevertheless, further work remains for Navy with respect to completing reforms arising from the Rizzo Reform program and the 2012 Senate Inquiry recommendation, such as the formal incorporation of T&E education into the career continuums of Navy civilian engineers and technical sailors.

Army T&E personnel

2.37 In September 2014, Army established a management framework for a newly formed T&E sub-specialisation. The framework includes annual checks of T&E positions and competencies, with the first of these checks conducted on 19 March 2015. At this inaugural meeting Army remained uncertain as to the required makeup of its T&E workforce. It was agreed at the meeting that there remained a need to determine whether T&E experience was essential or desirable for personnel in those T&E positions identified by Army, the number of which had also not been determined. There remains a significant body of work for Army to complete in order to give effect to the 2012 Senate Inquiry recommendation.

Air Force T&E personnel

2.38 T&E personnel training and competency management within the Air Force is generally regarded as more mature than in the Navy and Army, reflecting the requirements to demonstrate airworthiness and to meet technical regulatory requirements.43 In late 2012, the Air Force introduced a framework for managing T&E personnel following a review of flight test aircrew management.

Capability Acquisition and Sustainment Group T&E personnel

2.39 T&E tasks undertaken by DMO (now CASG) project offices vary considerably. Some project offices manage complex developmental T&E with Defence as the prime system integrator, while others manage acceptance T&E of mature products that have already undergone acceptance T&E by another jurisdiction. As a result, each project office has specific personnel competency requirements–informed by the relevant Acquisition Categories described below–and depends on DMO (now CASG) meeting its needs on a project-by-project basis.

Box 1: Defence Acquisition Categories

Acquisition Category I projects are Defence’s most strategically significant major capital equipment acquisitions. They are characterised by very high project and schedule management complexity and very high levels of technical difficulty, operating, support and commercial arrangements.

Acquisition Category II projects are major capital equipment acquisitions that are strategically significant to Defence. They are normally characterised by high levels of complexity, such as project and schedule management complexity, and high levels of technical difficulty, operating, support and commercial arrangements.

Acquisition Category III projects are major or minor capital equipment acquisitions that have a moderate strategic significance to Defence. They are normally characterised by moderate levels of complexity, such as project and schedule management complexity, and moderate levels of technical difficulty, operating, support and commercial arrangements.

Acquisition Category IV projects are major or minor capital equipment acquisitions that have a lower level of strategic significance to Defence. They are normally characterised by low levels of complexity such project and schedule management complexity, and low levels of technical difficulty, operating, support arrangements and commercial arrangements.

2.40 The DCDM notes that Acquisition Category I and Acquisition Category II project offices should be resourced with substantially better T&E management competency than the prescribed minimum. At the time of the audit, DMO had not reported any non-compliance with these requirements. Despite DMO not reporting any non-compliance with the qualification and experience requirements set out in DMO T&E procedures, an audit of DMO’s engineering and technical workforce management conducted in the first half of 2014 by a management consultancy firm found numerous deficiencies in DMO’s approach. The audit noted that DMO needed to attract, recruit and retain those engineers and technically skilled individuals who are best qualified to deliver DMO and government requirements. However, the audit found that DMO’s workforce management system was neither capable nor robust enough to deliver against these goals.

2.41 The audit recommended that DMO advise the Defence Learning Branch to conduct a skills audit of the DMO Materiel Engineering and Technical workforce and associated positions. Subsequently, in March 2015, DMO completed a whole-of-workforce Skills and Training Census in order to improve understanding of its personnel training requirements.44 As CASG (and the then DMO) does not have a distinct job family for T&E45 it is difficult to draw any direct conclusions about the health of the CASG T&E workforce from the results of the survey. However, the survey did include two questions that can provide some insight, as shown in Table 2.1 below.

Table 2.1: DMO Skills and Training Census

Question

Response

As a manager, managing and conducting Verification and Validation activitiesA inclusive of supplier quality management and calibration is a critical skill for my work area.

76 per cent

  • Good depth of skills in team?B

85 per cent

  • Robust mitigation strategy already in place?B

38 per cent

Managing and conducting Verification and Validation activities inclusive of supplier quality management and calibration is a critical skill for my current or previous role.

66 per cent

  • Substantial experience (in current or previous role).C

78 per cent

  • Formal qualifications/training (in current or previous role).C

74 per cent

Note A: Verification and validation is, broadly speaking, the term used by DMO for the activities reported throughout this report as test and evaluation.

Note B: Ratings of depth of skill and mitigation strategies are based on managers who rated the skill as critical to their work area. These results are linked to their parent question.

Note C: Ratings of substantial experience and formal qualifications/training are based on all materiel engineering and technical workforce respondents. These results are linked to their parent question.

Source: ANAO analysis, DMO 2015 Skills Census results.

2.42 Table 2.1 indicates that managers generally believe that their team has a good depth of skill for managing and conducting verification and validation activities, but were less confident that robust mitigation strategies were in place to manage skills shortages. Of those personnel indicating that verification and validation was a critical skill for their current or previous role, some three quarters indicated that they had substantial experience, formal qualifications/training or both.

2.43 These results may provide some limited assurance to Defence that, at an entity level, CASG personnel conducting verification and validation activities have the experience and qualifications/training required for their role. However, there remains scope for Defence to gain a greater appreciation of the skills and competency of its personnel at an individual level in order to facilitate the more effective and efficient application of these skills across Defence as a whole.

Joint projects T&E personnel

2.44 In May 2007, the then Chief of the Defence Force set out a vision for a revised joint force operations concept. This concept was developed further in 2011, and requires the individual Services and their enabling Groups to acquire capabilities that can be integrated to provide joint warfighting functions. These functions include: force application; force deployment; force protection; force generation and sustainment; command and control; and information dominance. The 2012 Defence Capability Plan lists 48 projects requiring varying amounts of joint T&E.

2.45 Defence relies on joint operational T&E to determine how well joint force capability development is progressing. As shown in Figure 2.1, Air Force and Navy have established operational T&E agencies and Army has adopted a centralised model, embedding a large proportion of its operational T&E staff in ADTEO. This structure leaves ADTEO responsible for the remainder of the ‘joint’ projects and for projects where the allocated Capability Manager has no recognised operational T&E capability.46

2.46 ADTEO has three personnel dedicated to joint T&E, and at the time of this audit they were assisted by two contractors. This team is required to develop ADTEO’s joint T&E capability and skills base, and to plan and coordinate the conduct of joint T&E for new projects. At the time of this audit, over 20 joint projects were reporting that their operational T&E from previous phases was ‘deficient, disjointed or incomplete’, and so were relying on future phases to remediate past operational T&E gaps and shortfalls.

2.47 Defence recognised in 2012 that ADTEO lacked the personnel, infrastructure and test resources required to implement joint operational T&E at the level required by the joint force operational concept. In 2012, it was assessed as highly unlikely that these deficiencies could be met within the Services and/or Australian Public Service personnel resources given the volume and diversity of capability that required joint operational T&E from 2013 to 2018. Given current resourcing pressures and levels of demand, this assessment has continued relevance, and resourcing issues require ongoing management focus to ensure the successful implementation of joint T&E.

Recommendation No.1

2.48 To strengthen the enterprise-level management of the T&E workforce, the ANAO recommends that Defence:

  1. identifies the training and competencies of the existing Defence T&E workforce;
  1. conducts a T&E personnel competency and training needs analysis for the whole entity; and
  1. monitors the availability of sufficient, appropriately trained T&E personnel in specific competency areas and takes steps to address any gaps identified.

Entity response: Agreed.

2.49 Noting proposed name change from ‘training needs analysis’ to ‘performance needs analysis’ in alignment with the systems approach to Defence learning and the Defence learning manual.

2.50 Defence acknowledges that decentralised T&E organisations are a governance challenge to achieve consistent use of T&E in support of all Defence acquisition projects. Defence agrees with the ANAO that competency of the T&E workforce is key to achieving more consistent use of T&E. Defence agrees to undertake a whole-of-Defence performance-needs-analysis for T&E competencies. Defence’s recent issue of a centralised policy for T&E is a first step that will be followed by better T&E governance to attain compliance and an assured T&E workforce.

3. Managing preview test and evaluation

Areas examined

This chapter examines Defence’s management of preview T&E for major off-the-shelf equipment acquisitions.

Conclusion

Defence’s recently released Defence Capability Development Manual reflects the 2012 Senate Inquiry recommendation that Defence mandate a default position of engaging specialist T&E personnel pre-First Pass. When implemented well, Defence preview T&E has mitigated acquisition risks.

Introduction

3.1 As discussed in Chapter 1, preview T&E is used to assist the analysis and refinement of major equipment acquisition options presented to government at First and Second Pass approval.

3.2 Defence has long recognised the risk reduction value of T&E conducted early in the acquisition lifecycle. As noted in the 2008 Defence Test and Evaluation Roadmap:

… the cost of correcting defects increases significantly the later in the life cycle that they are identified because of increasing system and sub-system complexity and integration. Because defects found earlier in the life cycle will cost significantly less to resolve than those identified later in the development process, T&E should be applied early in the system life cycle to help control system development costs.47

3.3 Defence has represented these escalating costs in its ‘Life Cycle Cost Escalation Model’ (see Figure 3.1). The figure shows that it may be extremely costly to fix requirements, design or construction defects found during a project’s operational T&E phase. The intent should be to detect and correct defects as early as possible while there remain sufficient financial and schedule resources to achieve the project’s approved outcomes.

Figure 3.1: Life Cycle Cost Escalation Model

 

 

Source: Department of Defence, Defence Test and Evaluation Roadmap, Defence, Canberra, 2008, p. 19.

3.4 The further a system’s development has satisfactorily progressed in T&E terms from initial concept and into operations, the lesser the risk of costly defects. It is therefore critical that acquisition options are thoroughly tested and evaluated to determine their risk profile in terms of operational concepts, system design, manufacturing and production, and operations. Defence has previously ranked the risk associated the various acquisition options as follows:

a. modifying extant platforms or combat systems which can be a lower risk, quicker and less costly option;

b. acquiring off‐the‐shelf (OTS) items, either military off‐the‐shelf (MOTS) or commercial off‐the‐shelf (COTS), without modification and accepting the trade‐offs necessary if they do not fully meet the requirement;

c. acquiring and modifying OTS items, either MOTS or COTS, recognising that this can be a high‐risk option;

d. integrating existing systems, military or commercial, which can again be a high‐risk option; or

e. pursuing new designs, which is the highest risk option.48

3.5 Consistent with Defence’s risk ranking, preview T&E will be particularly important in the case of OTS equipment acquisitions, which are a recognised mechanism for Defence to reduce equipment cost, schedule and performance risks. As indicated, OTS acquisitions may be categorised as MOTS equipment or COTS equipment. In both instances these products have already been delivered to another military, government body or commercial firm and hence are expected to have completed wide-ranging T&E.

Case study 5. Example of Defence’s acquisition of off-the-shelf equipment

Defence has increasingly pursued OTS acquisitions in recent years. Defence sources its OTS equipment predominantly from the US through the US Government’s Foreign Military Sales program. The program manages government-to-government agreements for the sale of US defence articles and services, as authorised by the US Arms Export Control Act. The program operates on a ‘no profit/no loss’ basis, and must be funded in advance by the customer. Figure 3.2 (below) shows the then-yearA Australian Dollar value of Foreign Military Sales contracts from 2000 to 2015.B These expenditures totalled US$19.523 billion. Australia’s purchases in 2011 included two C-17 Globemaster III aircraft and associated equipment, parts, training and logistical support; AIM-120C-7 Air-to-Air missiles for the Air Force’s F/A-18E/F Super Hornet fleet; and through-life-support for Navy’s 24 MH-60R Seahawk helicopters. Australia’s purchases in 2012 included MH-60R Seahawk helicopter mission avionics systems, cockpits and support elements; and 10 C-27J Spartan aircraft and associated equipment, long lead spare parts and logistical support.

Note A: Then-year prices are based on the cost of labour and materials and currency exchange rates at the time the expenditure occurred.

Note B: A 3.8 per cent Foreign Military Sales Administrative Surcharge is applicable to all purchases made through the US Foreign Military Sales program. Other additional Foreign Military Sales fees include a Contract Administration Services Surcharge of 1.5 per cent, and a Nonrecurring Cost fee for pro rata recovery of development costs. The amount of cost recovery is decided during negotiation of a Foreign Military Sales case and may be waived.

Figure 3.2: US Foreign Military Sales to Australia, 2000–01 to 2014–15

 

 

Source: Department of Defence.

3.6 The 2012 Senate Committee inquiry considered the issue of T&E in the context of OTS acquisitions at length49, and made a recommendation with regard to the use of preview T&E.50

3.7 In this chapter, the ANAO examines Defence’s:

  • progress in implementing the 2012 Senate Inquiry recommendations; and
  • approach to preview T&E for three OTS equipment acquisitions.

Has Defence fully implemented the 2012 Senate Committee’s recommendation to mandate preview test and evaluation?

Defence has implemented the 2012 Senate Inquiry recommendation that it mandate a default position of engaging specialist T&E personnel prior to seeking First Pass government approval for acquisition options to proceed into detailed analysis.

When implemented well, Defence preview T&E has mitigated acquisition risks, particularly with respect to OTS equipment acquisitions. OTS defence equipment acquisitions are not risk free and acquisition risks still need to be managed through the conduct of preview T&E and operational T&E.

The mixed experience with the MRH90 helicopter acquisition and other recent OTS acquisitions, such as Land 121 (medium and heavy vehicle fleet) and Land 125 (F88 Steyr rifle), underscore the importance of OTS equipment being subjected to T&E sufficient to allow the mitigation of cost, schedule and capability risks.

3.8 The 2012 Senate Inquiry expressed concern about Defence projects experiencing ‘very serious problems’, and attributed the cause in part to risks not being managed properly.51 The Senate Inquiry recommend that Defence:

… mandate a default position of engaging specialist T&E personnel pre-First Pass, during the project and on acceptance in order to stay abreast of potential or realised risk and subsequent management. This requirement was also to apply to military off-the-shelf/commercial off-the-shelf (MOTS/COTS) acquisition.52

3.9 To give effect to this recommendation, Defence has mandated the conduct of preview T&E and relies on its technical regulatory systems to ensure adequate T&E is performed throughout the Defence capability life cycle (see Figure 1.1). The DCDM makes the Director General T&E responsible for determining compliance with preview T&E policy, as each Defence Capability Plan submission progresses through its First and Second Pass approval processes. ADTEO is responsible for reporting to the Chief of CDG any non-conformance with preview T&E policy.

3.10 From the overall capability life cycle perspective, Defence’s technical regulations require equipment suppliers to conduct sufficient T&E to enable their products to be certified as complying with specified standards and to be technically fit for service in their intended role. Design acceptance organisations such as DMO (now CASG), are required to certify they have validated the design by proving, by examination of designers’ claims and supporting evidence, that the specified end-use of a product or system has been accomplished in its intended environment. Technical Regulatory Authorities may conduct compliance assurance audits that review the evidence supporting design certification and confirm that recognised organisations were employing sound processes within their engineering and quality systems.

Test and evaluation for off-the-shelf equipment

3.11 As noted, OTS acquisitions are products that have already been delivered to another military, government body or commercial firm. Consequently, OTS acquisitions may, in systems engineering terms, progress directly from the Requirements Definition Phase to System Acceptance as developmental and acceptance T&E will have already been completed by the originating source. However, acquisition risks and issues still need to be managed through the conduct of preview T&E and operational T&E to validate an acquisition’s ability to meet Defence requirements and its ability to integrate with Defence’s Fundamental Inputs to Capability.53 Technical and reliability deficiencies experienced by OTS equipment have required Defence to make ad hoc changes to equipment designs, and to adjust operational tactics, techniques and procedures. This has resulted in significant delays in achieving required Final Operational Capability.

3.12 ADTEO has reported that recent acquisitions of OTS equipment, including some deployed directly into military operations, experienced significant unanticipated operational limitations. That outcome may stem from OTS items being specifically designed for their country of origin with design features that rely on a different set of Fundamental Inputs to Capability. These differences have affected the equipment’s ability to satisfy Defence’s operational needs.

MRH90 Helicopter

3.13 An example of such equipment recently audited by the ANAO is the $3.746 billion (March 2015) acquisition of the MRH90 helicopters.54 Prior to Defence receiving First and Second Pass approval for the acquisition of the MRH90 helicopters in August 2004 and April 2006 respectively, no preview T&E covering the MRH90 was conducted as the aircraft was still under development. The first MRH90 completed System Acceptance and entered service in December 2007. As at June 2015, the MRH90 was still undergoing developmental and operational T&E (see paragraphs 4.55 to 4.61 and 4.66). This ongoing T&E is due to the slow resolution of design and sustainment issues discovered post-System Acceptance, which have required Defence to adjust operational tactics, techniques and procedures. Defence records indicate that by March 2015, 33 MRH90 aircraft had been accepted; Initial Operational Capability for both Army and Navy had been achieved; and the MRH90 had been used in a humanitarian assistance and disaster relief role during Operation Pacific Assist in Vanuatu, following Cyclone Pam in March 2015. Defence advised that MRH90 Final Operational Capability will be delayed by some 60 months from the original schedule.

Army field vehicles

3.14 Similarly, the acquisition of field vehicles and trailers for the Army through the $3.387 billion Land 121 Phase 3B project55 is illustrative of the benefits of conducting preview T&E. The initial process to select a supplier in 2007 did not include adequate T&E. Subsequently, it was assessed that the vehicles initially selected did not meet the requirements set by Defence, and Defence was required to abandon contract negotiations and undertake a tender resubmission process. This second tender process in 2008 involved preview T&E, which provided assurance that the vehicle options Defence submitted to government in July 2013 for Second Pass approval met the minimum capability requirements.

F88 Steyr Rifle

3.15 A more recent example of the benefits of timely T&E relates to improvements Defence is seeking to make to its F88 Steyr combat rifles. In August 2005, the Land 125 Phase 3C project received First Pass approval to proceed with detailed analysis of suitable options to improve the F88 design in terms of reliability, ergonomics, weapon balance, reduced weapon signature, reduced platform mass and interoperability with NATO ammunition.56 Between September 2014 and March 2015, ADTEO in conjunction with personnel from the responsible Project Office and Army’s 3rd Brigade, conducted preview T&E for 24 OTS options to improve the F88 rifle. The T&E involved operational assessments based on user evaluations and preferences obtained through the conduct of operational scenarios. The aim was to: confirm the most operationally suitable F88 configuration; and assist Defence in developing the project’s business case, capability definition documents and Second Pass approval submissions. The trials report identified the most operationally suitable improvements to seven areas of the F88 design.

Key lesson

3.16 The key lesson from the MRH90 acquisition and other recent OTS acquisitions, such as Land 121 and Land 125, is that OTS equipment should be subjected to T&E sufficient to allow the mitigation of cost, schedule and capability risks, including its ability to interface with the Fundamental Inputs to Capability. To that end, Defence’s T&E policy set out in the DCDM specifies that OTS systems are to undergo preview T&E in order to report, as a minimum: the configuration of the equipment under test; the operational environment; the operational scenarios used for testing; recommendations for the Australian Operational Concept Document; recommended Australian modifications; and the recommended test concepts for inclusion in First and Second Pass approval submissions.

4. Managing test and evaluation for Defence’s amphibious capability acquisition

Areas examined

This chapter examines Defence’s development, acceptance and operational T&E for the evolving amphibious deployment and sustainment capability, which is based predominantly on Defence’s newly acquired Landing Helicopter Dock (LHD) ships.

Conclusion

Key management decisions concerning the first LHD, HMAS Canberra, were informed by Defence’s T&E, which identified numerous defects and deficiencies for resolution. Defence decided, on balance, to accept HMAS Canberra on the understanding that the deficiencies would be addressed during the ship’s operational phase. In doing so, the Chief of Navy accepted greater risks than would have been the case had System Acceptance been based on more complete objective quality evidence of compliance with contracted specifications, and had Initial Materiel Release been based on less qualified findings by Defence’s regulators concerning compliance with technical, operational and safety management system requirements.

T&E for HMAS Canberra and the MRH90 helicopters to be embarked upon it is an extensive, multi-year process and remains ongoing. Defence advised the ANAO that Final Operational Capability for the ship will be achieved in the fourth quarter of 2017, some 12 months later than originally scheduled.A Final Operational Capability for the helicopters is scheduled for July 2019, five years later than initially planned.

Areas for improvement

The ANAO has made a recommendation aimed at reducing risk in the transition of capability from the acquisition phase to operations.

Note A: Defence advised the ANAO in October 2015 that the additional operational T&E opportunities provided by participation of the amphibious capability in Talisman Sabre 2017 had led Defence to defer final operational capability from mid-2017 to the fourth quarter of 2017.

Introduction

4.1 T&E for complex materiel acquisitions, such as the LHDs and MRH90 helicopters discussed in this chapter, is an extensive, multi-year process.

4.2 As outlined in Chapter 1, developmental and acceptance T&E processes are used by Defence to provide evidence regarding an acquisition project’s progress toward achieving contractually specified function and performance requirements, and suitability for Initial Materiel Release. Operational T&E processes are used by the Services to provide Capability Managers with evidence that an acquired system has been proven as effective and suitable for the intended role, and that in all respects it is ready for operational service. Operational T&E may also provide Capability Managers with the information needed to manage risks and issues resulting from capability deficiencies.

4.3 For over a decade, ANAO audits of major Defence acquisitions57 have highlighted the importance of developmental, acceptance and operational T&E to support the successful acquisition and integration of major new Defence capability. ANAO audits have underscored the importance of developmental and acceptance T&E being conducted, and completed, so that operational T&E does not become a ’voyage of discovery’ in the final stages of the project.58 This chapter examines the application of T&E to Australia’s evolving amphibious deployment and sustainment capability, specifically:

  • development and acceptance T&E for two Canberra Class Landing Helicopter Docks (LHDs)59; and
  • operational T&E for the two LHDs, 47 MRH90 helicopters60, and 12 LHD Landing Craft.61

4.4 The overall total approved acquisition budget for these elements of Defence’s amphibious capability is $7.072 billion (June 2015). Of that amount, $5.600 billion had been spent by June 2015.

4.5 Australia’s two LHDs are intended to provide Defence with an increased amphibious deployment and sustainment capability using a combination of helicopters and landing craft. They are also intended to significantly increase Australia’s capacity to respond to crises, including humanitarian assistance and disaster relief missions. In addition to a crew of 357, each LHD is designed to carry an embarked force of 1122 personnel available for operations ashore, helicopter operations, logistics, command and other support activities. The LHDs are expected to displace up to 27 830 tonnes each when fully loaded.

4.6 The first LHD, HMAS Canberra, commenced contractor sea trials on 3 March 2014, completed a second tranche of sea trials in August 2014 and was commissioned into Royal Australian Navy service on 28 November 2014. The second LHD, the Adelaide, commenced contractor sea trials in June 2015. The LHD acquisition contract contained 287 progress payment milestones to be paid to BAE Systems Australia, the prime contractor62, and of those 37 directly related to the satisfactory completion of T&E. By 30 June 2015, Defence had approved the payment, or partial payment, for 21 of these milestones and these payments totalled $130.763 million.

Figure 4.1: HMAS Canberra

 

 

Source: Department of Defence.

Were key decisions to accept Landing Helicopter Dock 1 and declare Initial Materiel Release adequately informed by completed test and evaluation?

Initial Materiel Release for LHD 1, HMAS Canberra, was declared in October 2014 with system acceptance test procedures ongoing and many test reports not submitted for Defence’s approval. LHD 1’s report of materiel state provided Navy with an assessment of remaining technical, operational and safety management system risks at the time of the ship’s release by DMO to Navy. Navy decided to accept the risks identified by the T&E, along with remediation plans, and the ship received Initial Operational Release into operational T&E in November 2014.

LHD System Acceptance

4.7 Sufficient developmental and acceptance T&E needs to be conducted to establish the ‘objective quality evidence’ required by Defence to grant System Acceptance.63 Systems on offer need to be assessed as safe and suitable for service, and satisfactorily meet the specified requirements including technical, operational and safety management system regulatory requirements. Acquisition contracts provide a key mechanism for specifying the conduct of T&E and the evidence required to support contractor claims that systems on offer comply with contracted specifications. They are also the key mechanism for specifying the conduct and evidence required to support technical, operational and safety management system regulatory compliance.

LHD 1 Test Program

4.8 BAE Systems Australia offered LHD 1 to Defence for System Acceptance on 29 September 2014. System Acceptance was declared by Defence on the following day, and an Acceptance Ceremony was held at Williamstown in October 2014.

4.9 The LHD acquisition contract required the completion of a number of maritime systems tests and trials activities prior to System Acceptance. Table 4.1 summarises the results of HMAS Canberra’s tests and trials program as at November 2014—one month after System Acceptance.

Table 4.1: HMAS Canberra’s test program results, November 2014

Item

System Integration Tests

Harbour Acceptance Trials

Sea Acceptance Trials

Number of Test ProceduresA

38

62

38

Number of Test Procedures not commencedB

0

2

1

Number of Test Procedures conducted but Test Report not submitted

0

12

11

Number of Test Procedures ongoingC

0

29

11

Total number of Approved Test ReportsD

38

19

15

Percentage of Test Reports approved

100 per cent

31 per cent

39 per cent

Percentage of requirements fully closedE

 

72 per cent

60 per cent

Note A: All Acceptance Test Procedures are applicable to both LHD 1 and 2.

Note B: Of these three items, two are pending commencement upon finalisation of defect rectification, and one has been deferred due to testing asset availability.

Note C: Subsystem retesting or finalisation of Test Report is required following defect rectification.

Note D: Number of tests conducted in full and Test Reports approved for LHD 1.

Note E: Percentage of the requirements scheduled to be fully closed during the trials.

Source: Defence Materiel Organisation, LHD Project Office.

4.10 A number of trials had not been completed prior to the declaration of System Acceptance on 30 September 2014, including tests of:

  • the LHD stern door’s ability to act as a steel beach for amphibious vehicles and as a ramp for transferring vehicles in harbour, which was deferred until after System Acceptance64; and
  • the sewage treatment plant, the oily water separation system, and the ship’s communication, radar and combat management systems, which were not completed due to defects.65

4.11 In August 2015, Defence informed the ANAO that it expected those trials not completed at System Acceptance would be completed when the assets and fully trained and experienced personnel were available. In the case of the sewage treatment plant, Defence informed the ANAO that trials would be completed when it was practical to embark a suitable number of personnel.

4.12 At System Acceptance, only 31 per cent of Harbour Acceptance Trials Test Reports and 39 per cent of Sea Acceptance Trials Test Reports had been approved. By August 2015, nearly 12 months after System Acceptance, the percentage of approved Harbour and Sea Acceptance Trials Test Reports had risen to 38 per cent and 60 per cent respectively.66

4.13 To obtain System Acceptance, BAE Systems Australia was required to complete and present a signed Supplies Acceptance Certificate to the LHD Project Manager and provide supporting evidence that the supplies complied with requirements of the contract, including confirmation of the successful completion of acceptance testing. The ANAO observed that LHD 1’s Supplies Acceptance Certificate included a listing of 451 defects and deficiencies, including those outstanding trials discussed above. Notwithstanding the defects and deficiencies reported, LHD 1 was accepted on 30 September 2014, and the LHD Project Manager signed the Supplies Acceptance Certificate on 3 October 2014. BAE Systems Australia advised the ANAO that, in its experience, complex warships:

… will typically be delivered and start to be used with a “punch list” of open items that the contractor is responsible to rectify post [system] acceptance’. The items are known, documented and evaluated by the customer as minor enough that they can be reasonably resolved at a later date.

4.14 In August 2015, Defence informed the ANAO that listed defects and deficiencies are ‘tracked by the LHD Project Office against established rectification plans through to their close out’.

4.15 LHD 1 contractual requirements verification has consistently lagged behind forecasts throughout the life of the project, and has continued to do so after System Acceptance. Figure 4.2 shows LHD 1’s requirements verification progress, from the start of compliance verification in March 2011 to January 2015.

Figure 4.2: Timeliness of LHD 1’s requirements verification, March 2011 to January 2015

 

 

Source: ANAO adaptation, Defence Materiel Organisation, LHD Project Office.

4.16 Defence advised that the under-performance set out in Figure 4.2 was predominantly driven by the rate of delivery of documents by BAE Systems Australia for review. As demonstrated in Figure 4.2, in late 2014 BAE Systems Australia slowed delivery of compliance verification and assurance documents in order to focus on the conduct of testing and Test Report production.

Separation of duties

4.17 The ANAO reviewed Defence’s compliance with its policy in relation to the LHD 1 System Acceptance decision. Defence did not implement its policy of ensuring a separation of duties between the Project Charter owner (the Project Director/Manager) and the System Acceptance approving authority.

4.18 In June 2011, Defence agreed to an ANAO recommendation that the then DMO ensure that the delegate authorised to approve Systems Acceptance is an executive with seniority commensurate with the importance of the project, who is external to the Systems Program Office and who is designated in the Project Certification Plan. In agreeing to that recommendation, Defence stated that the over-riding principle is that there must be a separation of duties between the Project Charter owner (the Project Director/Manager) and the System Acceptance approving authority. Defence informed the ANAO that it would develop guidance regarding appropriate authorities, by position, within the DMO who would be delegated by the then CEO DMO to approve Systems Acceptance.

4.19 The DMO Project Management Manual was subsequently amended to include the following requirement:

… the Chair of the PMSG [Project Management Stakeholder Group] is to act as an independent authority, within DMO, who has been delegated the responsibility of approving the system acceptance for the purpose of transitioning the materiel system to the Capability Manager at the Initial and Final Materiel Release Milestones.

4.20 LHD 1 System Acceptance was exercised by the then LHD Project Manager, a Navy Captain67, rather than the Chair of the Project Management Stakeholder Group, a three-star equivalent officer.

4.21 In summary, at the time of System Acceptance, the T&E program for LHD 1 had identified key areas of risk to be addressed in later stages of the acquisition, but had not finalised the documentary evidence base (in the form of completed Acceptance Trials Test Reports), and a number of key system trials had not been completed. The decision to approve System Acceptance was also made outside of a policy requirement concerning: the separation of duties between the Project Manager and the System Acceptance approving authority; and the necessary level of seniority.

Recommendation No.2

4.22 To reduce risk and assist the transition of capability from the acquisition phase to operations, the ANAO recommends that prior to System Acceptance, Defence ensures that material deficiencies and defects are identified and documented, and plans for their remediation established.

Entity response: Agreed.

4.23 This recommendation implies all System Acceptance activities must be complete before Acceptance. There will be occasions when outstanding Acceptance testing may be risk managed by the Capability Manager to maintain Capability schedule. This should be an agreed, documented outcome between the Capability Manager and the Acquisition Agency.

4.24 Defence has robust processes for determining with its acquisition contractors any deficiencies and associated deviations or waivers for new capabilities; however, Defence agrees the quality of those processes depend critically on the cooperative T&E effort of all stakeholders.

4.25 Defence’s published T&E policy will be consistently applied to ensure cooperative T&E planning and conduct by the Services and its acquisition contractors is integral to all acquisition contracts and that these T&E plans are updated regularly thereafter based on cooperative and documented regular T&E planning groups and workshops.

LHD Initial Materiel Release

4.26 Initial Materiel Release is the major acquisition milestone immediately following System Acceptance. It involves the acquisition authority (then DMO, now CASG) declaring that the materiel state of a system is such that it may transition from its construction phase to its in use phase. When the Capability Manager, in this case the Chief of Navy, agrees to that transition, the system is considered for Initial Operational Release. Initial Operational Release is approved by the Capability Manager when satisfied that the materiel state of the system is such that it is sufficiently safe, fit for service and environmentally compliant for entry into initial operational T&E.

4.27 The period leading to LHD 1’s System Acceptance and subsequent Initial Materiel Release was characterised by competing priorities. The Navy properly desired that full contracted capability be delivered, while also experiencing concern over delays in LHD 1’s delivery schedule.

4.28 DMO had originally agreed to deliver LHD 1 in January 2014. However, Defence records indicate that LHD 1’s delivery was delayed by eight months, the result of a two month delay in LHD 1’s hull delivery from Spain and productivity issues at the Williamtown Shipyard.

4.29 As a result of delays to LHD 1 delivery, Navy had by June 2014 extended HMAS Tobruk’s service life until at least June 2015 at a cost of approximately $8 million.68 This extension was to enable HMAS Tobruk to be available for the 2014-15 cyclone season. Further service life extensions would have attracted additional costs and impacted the release of HMAS Tobruk’s crew for the purpose of crewing LHD 2. To avoid these extra costs and crewing issues, System Acceptance and Initial Materiel Release for LHD 1 needed to be achieved by early November 2014.69

4.30 In June 2014, the LHD Project Manager reported to the Initial Materiel Release Gate Review Board70 that numerous requirements were still to be verified before delivery, which at that time was scheduled to take place in late-August or early-September 2014. The Capability Manager’s representative advised the board that full contracted capability was more important than schedule, and that the Chief of Navy had no intention of accepting less.

4.31 On 8 July 2014, the LHD Project Manager assessed that the Commissioning71 of LHD 1 should be scheduled for no earlier than 15 December 2014. However on 1 August 2014, the then Chief Executive Officer of DMO advised the Chief of Navy that:

I am able to confirm with some confidence the forecast to conduct final contractor Sea Acceptance Trials from 12 to 29 August 2014. Assuming these trials are successful, this is to be followed by delivery of Canberra to the DMO in late September and subsequent delivery to Navy in late October 2014.

… the DMO intends to claim IMR [Initial Materiel Release] at the end of October …

4.32 Subsequently, the Chief of Navy advised the then Minister for Defence that:

… DMO has advised that they are expected to declare Initial Materiel Release (IMR) in late October 2014 following acceptance of the ship from [BAE Systems Australia]. Canberra is expected to arrive at Fleet Base East mid-November 2014, where I plan to accept the ship from DMO in late-November 2014, marking Initial Operational Release (IOR) of the LHD capability.

As [a] consequence of advice from DMO (based on [BAE Systems Australia] expected work completion data and level of confidence around the current sea trial period) and Navy preparations to accept the ship, I have decided to Commission Canberra in Sydney on 28 November 2014.

4.33 LHD 1’s Initial Materiel Release was declared by the then LHD Project Manager on 31 October 2014.

Management assurance to support Initial Materiel Release

4.34 The Materiel Acquisition Agreement between the then DMO and Defence72 included a range of criteria DMO needed to satisfy in order to declare Initial Materiel Release. These criteria included that LHD 1 was released in compliance with the relevant Technical Regulatory Framework and in accordance with the approved certification basis.73

4.35 The Service Chiefs are accountable for the technical integrity of materiel introduced into service. Consequently, technical, operational and safety management system regulations require evidence that equipment being offered is fit for service and only poses acceptable risk to personnel, public safety and the environment. This requirement is factored into each Navy equipment acquisition project’s Certification Plan and System Safety Program Plan which, when executed, result in the issue of Reports of Materiel and Equipment Performance State (also known as TI 338 reports)74, and Safety Case Reports.75 These reports rely on evidence provided by developmental and acceptance T&E, and they are to be reviewed by Navy’s operational, technical and safety management system regulatory authorities, generally over a six to eight week period. The operational, technical and safety regulatory reviews of the TI 338s and Safety Case Reports form key components of submissions to the Chief of Navy regarding materiel being considered for release into operations.

4.36 As set out below, Navy’s regulators identified shortcomings in the development of the TI 338 Report and the Safety Case Report for LHD 1.

Box 2: LHD 1s TI 338 and Safety Case Report key shortcomings

1. The TI 338 Report

The LHD 1 TI 338 report was presented to the Navy regulatory authorities as a collection of spreadsheets. According to a November 2014 report by the operational regulatory authority, these spreadsheets:

… fail[ed] to provide the requisite level of evidence and intent such that all implications, risks and associated mitigation actions can be clearly understood.

… [had] no supporting narrative provided to integrate the disparate sections of the TI 338, to explain implications and associated risks of the numerous deficiencies and to articulate planned remediation activities.

… Part 1 and Part 2 Certificate wording differ[ed] from that prescribed by [Navy’s operational T&E manual], with no explanation given as to why.

In December 2014, the LHD Project Office informed the ANAO that the format and process for the development and endorsement of the TI 338 report for LHD 1 was agreed through regulator review meetings, and was intended to meet the intent, but not a strict interpretation of, the Navy’s operational T&E manual. The Project Office commented that adherence to Navy’s operational T&E manual was not stipulated in the Materiel Acceptance Agreement between Navy and the DMO.

2. The Safety Case Report

Soon after LHD 1 was presented by DMO to Navy in November 2014, its Safety Case Report received conditional endorsement from Navy’s regulators. However, in commenting on the LHD Safety Management Program’s information management and its effect on safety claim validity, Navy regulators reported that the physical volume of Safety Case information was not organised in a controlled and disciplined manner, resulting in many occurrences where evidence was either not linked to safety claims or the links between evidence and claims were not explained. The regulators reported that there were no data management procedures or protocols provided to demonstrate traceable and repeatable application of the processes used to create the Safety Case.

4.37 When seeking clarification as to why the Safety Case received conditional endorsement from the regulators, Defence informed the ANAO that during the regulators’ review of the Safety Case, Commander Surface Force and the Director General Navy Capability Transition and Sustainment met and responded to the regulatory issues being identified. Their response included gaining commitments to improve and manage safety for HMAS Canberra and the establishment of a Safety Case remediation program. On that basis, on 14 November 2014, the Director General Navy Certification and Safety provided conditional authorisation of HMAS Canberra’s Safety Case until 30 August 2015. Subsequent authorisation was to be contingent upon compliance with the Navy Safety Systems Manual, the LHD certification basis and with other existing regulatory system requirements. In October 2015, Defence informed the ANAO that a request for an extension to December 2015 had been approved concerning the LHD Projects requirement to provide an LHD Safety Case Report that complies with Navy’s Safety Systems Manual.

4.38 The ANAO was also informed that a combination of Navy verification activities and the planned progressive realisation of LHD capability gave Navy’s operational regulatory authority sufficient confidence to make a conditional recommendation regarding LHD 1’s Safety Case. Further, the Joint Amphibious Capability Implementation Team76 and the Royal Australian Navy Test, Evaluation and Acceptance Authority’s analysis of the TI 338 revealed that most of the identified Safety Case deficiencies were recorded in the LHD Safety Case and TI 338. However, the implementation team acknowledged that the deficiencies present in these documents made risk assessments regarding entry into the operational T&E phase more difficult than they needed to be.

4.39 One contributing factor to difficulties encountered with the LHD Safety Case was that Navy’s regulators found that BAE Systems Australia had used an LHD Safety Management Program that had been tailored by BAE Systems Australia, and approved for use by the then DMO. BAE Systems Australia informed the ANAO that:

Tailoring of the SMP [Safety Management Program] is normal practice and allowed for by the technical regulation requirements. A joint workshop was held between BAE Systems, DMO and Regulators in 2011 to establish each party’s responsibilities and this ultimately led to the Safety Case Report being approved prior to ship delivery … Navy’s regulators did provide input to the safety case and the safety case is an artefact that is initially developed by the shipbuilder up to the level required by the contract to address safe operation of the ship and its systems, but then must be ‘owned’ and further developed by Defence to take [and] incorporate the “operational” aspects that only the ships operators (the RAN [Navy]) can provide.

4.40 Navy’s regulators reported that this tailoring enabled safety criticality assessments to be based on the risk to safety of system failures, rather than also being based on departures from safe operations. Navy regulators were concerned that safety critical analysis ‘may not have occurred’ for all LHD software, other than for some combat system software. Consequently, DMO and Navy personnel were required to complete the LHD Safety Case Report’s operational safety argument and compile relevant evidence.

4.41 On 26 November 2014, the Chief of Navy, on the advice of and with verbal assurances from key stakeholders77, granted HMAS Canberra Initial Operational Release within specified limitations regarding the ship’s authorised Statement of Operating Intent.

4.42 HMAS Canberra was accepted with known defects and deficiencies but also with plans for their rectification.78 In granting Initial Operational Release, the Chief of Navy became the Executive Authority for accepting and managing the risks associated with operating HMAS Canberra. At the same time, the then DMO, in conjunction with the prime contractor BAE Systems Australia, retained responsibility for rectifying material defects and deficiencies identified at acceptance as well as warranty and latent defects covered by the provisions of the LHD acquisition contract. The warranty and latent defect provisions expire 12 months and 60 months after System Acceptance respectively.

4.43 In agreeing to transition HMAS Canberra into its initial operational phase, the Chief of Navy accepted greater risks than would have been the case had System Acceptance been based on more complete objective quality evidence of compliance with contracted specifications, and had Initial Operational Release been based on less qualified findings by Defence’s regulators concerning compliance with technical, operational and safety management system requirements.

What is the status of operational test and evaluation for key elements of Defence’s amphibious deployment and sustainment capability?

Early operational T&E of HMAS Canberra commenced against a backdrop of significant work required to verify contractual compliance with 451 function and performance specifications, which had not occurred at the time of System Acceptance and Initial Materiel Release. Among those 451 were known defects and deficiencies within the ship’s communications system, radar system, combat management system, sewage system and logistic support system.

HMAS Canberra is scheduled to achieve its Initial Operational Capability milestone during the fourth quarter of 2015, at which time it will be able to undertake its humanitarian assistance and disaster relief support role. Both LHDs are scheduled to undergo Final Operational Capability T&E during Exercise Talisman Sabre 2017, and Defence anticipates declaring Final Operational Capability in the fourth quarter of 2017, some 12 months later than originally scheduled. The MRH90 helicopters to be embarked up the LHDs are forecast to achieve Final Operational Capability in July 2019, some five years later than originally scheduled.

4.44 As discussed, the key elements of Defence’s evolving amphibious deployment and sustainment capability are: two LHDs, 12 LHD Landing Craft and supporting aviation assets.79 The ANAO examined the management of the operational T&E for key elements of the capability, which was in progress during the audit.

HMAS Canberra’s operational test and evaluation

4.45 The Royal Australian Navy Test, Evaluation and Acceptance Authority is responsible for planning, managing and conducting Navy operational T&E, and for providing assessments and recommendations concerning a naval vessel’s initial operational and materiel state, including any deficiencies in the Fundamental Inputs to Capability. This operational T&E includes Mission Based Testing-which is designed to focus on the operational missions required of the platform-and the T&E will incrementally increase in size and complexity. It will include Defence’s command and control systems, medical facilities, embarked forces and logistic support systems.

4.46 Operational T&E of HMAS Canberra commenced in January 2015, and both LHDs are scheduled to complete this T&E phase and achieve Final Operational Capability in the fourth quarter of 2017, against an initial schedule of November 2016. Defence advised the ANAO that this delay is in order to capitalise on the opportunity provided by Exercise Talisman Sabre 2017 to conduct further operational T&E of the amphibious capability as a whole.80 The budgeted cost of HMAS Canberra’s operational T&E is $0.489 million in 2014–15 and $0.200 million in 2015–16.81

4.47 The LHDs’ operational T&E plans include the MRH90 helicopters, Armed Reconnaissance Helicopters, heavy transport helicopters and LHD Landing Craft which are to operate from the LHDs. Joint operational T&E of the LHDs, MRH90s and LHD Landing Craft commenced in 2015. The Armed Reconnaissance Helicopters and heavy transport helicopters are scheduled to commence operational T&E from the LHDs in February 2016 and September 2016 respectively.

HMAS Canberra’s deficiencies impacting operational T&E

4.48 HMAS Canberra is scheduled to achieve its Initial Operational Capability milestone during the fourth quarter of 2015. This initial capability relates to the ship’s ability to fulfil its humanitarian assistance and disaster relief support role. In October 2015, Defence informed the ANAO that operational T&E had been completed to support this initial capability state, and that Initial Operational Capability would be declared in late November 2015.

4.49 As stated earlier in this chapter, at the time of HMAS Canberra’s Initial Materiel Release on 31 October 2014, the ship’s TI 338 listed 451 requirements as not yet certified as complying with the LHD function and performance specifications. By February 2015, that number had been reduced to 305, and BAE Systems Australia advised the ANAO that as at late October 2015, more than 90 per cent of defects identified at System Acceptance had been rectified.

Box 3: HMAS Canberra Urgent Defects

By mid-April 2015, a number of items listed in HMAS Canberra’s TI 338 report had been classified as Urgent Defects. Urgent Defects are characterised as either Priority 1, 2 or 3:

Priority 1 Urgent Defects are separated into two sub-categories: A Priority 1 (Safe) Urgent Defect is a safety-related defect, condition or deficiency that precludes the ship remaining at sea or sailing; while a Priority 1 (Operations) Urgent Defect is a defect, condition or deficiency that prevents the ship from completing a significant activity, assigned tasking, operation or mission that requires rectification at the first opportunity.

Priority 2 Urgent Defects are those defects or conditions that significantly limit seaworthiness, personnel safety or operational capability, but do not preclude scheduled operational activities; or that significantly increase the probability of being unable to complete any potential tasking; and that require rectification at the next suitable opportunity in the existing program.

Priority 3 Urgent Defects are those defects or conditions that do not warrant classification as either of the above, but place a significant burden on ships staff and are to be rectified at the next scheduled maintenance availability or earlier.

HMAS Canberra was reporting one Priority 1 (Operations), 44 Priority 2 and 24 Priority 3 Urgent Defects. While many of these related to stores, some were related to the ship’s galley deck; the forward aircraft elevator; the reverse osmosis water purifiers; the stern door; and the aircraft power supply.

In August 2015, Defence informed the ANAO that, with the exception of the stern door defect, all of the Urgent Deficiencies were LHD 1 TI 338 or warranty items. Under the LHD contract, BAE Systems Australia is responsible for remediating items listed in the TI 338 and warranty items.

4.50 Regarding HMAS Canberra’s logistics support, the TI 338 listed significant shortfalls in equipment spares lists and provisioning lists.82 Individual non-conformances within LHD logistics support numbered in the thousands. Defence records indicate that these logistics support shortfalls, combined with administrative and logistics delay times, ‘were creating a potentially unacceptable impact’ upon the operational capability of HMAS Canberra and its landing craft. At the same time, Navy identified continuing significant deficiencies in maintenance data held within the Asset Management Planning System83, and in spreadsheets originally developed to overcome those deficiencies. These issues were considered by Navy to place HMAS Canberra’s sea trial program at potential risk due to the limitations in equipment maintenance combined with an inability to assure the technical integrity of the platform.

4.51 By April 2015, the LHD System Program Office and Navy had raised some 200 Requests for Problem Resolution. These concerned a variety of issues ranging from defects and deficiencies already listed in the TI 338 and in Urgent Deficiency reports, to logistics support items such as spare parts deliveries and equipment support information management. By late July 2015, the number of Requests for Problem Resolution under management by the LHD System Program Office and Navy had risen to approximately 470.

4.52 In August 2015, Defence informed the ANAO that during the months leading to and including the scheduled June—July 2015 maintenance period for HMAS Canberra, work continued on rectifying TI 338 defects and deficiencies. At the same time, the LHD Project Office and BAE Systems Australia addressed emergent warranty claims, and CASG’s LHD System Program Office addressed maintenance and non-warranty rectification work, such as that experienced with the stern door, through the Transition In-Service Support Contract.

Helicopter operations from the Landing Helicopter Docks

4.53 Navy is undertaking an incremental approach to LHD aviation capability certification, which includes T&E, given the number of aircraft types, the number of aircraft landing spots and the inherent complexity of large flight deck operations. The LHD are required to be certified as providing facilities capable of supporting safe and effective flight operations. As noted previously, a number of different helicopter types are required to operate from the LHDs, and each type requires flight testing from the LHDs in support of the LHDs achievement of Final Operational Capability. In August 2015, Defence informed the ANAO that on 20 August 2015 the 1000th Deck Landing Practice serial took place on-board HMAS Canberra as part of Navy’s operational T&E program. The aircraft sortie for this serial was a Navy MRH90 helicopter.

MRH90 operational T&E

4.54 At a budgeted cost of $3.746 billion (March 2015), the Multi-Role Helicopter (MRH90) Program is to acquire 47 helicopters and their support systems. The first two of these helicopters were accepted by DMO in December 2007.84 As at 30 June 2015, DMO had accepted 33 MRH90 aircraft and associated support systems, and Program expenditure was $2.730 billion. Of the 47 MRH90, six will be employed in the Maritime Support Role for Navy on Amphibious Afloat Support Ships (not just LHDs). The remaining 41 MRH90 will be used by Army in roles such as Counter Terrorism, Aero Medical Evacuation, Training, Battlefield Mobility, and Amphibious Operations.

Figure 4.3: Multi-Role Helicopter MRH90

 

 

Source: Department of Defence.

Navy Maritime Support Helicopter role

4.55 In their Maritime Support Helicopter role, these aircraft are required to embark on selected Navy vessels for periods of up to 120 days. Each LHD will have the capacity to hangar up to 12 helicopters.

4.56 The acceptance into operational service for the Maritime Support Helicopter capability builds incrementally through three key stages: Operational Capability Maritime (OCM) 1, which is defined as a single aircraft embarked (not necessarily on an LHD) for day only operations in a low threat environment. OCM 2 is two embarked flights (of a single aircraft each) capable of day and night operations in a medium threat environment; OCM 3 is three embarked flights (all single aircraft flights) capable of day and night operations in a high threat environment.85 OCM 1, OCM 2 and OCM 3 were initially planned to be achieved by June 2010, December 2011 and December 2012 respectively.

4.57 The MRH90 received Initial Operational Release for limited OCM 1 operations in July 2013, some 36 months later than initially scheduled. Limitations revealed by operational T&E include the need for MRH90 redesigns. For example in April 2014, Defence informed the ANAO that an MRH90 aircraft cargo hook redesign was underway at a cost of some $16 million.86 In November 2014, Defence was informed that a re-designed cargo hook would take 33 months to fully introduce.

4.58 At the time of fieldwork for this audit, the MRH90 were yet to achieve the complete scope of the specified OCM 1 operations, and operational T&E involving more complicated MRH90 operations remained incomplete.87 Several aircraft design changes were yet to be accepted and implemented to enable the MRH90 to achieve the multi-role capability specified in the MRH90 Operational Concept Document and function and performance specifications.88 For example, resolution of Identification Friend or Foe Mode 4 and Electronic Warfare Self Protection issues need to be resolved prior to the operational T&E needed for the declaration of OCM 3 by Navy. OCM 3 is now scheduled to be achieved in the fourth quarter of 2016, almost 4 years later than originally scheduled.

Army amphibious operations role

4.59 As mentioned previously, the MRH90 aircraft are intended to enhance the Army’s amphibious support capability. The Army’s objective is to generate one squadron of MRH90 aircraft for amphibious operations for periods of up to 45 days, operating from the LHDs. Army plans to achieve this through a program of four Amphibious Aviation Capability Increments.89

4.60 Presently, achievement of full capability for Army operations from the LHDs is expected to have an overall delay of 28 months. Operational Capability Amphibious 1 was achieved on 4 December 2014 with 12 limitations. Final amphibious capability (Operational Capability Amphibious 4) is now not expected to be achieved until 2017 (initially December 2015).

4.61 The need for ongoing MRH90 development has resulted in Defence having to manage wide-ranging system development risks, which continue to delay the achievement of Operational Capability and add additional costs to the program. As discussed, a range of modifications which are necessary for the MRH90s to achieve Final Operational Capability remain outstanding. Consequently, the MRH90 fleet is not expected to achieve Final Operational Capability until July 2019, five years later than originally scheduled and 12 years after DMO accepted the first MRH90s in December 2007.

Aviation facilities certification

4.62 In order for the LHDs to achieve Initial Operational Capability by the fourth quarter of 201590 and Final Operational Capability in 2017, Navy will need to complete a series of helicopter flight trials and receive Defence Technical Airworthiness Authority and Operational Airworthiness Authority certificates relating to the integration of the various helicopter types with the LHDs. These certificates include Aviation Support Systems Certificate and Authority to Operate, which involves the Aircraft Maintenance and Flight Trials Unit conducting T&E in consultation with Defence’s Director General Technical Airworthiness. In August 2015, Navy informed the ANAO that the Aircraft Maintenance and Flight Trials Unit reported that overall the LHD aviation facilities were sufficient to conduct multi-aircraft T&E.

Helicopter flight trials

4.63 The helicopter flight trials include the development of Ship Helicopter Operating Limits, aircraft handling procedures and embarkation routines, which aim to determine the limitations and procedures for establishing a safe and operationally effective interface between these helicopters and the LHDs.

4.64 During the audit, MRH90 initial flight trials aboard HMAS Canberra had not progressed as planned, due to shortcomings in compliance assessment processes and MRH90 design issues. On 17 February 2015, the Director General Technical Airworthiness reported deficiencies in the LHD Project’s 9 February 2015 technical compliance assessments. These assessments related to the commencement of flight trials involving HMAS Canberra and MRH90 helicopters, scheduled for March 2015. The compliance assessments related to design data and test evidence compared to benchmark standards. In summary, the LHD Project’s compliance assessments were found to contain evidence that failed to meet normal objective quality evidence expectations and lacked appropriate rigorous analysis. Further, late delivery of test evidence led to the Director General having a degree of uncertainty in recommending the commencement of First of Class Flight Trials.

4.65 While the Director General Technical Airworthiness made a positive recommendation for commencing flight trials, this was accompanied by limitations to the flight deck operations of the MRH90. In making the recommendation, the Director General identified a series of issues requiring further assessment, deficiencies requiring further analysis and risk mitigations requiring implementation. The Director General recommended Navy’s Aircraft Maintenance and Flight Trials Unit include these issues and deficiencies within the Flight Trial plan to enable verification evidence to be captured for compliance assessments and to fully inform risk management decisions.

4.66 Flight trials for HMAS Canberra were conducted between March and April 2015. During those trials an MRH90 experienced significant main rotor head damage. Preliminary analysis indicates that MRH90 aircraft are susceptible to main rotor head damage in certain environmental conditions while the blades are rotating slowly when starting and stopping. This damage is caused by excessive blade sailing when relative wind speeds exceed 25 knots.91 Similar substantial main rotor system damage was also experienced by an MRH90 aboard HMAS Success in March 201492, which was not detected until routine post-flight maintenance. That damage prevented that aircraft from completing its MH370 search mission.93

4.67 The MRH90 Flight Manual limits the maximum wind speed for starting/stopping rotors while embarked to 60 knots, reducing to 30 knots in some conditions. In August 2015, Navy released a Special Flying Instruction imposing further limitations for starting/stopping MRH90 rotors when embarked on the LHDs, to an absolute maximum wind speed of 45 knots.

Landing Craft operations from the Landing Helicopter Docks

4.68 At a budgeted cost of $253.7 million, the JP2048 Phase 3 Amphibious Watercraft Replacement project is to acquire 12 LHD Landing Craft. The first batch of four LHD Landing Craft arrived in Australia in April 2014. On 9 October 2014, Chief of Navy approved their Initial Operational Release94, and they entered their initial operational T&E phase some five months later than initially scheduled. Figure 4.4 shows a LHD Landing Craft loading an Army Medium Heavy Truck and trailer, as part of its first of class testing.

Figure 4.4: LHD Landing Craft undergoing First of Class testing

 

 

Source: Department of Defence.

4.69 The landing craft are designed to transfer vehicles, personnel and equipment between the LHDs and shore, with each LHD carrying four landing craft. They are intended to carry and transfer all current and future ADF heavy equipment that may be embarked in the LHDs, including the Abrams M1A1 Main Battle Tank95 in service with the Australian Army.

4.70 First of class testing conducted during the Navy operational T&E period sought to verify the landing craft capability to carry typical humanitarian aid/disaster relief loads, independent from the LHDs. The remaining operational T&E requires integration with HMAS Canberra.

4.71 At the time of audit fieldwork, Navy planned to complete operational T&E for the landing craft in May 2015, declare Initial Operational Capability in June 2015 (some eight months behind schedule) and to declare Final Operational Capability in the second quarter of 2016.

5. Future directions for the governance of Defence test and evaluation

Areas examined

This chapter focuses on future directions for Defence’s entity-level governance of T&E.

Conclusion

Defence faces challenges in balancing the methodical conduct of T&E for major acquisitions to assure Capability Managers that full contracted capability is delivered, while also seeking to meet delivery schedule and other priorities.

The establishment of the Australian Defence Test and Evaluation Office and the finalisation of an overarching T&E policy has provided Defence with a stronger basis for the management of T&E. However, against a backdrop of decentralised internal arrangements, where the conduct of T&E is distributed across 12 Defence organisations, there remains scope to improve Defence’s whole-of-entity T&E governance arrangements.

Areas for improvement

The ANAO has made a recommendation aimed at strengthening enterprise-level governance and advisory arrangements for T&E, in the context of Defence’s implementation of reforms arising from the April 2015 First Principles Review.

The ANAO has also suggested that the T&E Principals’ Forum should make the establishment of key T&E performance indicators a matter of priority, and has identified a number of aspects of Defence’s administration of T&E that could usefully be considered in the context of Defence’s implementation of reforms arising from the recent First Principles Review.

5.1 It has long been recognised that T&E is a key risk mitigation technique capable of providing managers with the information feedback needed to effectively manage risk. Acquisition planners need that feedback to assess investment costs and benefits. System engineers need that feedback to identify and resolve equipment function, performance and sustainment issues. System users need that feedback to maximise reliable operational effectiveness.

5.2 Successive ANAO audits have demonstrated that effective T&E contributes to the achievement of successful Defence project outcomes. Most recently, ANAO Report No. 52 2014-15, Australian Defence Force’s Medium and Heavy Vehicle Fleet Replacement (Land 121 Phase 3B), documented the effective use of preview T&E to mitigate significant cost, schedule and capability risks. Conversely, inadequate system development and acceptance T&E has resulted in Defence capability being placed into limited service, with costly delays in achieving expected operational performance and availability.96

5.3 Over the last decade, the Parliament has highlighted the importance of T&E by reviewing Defence’s T&E arrangements and providing recommendations directed toward improving T&E effectiveness.97 The Joint Committee of Public Accounts and Audit has also identified that an ANAO performance audit into Defence T&E was an audit priority of the Parliament. The case studies presented in this audit report illustrate the challenges faced by Defence in balancing the methodical conduct of T&E for major acquisitions—to assure Capability Managers that full contracted capability is delivered—while seeking to meet delivery schedule and other priorities.

5.4 In 2013–14 Defence’s capital equipment acquisition program consisted of some 180 approved projects with a total value of $79 billion. The 2012 Defence Capability Plan contains 111 projects, or project phases, worth approximately $153 billion in capital costs. These are planned for either First or Second Pass approval over the four year Forward Estimates period.98 Each project will increase Defence’s T&E workload and there will be additional challenges relating to the T&E of joint projects.

The governance of Defence test and evaluation

5.5 In July 2001, the then Defence Secretary, Dr Allan Hawke, emphasised the importance of T&E and its associated governance in the following statement:

T&E is an important tool in our plans for the management of Defence capability to ensure successful achievement and maintenance of operational effectiveness. As Defence moves to consider its governance strategies in a theme of organisational renewal it is timely for us to consider T&E as a key management tool.99

5.6 The establishment of the Australian Defence Test and Evaluation Office in 2007 and a T&E Principals’ Forum in 2008, discussed below, along with the finalisation of an overarching T&E policy in 2015, have provided Defence with a stronger basis for the management of T&E.

5.7 In August 2015, Defence advised the ANAO that entity-level T&E governance is ‘taken care of’ by the Defence T&E Principals’ Forum. The Forum is expected to convene at least annually, with a view to fostering a corporate and standard approach to the planning, management and application of T&E.100

5.8 The T&E Principals’ Forum provides a mechanism for the engagement of, and meaningful consultation amongst, Defence’s T&E organisations. For example, the Forum discusses the results of some T&E activities completed in each preceding twelve months, and the minutes of the 2014 Forum noted that one activity was ‘a very successful testing exercise which produced impressive results’.

5.9 Importantly, in 2013, the T&E Principals’ Forum recognised that there was a need for Defence to establish entity-level performance measures for T&E.101 By 2015 this had not occurred, and the Defence Capability Development Manual (discussed in Chapter 2) was amended to allocate responsibility for the development and monitoring of key T&E performance indicators to the Forum. This will be an important milestone in the Forum’s development as Defence’s peak governance mechanism for T&E. The T&E Principals’ Forum should accordingly make the establishment of key T&E performance indicators a matter of priority.

5.10 Notwithstanding these positive developments, this performance audit has highlighted the inherent challenges in Defence’s entity-level management and conduct of T&E, which remains distributed across 12 internal organisations. Scope remains to improve key aspects of Defence’s administration of T&E, specifically:

  • assuring consistency in the conduct of T&E and compliance with whole of entity T&E requirements;
  • establishing entity-level performance measures for T&E; and
  • T&E personnel competency management.

5.11 These aspects of Defence’s administration could usefully be considered in the context of Defence’s implementation of reforms arising from the recent First Principles Review.

First Principles Review

5.12 During the Forward Estimates Period Defence intends to implement the recommendations of the April 2015 First Principles Review, which include significant organisational design changes affecting T&E governance, policy and responsibility allocations. The Review’s key recommendation that Defence establish a strong strategic centre to strengthen accountability and top-level decision-making102 presents an opportunity for Defence to rethink the institutional positioning of its T&E governance function and the provision of advice to senior leaders, recognising the key role played by T&E in mitigating risk.

5.13 While decisions on particular institutional and advisory arrangements are matters for Defence’s senior leadership, the integrity of Defence’s T&E activities would be reinforced by strengthening existing governance arrangements to provide advice to the Vice Chief of the Defence Force and Capability Managers on the coordination, monitoring and evaluation of Defence equipment T&E and on the adequacy and results of T&E.

Recommendation No.3

5.14 In the context of its implementation of reforms arising from the First Principles Review, the ANAO recommends that Defence introduce arrangements to provide the Vice Chief of the Defence Force and Capability Managers with enterprise-level advice on the coordination, monitoring and evaluation of the adequacy and results of Defence T&E activities.

Entity response: Agreed.

5.15 Defence concurs that the First Principles Review (FPR) Implementation is an opportunity to strengthen Defence T&E and governance to provide enterprise-level advice on T&E. This T&E governance will provide more informed and timely decision-making within Defence consistent with the published Defence T&E policy.

5.16 Defence is committed to addressing this recommendation through the FPR reforms of the Defence Capability Life Cycle.

Appendices

Appendices

Please refer to the attached PDF of the report for the Appendices:

  • Appendix 1: Entity Responses
  • Appendix 2: 2012 Senate Inquiry test and evaluation related recommendations

Footnotes

1 Senate Foreign Affairs, Defence and Trade References Committee, Report on the inquiry into materiel acquisition and management in Defence, March 2003, pp. 98–99.

2 The review is discussed in paragraphs 1.9 and 5.12 of this audit report.

3 Department of Defence, Defence Capability Development Manual, Introduction to Test and Evaluation, December 2014, p. 1.

4 Broadly, First Pass approval is in‐principle authorisation that allows Defence to develop preferred options into more specific proposals for consideration at Second Pass approval.

5 For most projects the acquisition authority is the Capability Acquisition and Sustainment Group, formerly known as the Defence Materiel Organisation.

6 For most projects the Capability Manager is the armed Service Chief responsible for the project once it enters operational service.

7 Initial Operational Release is the initial capability state as specified at Government Second Pass approval, and signifies that the Capability Manager is satisfied that the initial operational and materiel state of the equipment—including any deficiencies in other Fundamental Inputs to Capability—is such that it is sufficiently safe, fit for service and environmentally compliant to proceed into operational T&E.

8 Final Materiel Release marks the completion and release by the acquisition organisation of those project supplies required to support the achievement of Final Operational Capability.

9 Final Operational Capability is the equipment’s final capability as specified by Government at Second Pass approval.

10 ANAO Audit Report No. 30 2001–02, Test and Evaluation of Major Defence Equipment Acquisitions, January 2002.

11 For example, see: ANAO Audit Report No. 36 2005–06, Management of the Tiger Armed Reconnaissance Helicopter Project—Air 87, May 2006; ANAO Audit Report No. 11 2007–08, Management of the FFG Capability Upgrade, October 2007; ANAO Audit Report No. 37 2009–10, Lightweight Torpedo Replacement Project, May 2010; ANAO Audit Report No. 57 2010–11, Acceptance into Service of Navy Capability, June 2011; ANAO Audit Report No. 34 2011–12, Upgrade of the M113 Fleet of Armoured Vehicles, May 2012; ANAO Audit Report No. 52 2013–14, Multi-Role Helicopter Program, June 2014; and ANAO Audit Report No. 52 2014–15, Australian Defence Force’s Medium and Heavy Vehicle Fleet Replacement (Land 121 Phase 3B), June 2015.

12 Defence capability is formed by combining eight ‘Fundamental Inputs to Capability’. In short, they are: Personnel; Organisation; Collective Training; Major Systems; Supplies; Facilities and training areas; Support; and Command and Management. The Fundamental Inputs to Capability are discussed in ANAO Audit Report No. 22 2013–14, Air Warfare Destroyer Program, March 2014, pp. 74–75.

13 Department of Defence, First Principles Review – Creating One Defence, April 2015.

14 ANAO Audit Report No. 30 2001-02, Test and Evaluation of Major Defence Equipment Acquisitions, January 2002, p. 13.

15 For example, increasingly complex and interconnected information and communications systems, direct energy weapons systems and robotic combat systems.

16 The Defence Capability Plan contains those priority projects planned for either First or Second Pass approval over the four year Forward Estimates period.

17 The Chiefs of Service Committee did not agree to include a diagram describing the aims of T&E in the upcoming Defence White Paper.

18 T&E of Defence acquisitions may also be performed by other organisations. For example, Defence equipment acquired through US Government Foreign Military Sales contracts undergoes acceptance T&E by the US Defense Contract Management Agency prior to release to Australia.

19 The Seaworthiness Management System, managed by Navy’s Technical Regulatory Authority; the Technical Regulatory Management System, managed by the Army’s Technical Regulatory Authority; and the Technical Airworthiness Management System, managed the Defence’s Airworthiness Technical Regulatory Authority.

20 Department of Defence, Defence Test and Evaluation Roadmap, 2008, p. 33.

21 In addition to Part Three, Chapter 7 of Part Two of the DCDM addresses T&E processes during the Requirements phase of acquisition projects. For the remainder of this report all references to Part Three of the DCDM also refer to Chapter 7 of Part Two.

22 See Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Report on the inquiry into materiel acquisition and management in Defence, March 2003. p. 102.

23 ANAO Audit Report No. 30 2001-02, Test and Evaluation of Major Defence Equipment Acquisitions, January 2002, Recommendation No. 1, p. 23.

24 Defence Instruction (General) OPS 43–1, Defence test and evaluation policy was published in 2007. Despite having a review date set for 2010, it remained unchanged until it was cancelled following the publication of Part Three of the DCDM.

25 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Procurement procedures for Defence capital projects, August 2012. Recommendation 23, p. 210.

26 This initial version did not contain competency frameworks for the Intelligence and Security Group Aircraft Research and Development Unit, the Joint Proof and Experimental Unit, the Aircraft Maintenance and Flight Trials Unit or the Joint Electronic Warfare Operational Support Unit.

27 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Report on the inquiry into materiel acquisition and management in Defence, March 2003. p. 102.

28 The Forum was established in 2008 and is intended to foster a corporate and standard approach to the planning, management and application of T&E (see paragraphs 5.6-5.9 of this audit report).

29 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Report on the inquiry into materiel acquisition and management in Defence, March 2003, p. 210.

30 ibid., Recommendation 11, p. 165.

31 The Director General T&E is the head of ADTEO.

32 Aerospace projects are subject to independent Airworthiness Board reviews conducted as part of the Australian Military Type Certification process.

33 ANAO Audit Report No. 30 2001–02 Test and Evaluation of Major Defence Equipment Acquisitions, January 2002, p. 20.

34 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Procurement procedures for Defence capital projects, August 2012. Recommendation 23, p. 211.

35 ibid., Recommendation 24, p. 211.

36 For example, one Service was unable to directly provide the number of personnel who had completed T&E training, and instead relied on data provided by the training provider.

37 Inadequate tracking of personnel training and competencies by Defence is also discussed in ANAO Report No. 45 2014–15, Central Administration of Security Vetting, June 2015.

38 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Procurement procedures for Defence capital projects, August 2012. Recommendation 24, p. 211

39 Also, in terms of acceptance T&E, the ANAO was informed that the Royal Australian Navy Test, Evaluation and Acceptance Authority will normally be involved in an observer capacity and require that acceptance T&E results be provided in order to assist in operational T&E assessments.

40 The ANAO has previously examined developmental T&E undertaken by Navy, such as that during the upgrades to Navy’s Adelaide class Frigates and the Air Warfare Destroyer project. See: ANAO Audit Report No.11 2007–08, Management of the FFG Capability Upgrade, October 2007; and ANAO Audit Report No.22 2013–14, Air Warfare Destroyer Program, March 2014.

41 Commodore Warfare is not responsible for the conduct of any T&E activities.

42 The Rizzo Reform program is Navy’s response to the July 2011 Plan to Reform Support Ship Repair and Management Practices.

43 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Procurement procedures for Defence capital projects, August 2012. Recommendation 24, p. 211.

44 In October 2000, DMO surveyed the competency-based training and work experience held by its professional and technical staff. The survey indicated the probability of gaps between the competency expected by Defence and the actual capabilities of personnel involved in T&E.

45 T&E is one skill that is broadly grouped within the Engineering and Technical workforce within CASG, and the survey results presented below relate to that workforce as a whole.

46 These include projects for the Army, the Chief Information Officer Group and the Intelligence and Security Group.

47 Department of Defence, Defence Test and Evaluation Roadmap, Defence, Canberra, 2008, p. 18.

48 Department of Defence, Capability Systems Life Cycle Management Manual 2002, Chapter 3, p. 3–6. The benefits of including off-the-shelf options in advice to government were observed in ANAO Report No.6, 2013-14, Capability Development Reform, October 2013, pp.198–210.

49 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Procurement procedures for Defence capital projects, August 2012. Recommendation 23, pp. 195-211.

50 ibid., Recommendation 25, p. 211.

51 ibid., p. 34.

52 ibid, Recommendation 25, p. 211.

53 The Fundamental Inputs to Capability are discussed in ANAO Audit Report No.22 2013–14, Air Warfare Destroyer Program, March 2014, pp.74–75.

54 The MRH90 acquisition is examined in detail in ANAO Report No. 52 2013-14, Multi-Role Helicopter Program, June 2014.

55 ANAO Report No. 52 2014–15, Australian Defence Force’s Medium and Heavy Vehicle Fleet Replacement (Land 121 Phase 3B), June 2015.

56 Department of Defence, Defence Capability Plan 2012, Public Version, p. 194.

57 For example, see: ANAO Audit Report No. 36 2005–06, Management of the Tiger Armed Reconnaissance Helicopter Project—Air 87, May 2006; ANAO Audit Report No. 11 2007–08, Management of the FFG Capability Upgrade, October 2007; ANAO Audit Report No. 37 2009–10, Lightweight Torpedo Replacement Project, May 2010; ANAO Audit Report No. 57 2010–11, Acceptance into Service of Navy Capability, June 2011; ANAO Audit Report No. 34 2011–12, Upgrade of the M113 Fleet of Armoured Vehicles, May 2012; ANAO Audit Report No. 52 2013–14, Multi-Role Helicopter Program, June 2014; and ANAO Audit Report No. 52 2014–15, Australian Defence Force’s Medium and Heavy Vehicle Fleet Replacement (Land 121 Phase 3B), June 2015

58 ANAO Audit Report No. 57 2010–11, Acceptance into Service of Navy Capability, June 2011, p. 25.

59 Joint Project 2048 Phase 4A/4B, Amphibious Ships Project, has an approved budget of $3.090 billion. Expenditure as at June 2015 totalled $2.719 billion.

60 Of the 47 MRH90s being acquired, a squadron of 12 are to provide the ADF with extra mobility for forces on operations, particularly amphibious operations; 28 are to replace Army’s Black Hawk aircraft, and six have replaced the retired RAN Sea King helicopters. The MRH90 helicopters are being acquired through the AIR 9000 Phase 2, 4 and 6 Multi-Role Helicopter Project. This project has an approved budget of $3.746 billion. Expenditure as at June 2015 totalled $2.730 billion.

61 Joint Project 2048 Phase 3, Amphibious Watercraft Replacement Project, has an approved budget of $236 million. Expenditure as at 30 June 2015 totalled $150 million.

62 BAE Systems Australia is not the contractor for the LHD Landing Craft or MRH90 helicopters discussed in this chapter.

63 Objective quality evidence is a statement of fact, either quantitative or qualitative, pertaining to the quality of a product or service based on observations, measurements, or tests that can be verified. Evidence will be expressed in terms of specific quality requirements or characteristics. The Navy’s Technical Regulatory Manual states that objective quality evidence will/could take a number of forms ranging from test results to formal certificates issued by a Classification Society. Under the LHD Acquisition contract, the DMO was ‘responsible for conducting a validation of the design, design processes, presented objective quality evidence and the Designer’s Certificate to identify the risk to technical integrity associated with the design when incorporated into Australian Defence Force materiel. The Contractor shall provide objective quality evidence in support of the Contractor’s Designer’s Certificates to facilitate the analysis of the various elements of the design.’

64 BAE Systems Australia advised that all tests and trials of the LHD stern door required by the acquisition contract were performed. However, the ANAO observed that the requirements to act as a steel beach and as a ramp for transferring vehicles in harbour were not specified in the contract. Consequently, Navy was required to conduct the test and trials of these functions. Both the LHD Project Office and the Navy agreed to defer this testing until after System Acceptance.

65 In these circumstances, the practice is for defects and deficiencies to be defined in the System Acceptance documentation suite and evaluated for subsequent rectification or mitigation after ship delivery.

66 BAE Systems Australia advised the ANAO that there were various reasons why reports may not be accepted at the time of System Acceptance, such as the need to prepare subcontractor documentation, and that data analysis sometimes takes longer than anticipated.

67 The Project Manager had the delegated authority to act as the LHD Program’s Executive Authority and Project Representative.

68 HMAS Tobruk was a multi-purpose, roll-on/roll-off heavy lift and transport vessel. Its provided an amphibious heavy lift and humanitarian aid capability. HMAS Tobruk was decommissioned in June 2015, after completing its last humanitarian support mission in Vanuatu after Cyclone Pam.

69 Defence informed the ANAO that the November 2014 System Acceptance and Initial Material Release date was also based on its assessment of the readiness of the materiel being offered by BAE Systems Australia.

70 Gate Review Boards were introduced in 2008, with the aim of providing assurance that all identified risks for a project were manageable and that costs and schedule are likely to be under control prior to a project passing various stages of its lifecycle.

71 The Commissioning Ceremony marks the formal acceptance of a ship as a unit of the Royal Australian Navy and Canberra was then prefixed by Her Majesty’s Australian Ship (HMAS).

72 The Materiel Acquisition Agreement set out the materiel and acquisition services DMO as supplier was to deliver to Defence as the customer. This agreement was signed by delegates for the Chief of Navy and Chief of CDG who signed on behalf of Defence, and the Chief Executive Officer of DMO.

73 The certification basis comprises the ship’s function and performance design requirements, including requirements relating to safety and seaworthiness. These requirements must be satisfied in order to certify that a vessel is, at all times, fit for service and does not pose a hazard to personnel, public safety or the environment.

74 DMO (now CASG) project offices are responsible for developing the TI 338 reports.

75 Navy requires DMO/CASG to produce Safety Case Reports that provide assurance that major systems being presented to Navy for Initial Operational Release are accompanied by well-reasoned arguments, supported by evidence, that demonstrate how a reasonable person can conclude that a system has achieved acceptable levels of safety.

76 The Joint Amphibious Capability Implementation Team acts as the focal point for the identification, coordination and resolution of Amphibious Deployment and Sustainment joint and single service Fundamental Inputs to Capability issues to enable the integrated and coherent introduction of Defence’s new amphibious capability (of which the LHDs are but one part).

77 Prior to declaring Initial Operational Release, the Chief of Navy was briefed by Head Navy Engineering, Head Maritime Systems, Head Navy Capability, the Commander Australian Fleet and the Director of the Joint Amphibious Capability Implementation Team. He also received a report from the Royal Australian Navy Test, Evaluation and Acceptance Authority that was endorsed by the Commander Surface Fleet and recommended by Commander Australian Fleet, which recommended the Initial Operational Release of HMAS Canberra.

78 Defence Instructions allow Capability Managers to transition a capability into service even though it may not be safe, suitable or fully effective for its intended role. That transition is to be contingent on the shortfalls being defined, deficiency reports raised to close out any identified deficiencies, and negotiated resolutions sought from all stakeholders.

79 While a variety of rotary wing aircraft will eventually operate from the LHDs, the ANAO has focussed on the MRH90 helicopters.

80 Exercise Talisman Sabre is a joint exercise with the United States.

81 Costs exclude day to day operating costs of the ship (such as fuel, berthing costs, and victuals) and the additional cost of land forces and enabling components.

82 The provisioning lists identify on-board spares at ‘sail away’ which are considered necessary to support the ship’s preventative and corrective maintenance requirements.

83 This system is used by a ship’s personnel and DMO System Program Offices to plan and conduct defect prevention maintenance. Deficiencies in the planning system included an absence of mandatory planned maintenance information.

84 The MRH90 acquisition program’s cost components and key delivery and capability milestones were set out in ANAO Audit Report No. 52, 2013-14, Multi-Role Helicopter Program, June 2014, pp. 61-65.

85 OCM 1 operations involve general maritime helicopter support missions such as personnel and stores transfer to and from ships, and a limited search and rescue and medical evacuation capability.

86 The MRH90 aircraft cargo hook was found to have two deficiencies: it was incompatible for use with a rigid strop, which would impede the MRH90 aircraft from conducting vertical replenishment operations with other ships that use the rigid strop as standard load rigging equipment; and the potential for failure to jettison loads in an emergency when at trailing angles of greater than 45 degrees. The requirement for the cargo hook to be interoperable with rigid strop load rigging arrangements is unique to Australia amongst NH90 (the original MRH90 design) customer nations. As part of the negotiations for the design, development and qualification of a compatible cargo hook, the Commonwealth will receive 27 per cent of the value of any of these hooks on-sold to other NH90 customers.

87 OCM 2 and 3 operations involve aero-medical evacuations and wide-ranging military operations.

88 These are, in part, examined in ANAO Report No. 52 2013–14, Multi-Role Helicopter Program, June 2014, pp. 131–146, 157.

89 Operational Capability Amphibious 1 is defined as the operational deployment of up to an Amphibious Airmobile Troop, which consists of up to four MRH90, capable of deploying within the immediate or wider region in a low threat environment. Operational Capability Amphibious 2 is defined as the operational deployment of up to a Troop-sized element, capable of conducting amphibious operations from an LHD by day and night in a low threat environment. Operational Capability Amphibious 3 is defined as the operational deployment of up to a Troop-sized element, capable of conducting amphibious operations from an LHD by day and night in a medium threat environment. Operational Capability Amphibious 4 is defined as up to a Squadron-sized element, consisting of up to 12 MRH90, operating from an LHD by day and night in a high threat environment.

90 The Navy operational T&E program initially focused on achieving Initial Operational Capability in June 2015.

91 This phenomenon is not limited to the MRH90, and can occur on other helicopter types should the required conditions exist. Rotor blade sailing is an aeroelastic phenomenon affecting helicopter rotors when rotating at low speeds in high wind conditions. This is a potentially dangerous blade motion and the excessive blade deflections generated can endanger the airframe, the flight crew and any personnel working close to the aircraft.

92 Navy’s initial technical analysis indicates that the mode of failure for the MRH90 rotor head aboard HMAS Canberra differs from the mode of failure for the incident aboard HMAS Success. For the April 2015 incident, it was identified that while elements of the MRH90 rotor head design are unreliable, the interaction of wind and motion over the LHD flight deck is more prone to generating blade sail than other in-service vessels. In particular, the windward edges of the LHD flight decks are prone to the ‘cliff edge’ effect where wind acts perpendicular to the edge of the deck.

93 Malaysia Airlines Flight MH370 disappeared on 8 March 2014 en route from Kuala Lumpur to Beijing with 239 passengers and crew on board. The Australian Maritime Safety Authority (AMSA) assumed responsibility for coordination of a search in Australia’s search and rescue region. The surface search was coordinated by AMSA, and supported by the Australian Defence Force and other agencies. See < http://jacc.gov.au/search/index.aspx>.

94 In recommending that the Chief of Navy approve Initial Operational Release, the Royal Australian Navy Test, Evaluation and Acceptance Authority identified a number of internal Navy communication and procedural shortcomings regarding the development of the arguments to support Initial Operational Release. Initial Operational Release of the LHD Landing Craft was originally planned for July 2014. The three month delay was due to delays in the LHD Project and a need to progress at a slower rate to allow DMO resources to be used in the LHD preparation for Initial Materiel Release.

95 As at August 2015, Defence had not completed testing of the Landing Craft’s ability to carry loads as heavy as the Abrams, which weighs some 68 tonnes, not including fuel, other consumables or personnel (which total some five tonnes).

96 For example, see ANAO Audit Report No. 52 2013–14, Multi-Role Helicopter Program, June 2014.

97 Senate Foreign Affairs, Defence and Trade References Committee, Commonwealth of Australia, Report on the inquiry into materiel acquisition and management in Defence, March 2003, pp. 95–102 and pp. 195–211.

98 Department of Defence, Defence Capability Plan 2012, Public Version, p. i.

99 Senate Foreign Affairs, Defence and Trade References Committee, Report on the inquiry into materiel acquisition and management in Defence, March 2003, pp. 98–99.

100 The Forum is chaired by the Director General T&E and the membership comprises representatives from the armed Services, CDG, CASG, the Defence Science and Technology Group, the Chief Information Officer Group, the Defence Support and Reform Group, and the Vice Chief of the Defence Force Group.

101 The only performance monitoring of T&E occurring at an entity level is ADTEO’s examination of Test and Evaluation Master Plans for each project’s Gate Review Board.

102 Department of Defence, The First Principles Review: Creating One Defence, April 2015, Recommendations 1, 1.6, 1.7, 1.16; p. 85.