Currently showing reports relevant to the Foreign Affairs, Defence and Trade Senate estimates committee. [Remove filter]

Type: Performance audit
Report number: 11 of 2024-25
Portfolios: Foreign Affairs and Trade
Entities: Department of Foreign Affairs and Trade
Date tabled:
Audit Summary : show

Summary and recommendations

Background

1. The Department of Foreign Affairs and Trade (DFAT) is responsible for issuing passports to Australian citizens in accordance with the Australian Passports Act 2005, with delivery of passport services in Australia and overseas being one of DFAT’s three key outcomes. In July 2006, DFAT established the Australian Passport Office as a separate division to provide passport services. The Australian Passport Office has offices in each Australian capital city and it collaborates with Australian diplomatic missions and consulates to provide passport services to Australians located overseas.

2. DFAT is also the entity responsible for Australia’s international trade agreements. The Commonwealth Procurement Rules (CPRs) incorporate the requirements of Australia’s international trade obligations and government policy on procurement into a set of rules. As a legislative instrument, the CPRs have the force of law.1 Officials from non-corporate Commonwealth entities such as DFAT must comply with the CPRs when performing duties related to procurement. Achieving value for money is the core rule of the CPRs.

Rationale for undertaking the audit

3. The issuing of passports to Australian citizens is an important function of the Department of Foreign Affairs and Trade, undertaken by its Australian Passport Office. Between 1 July 2019 and 31 December 2023, the Australian Passport Office managed 331 contracts totalling $1.58 billion.

4. During the conduct of an earlier audit, Auditor-General Report No. 13 2023–24 Efficiency of the Australian Passport Office, the ANAO observed a number of practices in respect of the conduct of procurement by DFAT through its Australian Passport Office that merited further examination. The Auditor-General decided to commence a separate audit of whether the procurements DFAT conducts through its Australian Passport Office comply with the Commonwealth Procurement Rules and demonstrate the achievement of value for money.

5. The audit provides assurance to the Parliament of the effectiveness of the department’s procurement activities in achieving value for money, and the ethics of the department’s procurement processes, noting that procurement is an area of continuing focus by the Joint Committee of Public Accounts and Audit.2

Audit objective and criteria

6. The objective of the audit was to examine whether the procurements that DFAT conducts through its Australian Passport Office are complying with the Commonwealth Procurement Rules and demonstrating the achievement of value for money.

7. To form a conclusion against this objective, the following high-level criteria were applied.

  • Have open and competitive procurement processes been employed?
  • Has decision-making been accountable and transparent?

8. The audit focussed on procurement activities by the Australian Passport Office relating to contracts and contract variations that had a start date of between 1 July 2019 and 31 December 2023.

Conclusion

9. The procurements that DFAT conducted through its Australian Passport Office did not comply with the Commonwealth Procurement Rules and DFAT’s procurement policies, and did not demonstrate it had achieved value for money.

10. DFAT did not employ open and competitive processes in the conduct of Australian Passport Office procurement. There were no procurements conducted between July 2019 and December 2023 by way of an open approach to the market. Of the 73 procurements examined in detail by the ANAO, 29 per cent involved competition where the department had not identified a preferred supplier prior to inviting quotes.

11. Procurement decision-making was not sufficiently accountable and was not transparent. Procurement practices have fallen short of ethical standards, with DFAT initiating inquiries of the conduct of at least 18 individuals, both employees and contractors, in relation to Australian Passport Office procurement activities examined by the ANAO.

Supporting findings

Open and competitive procurement

12. DFAT did not appropriately plan the procurement activities for its Australian Passport Office. There was no overarching procurement strategy. The department engaged a contractor to develop a multi-year procurement strategy that was never completed. Overall, only 15 per cent of the 62 approaches to market examined by the ANAO met the minimum requirements at planning stage. (See paragraphs 2.3 to 2.20)

13. None of the 243 contracts totalling $476.5 million the APO entered between 1 July 2019 and 31 December 2023 was let via an approach to the open market.

14. DFAT’s AusTender reporting indicates the APO procures by open tender from a panel arrangement 71 per cent of the time. The ANAO examined 53 contracts DFAT had reported this way and identified that for 15 contracts (28 per cent) the APO had deviated from the panel arrangement to the extent that the approach constituted a limited tender. The ANAO also examined 12 contracts valued over the $80,000 threshold reported by DFAT as let by limited tender. The approach taken for six of these contracts (50 per cent) did not demonstrably satisfy the limited tender condition or exemption from open tender that had been reported by DFAT. The department’s approach is inconsistent with the Commonwealth Procurement Rules which, in turn, reflect the requirements of the Australia-United States Free Trade Agreement (DFAT is the Australian Government entity responsible for Australia’s international trade agreements). (See paragraphs 2.22 to 2.50)

15. A competitive approach was used to establish only 29 per cent of the 73 contracts tested by number or 25 per cent by value. This involved the APO inviting more than one supplier to quote in a process that did not have a pre-determined outcome. On 19 occasions the procurement approach was not genuine as the purported competitive process did not, in fact, involve competition. (See paragraphs 2.51 to 2.70)

16. For 14 per cent of contracts tested, evaluation criteria were included in request documentation with those same criteria used to assess submissions. (See paragraphs 2.72 to 2.77)

17. There was not a documented approval to approach the market for 36 per cent of the 73 contracts examined in detail by the ANAO. Advice provided to approvers on the outcomes of approaches to market in most cases did not demonstrate how value for money was considered to have been achieved. Three-quarters of the time the approval was requested by an embedded contractor, often populating a template as an administrative function and sometimes at the direction of the approver telling them what to recommend.

18. One quarter of the time, approval was given within a week of the expected contract start date. A 2022–23 practice of approving commitments on the understanding that the Department of Finance would later agree to additional funding to cover the costs was not sound financial management. (See paragraphs 2.79 to 2.104)

Accountable and transparent decision-making

19. For 71 per cent of the procurements examined by the ANAO, an appropriate contractual arrangement was in place prior to works commencing and after approval had been obtained to enter the arrangement. (See paragraphs 3.3 to 3.13)

20. Sound and timely advice was not provided to inform decisions about whether to vary contracts. In aggregate, the contracts the APO entered between 1 July 2019 and 30 June 2023 doubled in value during that period through contract amendment. The approval records for contract variations did not include advice on how value for money would be achieved and, for a number of high value contracts, approval was sought after costs were incurred. A quarter of the variations tested were entered after the related services had commenced and/or costs incurred. (See paragraphs 3.14 to 3.35)

21. ANAO analysis of AusTender data between 1 July 2019 and 30 June 2023 indicated that DFAT did not meet the Commonwealth Procurement Rules requirement to report contracts and amendments within 42 days of execution at least 22 per cent of the time. The extent of non-compliance increased to 44 per cent when the analysis was based on ANAO examination of the departmental records in a sample of 230 contracts and amendments. The AusTender reporting of 70 APO contracts examined was largely accurate. The reported descriptions of the goods or services procured was usually applicable but was also usually lacking in detail. The reported reasons given for 112 contract amendments examined did not contain sufficient detail to meet the minimum instructions in the AusTender reporting guide 81 per cent of the time. (See paragraphs 3.37 to 3.52)

22. Procurement activities fell short of ethical requirements. In response to ethical findings made by the ANAO in relation to a number of the procurements examined as part of this performance audit, the department advised the ANAO that it considers there are clear indications of misconduct involving a number of current or former DFAT officials and contractors as well as clear cultural issues. The department has commenced, or is considering, investigation (or referral) activity in relation to the conduct of at least 18 individuals in relation to various procurements examined by the ANAO. (See paragraphs 3.53 to 3.83)

23. The department’s central procurement team has not exercised sufficient oversight of the APO’s procurement activities. Departmental risk controls that have been documented have not been complied with by the APO and this non-compliance should have been evident to the central procurement team, and addressed. The department also does not have adequate arrangements in place for the identification and reporting of breaches of finance legislation. (See paragraphs 3.85 to 3.105)

Recommendations

Recommendation no. 1

Paragraph 2.21

The Department of Foreign Affairs and Trade improve its planning of procurement activity for the Australian Passport Office, including but not limited to taking steps to assure itself that procurement planning requirements (internal to the department as well as those required by the Commonwealth Procurement Rules) are being complied with.

Department of Foreign Affairs and Trade response: Agreed.

Recommendation no. 2

Paragraph 2.71

The Department of Foreign Affairs and Trade strengthen its procurement processes for the Australian Passport Office so that there is an emphasis on the use of genuinely open competition in procurement to deliver value for money outcomes consistent with the requirements and intent of the Commonwealth Procurement Rules.

Department of Foreign Affairs and Trade response: Agreed.

Recommendation no. 3

Paragraph 2.78

The Department of Foreign Affairs and Trade include evaluation criteria in request documentation for all procurements undertaken for the Australian Passport Office, and procurement decision-makers ensure those criteria have been applied in the evaluation of which candidate represents the best value for money.

Department of Foreign Affairs and Trade response: Agreed.

Recommendation no. 4

Paragraph 2.106

The Department of Foreign Affairs and Trade to strengthen its procurement policy framework by directly addressing the risk of officials being cultivated or influenced by existing or potential suppliers.

Department of Foreign Affairs and Trade response: Agreed.

Recommendation no. 5

Paragraph 3.36

The Department of Foreign Affairs and Trade strengthen its controls to ensure any contract variations are consistent with the terms of the original approach to market, and that officials do not vary contracts to avoid competition or other obligations and ethical requirements under the Commonwealth Procurement Rules.

Department of Foreign Affairs and Trade response: Agreed.

Recommendation no. 6

Paragraph 3.84

The Department of Foreign Affairs and Trade examine whether procurements not included in the sample examined by the ANAO also include ethical and integrity failures, and subject any such procurements to appropriate investigatory action.

Department of Foreign Affairs and Trade response: Agreed.

Recommendation no. 7

Paragraph 3.94

The Department of Foreign Affairs and Trade strengthen oversight by its central procurement area of the procurement activities of the Australian Passport Office. This should include being represented on the evaluation team for each procurement activity of higher risk or value.

Department of Foreign Affairs and Trade response: Agreed.

Summary of entity responses

24. The proposed report was provided to DFAT. Extracts of the proposed report were also provided to Alluvial Pty Ltd, Brink’s Australia Pty Ltd, Compas Pty Ltd, Community and Public Sector Union, Customer Driven Solutions Pty Ltd, Datacom Systems (AU) Pty Ltd, Deloitte Touche Tohmatsu, Department of Finance, Grosvenor Performance Group Pty Ltd, Hays Specialist Recruitment (Australia) Pty Ltd, Mühlbauer ID Services GmbH, Peoplebank Australia Ltd, Procurement Professionals Pty Ltd, Propel Design Pty Ltd, Randstad Pty Ltd, Serco Citizen Services Pty Ltd, Services Australia, UiPath S.R.L, Verizon Australia Pty Ltd and Yardstick Advisory Pty Ltd. The letters of response that were received for inclusion in the audit report are at Appendix 1. Summary responses, where provided, are included below.

Department of Foreign Affairs and Trade

The department values the ANAO’s independent review of procurement practices at the Australian Passport Office (APO). The audit came at a time when the department was assessing the effectiveness of the current procurement model. As a result of both reviews, the department’s procurement practices will be amended to improve compliance and efficiency. This will include the Finance Division taking more centralised and direct control over procurement activities, and additional resources to implement changes and provide enhanced oversight.

The ANAO audit highlighted the proactive steps the current Executive Director APO took to address procurement and cultural issues when she commenced with the department in early 2023. Work has continued, leading to the creation of a new Procurement, Finance and Assurance Section within APO. Additionally, the Internal Audit Branch has initiated a wide-ranging internal audit of procurement activities across the department.

Following the ANAO audit report and internal reviews, the department will also revise its Compliance and Assurance Framework as it relates to Public Governance, Performance and Accountability Act 2013 obligations. The updated Framework will be purpose-built, adopt a risk-based approach, and include effective assurance mechanisms. The department has initiated activities to address specific areas of concern regarding actions of staff.

Compas Pty Ltd

Compas is concerned that the Proposed Report conveys an imputation that it has engaged in conduct that may not be in accordance with the Commonwealth Procurement Rules.

Such an imputation is incorrect.

The relevant evaluation process was an internal DFAT process over which Compas, rightly, had no visibility. Given this, Compas cannot respond to, nor is it privy to, what processes were taken by DFAT to address the Panel Member’s affiliation to it.

Any deficiencies in the evaluation process cannot be attributed to Compas, and the final report should make this expressly clear in its findings. Any failure to do so could result in a reader being under a misapprehension that Compas had the ability to influence the process, did in fact influence the process improperly and, as a result, gained an improper advantage or benefit.

Should such a misrepresentation occur, this would have an unreasonably adverse effect on Compas’ reputation that it has built over nearly 40 years and have a deleterious effect on our business.

Propel Design Pty Ltd

Propel Design notes the extract provided by the ANAO. Propel Design submitted its tender for the procurement in question in accordance with all requirements under the Digital Marketplace (now BUYICT) and was not aware of any individuals appointed to the evaluation panel. We believe our employee was selected as the preferred contractor based on their skills and experience, as set out in their resumé and our responses to the selection criteria.

Key messages from this audit for all Australian Government entities

25. Below is a summary of key messages that have been identified in this audit and may be relevant for the operations of other Australian Government entities.

Group title

Governance

Key learning reference
  • Good governance involves entity leaders developing a culture requiring and supporting actions which are not only in compliance with rule frameworks but also with the intent of those frameworks, including those which set standards for ethical practices.
  • Praising or otherwise rewarding staff for getting contracts in place quickly can influence staff to bypass procurement rules and to direct-source incumbent suppliers. Forward planning and reinforcement of rules and processes can remove or reduce time pressure.
Group title

Procurement

Key learning reference
  • Acting ethically when conducting procurement requires ethical principles to be applied to all actions throughout the entire procurement process and for management action to be taken when ethical issues are identified. It also requires good recordkeeping so that it can be transparently demonstrated that the procurement was conducted ethically.
  • Entities should treat all tenderers, and potential tenderers, in a fair and non-discriminatory manner. This means that entities should undertake genuinely competitive procurements by not identifying a preferred supplier in advance of conducting a procurement process. A practice of inviting market responses when a preferred supplier has already been identified also wastes the other suppliers’ time and resources and reduces trust in the integrity of government procurement.
  • Entities should ensure that their procurement frameworks specifically address how risks of incumbency advantage are managed so that the procurement process is conducted with no bias or favouritism, and to maximise value for money to the Australian Government through competitive selection processes. In addition, the risks of employees and/or contractors having a conflict of interest with potential/actual market respondents, including any incumbent, should be fully considered and addressed.
Type: Performance audit
Report number: 2 of 2024-25
Portfolios: Defence
Entities: Department of Defence
Date tabled:
Audit Summary : show

Summary and recommendations

Background

1. The security of government information and communications technology (ICT) systems, networks and data supports Australia’s social, economic and national security interests as well as the privacy of its citizens. Malicious cyber activity has been identified as a significant threat affecting Australians, exacerbated by low levels of cyber maturity across many Australian Government entities.1

2. The Department of Defence’s (Defence’s) mission and purpose is to defend Australia and its national interests in order to advance Australia’s security and prosperity. Defence’s 2022 Cyber Security Strategy states that ‘Malicious cyber activity now represents one of Defence’s most critical risks.’2

3. The Protective Security Policy Framework (PSPF) was introduced in 2010 to help Australian Government entities protect their people, information and assets, both at home and overseas. The PSPF sets out the government’s protective security policy approach and is comprised of 16 core policies.3 PSPF Policy 11 Robust ICT systems requires that:

Entities must [emphasis in original] only process, store or communicate information and data on an ICT system that the determining authority (or their delegate) has authorised to operate based on the acceptance of the residual security risks associated with its operation.4

When establishing new ICT systems, or implementing improvements to an existing system, the decision to authorise (or reauthorise) a system to operate must [emphasis in original] be based on the Information Security Manual’s [ISM] six step risk-based approach for cyber security.5

4. Defence has established the Defence Security Principles Framework (DSPF) to support compliance with the requirements of the PSPF. The DSPF outlines Defence’s requirements for ICT assessment and authorisation including that ‘all Defence ICT systems must be authorised prior to processing, storing or communicating official information’.

Rationale for undertaking the audit

5. Through its 2022 Cyber Security Strategy, Defence has recognised that ‘Malicious cyber activity now represents one of Defence’s most critical risks.’ Robust ICT systems protect the confidentiality, integrity and availability of the information and data that entities process, store and communicate. PSPF Policy 11 outlines how entities can safeguard ICT systems through assessment and authorisation activities to support the secure and continuous delivery of government business.

6. Questions regarding Defence’s system authorisation process were raised at hearings of the Senate Foreign Affairs, Defence and Trade Legislation Committee in June 2021, including in relation to:

  • Defence’s use of provisional authorisations beyond 12 months for systems where security concerns have not been sufficiently addressed;
  • deficiencies in Defence’s processes for identifying and assessing risks as part of the authorisation process; and
  • DSPF compliance with the Information Security Manual (ISM).

7. This audit was conducted to provide assurance to the Parliament on Defence’s arrangements for the management of ICT systems security authorisations.6

Audit objective and criteria

8. The audit objective was to assess the effectiveness of the Department of Defence’s arrangements to manage the security authorisation of its ICT systems.

9. To form a conclusion against this objective, the following high-level criteria were adopted.

  • Does Defence have fit-for-purpose arrangements for the security authorisation of its ICT systems?
  • Has Defence implemented its arrangements for the security authorisation of its ICT systems?

Engagement with the Australian Signals Directorate

10. Independent timely reporting on the implementation of the cyber security policy framework supports public accountability by providing an evidence base for the Parliament to hold the executive government and individual entities to account. Previous ANAO reports on cyber security have drawn to the attention of Parliament and relevant entities the need for change in entity implementation of mandatory cyber security requirements, at both the individual entity and framework levels.

11. In preparing audit reports to the Parliament on cyber security in Australian Government entities, the interests of accountability and transparency must be balanced with the need to manage cyber security risks. The Australian Signals Directorate (ASD) has advised the ANAO that adversaries use publicly available information about cyber vulnerabilities to more effectively target their malicious activities.

12. The extent to which this report details the cyber security vulnerabilities of Defence was a matter of careful consideration during the course of this audit. To assist in appropriately balancing the interests of accountability and potential risk exposure through transparent audit reporting, the ANAO engaged with ASD to better understand the evolving nature and extent of risk exposure that may arise through the disclosure of technical information in the audit report. This report focusses on matters material to the audit findings against the objective and criteria.

Conclusion

13. Defence’s arrangements to manage the security authorisation of its ICT systems have been partly effective. Systems have not been authorised in a timely manner and were assessed through processes that did not consistently comply with Protective Security Policy Framework (PSPF) requirements.

14. Defence’s arrangements for the security authorisation of its ICT systems are partly fit for purpose. Defence’s policies, frameworks and processes to support system assessment and authorisation have not been regularly reviewed or updated to align with PSPF and Defence Security Principles Framework (DSPF) requirements. These policy and process documents are internally inconsistent. Defence has not established training to ensure that key personnel involved in the authorisation process remain up-to-date with changing cyber security requirements in the Information Security Manual (ISM) and PSPF.

15. Defence has partly implemented arrangements for the security authorisation of its ICT systems. Defence’s data on its system assessments and authorisations is incomplete and indicates that System Owner obligations to obtain and maintain authorisation of their systems are not being fulfilled.

16. There were deficiencies in relation to Defence’s monitoring and reporting arrangements, including non-compliance with DSPF reporting requirements. Key information on the system authorisation status of Defence’s systems was omitted from Defence’s reporting, including not addressing a request from the Minister for Defence to include metrics in reporting on unapproved ICT systems within Defence. Defence’s internal and external reporting on its assessments indicated a more optimistic outlook than was otherwise reflected in other internal Defence documentation. Across the ICT systems examined in case studies, deficiencies included: the absence of key data and mandatory security documentation; no evidence of assessment of control implementation; and deficiencies in the peer review process.

Supporting findings

Defence’s arrangements for the security authorisation of its ICT systems

17. Defence has not appropriately maintained its policy and governance framework for the authorisation of its ICT systems. When the DSPF was implemented in July 2018, some sections were not complete, with key authorisation roles listed but not defined for 13 of the 14 Defence Services and Groups listed. These roles remained undefined until a May 2024 review of the DSPF. Prior to the May 2024 update, DSPF Principle 23 and DSPF Control 23.1 had not been updated since July 2020. This meant that key changes to the mandatory requirements in PSPF Policies 10 and 11 between August 2020 and February 2022 — such as the introduction of the ‘Essential Eight’ and the ISM six-step process for system assessment and authorisation — were not reflected in the DSPF until 10 May 2024. (See paragraphs 2.3–2.25)

18. Directives, instructions, and policies issued by the Australian Defence Force (ADF) services for ICT authorisations for Army, Navy and Air Force systems contain provisions that are either not consistent with or not permitted by the requirements of the DSPF, or PSPF Policy 11. These provisions have allowed for exemptions to Defence’s system authorisation process that are not permitted under the DSPF or PSPF. (See paragraphs 2.26–2.38)

19. A key supporting framework, the Defence ICT Certification and Accreditation Framework (DICAF) — developed to ensure consistency in the authorisation process for all Defence ICT systems that process, store or communicate official, sensitive or classified information — has been in draft since December 2015. As at May 2024, the DICAF remains incomplete with a placeholder remaining for a key section that was to be developed on the assessment and authorisation process. In response to shortcomings identified in the DICAF by an internal audit in May 2020, Defence developed a separate ‘Assessment and Authorisation Framework’ document in December 2021. The framework was approved by Defence’s Chief Information Security Officer (CISO) in February 2024 and released in May 2024. (See paragraphs 2.39–2.51)

20. Defence does not have an up-to-date set of consolidated guidance to support the implementation of its framework in a consistent manner across the organisation. Defence’s assessment and authorisation process guidance is internally inconsistent and a number of supporting templates have not been finalised or are outdated. Separate instructions, directives and policies exist for the Army, Navy and Air Force, which include some requirements that are inconsistent with Defence’s assessment and authorisation process, the DSPF and PSPF. (See paragraphs 2.52–2.73)

21. Defence has not established training to ensure that Security Assessors remain up-to-date on evolving cyber security requirements, instead relying on peer review and Assessment Authority review to mitigate any ‘deficiencies in knowledge’. Deficiencies were identified in Defence’s implementation of the peer review process and Defence does not undertake assurance activities to monitor the extent to which training is completed. The absence of a formalised training approach to support the implementation of DSPF requirements for the assessment and authorisation of ICT systems creates a risk that systems are not being authorised as intended. Defence data on ICT system authorisations shows that 47 per cent of its systems have a status of either ‘Expired’ or ‘No accreditation’, indicating that System Owner obligations in respect to obtaining and maintaining the authorisation of their systems are not being met. (See paragraphs 2.74–2.93)

Implementation of arrangements for the security authorisation of Defence’s ICT systems

22. Defence’s data indicates that the obligations of System Owners to obtain and maintain the authorisation of their systems are not being fulfilled. (See paragraphs 3.5–3.26)

23. Defence self-assesses and reports annually on its compliance with PSPF Policy 11 and has established governance and internal reporting requirements for DSPF controls, including DSPF Control 23.1 ICT Certification and Accreditation. Deficiencies in Defence’s reporting include that:

  • Defence has not reported on the authorisation status of ICT systems at an enterprise level since 2018–19 in its PSPF and DSPF reporting (a key indicator of compliance against DSPF Control 23.1 and PSPF Policy 11);
  • Defence’s PSPF and DSPF reporting is not consistent with, and does not reflect, other information available within Defence on the assessment and authorisation of its ICT systems; and
  • Defence has not complied with the DSPF requirement to provide individual Control Owner reports to the Defence Security Committee since 2019–20. (See paragraphs 3.27–3.68)

24. Defence has not briefed the minister on its ICT assessment and authorisation activities in the last three years. In September 2019, the minister requested that Defence include a metric on the reduction of unapproved systems in an ‘ICT reform stream report’. Defence did not address this request. (See paragraphs 3.69–3.73)

25. Defence has not consistently complied with the requirements of its assessment and authorisation process. For example, for all five systems examined:

  • key supporting data had not been entered in Defence’s ICT authorisation management system, and mandatory security documentation had not been provided to the Security Assessors;
  • Defence was unable to substantiate that document reviews and control implementation assessments took place; and
  • there were shortcomings in the peer review process, including not identifying that mandatory security documentation was missing, and not identifying inaccuracies and errors in Risk Assessments. (See paragraphs 3.74–3.90)

26. There were instances where systems had been re-authorised based on the re-authorisation triggers in the DSPF. These re-authorisations were not always granted prior to authorisation expiry. (See paragraphs 3.91–3.100)

Recommendations

Recommendation no. 1

Paragraph 2.24

The Department of Defence ensure that DSPF roles and requirements for system assessment and authorisation are complete, current, and regularly reviewed for alignment with the PSPF and Group/Service appointments.

Department of Defence response: Agreed.

Recommendation no. 2

Paragraph 2.72

The Department of Defence conducts a review of, and updates, its assessment and authorisation process documentation to ensure:

  1. alignment with current DSPF and PSPF requirements;
  2. consistency across all internal guidance documents, including those developed by the ADF Services; and
  3. that any internal inconsistencies within individual guidance documents are eliminated.

Department of Defence response: Agreed.

Recommendation no. 3

Paragraph 2.89

The Department of Defence:

  1. implements improved training and awareness raising activities to ensure that key personnel involved in the assessment and authorisation process are aware of their obligations under the PSPF and DSPF, and remain up-to-date with evolving cyber security requirements; and
  2. implements a framework to monitor and report on the completion of training and awareness raising activities.

Department of Defence response: Agreed.

Recommendation no. 4

Paragraph 3.25

The Department of Defence develops and implements processes to ensure that information entered into its ICT authorisation management system is complete, accurate, and supports effective monitoring of ICT system authorisations.

Department of Defence response: Agreed.

Recommendation no. 5

Paragraph 3.45

The Department of Defence:

  1. implement enterprise-wide assurance arrangements to support the effective implementation of DSPF system authorisation requirements; and
  2. implement arrangements to ensure that deficiencies and non-compliance identified through Service assurance activities relating to system authorisations are addressed and rectified.

Department of Defence response: Agreed.

Recommendation no. 6

Paragraph 3.67

The Department of Defence implement arrangements to ensure reporting to senior Defence leadership on compliance with system authorisation requirements under the PSPF and DSPF is comprehensive, accurate, and based on available data.

Department of Defence response: Agreed.

Recommendation no. 7

Paragraph 3.72

The Department of Defence:

  1. ensures that relevant ministers are provided with timely and accurate advice on key issues and risks relating to Defence’s ICT security authorisations and its compliance with the PSPF; and
  2. provides regular (at least annual) updates to relevant ministers to support oversight for improvements to its assessment and authorisation policies, frameworks and processes.

Department of Defence response: Agreed.

Recommendation no. 8

Paragraph 3.98

The Department of Defence implements arrangements to ensure that PSPF requirements, DSPF requirements and Defence’s assessment and authorisation process are complied with, including:

  1. ensuring that all required documentation has been completed prior to system assessment and authorisation;
  2. documenting the approval and review of mandatory supporting documentation;
  3. conducting and documenting assessments of the implementation and effectiveness of controls and provisional authorisation conditions against all relevant ISM and DSPF controls; and
  4. ensuring systems are proactively monitored against the conditions for re-authorisation.

Department of Defence response: Agreed.

Summary of the Department of Defence’s response

27. The proposed audit report was provided to the Department of Defence. Defence’s summary response is provided below, and its full response is included at Appendix 1. Improvements observed by the ANAO during the course of this audit are listed in Appendix 2.

Defence welcomes the Auditor-General Report: Defence’s Management of ICT Security Authorisation. Defence agrees to the eight recommendations aimed at improving Defence’s Cyber Security Assessment and Authorisation Framework to more effectively govern and monitor the authorisation of ICT systems and networks and control cyber-related ICT risk.

Defence is committed to strengthening and standardising our approach to safeguarding data from cyber threats and ensuring the secure operation of our ICT systems to protect the continuous delivery of Defence outcomes. Defence is currently reviewing its Cyber Security Assessment and Authorisation Framework, along with the associated policies, practices and processes, as part of Defence’s wider initiative to uplift cyber security governance and its cyber risk management framework. This includes an overhaul of several pertinent Defence Security Principles Framework policies, which are undergoing review, along with a program to drive Essential 8 Maturity.

Key messages from this audit for all Australian Government entities

28. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.

Group title

Governance and risk management

Key learning reference
  • Policy and guidance documentation should be kept up-to-date, especially when it relates to activities to manage key entity risks. Periodic review of documentation helps maintain its fitness-for-purpose and alignment with Commonwealth standards.
Group title

Performance and impact measurement

Key learning reference
  • Accurate and transparent monitoring and reporting on compliance supports effective decision-making and accountability. Monitoring and reporting should be supported by data that is accurate and complete.
Type: Performance audit
Report number: 1 of 2024-25
Portfolios: Defence
Entities: Department of Defence
Date tabled:
Audit Summary : show

Summary and recommendations

Background

1. Security vetting involves the assessment of an individual’s suitability to hold a security clearance at a particular level. Australian Government employees and contractors require a security clearance to access classified resources, which can relate to Australia’s national security, economic and other interests.1 The security vetting and clearance process is an important risk mitigation activity intended to protect the national interest, which can also affect an individual’s employment and the business operations of entities if not managed effectively or in a timely manner.

2. The Australian Government Security Vetting Agency (AGSVA) is part of the Department of Defence (Defence) and provides security clearance assessments as a whole-of-government service. In February 2014, Defence identified the need for long-term and potentially significant investment in ICT solutions because the existing system used by AGSVA to process security clearances, the Personnel Security Assessment Management System (PSAMS), did not have the ‘functionality needed for the future’. The February 2016 Defence Integrated Investment Program (IIP) subsequently outlined a need for ‘expanded security vetting’ as one of the ‘principal areas of focus’ for Defence.2

3. In October 2016, the Australian Government agreed to a suite of reforms to improve government entities’ management of the threat posed by malicious insiders, which included upgrading AGSVA’s ICT system.3

Vetting Transformation Project

4. The ‘Defence and Security Vetting Services 20/20 Reform Program’ was established in December 2016 and consisted of four workstreams: vetting; security policy, services and advice; security governance, assurance and reporting; and cultural change. The objectives for the vetting workstream included delivering: a new vetting security business model; a supporting ICT system; and relevant training, communications and change management activities.

5. The Vetting Transformation Project was established to deliver the vetting workstream objectives, including the design and implementation of a new system that:

  • provides sponsoring entities with information on identified risk factors associated with individual clearance holders;
  • increases automation of clearance decision-making and data collection (including across other government holdings, and online social-media information); and
  • supports continuous assessment of security risk.4

Previous ANAO reports

6. The ANAO previously reviewed Defence’s performance in providing security vetting services through AGSVA in the following performance audits.

  • Auditor-General Report No. 45 2014–15 Central Administration of Security Vetting, which was presented for tabling in Parliament in June 2015. The audit conclusion was that the performance of centralised vetting had been mixed and government expectations of improved efficiency and cost savings had not been realised.5
  • Auditor-General Report No. 38 2017–18 Mitigating Insider Threats through Personnel Security, which was presented for tabling in May 2018. The audit conclusion was that the effectiveness of personnel security arrangements for managing insider threats had been reduced by AGSVA not implementing the government’s policy direction to share information with client entities on identified personnel security risks. The report also observed that AGSVA planned to realise the necessary process improvements through the procurement of a new ICT system, expected to be fully operational in 2023.6

Rationale for undertaking the audit

7. The ANAO undertook this audit, and previous (2015 and 2018) audits of Defence’s provision of security vetting services through AGSVA, as effective personnel security arrangements underpin the protection of the Australian Government’s people, information and assets. Previous audits identified deficiencies in AGSVA’s information systems. In the context of the Joint Committee of Public Accounts and Audit’s (JCPAA) inquiry into the ANAO’s 2018 audit, Defence advised the JCPAA that a project to build a new ICT system had received first-pass approval in April 2018, with delivery of the ‘initial operating capability’ (the base capability) expected in late 2020.7

8. The base capability of the new system was introduced on 28 November 2022. By February 2023, the extent of user issues experienced after the system ‘went live’ were the subject of parliamentary interest. This audit provides independent assurance to the Parliament on the effectiveness of Defence’s procurement and implementation of the new ICT system, now known as myClearance, and Defence’s remediation progress to date.

Audit objective and criteria

9. The objective of the audit was to assess the effectiveness of Defence’s procurement and implementation of the myClearance system to date.

10. To form a conclusion against the audit objective the following high-level criteria were adopted.

  • Did Defence plan effectively and establish fit for purpose governance, oversight and reporting arrangements?
  • Was Defence’s implementation of the system effective and supported by procurement processes conducted in accordance with the Commonwealth Procurement Rules (CPRs)?

11. The audit focused on the procurement of the project approval and support services provider (Deloitte), the prime systems integrator (Accenture), the organisational change management partner (KPMG) and the project delivery partner (VOAK Group). The audit also considered the arrangements used to procure the hardware and software components of the myClearance system, and other services to manage the delivery of the Vetting Transformation Project. The audit did not examine Defence’s administration or management of its contracts with the service providers.

Conclusion

12. Defence’s procurement and implementation of the myClearance system to date has been partly effective. The full functionality of the system will not be delivered as key elements, including the continuous assessment, automated risk-sharing and enhanced interface functionalities, were de-scoped from the project in November 2023.

13. Defence’s planning activities were largely effective. Early planning work in 2016 and 2017 focused on industry engagement and assessing the market’s ability to deliver and integrate the new IT system into Defence’s ICT environment. Work to refine the user and system requirements in mid-2018 was not informed by other government entities or stakeholders. Defence designed governance, oversight and reporting arrangements in line with the requirements of its Capability Life Cycle framework. The project governance arrangements were not implemented effectively and there was a lack of clarity on the purpose of and relationship between the various decision-making forums. Project reporting did not support informed, risk-based decision-making as project risks and issues were not clearly communicated to Defence leadership.

14. Defence’s procurement processes were partly effective. The processes to engage project approval and support services and the organisational change management partner were conducted in line with the Commonwealth Procurement Rules (CPRs). The process to engage the prime systems integrator was not consistent with the CPRs. The tender documentation included a list of mandatory products referring to trade names and producers — an approach that did not comply with Defence’s procurement policy framework. Defence’s conduct of the ‘Analysis of Alternatives’ in early 2020 resulted in material changes to the technical solution, schedule and delivery approach and provided opportunities to the preferred supplier that were not provided to other prospective suppliers. Defence’s approach to engaging the Project Delivery Partner in 2022 did not comply with Defence’s Accountable Authority Instructions or the intent of the CPRs.

15. Defence’s implementation of the myClearance system has been partly effective. Identified risks and issues were not resolved in a timely manner. Data cleansing and migration activities were not effective. Testing processes were truncated and were not conducted in line with agreed testing plans or Defence guidance. To address the issues encountered after the core vetting system went live in November 2022, Defence established the myClearance taskforce in February 2023. Defence’s remediation activities have progressively improved the performance of the system since it went live. In July 2023, Defence advised government that it had delivered a system that largely met the initial operating capability requirements. In November 2023 Defence advised government that the myClearance system would not deliver the full functionality as approved in December 2020.

Supporting findings

Effectiveness of planning activities

16. Defence conducted early planning activities between late 2016 and early 2018. Industry engagement and market research was undertaken to assess the market’s ability to design, build and integrate a new IT system into Defence’s ICT environment. Workshops and forums held to refine the user requirements and technical components in June 2018 did not include external stakeholders such as other government entities with ICT systems that AGSVA’s new vetting system would need to integrate or interface with. (See paragraphs 2.7 to 2.29)

17. The financial and technical risks associated with the planned procurement were assessed. To mitigate some of the identified risks, a list of mandatory products referring to trade names and producers was included in Defence’s tender documentation for the IT solution to be delivered by the systems integrator. As a result, the design of the procurement:

  • did not comply with Defence’s procurement policy framework and was inconsistent with the Commonwealth Procurement Rules (CPRs);
  • reduced the opportunity for suppliers to propose alternative solutions based on ‘functional and performance requirements’ that may have met Defence’s requirements; and
  • introduced critical dependencies that increased the integration and schedule risks of the project. These risks were not effectively managed or communicated to senior Defence leadership or government. (See paragraphs 2.30 to 2.52)

Governance, oversight and reporting arrangements

18. Defence established governance, oversight and reporting arrangements for the Vetting Transformation Project in accordance with its Capability Life Cycle Manual — a framework that was designed to govern Defence’s acquisition of complex military equipment and materiel. These arrangements were not implemented effectively. (See paragraphs 2.63 to 2.79)

19. Reporting to decision-making forums accurately assessed the risks and issues that contributed to the problems experienced after the system ‘went live’. The impacts of those risks and issues on the expected functionality and capability of the system were not clearly communicated to Defence leadership. (See paragraphs 2.86 to 2.96)

20. Successive reviews, including independent assurance reviews found that project governance arrangements were not ‘formally defined and maintained’ and there was a lack of clarity on the purpose of and relationship between each forum within the governance model. At March 2024, Defence had commenced a program of work to address the identified governance issues, including the implementation of a new governance model for the project. (See paragraphs 2.82 to 2.84 and 2.103 to 2.112)

Procurement processes

21. The processes to engage project approval and support services and the organisational change management partner were conducted in accordance with the CPRs. For the prime systems integrator (PSI) procurement, processes such as initial screening, evaluation, value for money assessment, and additional clarification activities were compliant with CPR requirements. Key shortcomings in the design of the PSI procurement resulted in the conduct of activities that were not consistent with the CPRs. These activities involved material changes to the technical solution, schedule and delivery approach and provided opportunities to the preferred supplier that were not provided to other prospective suppliers. These opportunities enabled the preferred supplier to develop a ‘solution to a budget’ and submit costings for work it did not originally tender for. (See paragraphs 3.15 to 3.44)

22. Defence did not comply with its Accountable Authority Instructions for the procurement of the Project Delivery Partner in June 2022. Up to 85 per cent of the project management and other specialist support services were engaged through approaches to single suppliers, selected from a panel on each occasion. This approach was technically compliant with the CPRs but was not consistent with their intent — to drive value for money through competition. (See paragraphs 3.48 to 3.56)

Implementation of the system

23. Identified risks and issues were not resolved in a timely manner and cumulative delays in providing Government Furnished Materials to the Prime Systems Integrator gave rise to risks impacting the critical path of the project. These risks were realised, reducing the time available to test the system as required prior to the core vetting system (the base capability) going live on 28 November 2022. (See paragraphs 3.63 to 3.66, 3.70 to 3.72, and 3.84 to 3.90)

  • Data cleansing and migration activities were not conducted effectively or completed in a timely manner. Representative data (production data) was not used for testing as planned. The impacts arising from these issues on the functionality and capability of the system were not clearly communicated to decision-makers. (See paragraphs 3.103 to 3.110)
  • Testing activities were truncated and were not conducted in line with agreed testing plans or in a manner consistent with Defence guidance. Testing activities that were to be conducted sequentially were conducted in parallel. (See paragraphs 3.111 to 3.123)
  • Defence does not have a program in place to monitor and review privileged user activity and does not have a process to periodically revalidate user accounts for the myClearance system. (See paragraphs 3.91 to 3.100)

24. Throughout 2023, Defence’s myClearance taskforce achieved progressive improvements to the core vetting system. In November 2023, Defence recommended that the government agree to de-scoping the: continuous assessment; automated risk sharing; use of artificial intelligence; and enhanced interfaces from the myClearance system. As a consequence, the myClearance system will not deliver the desired capability uplift or provide the full functionality advised to government in December 2020. (See paragraphs 3.135 to 3.139)

Recommendations

25. The ANAO has made two recommendations to improve risk management for complex high value ICT projects and manage and maintain the security of the system.

Recommendation no. 1

Paragraph 2.53

The Department of Defence ensure that risk management plans, comprising a risk appetite statement and risk tolerances, are developed, implemented and maintained for its complex, high value ICT projects.

Department of Defence response: Agreed.

Recommendation no. 2

Paragraph 3.101

The Department of Defence develop and implement a program of work to periodically revalidate user access and monitor privileged user accounts to ensure that management of the myClearance system complies with the requirements of the Information Security Manual.

Department of Defence response: Agreed.

Summary of the Department of Defence’s response

26. The proposed audit report was provided to the Department of Defence. Defence’s summary response is provided below, and its full response is included at Appendix 1. Improvements observed by the ANAO during the course of this audit are listed in Appendix 2.

Defence acknowledges the Auditor-General’s findings that the implementation of the myClearance system was partly effective. Defence is committed to strengthening procurement and governance arrangements, ensuring important projects are delivered in the best interests of Australia’s national security.

Defence has achieved substantial improvements in security clearance processing since the system launched. Following the introduction of myClearance in November 2022, over 110,000 clearances have been processed, with over 75,000 clearances completed in the myClearance system during 2023–24. Vetting timeframes for all clearance levels are also being consistently met.

Defence is committed to increasing ICT project risk oversight and management through three robust lines of assurance to ensure decision makers are well informed of emerging risks and potential impacts. The methodology includes:

  • Establishing robust first-line assurance for ICT projects prior to progressing through gate decisions, ensuring all mandatory project artefacts are complete and performance milestones are achieved;
  • Increasing second-line assurance, assessing ICT project governance implementation and the end-to-end business solution; and
  • Continuing third-line enterprise level objective assessment of adequacy, effectiveness and efficiency of governance, performance and risk management.

Defence is confident this holistic approach to oversight and assurance will enable active identification, robust management and reporting of risks and opportunities.

Key messages from this audit for all Australian Government entities

27. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.

Group title

Governance and risk management

Key learning reference
  • Reporting on technical issues and risks should be clearly communicated to senior leaders and decision makers in plain English and in terms of the anticipated impact on the functionality or capability of the system.
Type: Performance audit
Report number: 47 of 2023-24
Portfolios: Defence
Entities: Department of Defence
Date tabled:
Audit Summary : show

Summary and recommendations

Background

1. The Mulwala facility in New South Wales is the sole remaining manufacturing site of military propellants and high explosives in Australia. The nearby munitions facility at Benalla, Victoria, uses some of the output of the Mulwala facility in its operations. Both facilities are owned by the Commonwealth and operated by a third party, Australian Munitions, a wholly owned subsidiary of Thales Australia (Thales).1 Thales has managed and operated the facilities at Benalla and Mulwala under several different contractual arrangements since 1999 (outlined in Appendix 3).

2. The Australian Government announced on 29 June 2020 that the Department of Defence (Defence) had signed a new 10-year agreement valued at $1.2 billion with Thales for the continued management and operation of the Mulwala and Benalla facilities.2 The agreement was intended to provide surety of supply of key munitions and components for the Australian Defence Force (ADF) and maintain a domestic munitions manufacturing capability. The agreement took effect on 1 July 2020 and resulted from a complex multi-year sole source procurement begun in 2016. The sole source procurement followed a terminated competitive procurement process undertaken between 2009 and 2014.

3. The Australian Government also announced on 29 June 2020 a new contract between the Commonwealth and NIOA Munitions (NIOA) for a tenancy at the Benalla munitions factory.3 This agreement was to establish NIOA as a tenant alongside Thales and provide opportunities for domestic manufacturing while enhancing supplies of key munitions for Defence.4

4. On 24 April 2023, the Australian Government released a public version of the final report of the Defence Strategic Review (DSR).5 It referenced the continuing importance of advanced munitions manufacturing, stating that the immediate focus must be on consolidating ADF guided weapons and explosive ordnance (GWEO) needs, establishing a domestic manufacturing capability, and the acceleration of foreign military and commercial sales. The report further outlined that, to do this, the ADF must hold sufficient stocks of GWEO and have the ability to manufacture certain lines, with the realisation of a GWEO enterprise being ‘central to achieving this objective.’6

5. At 19 June 2024, the implementation of a GWEO enterprise remains a key government priority, with the domestic manufacture of GWEO and munitions in Australia included: as one of seven ‘Sovereign Defence Industrial Priorities’ in the Defence Industry Development Strategy (announced in February 2024)7; and as part of the ‘immediate priorities’ set out in the public versions of the 2024 Integrated Investment Program (IIP) and the 2024 National Defence Strategy (both announced on 17 April 2024).8

6. On 5 May 2023, the Minister for Defence Industry announced the appointment of a senior responsible officer with responsibility for a Defence GWEO enterprise.9 At June 2024, Defence’s website stated that the facilities at Mulwala and Benalla ‘are key assets within the GWEO enterprise and will play a role in the expansion of domestic GWEO manufacturing.’10

Rationale for undertaking the audit

7. To establish the arrangements for the operation and maintenance of the Mulwala and Benalla facilities beyond June 2020, Defence undertook a complex and lengthy procurement process that was based on a sole source approach. This audit examined whether this process was effective and in accordance with the Commonwealth Procurement Rules (CPRs).

8. This audit builds on previous work by the ANAO which has examined Defence’s management of the Benalla and Mulwala facilities over time, and provides independent assurance to the Parliament on Defence’s establishment of arrangements for the operation and maintenance of the Mulwala and Benalla facilities beyond June 2020.

Audit objective and criteria

9. The audit objective was to assess whether the arrangements for the operation and maintenance of the Mulwala and Benalla facilities beyond June 2020 were established through appropriate processes and in accordance with the CPRs.

10. To form a conclusion against the audit objective, the following high-level criteria were selected:

  • Did Defence plan effectively for the operation and maintenance of the facilities beyond the expiry of the 2015–20 interim contract?
  • Did Defence conduct an effective sole source procurement process to establish the 2020–30 contractual arrangements?
  • Did Defence effectively manage probity throughout the process?

11. This report is the first of two performance audit reports examining Defence’s establishment and management of the facilities beyond June 2020. It focuses on Defence’s establishment of the 2020–30 operating arrangements, including the tender assessment process, advice to decision makers and the decision to conduct a sole source procurement. Defence’s management of performance against the contract is the focus of a second report, which will be presented for tabling later in 2024.

Conclusion

12. Defence’s conduct of the sole source procurement for the operation and maintenance of the Mulwala and Benalla facilities beyond June 2020 was partly effective. Defence’s management of probity was not effective and there was evidence of unethical conduct.

13. Defence’s planning processes prior to the expiry of the 2015 interim contract were partly effective. While options for the management of the facilities beyond June 2020 were developed, deficiencies were identified in Defence’s subsequent procurement and probity planning processes and in its advice to decision-makers. Defence’s decision to conduct a sole sourced procurement was not informed by an estimated value of the procurement prior to this decision and Defence did not document the legal basis for selecting a sole sourced procurement approach, as required by the CPRs. Probity risks were realised in 2016 when Defence personnel provided Thales with confidential information relating to its Investment Committee (IC) proposal, and advice to decision-makers did not address how value for money would be achieved and commercial leverage maintained in the context of a sole source procurement.

14. Defence’s conduct of the sole source procurement process to establish the 2020–30 contractual arrangements was partly effective. Risk assessments were not timely and appropriate records for key meetings with Thales during the tender process were not developed or retained by Defence. After assessing Thales’ tender response as not being value for money in October 2019, Defence proceeded to contract negotiations in December 2019 notwithstanding internal advice that Defence was at a disadvantage in such negotiations due to timing pressures.

15. The negotiated outcomes were not fully consistent with Defence’s objectives and success criteria. Defence’s approach to negotiating the contract in accordance with high-level issues reduced the line of sight between the request for tender (RFT) requirements and the negotiated outcomes. Defence’s advice to ministers on the tender and contract negotiations did not inform them of the extent of tender non-compliance, basis of the decision to proceed to negotiations, or ‘very high risk’ nature of the negotiation schedule.

16. Defence did not establish appropriate probity arrangements in a timely manner. A procurement-specific probity framework to manage risks associated with the high level of interaction between Defence and Thales was not put in place until July 2018. Probity risks arose and were realised during 2016 and 2017, including when a Defence official solicited a bottle of champagne from a Thales representative. Defence did not maintain records relating to probity management and could not demonstrate that required briefings on probity and other legal requirements were delivered.

Supporting findings

Planning during the interim contract period

Options development and consideration of facilities management beyond June 2020

17. Defence provided advice to the Minister for Defence during 2014 on a range of options for the management of the facilities beyond June 2020, including: continuing with the status quo; the Commonwealth operating the facilities; and closing the facilities. These options continued to be considered by Defence and the government between 2015 and mid-2017. In 2016, a clear preference emerged to sole source the operation and maintenance of the facilities to the incumbent, Thales. By July 2016, Defence was primarily focused on developing a proposed ‘strategic partnership’ arrangement with Thales. Defence did not document the legal basis (that is, an exemption provided by paragraph 2.6 of the CPRs) for the proposed sole source activity to inform its subsequent procurement planning (see paragraphs 2.1 to 2.51).

18. A procurement-specific probity framework was not put in place until July 2018, to help manage probity risks in the context of pursuing a strategic partnership arrangement with Thales. These risks crystallised during 2016 when:

  • senior Defence personnel advised Thales at an October 2016 summit meeting that Defence’s preference would be to progress a government-owned contractor-operated arrangement with Thales into the future.
  • a Defence official sought assistance from and provided information to Thales in November 2016 on the development of internal advice to the IC, Defence’s committee processes, and internal Defence thinking and positioning. Government information of this sort is normally considered confidential, and the relevant email exchange evidenced unethical conduct (see paragraphs 2.48 to 2.51).
Advice and analysis informing the decision to conduct a sole source process with the incumbent operator

19. Defence’s advice to the IC in December 2016 and the Minister for Defence Industry in mid-2017 on the decision to sole source was not complete. The advice did not address the legal basis for the procurement method, the risks associated with a sole source procurement approach, or value for money issues — including how Defence expected to achieve value for money and maintain commercial leverage in the context of a sole source procurement. When the IC approved the sole source procurement method in December 2016, Defence had not estimated the value of the procurement. This was not consistent with the CPR requirement to estimate the value of a procurement before a decision on the procurement method is made (see paragraphs 2.52 to 2.71).

Establishment of the 2020–30 arrangements

Procurement planning activities

20. Defence’s procurement planning activities were not timely. Prior to mid-2017, Defence’s planning had largely focussed on seeking approval by June 2017 to inform Thales of the arrangements for the facilities beyond June 2020 (as required of Defence under the interim contract) and to enable collaborative contract development with Thales to commence. Defence’s advice to decision-makers was not informed by the results of key planning processes, as required by the CPRs and Defence’s procurement policy framework. These key processes were not conducted until after December 2016, when the sole source procurement method was approved and included:

  • the progressive development of Defence’s requirements for the facilities between March 2017 and July 2019, with assistance from Thales; and
  • internal workshops between October 2017 and May 2018, which identified risks that had not been previously documented. Defence did not develop a risk management plan to actively manage those risks (see paragraphs 3.1 to 3.31).
Development of the request for tender

21. Defence undertook a process which included the principal elements of a complex procurement as set out in Defence’s procurement policy framework, including an Endorsement to Proceed (EtP), RFT process and detailed contract negotiations. A feature of Defence’s process was the high level of interaction with Thales on the contents of the RFT before and after it was issued on 16 August 2019, including during the tender response period. Defence’s Complex Procurement Guide (CPG) identified ‘probity risks inherent in such activities’ and stated that relevant engagement processes and activities ‘should be planned and conducted with appropriate specialist support.’ Seeking specialist advice on the propriety and defensibility of its approach would have been prudent and consistent with the Public Governance, Performance and Accountability Act 2013 (PGPA Act) duty that officials exercise care and diligence (see paragraphs 3.32 to 3.63).

Tender evaluation

22. By October 2019, Defence had determined that Thales’ tender response was not value for money due to assessing the proposal as ‘Deficient – Significant’ with ‘High’ risk against all five evaluation criteria and identifying 199 non-compliances against the RFT. Defence considered the number of non-compliances to be ‘unprecedented’ and initially agreed, internally, to extend the interim contract with Thales to allow sufficient time to negotiate the non-compliances with the RFT (see paragraphs 3.64 to 3.78).

23. Following senior-level discussions in November 2019 with Thales, Defence decided to conclude the evaluation process on 4 December 2019 and proceed to contract negotiations. This decision was made notwithstanding internal advice that Defence was at a disadvantage in negotiations due to timing pressures. Defence’s internal advice considered that it had no ‘off-ramps’ due to the impending expiry of the interim contract on 30 June 2020. Defence did not clearly document the basis for reducing risk ratings against all the evaluation criteria from ‘High’ to ‘Medium’, following the senior-level discussions with Thales (see paragraphs 3.79 to 3.90).

24. Defence did not prepare or retain appropriate records for key meetings with Thales during the tender where the identified risks required active Defence management in the Commonwealth interest. Defence’s approach to record keeping was not consistent with requirements in the relevant Communications Plan, internal procurement advice, guidance in the CPG, or the CPRs (see paragraphs 3.91 to 3.100).

Negotiation outcomes

25. The negotiated outcomes for the 2020–30 contract were not fully consistent with Defence’s objectives and success criteria approved by Defence in July 2019. At the conclusion of negotiations in February 2020, three of the 15 success criteria aimed at incentivising satisfactory performance and reducing the contract management burden and total cost of ownership for the facilities were reported as not achieved. Defence’s approach to negotiations involved agreeing a schedule and high-level negotiation issues with Thales, to guide negotiations between December 2019 and February 2020. Defence did not systematically address the 199 non-compliances it had identified in Thales’ tender response. This approach reduced the traceability between the RFT requirements, risks and issues identified during tender assessment, and the negotiated outcomes in the agreed contract (see paragraphs 3.101 to 3.114).

26. Defence’s advice to its ministers on the tender and 2020–30 contract negotiations did not inform them of key issues such as the extent of tender non-compliance, the basis of the decision to proceed to negotiations, and Defence’s assessment of the ‘very high risk’ nature of the negotiation schedule (see paragraphs 3.115 to 3.133).

Probity management

Establishment of probity arrangements

27. Defence did not establish appropriate probity arrangements in a timely manner. Defence did not have project and procurement-specific probity arrangements in place until July 2018, more than two years after its initial engagement with Thales (in March 2016) about future domestic munitions manufacturing arrangements. Prior to establishing these probity arrangements, Defence did not assess or take steps to manage potential probity risks arising from ongoing direct engagement with the incumbent operator or remind those involved of their probity obligations, including in relation to offers of gifts and hospitality. During this period, probity risks were realised and there was evidence of unethical conduct, including when a Defence official solicited a bottle of champagne from a Thales representative (see paragraphs 4.1 to 4.30).

28. While Defence’s CPG identified ‘inherent’ probity risks in ‘any procurement that involves high levels of tenderer interaction’ Defence did not appoint a probity adviser that was external to the department. Defence maintained a register of probity documentation but did not retain relevant records for one of the 65 personnel recorded as having completed documentation. For 22 (25 per cent) of the 87 personnel who completed probity documentation, this completion was not recorded in any register. There was no relevant probity documentation for a further six individuals involved for a period in the procurement. Defence’s conflict of interest (COI) register for the procurement was also incomplete. It did not record six instances where a Defence official or contractor declared a potential, perceived or actual COI, including a Tender Evaluation Board member’s declaration of long-term social relationships with Thales staff. Defence was unable to provide evidence that briefings on probity and other legal requirements were delivered in accordance with the Legal Process and Probity Plan for the procurement (see paragraphs 4.31 to 4.50).

Recommendations

Recommendation no. 1

Paragraph 2.31

The Department of Defence document at the time the proposed procurement activities are decided:

  • the circumstances and conditions justifying the proposed sole source approach, to inform subsequent procurement planning; and
  • which exemption in the CPRs is being relied upon as the basis for the approach and how the procurement would represent value for money in the circumstances.

Department of Defence response: Agreed.

Recommendation no. 2

Paragraph 2.61

The Department of Defence, including its relevant governance committees, ensure that when planning procurements, the department estimates the maximum value (including GST) of the proposed contract, including options, extensions, renewals or other mechanisms that may be executed over the life of the contract, before a decision on the procurement method is made.

Department of Defence response: Agreed.

Recommendation no. 3

Paragraph 2.64

The Department of Defence, including its relevant governance committees, ensure that advice to decision-makers on complex procurements is informed by timely risk assessment processes that are commensurate with the scale, scope and risk of the relevant procurement.

Department of Defence response: Agreed.

Recommendation no. 4

Paragraph 3.61

The Department of Defence ensure that when it undertakes complex procurements with high levels of tenderer interaction, it seeks appropriate specialist advice, including from the Department of Finance as necessary.

Department of Defence response: Agreed.

Recommendation no. 5

Paragraph 3.94

The Department of Defence ensure compliance with the Defence Records Management Policy and statutory record keeping requirements over the life of the 2020–30 Strategic Domestic Manufacturing contract, including capturing the rationale for key decisions, maintaining records, and ensuring that records remain accessible over time.

Department of Defence response: Agreed.

Recommendation no. 6

Paragraph 3.112

The Department of Defence ensure, for complex procurements, that there is traceability between request for tender (RFT) requirements, the risks and issues identified during the tender assessment process, and the negotiated outcomes.

Department of Defence response: Agreed.

Recommendation no. 7

Paragraph 4.10

The Department of Defence develop procurement-specific probity advice for complex procurements at the time that procurement planning begins and develop probity guidance for:

  • complex procurements involving high levels of tenderer interaction; and
  • managing engagement risks in the context of long-term strategic partnership arrangements.

Department of Defence response: Agreed.

Recommendation no. 8

Paragraph 4.25

The Department of Defence make appointment of external probity advisers mandatory for all complex procurements with high probity risks, such as procurements with high levels of tenderer interaction.

Department of Defence response: Agreed.

Summary of entity response

29. The proposed audit report was provided to Defence. Defence’s summary response is reproduced below. The full response from Defence is at Appendix 1. Improvements observed by the ANAO during the course of this audit are listed in Appendix 2.

Department of Defence

Defence acknowledges the findings contained in the audit report on Defence’s Management of Contracts for the Supply of Munitions, which assessed the effectiveness of the procurement and contract establishment for the Department’s Strategic Domestic Munitions Manufacturing contracting arrangement.

The Mulwala and Benalla munition factories underpin Australia’s ability to develop critical propellants, explosives and munitions for the Australian Defence Force and are recognised as a world-class capability. Since this procurement activity, the strategic landscape has changed, as outlined in the Defence Strategic Update of 2020 and the Defence Strategic Review in 2023. The National Defence Strategy further prioritises these factories as critical and foundational industrial capabilities for Australian domestic manufacturing, supporting sovereign resilience and our allies.

Defence welcomes collaborative engagement with our industry partners in delivering unique capability outcomes. Defence acknowledges and understands the need to ensure that such engagement is appropriately managed, and will strengthen the guidance in relation to identifying and managing procurement and probity risks early in the process as well as maintaining these records for the life of the procurement activity. Defence is continually improving and updating the Defence frameworks that underpin the issues raised.

Key messages from this audit for all Australian Government entities

30. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.

Group title

Procurement

Key learning reference
  • Entities can demonstrate compliant, transparent, and accountable procurement processes through the creation and retention of appropriate records, including for: decisions on the procurement approach; assessment against selection criteria; engagement with tenderers; and the rationale for proceeding to negotiations.
  • Procurements involving high levels of interaction with potential tenderers require active management of engagement probity risks, including ensuring that all relevant interactions are appropriately documented and visibility is maintained over key records such as conflict of interest declarations and probity advice.
  • Effective risk management during procurement processes is supported by identifying, assessing and treating procurement risks early in the process and thereafter on an ongoing basis, including developing and maintaining a risk management plan, risk registers, and mitigation strategies for all risks.
Type: Performance audit
Report number: 45 of 2023-24
Portfolios: Defence
Entities: Department of Defence
Date tabled:
Audit Summary : show

Summary and recommendations

Background

1. Australian Defence Force (ADF) recruitment advertising campaigns are typically the largest conducted by Australian Government entities each year. The Department of Finance reported that the Department of Defence’s (Defence’s) recruitment advertising expenditure was $60.2 million for 2022–23, representing approximately 33.6 per cent of total Australian Government advertising expenditure of $179.3 million.1 In conjunction with a range of contracted suppliers, Defence designs and administers advertising campaigns aimed at particular target audiences.

2. Australian Government entities are required to comply with a framework established by the Australian Government Guidelines on Information and Advertising Campaigns by non-corporate Commonwealth entities (the Guidelines).2

3. The Guidelines state that they ‘operate on the underpinning premise that’:

  1. members of the public have equal rights to access comprehensive information about government policies, programs and services which affect their entitlements, rights and obligations; and
  2. governments may legitimately use public funds to explain government policies, programs or services, to inform members of the public of their obligations, rights and entitlements, to encourage informed consideration of issues or to change behaviour.

4. The Guidelines are a government policy and entities subject to them must be able to demonstrate compliance with five overarching principles when planning, developing and implementing publicly-funded information and advertising campaigns. The principles require that campaigns are:

  • relevant to government responsibilities;
  • presented in an objective, fair and accessible manner;
  • objective and not directed at promoting party political interests;
  • justified and undertaken in an efficient, effective and relevant manner; and
  • compliant with legal requirements and procurement policies and procedures.

Rationale for undertaking the audit

5. Campaign advertising for ADF recruitment is an ongoing Defence activity and represents a material component of all Australian Government campaign advertising. In meeting its outcomes, Defence has identified ‘investing in the growth and retention of a highly skilled workforce to meet Australia’s defence and national security requirements’ as one of its seven key activities.3 This audit provides independent assurance to the Parliament on Defence’s management of selected ADF recruitment advertising campaigns.

Audit objective and criterion

6. The audit objective was to assess the effectiveness of Defence’s management of advertising campaigns for Australian Defence Force recruitment.

7. To form a conclusion against the audit objective, the ANAO adopted the following high-level criterion.

  • Were the selected campaigns compliant with the Australian Government’s campaign advertising framework?

8. The ANAO selected three campaigns for review, which were launched in 2022–23:

  • Take a Closer Look — launched on 21 August 2022;
  • Where It All Begins — launched on 6 February 2023; and
  • Live a Story Worth Telling — launched on 19 March 2023.

Conclusion

9. The Department of Defence’s management of the three selected advertising campaigns for Australian Defence Force recruitment was largely effective.

10. For the selected campaigns, Defence largely complied with the review, certification and publication requirements of the Australian Government’s campaign advertising framework and complied with the requirements of Principles 1 to 3 of the Guidelines.

11. Defence largely complied with Principle 4 except that it could not provide the ANAO with supporting evidence to verify the accuracy of cost information for each campaign.

12. With respect to Principle 5, Defence did not clearly document the substantive basis for its advice that there were no legal concerns with respect to the campaign materials.

13. Defence does not evaluate the overall effectiveness of its recruitment advertising campaigns after they have ended. The extent to which Defence’s recruitment advertising activities have contributed towards increasing the number of applications to join the ADF has therefore not been assessed by Defence.

14. There is scope for Defence to improve the transparency of its public reporting on individual advertising campaigns and to strengthen the assurance provided to the Secretary of Defence on compliance with the principles of the campaign advertising framework.

Supporting findings

Defence campaigns — compliance with requirements

15. For the three selected campaigns, Defence complied with most of the review, certification and publication requirements of the campaign advertising framework.

16. Each campaign received government approvals in accordance with the framework requirements applying at the time they were considered. (See paragraphs 2.8 to 2.33)

17. The Defence Secretary completed certifications that the campaigns complied with the five ‘overarching principles’ of the Guidelines and the certifications were published on Defence’s website. The Secretary’s certifications were informed by a third-party certification from the Independent Communications Committee (ICC) as required by the Guidelines, and Defence advice on compliance. (See paragraphs 2.12, 2.23 and 2.31)

18. As required by the framework, Defence developed a 2022–23 Media Strategy that was reviewed by the ICC. The ICC provided a report to the Defence Secretary, which was published on the Department of Finance’s website as required. (See paragraphs 2.34 to 2.40)

19. Defence did not publish research reports for the selected campaigns on its website and did not document why it was not appropriate to do so. (See paragraphs 2.17, 2.25 and 2.32)

20. While Defence’s annual report includes information on overall campaign expenditure, it does not specify the individual advertising campaigns conducted by Defence, as required by the Public Governance, Performance and Accountability Rule 2014 (PGPA Rule). (See paragraphs 2.14, 2.25 and 2.33)

21. Defence complied with the requirements of Principles 1 to 3 of the Guidelines.

22. Defence largely complied with Principle 4 except that it could not provide the ANAO with supporting evidence to verify the accuracy of cost information that it provided. In the absence of this information, no assurance can be provided on the accuracy or completeness of the campaign advertising expenditure as advised by Defence. (See paragraphs 2.102 to 2.107)

23. With respect to Principle 5 (compliance with legal requirements and procurement policies and procedures), Defence did not clearly document the substantive basis for its advice that there were no legal concerns with respect to the campaigns. (See paragraphs 2.72 to 2.75)

24. Defence uses quarterly Communications Tracking reports to monitor the performance of its active campaigns. Defence does not evaluate the overall effectiveness of its recruitment advertising campaigns after they have ended. The extent to which Defence’s recruitment advertising activities have contributed towards increasing the number of applications to join the ADF has therefore not been assessed by Defence. (See paragraphs 2.102 to 2.107)

Recommendations

Recommendation no. 1

Paragraph 2.15

The Department of Defence comply with the requirement of the Public Governance, Performance and Accountability Rule 2014 to include a statement, in its annual report, on the specific advertising campaigns conducted by Defence.

Department of Defence response: Agreed.

Recommendation no. 2

Paragraph 2.100

The Department of Defence provide the Department of Finance with details of expenditure on individual Defence advertising campaigns for inclusion in Finance’s annual report on Campaign Advertising by Australian Government Departments and Entities.

Department of Defence response: Agreed.

Recommendation no. 3

Paragraph 2.108

To meet the requirements of the Australian Government Guidelines on Information and Advertising Campaigns by non-corporate Commonwealth entities and the Commonwealth Evaluation Policy, for future advertising campaigns, the Department of Defence:

  1. establish clear objectives for each campaign prior to the development of the campaign;
  2. document an evaluation plan; and
  3. at the conclusion of each campaign, prepare a final evaluation report.

Department of Defence response: Agreed.

Summary of entity response

Department of Defence

Defence acknowledges the Auditor-General’s assessment that Defence has mostly complied with the requirements of the Government’s advertising framework and related guidelines, and its management of three selected ADF advertising campaigns has been largely effective as a result.

Defence notes that each of its campaigns are subject to rigorous and comprehensive quarterly evaluations over the life of a campaign, a period of typically four to six years, to regularly assess the audience’s resonance with, recollection of, and reaction to, the subject campaign. However, Defence accepts the finding that it has not conducted a final evaluation of campaigns it has elected to remove from market.

Defence agrees with the recommendations regarding the improvements to transparency in formal reporting. While Defence has reported expenditure connected to all of its advertising campaigns in annual reports authored by the Department of Defence and the Department of Finance, Defence has not provided details relating to expenditure by campaign, an action it will undertake in formal reporting in the future.

Key messages from this audit for all Australian Government entities

25. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.

Group title

Governance and risk management

Key learning reference
  • Entities demonstrate transparency and accountability to the Parliament by complying with all annual reporting requirements specified in the Public Governance, Performance and Accountability Rule 2014.
  • Establishing clear objectives at the outset for projects such as campaign advertising supports the conduct of an effective evaluation process at their completion.
Type: Financial statement audit
Report number: 42 of 2023-24
Portfolios: Across Entities
Entities: Across Entities
Date tabled:
Audit Summary : show
Type: Performance audit
Report number: 39 of 2023-24
Portfolios: Cross entity; Health and Aged Care; Veterans’ Affairs; Home Affairs
Entities: Department of Health and Aged Care; Department of Veterans’ Affairs; Department of Home Affairs
Date tabled:
Audit Summary : show

Summary and recommendations

Background

1. Evaluation is a structured assessment of the value of government programs or activities, aimed at supporting improvement, accountability, and decision-making throughout the policy cycle. Pilot programs are small-scale tests or trials of programs with the aim of informing future decision-making.

2. The Public Governance, Performance and Accountability Act 2013 (the PGPA Act) requires the accountable authority of a Commonwealth entity to measure and assess the performance of the entity in achieving its purposes1, and that a minister must not approve expenditure unless satisfied that the expenditure would be an efficient, effective, economical and ethical use of public money.2

3. In 2019, the Australian Government released the Independent Review of the Australian Public Service3, which recommended that the APS embed a culture of evaluation and learning from experience to underpin evidence-based policy and delivery (Recommendation 26). The Australian Government agreed in part to this recommendation.4 The Minister for Finance endorsed a Commonwealth Evaluation Policy5 and Resource Management Guide 130 Commonwealth Evaluation Toolkit6 (the Toolkit) on 1 December 2021. The Toolkit provides a principles-based approach for the conduct of evaluations. It applies to all Commonwealth entities and companies subject to the PGPA Act.

Rationale for undertaking the audit

4. Pilot programs are trial programs of limited size that are used to decide whether a proposed policy should be adopted, and what adjustments should be made before adoption. Monitoring and evaluation are critical components of a pilot to support an assessment of the program or activity’s impact and efficiency.

5. The audit involved the examination of five Australian Government pilot programs across the Department of Health and Aged Care (Health), the Department of Home Affairs (Home Affairs), and the Department of Veterans’ Affairs (DVA). The pilots ranged in length from two to three years. The audit provides assurance to the Parliament over the appropriateness of frameworks for evaluation, and the adequacy of evaluation of pilot programs.

Audit objective and criteria

6. The objective of the audit was to assess the effectiveness of the evaluation of selected Australian Government pilot programs.

7. To form a conclusion against this objective, the following high-level criteria were adopted:

  • Do the selected entities have governance arrangements in place to support effective program evaluation?
  • Was the evaluation approach for the selected pilot programs robust?
  • Was pilot program reporting and advice to government appropriate?

Conclusion

8. The evaluation of the selected Australian Government pilot programs was mixed. Health’s evaluation of the Take Home Naloxone pilot was largely effective, and the evaluation of the Kava pilot was partly effective. DVA’s evaluation of the Wellbeing and Support Program pilot was largely effective, and the evaluation of the Non-Liability Rehabilitation pilot was partly effective. Home Affairs’ evaluation of the Skilled Refugee Labour Agreement pilot was partly effective.

9. Health and DVA have largely effective governance arrangements to support the evaluation of pilot programs. Home Affairs has partly effective arrangements. Health and DVA have strengthened their governance arrangements through the updating or development of entity-specific frameworks, guidance, and training on what, when and how to conduct an evaluation. Home Affairs does not have entity-specific evaluation guidance. Evaluation culture is maturing within Health and DVA, and is immature at Home Affairs. Pilot programs are only referenced in DVA’s entity-specific guidance.

10. The evaluation planning and approach for Health’s Take Home Naloxone pilot and DVA’s Wellbeing and Support Program pilot were largely robust, including appropriate stakeholder engagement and relevant ethics approvals. Planning for the evaluation of Health’s Kava pilot did not identify the risk that ethics approval may not be granted for one of the planned qualitative analysis methods, and there was a lack of baseline evidence to support the planned evaluation methodology. The effectiveness of planning for the evaluation of DVA’s Non-Liability Rehabilitation pilot was reduced as the analytical methodologies were not documented, and no external stakeholders were consulted. Home Affairs did not complete its planning for, or undertake, a robust evaluation for the Skilled Refugee Labour Agreement pilot. All evaluation plans and approaches could have been enhanced by a greater focus on the availability of data and an assessment of the proper use of public money.

11. Health’s evaluation reporting and advice to the Australian Government for the Take Home Naloxone pilot was largely effective, with the recommendations made to expand the naloxone pilot largely informed by the lessons learnt from the evaluation. Health’s evaluation reporting and advice to the Australian Government for the Kava pilot was partly effective as neither the evaluation report nor recommendations on the continuation of the pilot have been provided to the Australian Government. The evaluation report for DVA’s Wellbeing and Support Program pilot was largely effective. There was no evidence of DVA advising the Australian Government on the evaluation findings and impact on future program design. The evaluation for the Non-Liability Rehabilitation pilot has not yet commenced, and reporting and advice to the Australian Government on the mid-pilot review was partly effective. Home Affairs’ evaluation reporting and advice to the Australian Government for the Skilled Refugee Labour Agreement pilot was partly effective, with outputs rather than pilot outcomes analysed and reported to the minister.

Supporting findings

Governance arrangements

12. The Commonwealth Evaluation Toolkit provides appropriate high-level guidance to support entities in determining what programs or policies should be evaluated and when. It provides limited guidance on conducting an economic evaluation, including any assessment of cost effectiveness of implementation, and does not include a requirement for all pilots to be evaluated.

  • With the exception of DVA’s Non-Liability Rehabilitation pilot, at the time the other examined pilots commenced, only Health had established internal evaluation guidance.
  • In November 2023, Health published a revised evaluation strategy which specifies roles and responsibilities and includes a tiered system for identifying evaluation priorities across the department.
  • Since the commencement of the Wellbeing and Support Program pilot, DVA developed a framework which supports when and what to evaluate based on program characteristics, timing and capability. In August 2023, DVA introduced a framework for the planning, monitoring and evaluation of its health and wellbeing programs, which includes roles and responsibilities.
  • Home Affairs does not have an entity-specific approach to determining when and what to evaluate.
  • Each entity has an internal evaluation team to provide guidance and support on evaluation practice.

(See paragraphs 2.7 to 2.58)

13. Health and DVA have policies and guidance materials for how to conduct program evaluations. Only Health has guidance on when economic evaluation should be undertaken and the guidance is limited. Training on evaluation practices is provided at Health and DVA. Attendance is not consistently monitored. Home Affairs has no entity-specific guidance on conducting evaluations, and no training programs available to staff. (See paragraphs 2.59 to 2.69)

Evaluation approach

14. Planning for evaluation, including stakeholder engagement, was completed for Health’s Take Home Naloxone pilot and DVA’s Wellbeing and Support Program pilot. Planning for stakeholder engagement for evaluation of Health’s Kava pilot did not account for the risk that ethics approval may not be granted and the resulting impact on the planned analysis and evaluation methodology. The effectiveness of planning for the evaluation of DVA’s Non-Liability Rehabilitation pilot was reduced as the methodologies to be used were not documented, and no external stakeholders were consulted. Home Affairs did not complete its planning for the evaluation for the Skilled Refugee Labour Agreement pilot. While data sources were identified within the evaluation plans that were developed, one or more planned data sources within each pilot were not available for the evaluation, and this risk had not been identified. (See paragraphs 3.6 to 3.52)

15. The evaluation methodologies used for three out of the five pilots examined were largely consistent with the Toolkit. For the evaluations conducted, all could have been strengthened with a greater focus on baseline data, control group outcomes, and an assessment of the proper use of public money. Ethics approvals were obtained for Health’s Take Home Naloxone pilot and DVA’s Wellbeing and Support Program pilot. The ethics approval sought for Health’s Kava pilot was not granted and no alternative strategy was developed to obtain information that was critical to the evaluation. DVA’s Non-Liability Rehabilitation pilot evaluation plan did not include a consideration of ethics approval and the post-implementation review has not yet been undertaken. As Home Affairs did not conduct an evaluation of its pilot, there was no methodology applied, or consideration of the need for ethics approval. (See paragraphs 3.53 to 3.80)

Reporting and advice to the Australian Government

16. The analysis of pilot evaluation outcomes for Health’s pilots and DVA’s Wellbeing and Support Program pilot were largely fit for purpose, with the evaluation reports documenting the application of statistical methods to provide defensible findings and make recommendations on the basis of the analysis completed. The evaluation of DVA’s Non-Liability Rehabilitation pilot has not yet commenced. Home Affairs’ reporting of outputs of the Skilled Refugee Labour Agreement pilot did not contain fit-for-purpose analysis and does not satisfy the requirements of evaluation reporting in the Commonwealth Evaluation Toolkit. (See paragraphs 4.5 to 4.33)

17. Advice provided by Health to the Australian Government in relation to the Take Home Naloxone pilot was appropriate, including the lessons learnt from the pilot. The recommendation to expand the pilot into different environments was partly informed by evaluation. Health has not provided advice to the Australian Government on the findings of the evaluation or lessons learnt in relation to the Kava pilot. DVA did not advise the Minister for Veterans’ Affairs on the evaluation findings or lessons learnt for future program delivery for the Wellbeing and Support Program pilot. Home Affairs’ advice to the Australian Government for the continuation of the Skilled Refugee Labour Agreement pilot was not informed by an evaluation. (See paragraphs 4.34 to 4.52)

Recommendations

Recommendation no. 1

Paragraph 2.16

The Department of the Treasury update the Commonwealth Evaluation Policy and Toolkit to include:

  1. a definition of a ‘pilot’;
  2. guidance on how to conduct an economic evaluation and other methods for considering whether spending represents an appropriate use of public money;
  3. a recommendation that evaluations of pilot programs be undertaken;
  4. a recommendation for evaluation planning to be conducted alongside pilot design; and
  5. guidance on governance arrangements for cross-entity evaluations to minimise duplication and maximise coordination and learnings across entities.

Department of the Treasury’s response: Agreed.

Recommendation no. 2

Paragraph 2.32

The Departments of Health and Aged Care and Veterans’ Affairs include in their entity-specific evaluation policies:

  1. decision-making criteria for the appropriate style of evaluation to be completed by reference to the activity’s risk, objective and outcomes;
  2. guidance on how to demonstrate whether a program represented a proper use of public money, including the cost-effectiveness of its implementation, and how to undertake an economic evaluation where appropriate; and
  3. guidance related to evaluation of pilot programs.

Department of Health and Aged Care’s response: Agreed.

Department of Veterans’ Affairs’ response: Agreed.

Recommendation no. 3

Paragraph 2.35

The Department of Home Affairs develop entity-specific policies for evaluation, including:

  1. decision-making criteria as to when an evaluation is required and the appropriate style of evaluation by reference to the activity’s risk, objective and outcomes;
  2. guidance on how to demonstrate whether a program represented a proper use of public money, including the cost-effectiveness of its implementation, and how to undertake an economic evaluation where appropriate; and
  3. guidance related to evaluation of pilot programs.

Department of Home Affairs’ response: Agreed.

Recommendation no. 4

Paragraph 2.55

The Departments of Health and Aged Care, Veterans’ Affairs and Home Affairs develop and implement explicit guidance to support early engagement with central evaluation teams to improve evaluation strategy and planning.

Department of Health and Aged Care’s response: Agreed.

Department of Veterans’ Affairs’ response: Agreed.

Department of Home Affairs’ response: Agreed.

Recommendation no. 5

Paragraph 3.25

The Departments of Health and Aged Care, Veterans’ Affairs and Home Affairs ensure evaluation plans are prepared for policies or programs subject to evaluation requirements and that they be approved prior to the implementation of the policy or program. Consistent with the Commonwealth Evaluation Toolkit, evaluation plans should incorporate proportionate and risk-based level of information, including:

  1. methods for measuring or capturing baseline evidence, and attributing changes to the pilot, policy or program; and
  2. a method of economic evaluation or other means of assessing the proper use of public money.

Department of Health and Aged Care’s response: Agreed.

Department of Veterans’ Affairs’ response: Agreed.

Department of Home Affairs’ response: Agreed.

Recommendation no. 6

Paragraph 4.50

The Departments of Veterans’ Affairs’ and Home Affairs’ advice to government on the cessation, continuation or scaling up of a pilot draws on evidence and learnings from the evaluation, including limitations on the robustness of the evaluation undertaken.

Department of Veterans’ Affairs’ response: Agreed.

Department of Home Affairs’ response: Agreed.

Summary of entity responses

18. The proposed audit report was provided to Health, DVA, Home Affairs and the Department of the Treasury. Letters of response provided by each entity are included at Appendix 1. The summary responses provided are included below. The improvements observed by the ANAO during the course of this audit are at Appendix 2.

Department of Health and Aged Care

The Department of Health and Aged Care welcome the findings, in the report and accept the recommendation directed to the department. The department is committed to implementing the recommendations effectively and has already taken steps to address issues identified in this audit.

The ANAO found the department has largely effective governance arrangements to support evaluation. The audit also found the department’s evaluation culture is maturing, including:

  • updating our guidance and training on what, when and how to conduct an evaluation.
  • establishing the role of Chief Evaluation Officer to provide strategic oversight of evaluation activities and to engage with other Senior Executive to champion evaluation as part of policy design and program management.

The department notes the finding on the need to develop better guidance on conducting economic evaluations or other means of assessing the proper use of public money.

Since the audit was conducted, the department has launched its Strategic Investment Framework, which makes sure our policy and program officers embed evaluation and evidence within all programs. The Framework will ensure investments are supported by robust, evidence-based program evaluation and target funding to high-value programs aligned with priority areas.

The department notes that the audit on the Kava Pilot Program was undertaken while the pilot period was still under way, and certain aspects of the pilot, including recommendations to Government on the future of the Program, are yet to be finalised.

The department is building its in-house evaluation capability through a range of initiatives including:

  • implementing the new Evaluation Strategy 2023-26
  • developing a suite of departmental-specific tools and resources to support high-quality evaluation.
  • partnering with Australian Centre for Evaluation in Treasury and leveraging opportunities to showcase in-house impact evaluation capability.

Department of Veterans’ Affairs

The Department of Veterans’ Affairs (DVA) welcome the ANAO recommendations. The ANAO report acknowledges that DVA has established policies and processes that largely support compliance with the Commonwealth Evaluation Policy (the Policy).

The Department acknowledge and agree with the ANAO’s recommendations. Work is planned for 2024 to review and update the relevant policies and protocols to enhance maturity with the Commonwealth Evaluation Policy requirements, and work has already commenced to implement these enhancements.

Department of Home Affairs

The department agree with the recommendations, and as part of its ongoing efforts to strengthen evaluation, acknowledge the benefits of a more robust evaluation culture to inform Government decision-making.

The department continues to leverage Commonwealth resources and materials to assist in guiding staff on how an evaluation should be carried out. To supplement the Commonwealth Evaluation Toolkit, the department is developing additional resources to assist staff in determining when, and to what extent, an evaluation should be conducted.

The department is monitoring the outcomes of the Skilled Refugee Labour Agreement to build a sufficient evidence base to assess the viability and future scalability of the program. The department’s advice to Government on the future of the Skilled Refugee Labour Agreement Pilot will be informed by an evaluation consistent with the Commonwealth Evaluation Policy.

Department of the Treasury

Treasury welcomes the report and agrees with the recommendation to update guidance in the Commonwealth Evaluation Toolkit (the Toolkit). Specifically, Treasury will update the Toolkit to include a definition of a ‘pilot’, and provide guidance on: economic evaluation, evaluation of pilots, and governance arrangements for cross-entity evaluations.

Treasury’s guidance on whether spending represents an appropriate use of public money will focus on (and be limited to) guidance on economic evaluation methods, and other fit-for-purpose evaluation approaches. The broader importance of appropriately using public money is well addressed through the suite of guidance administered by the Department of Finance to support resource management and therefore will not be duplicated through Treasury materials.

Treasury will recommend, but not mandate, that all pilots are subject to evaluation consistent with the principles-based Commonwealth Evaluation Policy, which recommends that responsible managers need to determine robust, proportional evaluation approaches for specific pilots or programs.

The Department of the Treasury is committed to continuous improvement of the Evaluation Toolkit. Planned enhancements will include more practical guidance on analytical methods, including economic evaluation, and effective governance arrangements that can help to improve the way Commonwealth entities assess implementation, measure the impact of government programs, and frame policy decisions.

Key messages from this audit for all Australian Government entities

19. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.

Group title

Performance and impact measurement

Key learning reference
  • Pilots provide an opportunity to effectively evaluate new or amended policy and program design and activities to ensure expenditure is an efficient, effective, economical, and ethical use of public money.
  • Strong evaluation culture is needed to build effective evaluation capacity. This includes senior leaders prioritising evaluation activities, accessible and tailored guidance and tools for staff, transparently sharing lessons learnt, and acting on evidence-based outcomes and recommendations.
  • Early engagement with evaluation expertise is needed to determine the appropriate type of evaluation for the policy or program, to identify and manage evaluation risks, and ensure the collection, and appropriate assessment, of data and information to draw robust policy conclusions.
  • Early planning to identify and capture baseline and relevant data and information will help to support robust data analysis during the evaluation.