Browse our range of reports and publications including performance and financial statement audit reports, assurance review reports, information reports and annual reports.
Currently showing reports relevant to the Legal and Constitutional Affairs Senate estimates committee. [Remove filter]
Summary and recommendations
Background
1. The Australian Government Crisis Management Framework (AGCMF) outlines the Australian Government’s approach to preparing for, responding to, and recovering from crisis.1 The AGCMF describes an ‘all-hazards’ approach that includes mitigation, planning, and assisting states and territories to manage emergencies resulting from natural events.2
2. The AGCMF has been used to respond to a variety of crises between 2020 and 2023 including:
- the COVID-19 pandemic;
- natural disasters such as prolonged flood events across Australia and tropical cyclone events;
- cyber security incidents including data breaches involving Medibank and Latitude Financial, and the security breach affecting the email gateway system supporting some ACT Government systems; and
- the Turkiye and Syria earthquake for which the Australian Government committed humanitarian assistance.
3. In March 2023, government agreed to conduct a review of the AGCMF. The Department of the Prime Minister & Cabinet (PM&C) conducted this review. Following the 2023 AGCMF Review, a revised AGCMF was released at the 2024–25 Higher Risk Weather Season National Preparedness Summit in Canberra on 18–19 September 2024.3
Rationale for undertaking the audit
4. The AGCMF is the basis for the Australian Government’s response to crises including pandemics, natural disasters, terrorism, and cyber incidents. This audit provides assurance to the Parliament on whether Australian Government entities have identified and applied lessons from crises between 2020 and 2023, including the COVID-19 pandemic, to the AGCMF in preparation for future severe to catastrophic crises.
5. In its report on the Department of Foreign Affairs and Trade’s crisis management approaches, the Joint Committee of Public Accounts and Audit (JCPAA) recommended that the Auditor-General consider undertaking a performance audit of the AGCMF, and include within the audit scope whether the updated framework adequately reflects lessons learned from the COVID-19 pandemic.4 The JCPAA also identified an audit of the AGCMF as an audit priority of the Parliament in 2022–23.
Audit objective and criteria
6. The audit objective was to assess whether the Australian Government has established an appropriate framework for responding to crises.
7. To form a conclusion against the objective, the following high-level criteria were adopted:
- Has the readiness of systems and processes to respond to crises been assessed?
- Is the AGCMF fit for purpose to respond to a changing threat environment?
8. The audit examined whole-of-government crisis coordination arrangements established through seven versions of the AGCMF between 2020 and 2023, and the 2023 review of the AGCMF undertaken by PM&C. The audit focussed on whole-of-government crisis coordination arrangements between 2020 and 2023 including the supporting mechanisms to convene key committees under the AGCMF.
9. The audit did not examine:
- the application of the framework to the response to the COVID-19 pandemic or other crises;
- the adherence to individual national plans required under the AGCMF;
- agency specific crisis coordination arrangements; or
- operational responses to crises.
Conclusion
10. In establishing the revised AGCMF, PM&C has developed a largely appropriate framework for responding to crises. The revised AGCMF incorporates lessons from prior crises and provides increased guidance for all-hazards responses, including complex and concurrent crises. The increased oversight and additional continuous improvement activities established in the revised AGCMF will be important to ensure the framework remains appropriate for responding to crises over time as threats and the environment continue to evolve. The revised AGCMF represents a shift in approach from previous versions of the AGCMF and will require sustained effort to build and maintain appropriate capability.
11. A structured assessment of the readiness of systems and processes contained in the AGCMF was not undertaken prior to the 2023 Review. Updates to the AGCMF during 2020 to 2023 were administrative in nature and reflected changes that had already been operationalised. The roles and responsibilities set out under previous versions of the AGCMF were not clearly defined. The 2023 AGCMF Review was guided by a project plan which captured evidence from a range of inputs including comprehensive stakeholder engagement and testing of recommendations and proposed actions. Clarifying arrangements for annual updates and future comprehensive reviews is important to ensure these activities adequately capture and address required changes in a timely manner. The lessons management capability and associated processes are evolving. Formal lessons activities are not conducted for all crises. Thresholds for conducting a lessons process had not been defined or documented prior to 2024.
12. The revised AGCMF released in September 2024 incorporates an increased emphasis on continuous improvement and improved oversight. These amendments, if effectively implemented, should position the framework to respond to a changing threat environment. Activities that informed the 2023 AGCMF Review, such as ‘futures workshops’, would provide value to the framework into the future as they provide an opportunity to examine whether the framework is strategically positioned to adapt to the future. The revised AGCMF introduces several new roles. The responsibilities of these roles are largely clear. Until 2024, there has been a lack of oversight over national level plans to ensure they are reviewed and updated. The annual national exercise program conducted by the National Emergency Management Agency (NEMA) has primarily focussed on natural disaster scenarios. Compounding non-natural disaster specific impacts are now being integrated into natural disaster scenario-based exercises within the program. There is scope to improve the transparency and currency of national plans and risk planning in relation to shared risks and key management personnel risks.
Supporting findings
Readiness of systems and processes
13. Within the AGCMF, specific hazards are identified with lead ministers and entities assigned to these hazards. The emergence of newly identified hazards has led to updates in the AGCMF. Space weather events were added as a specific hazard as they were identified as posing a risk to critical infrastructure. Cyber incidents were added as a specified hazard following a review of crises that indicated roles and responsibilities were not clearly defined. Under previous versions of the AGCMF, triggers and thresholds for activation of whole-of-government crisis coordination were broad and did not provide clear guidance to entities. There are multiple mechanisms that support crisis coordination and response. Some of these mechanisms were not defined in the AGCMF. The role and interactions between various crisis mechanisms could have been more clearly defined. The National Coordination Mechanism (NCM) was introduced as a means to provide broader engagement than previously existing arrangements. The NCM was embedded in the AGCMF after it became a regularly used mechanism during the COVID-19 pandemic response. (See paragraphs 2.3 to 2.33)
14. Updates undertaken annually between 2020 and 2023 were largely limited to documenting machinery of government changes. These updates varied in the approach and stakeholder engagement. There was no engagement with states and territories as part of the administrative updates in 2020, the second update in 2021 or 2022. More significant comments relating to the framework were held over in anticipation of a future review, which was conducted in 2023. The 2023 AGCMF Review had not been approved at the time. The approach to the 2023 AGCMF Review was guided by a project plan which captured evidence from a range of inputs including comprehensive stakeholder engagement and testing of recommendations and proposed actions. There are minor gaps in documentation relating to the analysis of some of this evidence base. Lessons management, including a lessons management capability, to inform continuous improvement activities is evolving. (See paragraphs 2.34 to 2.79)
15. There are gaps in lesson management at the whole-of-government level. As the lessons management capability matures, implementation of actions to address identified lessons is improving. During crises between 2019 and 2023, an APS Surge Reserve was established from lessons relating to capability across the APS. While intended to provide additional personnel capacity in the event of a crisis, the APS Surge Reserve provides staff with generalist skills. The 2023 AGCMF Review identified a gap in suitably qualified staff for crisis management roles. NEMA has sought opportunities to utilise the Centres for National Resilience for certain crises, however, an agreement to utilise Department of Finance managed centres has not yet been established. NEMA has established the National Emergency Management Stockpile to enable the rapid deployment of resources. (See paragraphs 2.80 to 2.97)
Responding to a changing threat environment
16. Risk assessments do not include potential key management personnel risks. The 2023 ACGMF Review incorporated strategic risk consideration including future scenario planning which had not previously been conducted. The Crisis Appreciation and Strategic Planning (CASP) methodology has been embedded in NEMA’s approach to operational response activities, however, the methodology has not yet been established as a consistent planning tool across the range of entities involved in crisis management, or in horizon scanning activities to detect emerging threats. When fully embedded, the CASP methodology has the potential to provide a robust approach to planning and preparedness as well as recovery. (See paragraphs 3.3 to 3.37)
17. The revised AGCMF provides increased clarity on roles and responsibilities. This includes introduction of a tiered crisis coordination model intended to provide greater flexibility as crises evolve. The revised AGCMF groups key information relating to roles and responsibilities together for an easier read. The Handbook provides additional guidance to senior officials. The revised AGCMF has largely addressed feedback obtained during the 2023 AGCMF Review to improve the clarity of the arrangements for the available crisis mechanisms. PM&C have identified ongoing activities are required to support the implementation of the revised AGCMF including by improving capability. (See paragraphs 3.38 to 3.67)
18. Previous versions of the AGCMF did not establish oversight arrangements for the full suite of national level plans to ensure they are reviewed and updated to respond to future events. The September 2024 version of the AGCMF establishes oversight arrangements. As at July 2024, thirty-two per cent of the publicly available plans have not been updated in the last three years. (See paragraphs 3.68 to 3.80)
19. NEMA delivers two annual national-level exercises primarily focussed on multi-jurisdictional natural disasters. Since 2022 compounding non-natural disaster specific impacts such as mass power outages and supply chain issues have been included in NEMA led exercises. Prior to 2024, there were gaps in the arrangements to identify and prioritise whole-of-government exercises. There are limitations with arrangements to capture information relating to exercises led by other entities, reducing the ability to advise government on the preparedness of Australian Government entities to response to crises. The expanded role of the Crisis Arrangements Committee under the revised AGCMF provides coverage of these gaps. Higher Risk Weather Season (HRWS) preparedness has evolved with the addition of ministerial exercises and the HRWS National Preparedness Summit. (See paragraphs 3.81 to 3.111)
Recommendations
Recommendation no. 1
Paragraph 2.44
The Department of the Prime Minister and Cabinet:
- document a process for annual administrative updates that provides a consistent approach including ensuring appropriate records of engagement and input are maintained; and
- ensure significant issues are documented to be considered in comprehensive reviews of the AGCMF.
Department of the Prime Minister and Cabinet response: Agreed.
Recommendation no. 2
Paragraph 2.75
The Department of the Prime Minister and Cabinet:
- provide stronger guidance to entities in their development and updating of entity level and relevant national level crisis management policies and plans; and
- provide a formal response to the Joint Committee of Public Accounts and Audit that outlines actions taken to address recommendation three from Report 494: Inquiry into the Department of Foreign Affairs and Trade’s crisis management arrangements.
Department of the Prime Minister and Cabinet response: Agreed.
Recommendation no. 3
Paragraph 3.23
The Department of the Prime Minister and Cabinet embed arrangements for future scenario planning into ongoing review and update arrangements for the AGCMF. These should be appropriately documented to ensure lessons are captured and can be learned.
Department of the Prime Minister and Cabinet response: Agreed.
Recommendation no. 4
Paragraph 3.79
The Department of the Prime Minister and Cabinet include in the Australian Government Crisis Management Handbook criteria for the publication of plans to appropriately inform stakeholders of crisis arrangements.
Department of the Prime Minister and Cabinet response: Agreed.
Recommendation no. 5
Paragraph 3.97
The National Emergency Management Agency document its consideration of Crisis Arrangements Committee advice on gaps and priorities for whole-of-government exercising, as well as the annual analysis undertaken to review and update the list of identified hazards under AGCMF, to inform the development of the annual national exercise program. This should include ensuring that exercises consider both natural and all-hazard scenarios.
National Emergency Management Agency response: Agreed.
Summary of entity responses
20. The proposed audit report was provided to the Department of the Prime Minister and Cabinet and the National Emergency Management Agency. Letters of response provided by each entity are included at Appendix 1. The summary responses provided are included below. The improvements observed by the ANAO during the course of this audit are at Appendix 2.
Department of the Prime Minister and Cabinet
The Department of the Prime Minister and Cabinet (PM&C) welcomes the proposed report on the Australian Government Crisis Management Framework (AGCMF). PM&C accepts the key findings and recommendations, and has commenced steps to address these matters.
PM&C is committed to strengthening the Australian Government’s crisis management arrangements and preparedness in partnership with other Australian Government agencies. It has undertaken a comprehensive review of the AGCMF, resulting in the development of a new and enhanced Framework, supporting Handbook and more robust continuous improvement processes. It will continue to enhance guidance under these products to guide the publication of plans, assessment of staffing capacities and the development of surge arrangements.
PM&C will also continue to work other relevant agencies, including the National Emergency Management Agency (NEMA), to enhance guidance on national planning and preparedness activities, including human rights considerations and consider options to clarify crisis responsibilities following machinery of government changes. It will establish improved guidance and repeatable processes for the annual review of the AGCMF, as well as for future comprehensive reviews, to ensure lessons from future scenario planning and exercises are captured. PM&C will also assess its senior staffing capacities in the context of crisis response.
National Emergency Management Agency
The National Emergency Management Agency (NEMA) welcomes the findings of the ANAO Performance Audit of the Australian Government Crisis Management Framework (AGCMF) and is committed to preparing Australia for all hazard crisis events, now and into the future. The Performance Audit complements the recent review of the AGCMF. NEMA will continue to work with the Department of the Prime Minister and Cabinet (PM&C), the Australian Government, jurisdictions, industry and non-government organisations for continuous improvement in crisis management preparedness.
NEMA will work with PM&C to ensure whole-of-government crisis exercising aligns to the priorities identified by the Crisis Arrangements Committee, including consideration of natural and all-hazard impacts and consequences.
Acknowledging the current and future risk of consecutive, compounding and concurrent crises, NEMA will continue building crisis capability within the agency and across the Australian Government. NEMA will work alongside PM&C to assess crisis workforce planning needs and increase crisis workforce capability.
NEMA is committed to building the Australian Government’s strategic crisis planning capability through the Crisis Appreciation and Strategic Planning (CASP) methodology. We will continue to support a nationally-consistent approach to planning and preparedness activities through CASP, ensuring Australians and their communities are supported before, during and after crisis events.
Key messages from this audit for all Australian Government entities
21. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.
Governance and risk management
Records management
Summary and recommendations
Background
1. The Department of Finance’s Resource Management Guide 206 defines a ‘corporate credit card’ as a credit card used by Commonwealth entities to obtain goods and services on credit.1 Credit cards are used by Commonwealth entities to support timely and efficient payment of suppliers for goods and services.2 For the purposes of the Public Governance, Performance and Accountability Act 2013 (PGPA Act), credit cards include charge cards (such as VISA, Mastercard, Diners and American Express cards) and vendor cards (such as travel cards and fuel cards).
2. The Federal Court of Australia (FCA) uses corporate credit cards for official purchases under $10,000 and CabCharge cards for domestic taxi fares. For 2021–22 and 2022–23, the FCA’s total credit card expenditure was approximately $2.1 million, comprising 13,393 transactions. Credit card expenditure represented 16 per cent of the FCA’s supplier expenses across the two years.3
Rationale for undertaking the audit
3. The misuse of corporate credit cards, whether deliberate or not, has the potential for financial losses and reputational damage to government entities and the Australian Public Service. The Australian Public Service Commission (APSC) states that:
establishing a pro-integrity culture at the institutional level means setting a culture that values, acknowledges and champions proactively doing the right thing, rather than purely a compliance-driven approach which focuses exclusively on avoidance of wrongdoing.4
4. In describing the role of Senior Executive Service (SES) officers, the APSC states that the SES ‘set the tone for workplace culture and expectations’, they ‘are viewed as role models of integrity’ and ‘are expected to foster a culture that makes it safe and straightforward for employees to do the right thing’.5 The New South Wales Independent Commission Against Corruption identifies organisational culture and expectations as a key element in preventing corruption and states:
[T]he way that an agency’s senior executives, middle managers and supervisors behave directly influences the conduct of staff by conveying expectations of how staff ought to act. This is something that affects an agency’s culture.6
5. Deliberate misuse of a corporate credit card is fraud. The National Anti-Corruption Commission’s Integrity Outlook 2022/23 identifies fraud, which includes the misuse of credit cards, as a key corruption and integrity vulnerability.7 The Commonwealth Fraud Risk Profile indicates that credit cards are a common source of internal fraud risk. Previous audits have identified issues in other entities relating to positional authority for approving credit card transactions8 and ineffective controls to manage the use of credit cards.9 This audit was conducted to provide the Parliament with assurance that the FCA is effectively managing corporate credit cards in accordance with legislative and entity requirements.
6. This audit is one of a series of compliance with credit card requirements that apply a standard methodology. The four entities included in the ANAO’s 2023–24 compliance with credit card requirements series are the:
- Federal Court of Australia (FCA);
- Australian Research Council;
- National Disability Insurance Agency; and
- Productivity Commission.
Audit objective and criteria
7. The objective of the audit was to assess the effectiveness of the FCA’s management of the use of corporate credit cards for official purposes in accordance with legislative and entity requirements.
8. To form a conclusion against the objective, the ANAO examined:
- whether the FCA has effective arrangements in place to manage the issue, return and use of corporate credit cards; and
- whether the FCA has implemented effective controls and processes for corporate credit cards in accordance with its policies and procedures.
Conclusion
9. The FCA’s management of the use of corporate credit cards for official purposes in accordance with legislative and entity requirements has been partly effective, as there were weaknesses in its implementation of preventive and detective controls.
10. The FCA’s arrangements for managing the issue, return and use of corporate credit cards were largely effective. The FCA had considered risks associated with the use of corporate credit cards within its overarching risk framework and identified relevant controls. Policies and procedures included core requirements but lacked detail on eligibility criteria for issuing cards and requirements for using CabCharge cards. No structured training and education arrangements were in place to promote compliance with policy and procedural requirements. While arrangements had been established for monitoring and reporting on credit card issue, return and use, detailed reporting was not provided to the FCA’s executive management on credit card non-compliance. The FCA did not respond to Parliamentary questions on notice with accurate reporting on credit card use.
11. The FCA’s implementation of controls and processes for corporate credit cards in accordance with its policies and procedures was partly effective. There were weaknesses in its preventive controls relating to assessing and recording business needs for issuing credit cards and documenting pre-approval and rationales for purchases. The implementation of detective controls was partly effective, with no managerial review process for CabCharge card transactions and no use of data analytics across credit card transactions to detect potential instances of purchase splitting. These deficiencies heighten the risk that instances of credit card non-compliance could go undetected. Where misuse was detected, the FCA used established escalation processes and mechanisms to deal with non-compliant transactions.
Supporting findings
Arrangements for managing corporate credit cards
12. The FCA had considered risks associated with the use of corporate credit cards within its overarching risk framework and identified relevant controls. Its enterprise-level fraud risk register identified misuse or unauthorised use of corporate credit cards as a low risk. The register was last updated in June 2021 and the FCA had not recently tested the effectiveness of its credit card controls. Risk monitoring and reporting arrangements were undergoing change and yet to be formalised. (See paragraphs 2.4 to 2.21)
13. The FCA’s policies and procedures for the issue, return and use of corporate credit cards included core requirements, which were covered within the FCA’s accountable authority instructions and other policies. Eligibility requirements for issuing credit cards could be improved by defining business need criteria for card issuance in policies and procedures. More guidance could be provided on using and acquitting CabCharge cards. (See paragraphs 2.22 to 2.44)
14. While the FCA had published relevant policies and procedures on its intranet, it did not provide structured training and education to promote compliance with corporate credit card policy and procedural requirements. Support was provided by the FCA’s finance team for cardholders and managers upon card issuance and when requested. (See paragraphs 2.45 to 2.48)
15. The FCA had arrangements in place for monitoring and reporting on the issue, return and use of corporate credit cards. Credit card usage was monitored by the FCA’s finance team. Detailed reporting on credit card non-compliance was not provided to the FCA’s executive management, diminishing its understanding of fraud, risk and integrity implications arising from non-compliance. While FCA reported on credit card usage and expenditure when requested by Parliament, there were errors in its reporting. (See paragraphs 2.49 to 2.58)
Controls and processes for corporate credit cards
16. The FCA’s implementation of preventive controls did not include a systematic approach to assessing and recording that staff have valid business needs prior to issuing credit cards. There were control weaknesses in documenting pre-approvals and rationales for entertainment purchases and purchases covered by whole-of-government arrangements. There were also control weaknesses in documenting pre-approvals for CabCharge card transactions that fell outside approved domestic travel budgets. The card returns process placed a reliance on the relevant manager both identifying that the employee had a card to return and ensuring the card was either returned to the finance team or destroyed. (See paragraphs 3.4 to 3.35)
17. The FCA has implemented detective controls to acquit, verify and review transactions. The monthly acquittal process for corporate credit card transactions could be improved by ensuring CabCharge card transactions are acquitted by cardholders and signed off by responsible managers. The FCA could make greater use of data analytics to identify potential non-compliance, such as purchase that have been split to avoid transaction limits. (See paragraphs 3.37 to 3.58)
18. Deficiencies in the FCA’s credit card control framework heighten the risk that instances of credit card non-compliance could go undetected. The FCA detected four instances of non-compliance in 2022–23 that triggered the escalation protocols in its credit card policy. This has led to the recovery of funds from cardholders and merchants where accidental misuse and fraudulent transactions were identified. (See paragraphs 3.60 to 3.63)
Recommendations
Recommendation no. 1
Paragraph 2.26
The Federal Court of Australia update its policies and procedures for issuing credit cards and CabCharge cards to provide guidance on eligibility criteria and accurately reflect current processes.
Federal Court of Australia response: Agreed.
Recommendation no. 2
Paragraph 2.38
The Federal Court of Australia update its policies and procedures for credit card use to provide additional guidance on receipt and approval requirements for CabCharge cards.
Federal Court of Australia response: Agreed.
Recommendation no. 3
Paragraph 3.19
The Federal Court of Australia establish a process to confirm evidence of pre-approval by the designated official and the rationale for spending are documented when acquitting credit card transactions for official hospitality and entertainment purchases and instances where whole-of-government arrangements are not being utilised.
Federal Court of Australia response: Agreed.
Recommendation no. 4
Paragraph 3.28
The Federal Court of Australia establish a process to ensure evidence of pre-approval and receipts are recorded for all CabCharge card transactions.
Federal Court of Australia response: Agreed.
Recommendation no. 5
Paragraph 3.45
The Federal Court of Australia establish a process to ensure CabCharge card transactions are acquitted by cardholders and approved by their responsible managers on a monthly basis.
Federal Court of Australia response: Agreed.
Recommendation no. 6
Paragraph 3.57
The Federal Court of Australia formalise and document its process for conducting periodic analysis of credit card transactions targeting key areas of risk, including purchase splitting, and update its policies and procedures to prohibit purchase splitting.
Federal Court of Australia response: Agreed.
Summary of entity response
19. The proposed audit report was provided to the FCA. The FCA’s summary response is reproduced below. Its full response is included at Appendix 1. Improvements observed by the ANAO during the course of the audit are listed at Appendix 2.
The Federal Court of Australia (the Court) acknowledges and agrees the recommendations of the Australian National Audit Office and accepts the identified areas where the Court has opportunity to improve.
The Court will continue to focus on strengthening the current processes and guidance that are necessary to reduce risks associated with the potential inappropriate use of credit cards.
Key messages from this audit for all Australian Government entities
20. This audit is part of a series of audits that apply a standard methodology to corporate credit card management in Commonwealth entities. The four entities included in the ANAO’s 2023–24 corporate credit card management series are the:
- Federal Court of Australia;
- Australian Research Council;
- National Disability Insurance Agency; and
- Productivity Commission.
21. Key messages from the ANAO’s series of credit card management audits will be outlined in an Insights product available on the ANAO website.
Summary and recommendations
Background
1. Evaluation is a structured assessment of the value of government programs or activities, aimed at supporting improvement, accountability, and decision-making throughout the policy cycle. Pilot programs are small-scale tests or trials of programs with the aim of informing future decision-making.
2. The Public Governance, Performance and Accountability Act 2013 (the PGPA Act) requires the accountable authority of a Commonwealth entity to measure and assess the performance of the entity in achieving its purposes1, and that a minister must not approve expenditure unless satisfied that the expenditure would be an efficient, effective, economical and ethical use of public money.2
3. In 2019, the Australian Government released the Independent Review of the Australian Public Service3, which recommended that the APS embed a culture of evaluation and learning from experience to underpin evidence-based policy and delivery (Recommendation 26). The Australian Government agreed in part to this recommendation.4 The Minister for Finance endorsed a Commonwealth Evaluation Policy5 and Resource Management Guide 130 Commonwealth Evaluation Toolkit6 (the Toolkit) on 1 December 2021. The Toolkit provides a principles-based approach for the conduct of evaluations. It applies to all Commonwealth entities and companies subject to the PGPA Act.
Rationale for undertaking the audit
4. Pilot programs are trial programs of limited size that are used to decide whether a proposed policy should be adopted, and what adjustments should be made before adoption. Monitoring and evaluation are critical components of a pilot to support an assessment of the program or activity’s impact and efficiency.
5. The audit involved the examination of five Australian Government pilot programs across the Department of Health and Aged Care (Health), the Department of Home Affairs (Home Affairs), and the Department of Veterans’ Affairs (DVA). The pilots ranged in length from two to three years. The audit provides assurance to the Parliament over the appropriateness of frameworks for evaluation, and the adequacy of evaluation of pilot programs.
Audit objective and criteria
6. The objective of the audit was to assess the effectiveness of the evaluation of selected Australian Government pilot programs.
7. To form a conclusion against this objective, the following high-level criteria were adopted:
- Do the selected entities have governance arrangements in place to support effective program evaluation?
- Was the evaluation approach for the selected pilot programs robust?
- Was pilot program reporting and advice to government appropriate?
Conclusion
8. The evaluation of the selected Australian Government pilot programs was mixed. Health’s evaluation of the Take Home Naloxone pilot was largely effective, and the evaluation of the Kava pilot was partly effective. DVA’s evaluation of the Wellbeing and Support Program pilot was largely effective, and the evaluation of the Non-Liability Rehabilitation pilot was partly effective. Home Affairs’ evaluation of the Skilled Refugee Labour Agreement pilot was partly effective.
9. Health and DVA have largely effective governance arrangements to support the evaluation of pilot programs. Home Affairs has partly effective arrangements. Health and DVA have strengthened their governance arrangements through the updating or development of entity-specific frameworks, guidance, and training on what, when and how to conduct an evaluation. Home Affairs does not have entity-specific evaluation guidance. Evaluation culture is maturing within Health and DVA, and is immature at Home Affairs. Pilot programs are only referenced in DVA’s entity-specific guidance.
10. The evaluation planning and approach for Health’s Take Home Naloxone pilot and DVA’s Wellbeing and Support Program pilot were largely robust, including appropriate stakeholder engagement and relevant ethics approvals. Planning for the evaluation of Health’s Kava pilot did not identify the risk that ethics approval may not be granted for one of the planned qualitative analysis methods, and there was a lack of baseline evidence to support the planned evaluation methodology. The effectiveness of planning for the evaluation of DVA’s Non-Liability Rehabilitation pilot was reduced as the analytical methodologies were not documented, and no external stakeholders were consulted. Home Affairs did not complete its planning for, or undertake, a robust evaluation for the Skilled Refugee Labour Agreement pilot. All evaluation plans and approaches could have been enhanced by a greater focus on the availability of data and an assessment of the proper use of public money.
11. Health’s evaluation reporting and advice to the Australian Government for the Take Home Naloxone pilot was largely effective, with the recommendations made to expand the naloxone pilot largely informed by the lessons learnt from the evaluation. Health’s evaluation reporting and advice to the Australian Government for the Kava pilot was partly effective as neither the evaluation report nor recommendations on the continuation of the pilot have been provided to the Australian Government. The evaluation report for DVA’s Wellbeing and Support Program pilot was largely effective. There was no evidence of DVA advising the Australian Government on the evaluation findings and impact on future program design. The evaluation for the Non-Liability Rehabilitation pilot has not yet commenced, and reporting and advice to the Australian Government on the mid-pilot review was partly effective. Home Affairs’ evaluation reporting and advice to the Australian Government for the Skilled Refugee Labour Agreement pilot was partly effective, with outputs rather than pilot outcomes analysed and reported to the minister.
Supporting findings
Governance arrangements
12. The Commonwealth Evaluation Toolkit provides appropriate high-level guidance to support entities in determining what programs or policies should be evaluated and when. It provides limited guidance on conducting an economic evaluation, including any assessment of cost effectiveness of implementation, and does not include a requirement for all pilots to be evaluated.
- With the exception of DVA’s Non-Liability Rehabilitation pilot, at the time the other examined pilots commenced, only Health had established internal evaluation guidance.
- In November 2023, Health published a revised evaluation strategy which specifies roles and responsibilities and includes a tiered system for identifying evaluation priorities across the department.
- Since the commencement of the Wellbeing and Support Program pilot, DVA developed a framework which supports when and what to evaluate based on program characteristics, timing and capability. In August 2023, DVA introduced a framework for the planning, monitoring and evaluation of its health and wellbeing programs, which includes roles and responsibilities.
- Home Affairs does not have an entity-specific approach to determining when and what to evaluate.
- Each entity has an internal evaluation team to provide guidance and support on evaluation practice.
13. Health and DVA have policies and guidance materials for how to conduct program evaluations. Only Health has guidance on when economic evaluation should be undertaken and the guidance is limited. Training on evaluation practices is provided at Health and DVA. Attendance is not consistently monitored. Home Affairs has no entity-specific guidance on conducting evaluations, and no training programs available to staff. (See paragraphs 2.59 to 2.69)
Evaluation approach
14. Planning for evaluation, including stakeholder engagement, was completed for Health’s Take Home Naloxone pilot and DVA’s Wellbeing and Support Program pilot. Planning for stakeholder engagement for evaluation of Health’s Kava pilot did not account for the risk that ethics approval may not be granted and the resulting impact on the planned analysis and evaluation methodology. The effectiveness of planning for the evaluation of DVA’s Non-Liability Rehabilitation pilot was reduced as the methodologies to be used were not documented, and no external stakeholders were consulted. Home Affairs did not complete its planning for the evaluation for the Skilled Refugee Labour Agreement pilot. While data sources were identified within the evaluation plans that were developed, one or more planned data sources within each pilot were not available for the evaluation, and this risk had not been identified. (See paragraphs 3.6 to 3.52)
15. The evaluation methodologies used for three out of the five pilots examined were largely consistent with the Toolkit. For the evaluations conducted, all could have been strengthened with a greater focus on baseline data, control group outcomes, and an assessment of the proper use of public money. Ethics approvals were obtained for Health’s Take Home Naloxone pilot and DVA’s Wellbeing and Support Program pilot. The ethics approval sought for Health’s Kava pilot was not granted and no alternative strategy was developed to obtain information that was critical to the evaluation. DVA’s Non-Liability Rehabilitation pilot evaluation plan did not include a consideration of ethics approval and the post-implementation review has not yet been undertaken. As Home Affairs did not conduct an evaluation of its pilot, there was no methodology applied, or consideration of the need for ethics approval. (See paragraphs 3.53 to 3.80)
Reporting and advice to the Australian Government
16. The analysis of pilot evaluation outcomes for Health’s pilots and DVA’s Wellbeing and Support Program pilot were largely fit for purpose, with the evaluation reports documenting the application of statistical methods to provide defensible findings and make recommendations on the basis of the analysis completed. The evaluation of DVA’s Non-Liability Rehabilitation pilot has not yet commenced. Home Affairs’ reporting of outputs of the Skilled Refugee Labour Agreement pilot did not contain fit-for-purpose analysis and does not satisfy the requirements of evaluation reporting in the Commonwealth Evaluation Toolkit. (See paragraphs 4.5 to 4.33)
17. Advice provided by Health to the Australian Government in relation to the Take Home Naloxone pilot was appropriate, including the lessons learnt from the pilot. The recommendation to expand the pilot into different environments was partly informed by evaluation. Health has not provided advice to the Australian Government on the findings of the evaluation or lessons learnt in relation to the Kava pilot. DVA did not advise the Minister for Veterans’ Affairs on the evaluation findings or lessons learnt for future program delivery for the Wellbeing and Support Program pilot. Home Affairs’ advice to the Australian Government for the continuation of the Skilled Refugee Labour Agreement pilot was not informed by an evaluation. (See paragraphs 4.34 to 4.52)
Recommendations
Recommendation no. 1
Paragraph 2.16
The Department of the Treasury update the Commonwealth Evaluation Policy and Toolkit to include:
- a definition of a ‘pilot’;
- guidance on how to conduct an economic evaluation and other methods for considering whether spending represents an appropriate use of public money;
- a recommendation that evaluations of pilot programs be undertaken;
- a recommendation for evaluation planning to be conducted alongside pilot design; and
- guidance on governance arrangements for cross-entity evaluations to minimise duplication and maximise coordination and learnings across entities.
Department of the Treasury’s response: Agreed.
Recommendation no. 2
Paragraph 2.32
The Departments of Health and Aged Care and Veterans’ Affairs include in their entity-specific evaluation policies:
- decision-making criteria for the appropriate style of evaluation to be completed by reference to the activity’s risk, objective and outcomes;
- guidance on how to demonstrate whether a program represented a proper use of public money, including the cost-effectiveness of its implementation, and how to undertake an economic evaluation where appropriate; and
- guidance related to evaluation of pilot programs.
Department of Health and Aged Care’s response: Agreed.
Department of Veterans’ Affairs’ response: Agreed.
Recommendation no. 3
Paragraph 2.35
The Department of Home Affairs develop entity-specific policies for evaluation, including:
- decision-making criteria as to when an evaluation is required and the appropriate style of evaluation by reference to the activity’s risk, objective and outcomes;
- guidance on how to demonstrate whether a program represented a proper use of public money, including the cost-effectiveness of its implementation, and how to undertake an economic evaluation where appropriate; and
- guidance related to evaluation of pilot programs.
Department of Home Affairs’ response: Agreed.
Recommendation no. 4
Paragraph 2.55
The Departments of Health and Aged Care, Veterans’ Affairs and Home Affairs develop and implement explicit guidance to support early engagement with central evaluation teams to improve evaluation strategy and planning.
Department of Health and Aged Care’s response: Agreed.
Department of Veterans’ Affairs’ response: Agreed.
Department of Home Affairs’ response: Agreed.
Recommendation no. 5
Paragraph 3.25
The Departments of Health and Aged Care, Veterans’ Affairs and Home Affairs ensure evaluation plans are prepared for policies or programs subject to evaluation requirements and that they be approved prior to the implementation of the policy or program. Consistent with the Commonwealth Evaluation Toolkit, evaluation plans should incorporate proportionate and risk-based level of information, including:
- methods for measuring or capturing baseline evidence, and attributing changes to the pilot, policy or program; and
- a method of economic evaluation or other means of assessing the proper use of public money.
Department of Health and Aged Care’s response: Agreed.
Department of Veterans’ Affairs’ response: Agreed.
Department of Home Affairs’ response: Agreed.
Recommendation no. 6
Paragraph 4.50
The Departments of Veterans’ Affairs’ and Home Affairs’ advice to government on the cessation, continuation or scaling up of a pilot draws on evidence and learnings from the evaluation, including limitations on the robustness of the evaluation undertaken.
Department of Veterans’ Affairs’ response: Agreed.
Department of Home Affairs’ response: Agreed.
Summary of entity responses
18. The proposed audit report was provided to Health, DVA, Home Affairs and the Department of the Treasury. Letters of response provided by each entity are included at Appendix 1. The summary responses provided are included below. The improvements observed by the ANAO during the course of this audit are at Appendix 2.
Department of Health and Aged Care
The Department of Health and Aged Care welcome the findings, in the report and accept the recommendation directed to the department. The department is committed to implementing the recommendations effectively and has already taken steps to address issues identified in this audit.
The ANAO found the department has largely effective governance arrangements to support evaluation. The audit also found the department’s evaluation culture is maturing, including:
- updating our guidance and training on what, when and how to conduct an evaluation.
- establishing the role of Chief Evaluation Officer to provide strategic oversight of evaluation activities and to engage with other Senior Executive to champion evaluation as part of policy design and program management.
The department notes the finding on the need to develop better guidance on conducting economic evaluations or other means of assessing the proper use of public money.
Since the audit was conducted, the department has launched its Strategic Investment Framework, which makes sure our policy and program officers embed evaluation and evidence within all programs. The Framework will ensure investments are supported by robust, evidence-based program evaluation and target funding to high-value programs aligned with priority areas.
The department notes that the audit on the Kava Pilot Program was undertaken while the pilot period was still under way, and certain aspects of the pilot, including recommendations to Government on the future of the Program, are yet to be finalised.
The department is building its in-house evaluation capability through a range of initiatives including:
- implementing the new Evaluation Strategy 2023-26
- developing a suite of departmental-specific tools and resources to support high-quality evaluation.
- partnering with Australian Centre for Evaluation in Treasury and leveraging opportunities to showcase in-house impact evaluation capability.
Department of Veterans’ Affairs
The Department of Veterans’ Affairs (DVA) welcome the ANAO recommendations. The ANAO report acknowledges that DVA has established policies and processes that largely support compliance with the Commonwealth Evaluation Policy (the Policy).
The Department acknowledge and agree with the ANAO’s recommendations. Work is planned for 2024 to review and update the relevant policies and protocols to enhance maturity with the Commonwealth Evaluation Policy requirements, and work has already commenced to implement these enhancements.
Department of Home Affairs
The department agree with the recommendations, and as part of its ongoing efforts to strengthen evaluation, acknowledge the benefits of a more robust evaluation culture to inform Government decision-making.
The department continues to leverage Commonwealth resources and materials to assist in guiding staff on how an evaluation should be carried out. To supplement the Commonwealth Evaluation Toolkit, the department is developing additional resources to assist staff in determining when, and to what extent, an evaluation should be conducted.
The department is monitoring the outcomes of the Skilled Refugee Labour Agreement to build a sufficient evidence base to assess the viability and future scalability of the program. The department’s advice to Government on the future of the Skilled Refugee Labour Agreement Pilot will be informed by an evaluation consistent with the Commonwealth Evaluation Policy.
Department of the Treasury
Treasury welcomes the report and agrees with the recommendation to update guidance in the Commonwealth Evaluation Toolkit (the Toolkit). Specifically, Treasury will update the Toolkit to include a definition of a ‘pilot’, and provide guidance on: economic evaluation, evaluation of pilots, and governance arrangements for cross-entity evaluations.
Treasury’s guidance on whether spending represents an appropriate use of public money will focus on (and be limited to) guidance on economic evaluation methods, and other fit-for-purpose evaluation approaches. The broader importance of appropriately using public money is well addressed through the suite of guidance administered by the Department of Finance to support resource management and therefore will not be duplicated through Treasury materials.
Treasury will recommend, but not mandate, that all pilots are subject to evaluation consistent with the principles-based Commonwealth Evaluation Policy, which recommends that responsible managers need to determine robust, proportional evaluation approaches for specific pilots or programs.
The Department of the Treasury is committed to continuous improvement of the Evaluation Toolkit. Planned enhancements will include more practical guidance on analytical methods, including economic evaluation, and effective governance arrangements that can help to improve the way Commonwealth entities assess implementation, measure the impact of government programs, and frame policy decisions.
Key messages from this audit for all Australian Government entities
19. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.
Performance and impact measurement
Summary and recommendations
Background
1. New and emerging technologies play an important role in delivering digital services for Australian Government entities. As the development, integration and use of technology increases, so does the number of possible entry or weak points that malicious cyber actors can exploit. This is commonly referred to as the ‘attack surface’.1 It is important that Australian Government entities continue to uplift their cyber security maturity and implement arrangements to manage cyber security incidents2 effectively. The ability to maintain business continuity following a cyber security incident is critical to ensuring the continued provision of government services.
2. Australian Government entities are attractive, high-value targets for a range of malicious cybercriminals because they hold the personal and financial information of Australians.3 In 2022–23, approximately 31 per cent of cyber security incidents reported to the Australian Signals Directorate (ASD) were from non-corporate Commonwealth entities. Over 40 per cent of these cyber security incidents were coordinated, low-level malicious cyberattacks directed specifically at the Australian Government, government shared services, or regulated critical infrastructure.4 Ransomware was the most destructive cybercrime threat in 2022–235 and continues to pose considerable risk to Australian Government entities, businesses and individuals.
Rationale for undertaking the audit
3. On 22 November 2023, the Australian Government released the 2023–30 Australian Cyber Security Strategy which outlines a forecast approach towards uplifting Australia’s cyber resilience as well as ‘[building] … national cyber readiness [and] proactively identifying and closing gaps in … cyber defences and incident response plans’.
4. Australian Government entities are expected to be ‘cyber exemplars’, as they receive, process and store some of Australia’s most sensitive data to support the delivery of essential public services.6 Whilst there were reported improvements from 2022, ASD’s 2023 Cyber Security Posture Report highlighted that the overall maturity level across entities remained low in 2023.7
5. Previous audits conducted by the ANAO identified low levels of cyber resilience in entities. Low levels of cyber resilience continue to make entities susceptible to cyberattack and reduce business continuity and recovery prospects following a cyber security incident. An entity’s preparedness to respond to and recover from a cyberattack is a key part of cyber resilience. This audit was conducted to provide assurance to Parliament about the effectiveness of the selected entities’ implementation of arrangements for managing cyber security incidents.
Audit objective, criteria and scope
6. The objective of this audit was to assess the effectiveness of the selected entities’ implementation of arrangements for managing cyber security incidents in accordance with the Protective Security Policy Framework (PSPF) and relevant ASD Cyber Security Guidelines.
7. To form a conclusion against the audit objective, the following high-level criteria were adopted:
- Do the Australian Transaction Reports and Analysis Centre (AUSTRAC) and Services Australia have appropriately designed and implemented cyber security incident management procedures?
- Have AUSTRAC and Services Australia effectively implemented cyber security incident management processes for investigating, monitoring and responding to cyber security incidents?
- Have AUSTRAC and Services Australia effectively implemented recovery processes that mitigate disruptions during and after cyber security incidents?
Engagement with the Australian Signals Directorate
8. Independent timely reporting on the implementation of the cyber security policy framework supports public accountability by providing an evidence base for the Parliament to hold the executive government and individual entities to account. Previous ANAO reports on cyber security have drawn to the attention of Parliament and relevant entities the need for change in entity implementation of mandatory cyber security requirements, at both the individual entity and framework levels.
9. In preparing audit reports to the Parliament on cyber security in Australian Government entities, the interests of accountability and transparency must be balanced with the need to manage cyber security risks. ASD has advised the ANAO that adversaries use publicly available information about cyber vulnerabilities to more effectively target their malicious activities.
10. The extent to which this report details the cyber security vulnerabilities of individual entities was a matter of careful consideration during the course of this audit. To assist in appropriately balancing the interests of accountability and potential risk exposure through transparent audit reporting, the ANAO engaged with ASD to better understand the evolving nature and extent of risk exposure that may arise through the disclosure of technical information in the audit report. This report therefore focusses on matters material to the audit findings against the objective and criteria and contains less detailed technical information than previous audits. Detailed technical information flowing from the audit was provided to the relevant accountable authorities during the audit process to assist them to gain their own assurance that their remediation plans are focussed on improving cyber resilience as required and support reliable reporting through the existing cyber security policy framework.
Conclusion
11. The implementation of arrangements by AUSTRAC and Services Australia to manage cyber security incidents has been partly effective. Neither entity is well placed to ensure business continuity or disaster recovery in the event of a significant or reportable cyber security incident.
AUSTRAC
12. AUSTRAC has partly effective cyber security incident management procedures for investigating, monitoring and responding to cyber security incidents. It has established management structures and a framework of procedures to support these processes. It has not detailed the responsibilities for its Chief Information Security Officer (CISO), its approach to continuous monitoring and improvement reporting, or defined timeframes for reporting to stakeholders.
13. AUSTRAC has partly implemented effective response processes that mitigate disruptions during and after cyber security incidents. It has established a Security Information and Event Management (SIEM) solution and processes for reporting cyber security incidents. The coverage of log events is not in accordance with ASD’s Cyber Security Guidelines. AUSTRAC does not have an event logging policy and does not document its analysis of all cyber security events.
14. AUSTRAC has procedures to support its cyber security incident recovery processes. These procedures do not include the security and testing of backup solutions, nor detail the systems, applications and servers supporting critical business processes. AUSTRAC performs recovery of backups as part of business area requests. It does not perform testing of restoration of backups for disaster recovery purposes.
Services Australia
15. Services Australia is partly effective in its design of cyber security incident management procedures. It has established a framework of procedures and an incident response plan. It has not documented an approach to threat and vulnerability assessments. Services Australia does not have a policy covering the management of cyber security incidents.
16. Services Australia has partly effective cyber security incident response procedures for investigating and responding to cyber security incidents. It has procedures for managing data spills, malicious code infections and external instructions. It has implemented a Security Information and Event Management (SIEM) solution and a systematic approach to monitoring and prioritisation of alerts. Services Australia has not established a timeframe for triage and escalation activities nor a process for analysing archived SIEM data. Services Australia has not defined an approach for cyber security investigations.
17. Services Australia has partly implemented effective recovery processes that mitigate disruptions during and after cyber security incidents. It has developed business continuity and disaster recovery plans and implemented regular backups. Its plans do not include all systems and applications supporting critical business processes and it does not test the recoverability of backups.
Supporting findings
AUSTRAC
18. AUSTRAC has established management structures and responsibilities for managing cyber security incidents. However, it has not documented the assigned responsibilities for its CISO although the CISO is empowered to make decisions. AUSTRAC has documented a framework of procedures for cyber security risk and incident management. However, it does not detail a process for reviewing, updating and testing its cyber security incident management procedures, nor has it implemented a security maturity monitoring plan that details an approach that defines a continuous improvement cycle as well as reporting to management. AUSTRAC has developed reporting processes for significant or reportable cyber security incidents. AUSTRAC does not document cyber security incident meetings, nor has it defined timeframes for reporting to relevant stakeholders. (See paragraphs 2.6 to 2.32)
19. AUSTRAC has reporting processes for reporting significant or reportable cyber security incidents to internal and external stakeholders. These processes do not include the engagement of relevant expertise in other business areas, such as legal advisors, and do not ensure the integrity of evidence supporting cyber security investigations. AUSTRAC has documented cyber security incident monitoring and response procedures. It has not developed an event log policy for handling and containing malicious code infections or intrusions, or containment actions in the event of a data spill. AUSTRAC has implemented a Security Information and Event Management (SIEM) solution. Its coverage of event logs is not in accordance with ASD’s Cyber Security Guidelines. It undertakes an analysis of event logs and escalates significant or reportable cyber security incidents to management and relevant external stakeholders. It does not record or document its analysis of non-significant cyber security events, nor has it defined timeframes for triage and escalation activities. AUSTRAC is able to analyse data within its SIEM solution, it does not have a process for retrieving and analysing production and archived SIEM data. (See paragraphs 2.33 to 2.65)
20. AUSTRAC has documented procedures to support its cyber security incident recovery processes. These procedures do not include the security and testing of backup solutions, nor detail the systems, applications and servers supporting critical business processes. AUSTRAC has not tested the recoverability of its systems and applications supporting critical business processes. It has not included all relevant systems, including the tools used for managing backups, within disaster recovery testing schedules and security policies. AUSTRAC is not well placed to ensure business continuity or disaster recovery in the event of a significant or reportable cyber security incident. AUSTRAC has primary and secondary data centres to support its approach to regular backups. AUSTRAC performs recovery of backups as part of business area requests. It does not perform testing of restoration of backups for disaster recovery purposes. It does not have a process for extracting and analysing production and archive backup data. AUSTRAC’s incident reports include post-incident learning and post-remediation analysis. These reports are not used to review or update existing cyber security recovery procedures, with potential improvements highlighted in these reports not being considered for incorporation into existing cyber security documentation. (See paragraphs 2.66 to 2.93)
Services Australia
21. Services Australia has established management structures and responsibilities for its management of cyber security incidents. It has not documented an approach to threat and vulnerability assessments, nor does it have a policy covering the management of cyber security incidents but it does have a security maturity monitoring plan although this does not detail an approach that defines a continuous improvement cycle as well as reporting to management. Services Australia has developed a cyber security incident response plan and a trusted insider program. However, its trusted insider program has not considered input from other business areas, such as its legal function. Services Australia’s critical asset and data registers do not have complete information on critical systems and data assets. Services Australia has documented a framework of procedures for cyber security risk and incident management. However, it does not detail a process for reviewing, updating and testing its cyber security incident management procedures. Services Australia has reporting processes that provide regular reporting of cyber security incidents, including significant or reportable cyber security incidents, to internal and external stakeholders. It has not defined the timeframes for reporting to relevant stakeholders and the consideration of engaging other relevant expertise, such as legal advisors, during reporting processes. (See paragraphs 3.6 to 3.44)
22. Services Australia has documented its approach for managing data spills, malicious code infections and intrusions. It has not established processes for reviewing, updating and testing these cyber security incident response procedures. Services Australia has implemented a Security Information and Event Management (SIEM) solution and developed a systematic approach to the monitoring and prioritisation of security alerts. Services Australia has an Event Logging and Monitoring Policy. It has not established processes for extracting, retrieving and analysing archived SIEM data, nor has it defined the timeframe requirements for triage and escalation activities. Services Australia has not defined an approach for cyber security investigations. (See paragraphs 3.45 to 3.73)
23. Services Australia has not defined an approach to digital preservation related to cyber security incidents and regular backups and nor does it have business continuity or disaster recovery plans that address all systems, including the systems which support the critical recovery processes. It is not well placed to ensure business continuity or disaster recovery in the event of a significant or reportable cyber security incident. Services Australia has processes for performing regular backups. These processes do not include all platforms and Services Australia does not test the restoration of data, applications and settings from backups as part of disaster recovery exercises. Services Australia has not appropriately documented an embedded post-incident learning approach following a cyber security incident. Services Australia has not established a process that leverages post-incident learnings to review and improve the effective implementation of arrangements to manage cyber security incidents. (See paragraphs 3.74 to 3.103)
Recommendations
Recommendation no. 1
Paragraph 2.24
Australian Transaction Reports and Analysis Centre develops and implements:
- policies that define the responsibilities of the Chief Information Security Officer in accordance with the Protective Security Policy Framework requirements; and
- a security maturity monitoring plan that defines a continuous improvement cycle as well as reporting to management, including documenting the determination of reporting frequency and escalation.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 2
Paragraph 2.31
Australian Transaction Reports and Analysis Centre develops and implements:
- processes for ensuring cyber security incident meetings are documented;
- timeframes for reporting to relevant external stakeholders; and
- processes that ensure regular risk reporting to its portfolio minister and the Department of Home Affairs.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 3
Paragraph 2.41
Australian Transaction Reports and Analysis Centre develops and implements:
- procedures that define assigned security roles and responsibilities for coordinating responses, including engagement of relevant expertise; and
- processes for managing and maintaining evidence during and after cyber security investigations.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 4
Paragraph 2.47
Australian Transaction Reports and Analysis Centre develops and implements:
- an approach for containment actions that restrict access to data, systems and networks in the event of a data spill; and
- an event log policy for handling and containing malicious code infections or intrusions.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 5
Paragraph 2.57
Australian Transaction Reports and Analysis Centre implements a strategy for Security Information and Event Management (SIEM) solution coverage that is in accordance with Australian Signals Directorate’s Guidelines for System Monitoring and performs a risk assessment to support any deviations from the guideline’s recommendations.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 6
Paragraph 2.63
Australian Transaction Reports and Analysis Centre establishes:
- a process for retrieving and analysing production Security Information and Event Management (SIEM) solution data held within its SIEM solution and archived SIEM data;
- record keeping requirements for triage and escalation activities over non-significant cyber security events to ensure completeness of activities; and
- timeframe requirements for triage and escalation activities.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 7
Paragraph 2.78
Australian Transaction Reports and Analysis Centre develops and implements:
- disaster recovery testing schedules that include backup solutions;
- business continuity planning processes that incorporate the systems, applications and servers which support critical business processes; and
- processes that test the recoverability of its systems and applications supporting critical business processes, including implementing any lessons learned into future testing schedules.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 8
Paragraph 2.88
Australian Transaction Reports and Analysis Centre establishes a program that assesses the effectiveness of recovery processes for all production and archived backup data.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 9
Paragraph 2.92
Australian Transaction Reports and Analysis Centre leverage its post-incident learning approaches following a cyber security incident to inform a process that reviews, updates and tests all of the relevant security documentation for the effective management of cyber security incidents. That is:
- supporting security documentation to its security plans;
- framework of procedures for cyber security incident management;
- associated guidance for cyber security incident response; and
- associated guidance for cyber security incident recovery.
Australian Transaction Reports and Analysis Centre response: Agreed.
Recommendation no. 10
Paragraph 3.18
Services Australia updates its trusted insider program with the support of legal advice and other relevant expertise and ensure it is fit for purpose across the organisation.
Services Australia response: Agreed.
Recommendation no. 11
Paragraph 3.23
Services Australia updates its systems criticality assessments and data registers with the necessary information to confirm the criticality of each system and data asset.
Services Australia response: Agreed.
Recommendation no. 12
Paragraph 3.29
Services Australia establishes a Cyber Security Incident Management Policy or include ‘cyber security incidents’ as part of the scope of the Incident Management and Escalation Policy.
Services Australia response: Agreed.
Recommendation no. 13
Paragraph 3.35
Services Australia develops and implements an approach that ensures continuous monitoring and improvement reporting is provided to management, including documenting the determination of reporting frequency and escalation.
Services Australia response: Agreed.
Recommendation no. 14
Paragraph 3.43
Services Australia designs and implements procedures detailing:
- the timeframes for reporting to internal and external stakeholders; and
- roles and responsibilities for coordinating responses, including engagement of relevant expertise.
Services Australia response: Agreed.
Recommendation no. 15
Paragraph 3.59
Services Australia develops and implements procedures detailing:
- the process for performing cyber security investigations in accordance with the Australian Government Investigations Standard; and
- the process for managing and maintaining evidence during and after cyber security investigations.
Services Australia response: Agreed.
Recommendation no. 16
Paragraph 3.71
Services Australia develops and implements:
- a process for retrieving and analysing archived Security Information and Event Management (SIEM) solution data; and
- timeframe requirements for triage and escalation activities.
Services Australia response: Agreed.
Recommendation no. 17
Paragraph 3.87
Services Australia develop and implement:
- a policy for digital preservation;
- a policy for regular backups;
- business continuity and disaster recovery plans that include the systems, applications and servers which support their critical recovery processes; and
- processes that test the recoverability of their systems and applications supporting critical business processes, and implement any lessons learned into future testing plans.
Services Australia response: Agreed.
Recommendation no. 18
Paragraph 3.96
Services Australia establish a program that assesses the effectiveness of recovery processes for all production and archived backup data.
Services Australia response: Agreed.
Recommendation no. 19
Paragraph 3.101
Services Australia develops its post-incident learning approaches following a cyber security incident to inform a process that reviews, updates and tests all of the relevant security documentation for the effective management of cyber security incidents. That is:
- supporting security documentation to their security plans;
- framework of procedures for cyber security incident management;
- associated guidance for cyber security incident response; and
- associated guidance for cyber security incident recovery.
Services Australia response: Agreed.
Summary of entity responses
24. The proposed audit report was provided to AUSTRAC and Services Australia. The entities’ summary responses are reproduced below. Their full responses are included at Appendix 1. Improvements observed by the ANAO during the course of this audit are listed at Appendix 2.
AUSTRAC
AUSTRAC welcomes the review and the opportunity to reflect on its processes and procedures for managing cybersecurity incidents. AUSTRAC maintains that our processes to date have enabled effective management of cyber security incidents if and as they occur, involving prioritisation, escalation and seeking internal and external expertise to inform AUSTRAC’s effective cyber security incident response. AUSTRAC welcomes the ANAO’s recommendations, which will support AUSTRAC to strengthen our approach to cybersecurity incident management through greater clarity and certainty provided by documenting much of our existing approach and enhancing it where gaps have been identified. In response to the recommendations, AUSTRAC will update key incident response plans and documents, as well as develop testing schedules consistent with our risk profile and appetite and operational requirements.
Services Australia
Services Australia (the Agency) notes the audit findings and the recommendations for the Agency associated with improving the management of cyber security. The Agency agrees with the recommendations, and will work towards further strengthening controls in the identified areas.
The Agency takes its responsibility to safeguard the personal information and data of its customers very seriously, as well as the need to ensure continuity of the essential services and payments that the Agency provides. I consider that the implementation of the recommendations contained in the report will support the Agency in achieving those outcomes.
Key messages from this audit for all Australian Government entities
Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.
Governance and risk management
Performance and impact measurement
Summary and recommendations
Background
1. Established in 1948, the Adult Migrant English Program (AMEP) provides free English tuition to eligible migrants and humanitarian entrants with no or low English levels.1 The program supports an average of 53,000 participants annually. The design of the program recognises that learning English can help migrant settlement.
2. Contracted delivery of the Adult Migrant English Program (AMEP) has been in place for 25 years.2 The current contracts commenced in 2017 and deliver English language lessons in over 300 locations and online, to clients in metropolitan, regional and remote locations in Australia. The 15 contracts3 were initially valued at $1.22 billion which had increased by 75 per cent) to more $2.153 billion as at April 2024, representing an average annual contract value of $287 million. There is also a contract for quality assurance, for which the reported value increased from $6.15 million to $22.52 million. The October 2022 Federal Budget included $20 million to provide more flexible delivery options for the program and to increase case management support for students (to deliver on an election commitment).
3. All contracts were due to end on 30 June 2023. These contract arrangements were extended in December 2022, to 30 June 2024 (with work orders under the contracts not due to expire until 31 December 2024), after a government decision to delay implementation of a further new AMEP business model.4 The request for tender for new AMEP contracts was to be issued in September 2023, and new arrangements to commence 1 January 2025. On 30 November 2023, the Minister for Immigration, Citizenship and Multicultural Affair approved a further delay5, with no new release date advised. In February 2024 the department advised the ANAO that advice to the minister and a new policy proposal were in development and the release date of the request for tender is dependent on government agreement to a new model and implementation date.
Rationale for undertaking the audit
4. Contracts for the delivery and quality assurance of delivery of AMEP are valued at over $2 billion and will have been in place for at least seven and a half years by the time they are replaced. The ANAO has audited the management of AMEP contracts once previously (in 20016), with the audit report including six recommendations, all of which were agreed to. Those recommendations related to improving program performance management and reporting; strategic management and coordination; management of financial risk; and monitoring of contractor performance.
5. The audit provides assurance to the Parliament that the department is appropriately administering the Adult Migrant English Program contracts.
Audit objective and criteria
6. The objective of the audit was to assess whether the design and administration of AMEP is effective.
7. To form a conclusion against the objective, the following high-level criteria were applied:
- Are appropriate contractual arrangements in place?
- Are the service provider contracts appropriately managed?
- Are contracted quality assurance services being delivered to an appropriate standard?
8. In addition to auditing the management of contracts by the Department of Home Affairs, the audit used the follow the money power provided under paragraph 18B(1)(b) of the Auditor-General Act 1997 to examine the performance of Linda Wyse and Associates (LWA), the quality assurance provider for the AMEP.
Conclusion
9. The design and administration of the Adult Migrant English Program contracts has not been effective.
10. Appropriate contractual arrangements are not in place with the 13 general service providers. The contracts are continuing to operate past their stated completion date, despite there being no extension options in the contracts. While the contracts, and associated instructions, clearly outline the contracted deliverables:
- there have been significant variations made to each of the contracts, with insufficient documentation to evidence that each variation represented value for money to the Australian Government and the records of the variations are inadequate;
- there are deficiencies in the processes by which the department has engaged advisers and contracted the existing service providers to identify areas that could benefit from adaptation of new ideas and innovative service delivery to enhance client outcomes (referred to as innovative projects);
- there is no probity plan for the management of the contracts, and inadequate departmental transition planning for the end of the contracts.
11. The general service provider contracts have not been appropriately managed. A comprehensive set of contracted performance indicators was in place when the contracts were first signed, but that framework has been amended over time such that it no longer addresses the educational outcomes being achieved by students, the accuracy of provider assessments of student educational outcomes or the timeliness of service provider provision of data to the department. The only indicator that has remained relates to the extent to which eligible students commence in the program. In addition:
- invoice verification processes have not been sufficiently robust; and
- the department has not implemented a previously agreed recommendation that it would use complaints data from providers to inform and improve service delivery for students.
12. In its administration of the contract with the firm engaged as quality assurance provider, the Department of Home Affairs (Home Affairs) has not obtained appropriate assurance over the work of the 13 contracted general service providers. Key factors that led to this result include:
- the contractual performance framework was diminished after the quality assurance provider was selected and the contract signed, and key performance indicators have not been met notwithstanding that the targeted quantity of work has been reduced over time by the department;
- the approach to planning quality assurance work is not risk based; and
- Home Affairs has significantly changed the nature of services provided away from quality assurance over the work of the general service providers. For 2023–24, the department decided that 15 per cent of the budget for the provider would be spent on quality assurance work, down from 78 per cent in the first year of the contract.
Supporting findings
Contractual arrangements
13. The service provider contracts, and associated Service Provider Instructions, clearly outlined the contracted deliverables. (See paragraphs 2.2 to 2.7)
14. Significant variations have been made to each of the contracts since they were signed such that the terms and conditions are now different in important respects from the procurement opportunity that was presented to the market. The department has not kept adequate records of the contract variations that have occurred. Home Affairs’ records of decisions to vary the contracts do not adequately address value for money considerations and therefore do not demonstrate that each of the variations has been appropriate. (See paragraphs 2.8 to 2.30)
15. Contracted advisers have not been engaged through appropriate procurement processes. In addition to engaging advisers, the contracts with service providers were amended to enable the department to engage them to deliver ‘innovative projects’ that are additional to the services they were contracted to deliver at the conclusion of the 2017 procurement process. (See paragraphs 2.33 to 2.42)
16. There is no probity plan for the management of the AMEP contracts. As a result, there are no conflict of interest declaration and other probity risk management requirements in place for the AMEP contracts. (See paragraphs 2.47 to 2.49)
17. Appropriate transition management plans are not in place. The contracts were due to end on 30 June 2023 and did not include any extension options yet, due to delays with the procurement process to replace them, they are continuing to operate past the stated completion date. As of December 2023, the department had not finalised and approved a transition out plan for the existing contracts, notwithstanding that those contracts were originally due to expire in June 2023 (they are now due to expire on 30 June 2024, with work orders under the contracts due to expire on 31 December 2024). Further, the draft transition in plan is substantively incomplete, reflecting the uncertainty about the future contractual arrangements (the tender process for replacement contracts has been subject to delays). (See paragraphs 2.53 to 2.63)
Service provider contract management
18. The introduction of an information technology system to support the department’s oversight of contractor service delivery has not proceeded. The department’s continued use of a system that was not replaced as planned has not provided a sound basis for monitoring service provider performance, or to support the payment of invoiced amounts. The failure to introduce the planned new system also required the department to make additional payments to the service providers to recognise the additional administrative burden placed on them, and has meant one of the four key performance indicators for the general service provider contracts has not been applied. (See paragraphs 3.2 to 3.9)
19. The contracts, when first signed, established a performance measurement and management framework for the general service providers, focused on four key performance indicators (KPIs). The request for tender that led to the contracts being signed had stated that the four KPIs represented ‘a minimum performance standard that service providers will be expected to meet and it is an expectation of the department that service providers will strive to deliver above these standards’. The department has amended the framework over time such that the suite of four KPIs have not been used to inform contract management:
- none of the four KPIs were applied for the first 12 months of the contract term;
- the KPI relating to data timeliness has not been applied at all;
- the target addressing the KPI relating to the accuracy of service provider assessments of client learning outcomes was first reduced and was later paused in November 2021 for the remaining term of the contracts; and
- the English attainment progress KPI has been removed. (See paragraphs 3.10 to 3.20)
20. The invoices for general service providers have not been appropriately verified. Invoicing and payments to the 13 AMEP general service providers has not consistently adhered to the contracts with issues identified in a number of areas including the application of goods and services tax, fee indexation and backdating fee increases. (See paragraph 3.24 to 3.28)
21. The Department of Home Affairs does not have appropriate complaint resolution processes for the delivery of services under the Adult Migrant English Program, and has not implemented an ANAO recommendation from a 2000–01 audit report7 that it had agreed to. While the contractual framework includes appropriate arrangements to enable the department to monitor the number and nature of complaints being received by the 13 general service providers, the department has not effectively administered those arrangements. As a result, the department is unable to assure itself that service providers are meeting their obligations for the timely and effective handling of complaints, and the department does not analyse complaints data to identify opportunities to improve service delivery across the program. (See paragraph 3.29 to 3.34)
Quality assurance services
22. The contracted performance management framework has not been appropriately implemented by Home Affairs. The KPI framework was changed after the completion of the procurement process to select the provider of quality assurance services, and does not reflect the full scope of services expected of the contractor. Further, notwithstanding that the department has reduced over time the amount of quality assurance reviews required to be undertaken8, in only two years has the provider reported undertaking the (reduced) number of client file verifications specified in the annual plan (a shortfall of 27 per cent in the first six years) and has only undertaken the (reduced) number of onsite quality assurance reviews in 2022–23 (a shortfall of 20 per cent in the first six years). (See paragraphs 4.2 to 4.10)
23. Performance of AMEP service providers has not been a direct input into the development of quality assurance work. The department has not consistently implemented a risk-based approach to quality assurance work. The department decided to cease using a risk-based approach in 2020–21 and a proportional approach, based on student populations, was instead implemented from 2021–22. (See paragraphs 4.15 to 4.27)
24. The budgeting for, and tasking of, the quality assurance provider, by Home Affairs, has significantly changed the nature of services provided under this contract. The contracted provider has identified that the changes have redirected services from the intended purpose of the quality assurance role. As a result of the department refocusing the work of the contracted quality assurance provider to the delivery of professional development and development of ‘program delivery documents’, the quality assurance activities planned and delivered have not appropriately monitored the performance of the contracted general service providers. (See paragraphs 4.31 to 4.68)
25. The contractual arrangements in place to not allow an evidence-based assessment of whether the work of the quality assurance provider has improved the performance of the general service providers. (See paragraphs 4.73 to 4.77)
26. The invoices for the AMEP quality assurance provider have not been appropriately verified or paid in accordance with the AMEP quality assurance contract. (See paragraphs 4.78 to 4.79)
Recommendations
Recommendation no. 1
Paragraph 2.18
To meet its record keeping obligations and ensure appropriate performance management of contracts, the Department of Home Affairs develop a complete record of all contract variations, including those variations agreed through correspondence, together with a master version of the contracts that incorporates all variations.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 2
Paragraph 2.31
When considering potential contract variations for the Adult Migrant English Program, the Department of Home Affairs make a decision-making record that addresses whether the proposed changes represent value for money, including by reference to the value for money assessment that underpinned the procurement decision-making prior to the contract being awarded.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 3
Paragraph 2.43
The Department of Home Affairs introduce stronger governance arrangements over the process by which it engages service providers under the Adult Migrant English Program to identify areas that could benefit from adaptation of new ideas and innovative service delivery to enhance client outcomes including opportunities to offer these opportunities to open competition.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed
Recommendation no. 4
Paragraph 2.50
The Department of Home Affairs develop a probity plan to govern the management of contracts for the Adult Migrant English Program.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 5
Paragraph 2.64
The Department of Home Affairs improve its transition planning for the Adult Migrant English Program by:
- finalising the transition out plan for the current contracts and, for future contracts, preparing the transition out plan early in the new contract period; and
- aligning the development of the transition in plan for the replacement contracts with the preparation of the approach to market documentation.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 6
Paragraph 3.21
The Department of Home Affairs establish a comprehensive suite of performance indicators and targets in the service provider contracts for the Adult Migrant English Program, require that service providers report performance against the indicators and targets and take appropriate contract management action where performance is below requirements.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 7
Paragraph 3.35
The Department of Home Affairs analyse and review complaints data from the general service providers for the Adult Migrant English Program to inform and improve service delivery to students.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 8
Paragraph 4.11
The Department of Home Affairs strengthen the contractual performance management framework for the provision of quality assurance services for the Adult Migrant English Program.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 9
Paragraph 4.28
The Department of Home Affairs undertake a systematic, documented, evidence-based approach to determining and targeting quality assurance activities based on general service provider performance and other risk information known to the department.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Recommendation no. 10
Paragraph 4.69
The Department of Home Affairs give greater emphasis to monitoring the quality of services being delivered to students by the contracted general service providers.
Department of Home Affairs response: Agreed.
Linda Wyse and Associates response: Agreed.
Summary of entity response
27. The proposed audit report was provided to Home Affairs and Linda Wyse and Associates (LWA), the quality assurance provider for the AMEP. The letters of response are included in Appendix 1. Summary responses from Home Affairs and LWA are reproduced below.
Department of Home Affairs
The department has agreed to the recommendations made by the ANAO. While acknowledging room for improvement, the Department does not consider the ANAO’s findings, listed below, reflect and recognise the environment in which the AMEP Agreements were being delivered:
- The design and administration of the current Agreements have not been effective, and
- The appropriate contractual arrangements are not in place.a
The AMEP has successfully delivered English language tuition to eligible migrants and humanitarian entrants, including during a period of unprecedented disruption due to the impact of COVID-19.b
Since implementation of Administrative Orders, transferring the administration of the AMEP to the department in July 2019, the department has sought to strengthen processes, procedures and the technology that support the management of the Agreements.c Several recommendations made have previously been identified by the department as opportunities for improvement in the design of the future contract/s.d The procurement process for this future contract cycle has been delayed due to the change of Government and subsequent program setting reviews. Through the future contract/s, the department will implement an enhanced performance management framework, including key performance indicators supporting the strategic intent of the AMEP and effective performance management, and deliver a new IT system supporting the administration of the contract/s.
ANAO comments on Department of Home Affairs summary response
28. ANAO comments regarding Home Affair’s summary responses are included below, with rejoinders to the letter of response included within Appendix 1.
- An important consideration in the ANAO’s conclusions that the design and administration of the contracts has not been effective relates to the Key Performance Indicators (KPIs) for service providers. Of the four KPIs identified included in the contracts that commenced in July 2017 to establish ‘a minimum performance standard’ that service providers were expected to meet, three are no longer in place, including the KPI relating to the desired program outcome of progressive English attainment by students (see paragraphs 3.11 to 3.20).
- The impact of the COVID-19 pandemic on the program, and/or how the impact was addressed or not addressed by the department, is discussed throughout the audit report (paragraphs 1.4, 2.7, 2.26 and 2.27, 3.15, 4.25, 4.31, 4.36, 4.57, 4.60 to 4.68 and Table 3.1).
- Administration of the AMEP contracts transferred into the Department of Home Affairs nearly five years ago, in July 2019. Relevant records and some key staff moved with the program to the Department of Home Affairs (see footnote 28). Consistent with the ‘Collaborative Agreement’ signed by the two departments in October 2019, changes made since 2019 by the Department of Employment and Workplace Relations to the parts of the head contracts that apply to both AMEP and SEE have occurred through engagement with, and input from, the Department of Home Affairs. Home Affairs agreed that the department responsible for the SEE program would be the lead agency, and that any variations to the general clauses in the contract must be agreed by both departments.
- The ANAO notes the scope of its work has been on the current contracts and as such the findings relate to the administration of the current contracts. The ANAO has not audited the design of future contract/s, for which no request for tender has yet been issued.
Linda Wyse and Associates
LWA welcomes the report and its recognition that the Adult Migrant English Program (AMEP) contract faced many challenges in its tenure. The disruption caused by COVID-19 cannot be underestimated; it was the impetus for the change in work carried out by LWA to support AMEP providers facing the challenges of moving from face-to-face classes to online delivery for a diverse cohort of clients, who personified the digital divide.
LWA agrees in full with the 10 recommendations given in the report, recognising the importance of assuring quality and ensuring documentation accurately captures quality assurance activities. We feel strongly that monitoring the quality of services being delivered to the student is an important duty of the quality assurance provider and welcome the ANAO recommendation to give more emphasis to this activity.
LWA is committed to working with the Department to implement the recommendations and is initiating steps, as noted against the relevant recommendation, to address the areas identified for improvement.
Key messages from this audit for all Australian Government entities
29. Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.