The objective of this audit was to examine the extent to which the Department of Infrastructure and Regional Development, now the Department of Home Affairs (the Department) has implemented the recommendations made by the ANAO in Audit Report #5 2016–17, Passenger Security Screening at Domestic Airports.

Summary and recommendations

Background

1. The aim of domestic passenger screening is to prevent prohibited items such as weapons and explosives from being carried onto aircraft. Specialised equipment and screening personnel are used to detect and control prohibited items at 62 security controlled airports across Australia.1 Since October 2012, the number of domestic aircraft passengers has trended upwards with 62.13 million domestic passengers carried on 639 400 regular public transport and charter aircraft trips during the year ending October 2017.2

2. On 31 August 2016, the Australian National Audit Office (ANAO) tabled Audit Report No.5 2016–17, Passenger Security Screening at Domestic Airports in the Parliament. In that report, the ANAO found that the Department (then, the Department of Infrastructure and Regional Development) was unable to provide assurance that passenger screening was effective, or to what extent screening authorities had complied with the Regulations, due to poor data and inadequate records.3 The ANAO also found that the Department did not have meaningful passenger screening performance targets or enforcement strategies and did not direct resources to areas with a higher risk of non-compliance.4 The ANAO made five recommendations aimed at improving the Department’s regulatory performance.5 The Department accepted all five recommendations and advised the ANAO that:

The Department notes that following significant review work in 2015 it is investing in the broad reform of its transport security regulatory operations. This is to ensure that the Office of Transport Security’s regulatory activities are well positioned to respond to changing threats and risks, future industry growth and diversification, and that its approvals and compliance operations are efficient. This reform program comprises three elements: redesign of the transport security operating model, the establishment of competency based learning and development framework and improvements to the regulatory management system (RMS).6

Audit objective and criteria

3. The objective of this audit was to examine the extent to which the Department of Infrastructure and Regional Development, now the Department of Home Affairs (the Department) has implemented the recommendations from ANAO Report No.5 2016–17, Passenger Security Screening at Domestic Airports.

4. To form a conclusion against the audit objective the ANAO adopted the following high level criteria:

  • To what extent has the Department implemented an effective compliance monitoring program?
  • To what extent has the Department implemented an appropriate learning and development framework?
  • To what extent has the Department implemented effective performance monitoring and reporting arrangements?

Rationale

5. Due to the significance of the findings from the previous audit, the associated recommendations, and the response from the Department advising that a number of initiatives to address the shortcomings identified were already underway, it was expected that the Department would act quickly to address and remediate the issues identified. Timely implementation is necessary if the audited entity is to achieve full value from the agreed recommendations. On this basis, the ANAO decided to commence a follow-up audit in the 2017–18 financial year to determine if the Department is implementing the recommendations from the previous audit in a timely manner, and if the Department is now in a better position to provide assurance of the effectiveness of passenger screening.

Conclusion

6. As at March 2018, the Department has implemented one and partially implemented four of the five recommendations made in ANAO Audit Report No.5 2016–17, Passenger Security Screening at Domestic Airports (see Table S.1 below). Consequently, while the Department has made progress, it is not yet well placed to provide assurance that passenger screening is effective and that screening authorities comply with the Regulations.

Table S.1. Assessment of the extent to which the recommendations made in ANAO Audit Report No.5 2016–17 have been implemented

Recommendation

Status

ANAO Recommendation No.1

Set a date at which the grandfathering provisions for passenger screening equipment requirements will cease, and amend the Aviation Screening Notices accordingly.

Implemented

ANAO Recommendation No.2

a) establish an analysis function to identify non-compliance trends based on accurate, reliable compliance activity data; and

b) incorporate the results of the analysis into the compliance program, focusing on areas at risk of non-compliance.

Partially implementeda

ANAO Recommendation No.3

Develop performance measures for passenger screening that are practical, achievable and measurable.

Partially implementedb

ANAO Recommendation No.4

Conduct a training needs analysis for users of the regulatory management system, deliver appropriate training, and monitor its effectiveness.

Partially implementedc

ANAO Recommendation No.5

Provide targeted reporting to its stakeholders, based on accurate data, which enables an assessment of the effectiveness of passenger screening, and promotes improved passenger screening effectiveness.

Partially implemented

   

Note a: Recommendation No.2 will be implemented through a broader OTS data strategy.

Note b: Recommendation No.3 will be implemented as part of a broader Enhanced Mandatory Reporting Project.

Note c: The Department decided to broaden the application of the training needs analysis and associated learning and development framework to all staff within the Transport Security Operations branch.

Source: ANAO analysis of Departmental documentation.

7. The extent to which the Department has implemented an effective compliance monitoring program has been constrained by the quality of data captured in the Regulatory Management System. Consequently, the ability of the Department to conduct meaningful analysis of compliance activity data and identify non-compliance trends in passenger screening is limited. While the Department has developed a data analysis function to work around the limitations of the Regulatory Management System, its initial analysis was not used to inform planning. The Department has further work to do to be able to identify non-compliance trends and incorporate the results of the analysis into the annual compliance program as recommended in the previous audit.

8. The Department has developed and approved a learning and development framework. However, the plan to implement the framework has not yet been approved, and a key element yet to be finalised is the approach to monitoring and evaluating the framework. Delivery of appropriate training was delayed by the Department’s decision to broaden the application of the training needs analysis beyond the initial Regulatory Management System focused training recommended by the ANAO (Recommendation No.4) in the previous audit. While steps were taken to address short-term training needs, the first training courses outlined in the learning and development framework commenced in February 2018.

9. The Department has developed, but not yet implemented performance monitoring arrangements despite numerous reports7 including ANAO Audit Report No.26 2002–03, Aviation Security in Australia recommending that performance measures be implemented. Most recently, the timely implementation of these performance measures has been impacted by the Department’s decision to implement the measures in July 2018 as part of the broader Enhanced Mandatory Reporting Project. While the Department has made progress to improve the reporting provided to its stakeholders, the ability of the Department to accurately assess the effectiveness of passenger screening is limited due to the quality of the data captured in the Regulatory Management System, the lack of an associated reporting function and because performance measures have not yet been implemented.

Supporting findings

Compliance monitoring

10. The Department has established a data analysis function, as recommended by the ANAO (Recommendation No.2a) in the previous report and delivered its initial (and only) analysis in July 2017. The analysis provided a breakdown of compliance activities conducted for each category of airport and aircraft operator, the number of findings by type for each state, and the number of findings per security mitigation category. However, the analysis could not be used to identify trends in non-compliance and issues in relation to the quality of compliance activity data remain.

11. The results of the July 2017 analysis were not incorporated into the 2017–18 National Compliance Plan as intended by the ANAO (Recommendation No. 2b). Instead, the plan was informed by separate compliance activity data extracted from the Regulatory Management System and the Microsoft Excel regional office work plans. The analysis used to inform the plan did not include a comprehensive analysis of compliance outcomes or identify areas at risk of non-compliance.

12. The Department directs compliance resources to categories of airports and airline operators that present a higher security risk. The Department has not yet established an effective approach to direct compliance resources within categories. As a result, compliance resources are not targeted towards individual airports and airline operators where the risk of non-compliance is higher or additional support is needed to comply with regulatory requirements.

13. The Department has developed a compliance and enforcement framework, which identifies the need to develop clear guidance outlining the various enforcement options, and the escalation process to assist in the management of non-compliance. However, the framework has not yet been implemented.

Learning and Development

14. An analysis of training needs was completed in February 2017. It took longer to complete than expected because the Department decided to conduct a broader analysis than had been recommended by the ANAO (Recommendation No.4) in the previous audit report.

15. The Department has developed a learning and development framework based on the training needs analysis. However, one element of the framework remains incomplete—the framework does not include a monitoring and evaluation component.

16. Delivery of the training identified in the learning and development framework commenced in late February 2018. Prior to the formal commencement of the training modules identified in the learning and development framework, short-term training needs were met through the development of quick reference guides and the provision of procedural fairness and administrative decision making training.

Performance Monitoring and Reporting

17. The Department has developed performance measures in consultation with stakeholders informed by the performance measures outlined in the 2009 Aviation Security Screening Review. However, implementation has been delayed due to the Department’s decision to incorporate implementation into the Enhanced Mandatory Reporting Project. As a result, the extent to which the measures are practical, achievable and measurable cannot be currently assessed.

18. The Department provides a range of activity and information reports to its various stakeholders. While this information is useful, and supports some stakeholders assess specific elements of their screening operations, it does not enable the Department to assess the effectiveness of passenger screening or the extent to which the compliance monitoring program promotes improved passenger screening effectiveness.

Recommendations

Recommendation no.1

Paragraph 2.12

In implementing Recommendation No.2 from the previous audit, the Department should ensure that its approach delivers a meaningful analysis of passenger screening compliance activities and outcomes. The analysis should: be capable of accurately identifying non-compliance trends; generate results that are used to inform the development of the risk and compliance prioritisation ratings; and be incorporated into subsequent compliance monitoring programs.

Department of Home Affairs response: Agreed.

Recommendation no.2

Paragraph 3.9

In implementing Recommendation No.4 from the previous audit, the Department should develop and implement a monitoring and evaluation strategy, so it can assess to what extent the objectives of the learning and development framework are being met.

Department of Home Affairs response: Agreed.

Recommendation no.3

Paragraph 4.12

In implementing Recommendation No.3 from the previous audit, the Department should ensure that performance measures are established in a timely manner alongside an effective monitoring and review mechanism to provide assurance that the performance measures developed for passenger screening are practical, achievable and measurable.

Department of Home Affairs response: Agreed.

Summary of entity response

19. The Department of Home Affairs was provided with a copy of the proposed audit report for comment. A summary of the response received from the Department is provided below. The full response provided by the Department of Home Affairs is at Appendix 1.

Department of Home Affairs

The Department agrees with all recommendations in the audit.

The Department is continuing to work on the implementation of the recommendations from the original audit of passenger security screening at domestic airports and has made significant progress. The Department is strengthening ICT systems and staff skills; improving compliance planning, data quality and analytical capabilities and working with industry to share key data.

Key learnings for all Australian Government entities

Below is a summary of key learnings and areas for improvement identified in this audit report that may be considered by other Commonwealth entities when administering regulatory programs and implementing recommendations.

Group title

Governance and risk management

Key learning reference
  • To provide assurance that risks are being effectively managed, entities should not depend upon qualitative assessments alone. To assess the effectiveness of critical risk controls, qualitative assessments should be supported by quantitative data and incorporated into a feedback mechanism capable of identifying if the approach to delivering government policy is appropriate and supports achievement of the stated policy objective.
Group title

Implementation of recommendations

Key learning reference
  • Entities should ensure that agreed timeframes to implement recommendations are determined in accordance with the assessed risk. Clearly defining roles and responsibilities, priority, resourcing, desired outcomes, and including appropriate monitoring and reporting arrangements will lead to better results.
Group title

Performance and Impact measurement

Key learning reference
  • Entities should ensure that performance measures are included when designing an approach to deliver government policy. Performance measures should be implemented in a timely manner, and support the ability of the entity to provide assurance to stakeholders that the outcomes of the program established support the entity to deliver the objectives of the policy.

1. Background

Screening of domestic aircraft passengers

1.1 Everyone entering the secure area of an Australian airport terminal, and their carry-on baggage is screened to ensure the safety and security of all travellers. Screening utilises a combination of specialised equipment—walk through metal detectors, x-ray observation, and explosive trace detectors—as well as the judgement of screening personnel to prevent prohibited items, such as weapons and explosives from being carried onto aircraft.8 Across Australia, there are 62 security controlled airports9, where screening is undertaken to mitigate against: the unlawful seizure of an aircraft; intrusion on board an aircraft of a weapon or other material capable of threatening the integrity of the airframe; and the use of the aircraft as a weapon.

1.2 The Office of Transport Security (OTS), within the Department of Home Affairs (the Department)10 is responsible for ensuring that passenger screening is effective. Powers to conduct screening are provided under the Aviation Transport Security Act 2004 (the Act), and the Aviation Transport Security Regulations 2005 (the Regulations).

1.3 As well as being effective in order to meet the legislated and regulatory requirements, screening needs to be conducted efficiently to facilitate air travel. Since October 2012, the number of domestic aircraft passengers has trended upwards with 62.13 million domestic passengers carried on 639 400 regular public transport and charter aircraft trips during the year ending October 2017.11

Implementation of increased security measures

1.4 On 29 July 2017, counter-terrorism raids were conducted in Sydney in response to a plan to attack an aircraft using an improvised device. Further to the raids, on 30 July 2017, the Government announced that increased security measures had been put in place at Sydney, Melbourne, Brisbane, Darwin, Perth, Adelaide, Canberra, Cairns, Gold Coast, and Hobart airports to:

complement security arrangements already in place and applied as an extra precaution, in coordination with counter terrorism raids in Sydney conducted on 29 July 2017.12

1.5 Subsequently, the Inspector of Transport Security was tasked with conducting a review of aviation security at regulated airports. This review of aviation security was instigated on 8 August 2017 to ensure that security measures remain appropriate, and are adapting to new and emerging issues. The report was delivered on 24 November 2017.

Monitoring screening authorities compliance

1.6 In most instances, an airport or airline will act as the authorised screening authority, and may sub-contract day-to-day screening operations to a specialist screening provider. Airport and aircraft operators that undertake screening must be authorised by the Department and are responsible for:

  • meeting the minimum legislated security requirements outlined in the Act and the Regulations;
  • delivering security on a day-to-day basis; and
  • operating in accordance with the relevant Aviation Screening Notice.13

1.7 The Department is responsible for establishing an effective compliance monitoring program to: provide assurance that industry participants comply with regulatory requirements; and accurately assess the effectiveness of passenger screening. The National Compliance Plan (NCP) developed by the Department documents the number and type of compliance monitoring activities the Department plans to conduct on an annual basis. The plan outlines the allocation of Departmental resources to ‘core’14 and ‘non-core’15 compliance monitoring activities.16 Table 1.1 outlines the types of ‘core’ and ‘non-core’ compliance monitoring activities conducted by the Department.

Table 1.1: Types of compliance monitoring activities

Type

Description

Core activities

Audits

Audits are used to determine the extent to which an industry participant is compliant with the Act and the Regulations. They are defined as an ‘in depth compliance examination of all aspects of the implementation of the national civil aviation security programa’ and are intended to provide assurance that the Transport Security Programb addresses regulatory requirements and the associated measures outlined in the program have been met.

Inspections

Inspections focus on a specific area/s of an industry participant’s security regime and allow for an in-depth investigation of that element. They are designed to critically examine a process, procedure, or aspect of an industry participants business.

Systems tests

A systems test assesses the effectiveness of a specific security element, such as a screening or access procedure. Systems tests are used to determine if the operations of industry participants adhere to the relevant regulatory requirements.

Non-core activities

Campaigns

Campaigns are short-term activities to improve industry participants’ understanding of current threats and risk, and to provide information and guidance.

Response activities

The NCP includes an allowance for response activities which introduces flexibility to undertake additional activities as needed.c Examples of response activities conducted in 2017 include trials at Category 1 airports to support the re-introduction of the Glock17P systems test piece and monitoring of industry’s compliance with the enhanced aviation screening measures following a security incident.

   

Note a: The term ‘audit’ is defined by the Department as per Annex 17 to the International Civil Aviation Organisation (ICAO).

Note b: A Transport Security Program sets out the physical places within an airport that are subject to regulation, and how the airport operator will manage security for its operations, including for passenger screening. Adherence to the Transport Security Program is monitored through compliance activities.

Note c: This can include additional targeted activities on industry participants (for example, where a security failure has been identified).

Source: ANAO analysis of Departmental documentation.

1.8 Since the previous audit was tabled, the Department has adjusted the allocation of resources to ‘core’ compliance activities to increase the number of inspections and system tests and reduce the number of audits conducted (see Table 1.2). The Department has also revised the systems test regime. In January 2017, the Department released a system test strategy (co-designed with industry) which introduced a tiered approach to systems testing, re-introduced the Glock 17P test piece, and a range of new test pieces designed to test the ability of the system to detect contemporary threats. Subsequently, the Department adjusted the NCP to increase the usage rate of the new test pieces in January 2018.

Table 1.2: Allocation of resources to core compliance activities (2015–16 to 2017–18)

Compliance activity

Compliance activities

2015–16

(Actual)

Compliance activities

2016–17

(Actual)

Compliance activities

2017–18

(Planned)

Compliance activities

2017–18

(Actual)a

Audit

63

68

52

26 (50 per cent)

Inspection

141

190

374

214 (57 per cent)

Systems Test

349

310

738

386 (52 per cent)

Total Activities

553

568

1164

626 (54 per cent)

         

Note a: As at January 2018.

Source: ANAO analysis of Departmental documentation.

The ANAO’s previous audit

1.9 In August 2016, the Australian National Audit Office (ANAO) tabled Audit Report No.5 2016–17, Passenger Security Screening at Domestic Airports in the Parliament. In that audit, the ANAO found that the Department (then, the Department of Infrastructure and Regional Development) was unable to provide assurance that passenger screening was effective, or to what extent screening authorities had complied with the Regulations, due to poor data and inadequate records. Further, the ANAO found that the Department did not have meaningful passenger screening performance targets or enforcement strategies and did not direct resources to areas with a higher risk of non-compliance.17

1.10 The ANAO made five recommendations (see Box 1) aimed at improving the Department’s regulatory performance.18 The Department agreed to implement all five recommendations.

Recommendations of the previous ANAO audit

ANAO Recommendation No.1

That, independently of other projects being conducted, the Department sets a date at which the grandfathering provisions for the 2011 passenger screening equipment requirements will cease, and amends the Aviation Screening Notices accordingly.

ANAO Recommendation No.2

That the Department:

(a) establishes an analysis function to identify non-compliance trends based on accurate, reliable compliance activity data; and

(b) incorporates the results of the analysis into the compliance program, focusing on areas at risk of non-compliance.

ANAO Recommendation No.3

That the Department, in consultation with stakeholders, develops performance measures for passenger screening that are practical, achievable and measurable.

ANAO Recommendation No.4

That the Department conducts a training needs analysis for users of the regulatory management system, delivers appropriate training, and monitors its effectiveness.

ANAO Recommendation No.5

That the Department provides targeted reporting to its stakeholders, based on accurate data, which enables an assessment of the effectiveness of passenger screening, and promotes improved passenger screening effectiveness.

1.11 In its response, the Department advised19 the ANAO that:

The Department notes that following significant review work in 2015 it is investing in the broad reform of its transport security regulatory operations[…] This reform program comprises three elements: broad reform of Departmental transport security regulatory operations to ensure the Department is well positioned to respond to changing threats and risks; improving the Department’s collection and analysis of data pertaining to passenger screening, including incorporating non-compliance risk into its planning; and establishing a working group to develop a framework to measure the effectiveness and extent that screening authorities are complying with passenger screening regulations.

Progress in implementing the ANAO’s recommendations

1.12 Implementation of the ANAO’s recommendations has been monitored by the Department’s Audit Committee. The operational area responsible for implementing each recommendation established the target date to implement each of the five ANAO recommendations. The Department’s advice to its Audit Committee on the status of each recommendation is outlined in Table 1.3.

Table 1.3: Target date for implementation and the status of each recommendation

Recommendation no.

(summary)

Target date for implementation (months)a

Status

1.

(cease grandfathering provisions)

01 July 2018 (24 months)

Implemented.

The Department amended the Aviation Screening Notice on 23 June 2017, setting the date at which the grandfathering arrangements were to cease as 1 July 2018. Subsequently, the Department’s Audit Committee agreed to close the recommendation on 10 August 2017.

2.

(establish capability to identify and incorporate non-compliance trends into compliance programs)

01 July 2017 (12 months)

In progress.

The Department’s Audit Committee agreed to close the recommendation on 10 August 2017. In late February 2018, the Department sought the agreement of the Audit Committee to have the recommendation re-opened.

3.

(develop performance measures for passenger screening)

01 July 2018 (24 months)

In progress.

The Audit Committee was advised on 8 November 2017 that the Department was ‘on-target’ to implement by 1 July 2018.

4.

(train officers to use the regulatory management system)

31 Dec 2016 (six months)

In progress.

The Audit Committee was advised on 8 November 2017 that implementation had been delayed by 18 months and provided a revised date of 30 June 2018.

5.

(report on the effectiveness of passenger screening)

01 July 2018 (24 months)

In progress.

The Audit Committee was advised on 8 November 2017 that the Department was ‘on-target’ to implement by 1 July 2018.

     

Note a: Calculated from 12 July 2016 the date of the Department’s response to the previous ANAO audit.

Source: ANAO analysis of Departmental documentation.

Recent Departmental and external reviews

1.13 Subsequent to the ANAO’s previous audit, there have been a further three external reviews into Australia’s aviation security regime (see Table 1.4 on the following page).

Table 1.4: Aviation Security Reviews

Year

Review conducted by

Description

2016

International Civil Aviation Organisation

The audit resulted in 22 recommendations aimed at improving Australia’s aviation security.

2017

Rural and Regional Affairs and Transport References Committee

Inquiry and report on apparent breaches in airport and aviation security; responses to the reports; and consideration of the appropriateness of airport security settings. The Committee made nine recommendations in its report—Airport and Aviation Security.

2017

Inspector of Transport Security

Review of security at Australian security regulated airports to ensure that security measures remain appropriate, and are adapting to new and emerging issues.

     

Source: ANAO Analysis of Departmental documentation.

Audit approach

1.14 The objective of this audit was to examine the extent to which the Department of Infrastructure and Regional Development, now the Department of Home Affairs (the Department) has implemented the five recommendations from ANAO Report No.5 2016–17, Passenger Security Screening at Domestic Airports.

1.15 To form a conclusion against the audit objective the ANAO adopted the following high level criteria:

  • To what extent has the Department implemented an effective compliance monitoring program?
  • To what extent has the Department implemented an appropriate learning and development framework?
  • To what extent has the Department implemented effective performance monitoring and reporting arrangements?

1.16 The audit team has: reviewed, examined and analysed relevant documentation held by the Department; interviewed relevant Departmental staff and internal stakeholders; sought input and commentary from authorised screening authorities in relation to the Department’s compliance planning, monitoring, and enforcement activities, focusing on the changes made since the previous audit was tabled; observed screening procedures at Gold Coast, Brisbane, Melbourne and Avalon airports; and examined the results from systems testing undertaken by the Department.

1.17 The audit scope has not included the examination of air cargo, checked baggage or the screening of international passengers.

1.18 The audit was conducted in accordance with ANAO Auditing Standards at a cost to the ANAO of approximately $297 350.

1.19 Team members for this audit were Joyce Knight, Michael Commens and Sally Ramsey.

2. Compliance monitoring

Areas examined

This chapter examines the extent to which the Department of Home Affairs (the Department) has implemented an effective compliance monitoring program including: establishing a data analysis function to identify trends in non-compliance; using the results from the analysis of compliance activities to inform future compliance plans; directing resources to areas of higher risk; and accurately identifying and managing non-compliance (ANAO Recommendation Nos.1 and 2).

Conclusion

The extent to which the Department has implemented an effective compliance monitoring program has been constrained by the quality of data captured in the Regulatory Management System. Consequently, the ability of the Department to conduct meaningful analysis of compliance activity data and identify non-compliance trends in passenger screening is limited. While the Department has developed a data analysis function to work around the limitations of the Regulatory Management System, its initial analysis was not used to inform planning. The Department has further work to do to be able to identify non-compliance trends and incorporate the results of the analysis into the annual compliance program as recommended in the previous audit.

Recommendation

The ANAO has recommended the Department ensure that in implementing Recommendation No.2 from the previous audit, the approach is capable of delivering a meaningful analysis of passenger screening.

Has the Department established a data analysis function that can identify trends in non-compliance based on accurate, reliable compliance activity data?

The Department has established a data analysis function which delivered its initial (and only) analysis in July 2017. The analysis provided a breakdown of compliance activities conducted for each category of airport and aircraft operator, the number of findings by type for each state, and the number of findings per security mitigation category. However, the analysis could not be used to identify trends in non-compliance, and issues in relation to the quality of compliance activity data remain.

2.1 ANAO Audit Report No.5 2016–17 found that the Department had limited information about the compliance history of individual industry participants and lacked reliable data and analysis to identify systemic issues, or compare performance across industry participants.20 To improve the Department’s ability to target compliance activities at areas where there was a higher risk of non-compliance, the previous audit recommended (ANAO Recommendation No. 2a) that the Department establish an analysis function capable of identifying trends in non-compliance based on accurate, reliable compliance activity data.21

2.2 The data analysis function that was subsequently established by the Department has produced a single analysis of compliance activity data based on data extracted from the Regulatory Management System (RMS)22 for the period July 2015 to December 2016. The analysis was delivered in July 2017 and provided a breakdown of compliance activities conducted for each category of airport23 and aircraft operator24, the number of findings by type25 for each state, and the number of findings per security mitigation category.26 The analysis did not identify trends in non-compliance or identify industry participants that required a higher level of support to comply with the regulated requirements (such as those providers discussed in paragraph 2.9).

2.3 The ANAO analysed compliance activity data extracted from the RMS between July 2016 and October 2017 and found that while the quality and consistency of the data is improving, inconsistent and incomplete data is still being entered into the system. Further, the Department’s ability to retrieve data recording the outcome of compliance activities from the RMS is limited, as the data is stored in free text fields which do not facilitate easy retrieval for reporting and analysis.

2.4 As part of its 2017–20 strategic plan, the Office of Transport Security identified a number of priorities including Strategic Priority No.1: Improve our data and information collection, use and management.27 In line with this strategic priority, the Department has acknowledged the limitations on its ability to obtain meaningful, comprehensive reporting and analysis from its systems and approval was given to develop a data product suite to meet business needs on 10 November 2017. The business case outlining the rationale for the development of a data product suite identified a number of limitations with the current systems including:

  • the number of disparate business systems used to store data;
  • the quality and continuity of data sets;
  • limited data extraction and reporting capabilities; and
  • different data structures and definitions used across the various business systems in use.

When approving the business case the Department noted that it will adopt the data and information framework applied in the Department of Home Affairs to develop the data product suite. However, as at March 2018, a plan outlining how the Department intends to develop the data product suite and address the limitations of the systems to provide meaningful analysis of compliance activity and outcomes as well as facilitate comprehensive reporting has not yet been developed.

Are the results from previous compliance activities being incorporated into the compliance monitoring program?

The results of the July 2017 analysis were not incorporated into the 2017–18 National Compliance Plan as intended by the ANAO (Recommendation No.2b). Instead, the plan was informed by separate compliance activity source data extracted from the Regulatory Management System and the Microsoft Excel regional office work plans. The analysis used to inform the plan did not include a comprehensive analysis of compliance outcomes or identify areas at risk of non-compliance.

2.5 ANAO Audit Report No.5 2016–17 found that without effective data collection, storage and retrieval to enable analysis of individual industry participant performance the Department is unable to assess the risk of non-compliance or identify systemic non-compliance and apply resources where the risk is higher.28 It also observed that there was no feedback loop in the compliance cycle to indicate whether compliance is improving or deteriorating and where additional monitoring may be required. Accordingly, the ANAO recommended that the results of previous compliance activities be incorporated into future compliance plans (ANAO Recommendation No. 2b) to focus compliance resources on areas where there is a higher risk of non-compliance.

2.6 The July 2017 analysis (as discussed in paragraph 2.2) was not used by the Department to develop the National Compliance Plan for 2017–18. Instead, separate source data was extracted from the RMS and the Microsoft Excel regional office work plans to track progress against the targets established in the 2016–17 National Compliance Plan. The data included a summary of the compliance activities the Department had conducted, the number of findings per security mitigation category and the results of systems tests undertaken across Category 1 to 3 airports.

2.7 As a result, the data used to inform the development of the 2017–18 National Compliance Plan was not informed by an analysis of trends in non-compliance or identify industry participants exposed to a higher risk of non-compliance.29

Does the Department direct compliance resources to areas with a higher risk of non–compliance or that require higher levels of support and/or monitoring?

The Department directs compliance resources to categories of airports and airline operators that present a higher security risk. The Department has not yet established an effective approach to direct compliance resources within categories. As a result, compliance resources are not targeted towards individual airports and airline operators where the risk of non-compliance is higher or additional support is needed to comply with regulatory requirements.

Allocating compliance resources and activities

2.8 The number and type of compliance activities planned for each category of airport and airline operator is based on the risk and compliance prioritisation rating applied. As shown in Figure 2.1 (below), more resources are applied to categories of airports and aircraft operators with higher risk and compliance prioritisation ratings. The Department determines the risk and compliance prioritisation30ratings by:

  • calculating a risk priority rating for each category of airport and airline operator using risk scenarios and an attack path methodology to assess the likelihood and consequence of a risk event occurring; and
  • calculating an adjustment score using ten adjustment score questions for each category of airport and airline operator which is then applied to the risk rating to determine a compliance prioritisation rating.31

Figure 2.1: Proportion of compliance activities planned per risk rating

 
 
 
Very High
 
High
 
Medium
 
Low
 
Very Low

Source: ANAO analysis of Departmental documentation.

2.9 The process used by the Department to determine the risk priority rating is effective at identifying categories of airports exposed to a higher security risk—for instance, where an attempt to unlawfully interfere with an aircraft is more likely to occur. However, the Department has not yet established an effective approach to identify individual industry participants that present a higher risk of non-compliance and does not direct resources accordingly within categories. For example, the Department does not direct resources to industry participants within a category that present a higher risk of non-compliance due to:

  • a poor record of resolving non-compliance notices;
  • frequent non-compliance notices32, or observations33 of potential security vulnerabilities have been issued in relation to the same security mitigation category over an extended period of time;
  • the industry participant is operating screening equipment that is subject to an extension to the grandfathering arrangements; or
  • the industry participant routinely needs additional support to comply with regulated requirements.

Case study 1. Ongoing use of grandfathering arrangements for screening equipment

To implement ANAO Recommendation No.1 from the previous audit the Department advised industry participants on 24 June 2016 that their equipment must comply with Aviation Screening Notice (ASN) requirements from:

  • 1 July 2017 for explosive trace detection and walk through metal detection equipment; and
  • 1 July 2018 for X-ray observation and checked baggage screening equipment.

The ASN was amended on 23 June 2017. The Department also approved six Australian airports to continue to use 24 pieces of explosive trace detection and walk-through metal detection screening equipment past the date specified in the amended ASN. In December 2017, the Department advised the ANAO that, across five Australian airports, 15 pieces of screening equipment remain in use and further extensions to the grandfathering arrangements for two of the airports have been provided, allowing 12 of the 15 pieces of explosive trace detection and walk through metal detection screening equipment to remain in use until 1 July 2018. The Department advised that the additional extensions were granted as the Government was considering introducing enhanced screening equipment requirements at domestic airports.34

The Department undertook an assessment of the request, including the likely impact on detection performance of the equipment, where Airports had been deemed as ‘high risk’ prior to granting the extension.a The assessment tools viewed by the ANAO identified that in some cases the detection performance of the equipment could not be verified, and specified that use of the equipment would be limited, which was considered a sufficient safeguard. However, the approval documents did not apply any conditions such as additional systems testing or more frequent compliance monitoring activity. Further, the Department did not request that the screening authority provide any performance information to provide assurance that the screening equipment operating under the grandfathering arrangements complied with the agreed restrictions.

Regional Offices are advised when screening equipment operating under extended grandfathering arrangements is in use. However, there is no evidence that this has been taken into account when determining the risk and compliance prioritisation rating assigned, and the Department has not amended the focus, frequency or type of compliance activities planned for the airports where screening equipment operates under extended grandfathering arrangements.

Note a: The ‘high’ risk airports were a Category 1 and 2 airport.

2.10 The process currently used by the Department to determine the compliance rating is a qualitative one which is applied to each category of airport and aircraft operator and not individual industry participants within the categories. Regional offices are to consider the compliance attitude, history, size and complexity of individual industry participants as part of planning the compliance activities. However, there is no evidence that the data compiled by the regional offices is analysed to determine if compliance is improving or deteriorating or if it forms part of a feedback loop to inform the national compliance planning process.

2.11 The national compliance planning process could be strengthened considerably by including accurate quantitative data from the analysis of previous compliance activities, to provide more granular information about the compliance history of individual industry participants. By doing so, the Department would be better placed to implement a nuanced approach to its compliance monitoring activities within each category, and effectively target compliance resources. The difficulties the Department has had in obtaining accurate data from its systems which would support it to undertake this type of analysis were examined at paragraphs 2.2 and 2.3.

Recommendation no.1

2.12 In implementing Recommendation No.2 from the previous audit, the Department should ensure that its approach delivers a meaningful analysis of passenger screening compliance activities and outcomes. The analysis should: be capable of accurately identifying non-compliance trends; generate results that are used to inform the development of the risk and compliance prioritisation ratings; and be incorporated into subsequent compliance monitoring programs.

Department of Home Affairs response: Agreed

2.13 Strong progress has been made by the Department to undertake analysis of compliance data to inform both policy settings and compliance activities. An analytical unit has been established to do this.

2.14 The value of compliance findings is well understood and non-compliance trends will be used to inform the development of the Division’s 2018–19 National Compliance Plan.

Are non-compliances accurately identified, monitored and managed in accordance with a clear enforcement framework?

The Department has developed a compliance and enforcement framework, which identifies the need to develop clear guidance outlining the various enforcement options, and the escalation process to assist in the management of non-compliance. However, the framework has not yet been implemented.

2.15 ANAO Audit Report No. 5 2016–17 found that the Department did not have an enforcement policy and did not utilise the full suite of enforcement actions available to it. Paragraph 3.1835 of that report suggested the Department develop a clear enforcement strategy, and communicate its strategy to relevant stakeholders.

2.16 The Department’s strategic plan for 2017–20 broadly outlines its compliance approach (see Figure 2.2) in terms of a regulatory pyramid and the industry version of the NCP released in March 2017 informed relevant stakeholders about the Department’s regulatory operations (at a high level).

Figure 2.2: The Office of Transport Security’s Regulatory Pyramid

 

A diagram that illustrates the Department’s compliance and enforcement approach. At a high level, the diagram illustrates how the compliance attitude of the industry participant is used to determine the type of enforcement action/s that can be utilised by the Department.

 

Source: Departmental documentation.

2.17 The Department’s strategic plan advises that it intends to build its capacity to undertake enforcement action over the course of the next three years.36 In addition, the Department has developed a compliance and enforcement framework, and implementation plan. The objectives of the framework include ‘an enforcement capability to address seriously disengaged or non-compliant industry participants and/or breaches’. The implementation plan—which has not yet been approved—outlines how enforcement activities will be managed and the delivery model that will be used. It includes the establishment of an enforcement team and proposes to deliver training to improve the enforcement and investigative skills of the Transport Security Inspectors (TSIs), and identifies the need to develop clear guidance outlining the application of the various enforcement options available and the escalation process to assist manage non-compliance as identified in the previous audit.37

2.18 The previous audit found38 that the majority of enforcement action undertaken by the Department involved issuing observations39 and non-compliance notices.40 Where a non-compliance notice or observation requires that a corrective action plan be submitted by the industry participant, a timeframe for resolution and the actions that the industry participant will complete in order to resolve the non-compliance, is required.41

2.19 Non-compliances and observations are identified through the compliance monitoring activities undertaken. Where non-compliances are identified they are monitored and managed at the regional office level via weekly regional office committee (ROC) meetings where proposed, draft, and issued observations and non-compliance notices are addressed. Regional directors42 also have the power to compel industry participants to provide evidence such as CCTV footage, daily testing logs for screening equipment, training records and other relevant information where an alleged breach of the legislation or regulations is being examined.43

2.20 Since the previous audit was tabled, the Department has sought to improve consistency when issuing an observation or non-compliance notice by providing administrative decision making and procedural fairness training to all TSIs.44 The Department has also started to make greater use of additional enforcement options such as infringement notices.45 However, as identified in the implementation plan there is a need for clear guidance to be made available to TSIs which outlines the escalation process.

2.21 Interviews with regional TSIs indicated that the process of issuing and managing observation or non-compliance notices was well understood. Notwithstanding this, the Department still requires a clear escalation process for serious or systemic non-compliance, and guidance detailing how various sanctions are to be applied is not currently available.

3. Learning and Development

Areas examined

This chapter examines the extent to which the Department of Home Affairs (the Department) has implemented an appropriate learning and development framework including by: conducting a training needs analysis for users of the regulatory management system; delivering and monitoring a learning and development framework; and delivering appropriate training (ANAO Recommendation No.4).

Conclusion

The Department has developed and approved a learning and development framework. However, the plan to implement the framework has not yet been approved, and a key element yet to be finalised is the approach to monitoring and evaluating the framework. Delivery of appropriate training was delayed by the Department’s decision to broaden the application of the training needs analysis beyond the initial Regulatory Management System focused training recommended by the ANAO (Recommendation No.4) in the previous audit. While steps were taken to address short-term training needs, the first training courses outlined in the learning and development framework commenced in February 2018.

Recommendation

The ANAO has made one recommendation aimed at supporting the Department obtain assurance that the objectives of the learning and development framework are being met.

Has the Department completed a training needs analysis?

An analysis of training needs was completed in February 2017. It took longer to complete the training needs analysis than expected because the Department decided to conduct a broader analysis than had been recommended by the ANAO (Recommendation No.4) in the previous audit.

3.1 ANAO Audit Report No.5 2016–17 identified significant issues with the reliability (namely completeness and accuracy) of data captured in the Regulatory Management System (RMS) which arose because users were not trained in the use of the system. The ANAO recommended (ANAO Recommendation No.4) that the Department conduct a training needs analysis for users of the RMS system, deliver appropriate training and monitor its effectiveness.

3.2 The Department accepted the recommendation and responded that the recommendation would be implemented through the development of a broader learning and development framework46, specifically that:

… a needs analysis will be undertaken to ensure staff are trained and supported to undertake all of their regulatory responsibilities. This will include an assessment of training requirements for the regulatory management system. Once the skills, knowledge and capabilities of staff have been assessed against their roles, a plan to address gaps will be developed and implemented. The learning and development framework will include a process for reviewing and maintaining skills.

3.3 The Department originally advised that a training needs analysis would be complete by December 2016, however it wasn’t until November 2016 that the Department entered into a contract to develop and deliver a training needs analysis. The analysis developed by the contractor was accepted by the Department on 28 February 2017 (more than twelve months later than planned). It identified skill and knowledge gaps that could be addressed through learning and development, and found that:

  • a formal approach to learning and development is required;
  • a nationally consistent qualification that incorporates role specific and generalist training modules would provide staff with formal recognition of their skills as well as enhancing career mobility47;
  • ownership of the learning and development framework should be established and implementation should be managed by a learning and development specialist;
  • the Department should utilise a 70:20:10 (70 per cent on the job, 20 per cent coaching/mentoring, and 10 per cent formal training) approach to learning;
  • the Department should identify a partner organisation to develop and implement the learning and development framework and deliver associated courses and tailored training modules; and
  • measures of success, and a timeframe for review and evaluation of the framework should be included to assess if the framework is meeting ongoing needs.

Has the Department developed a learning and development framework which is continuously monitored, evaluated, and revised where necessary?

The Department has developed a learning and development framework based on the training needs analysis. One element of the framework remains incomplete—the framework does not include a monitoring and evaluation component.

3.4 In June 2017, the Department commenced translating the recommendations from the training needs analysis into a learning and development framework. The framework was approved in November 2017, and breaks down the training and development into three phases:

  • induction—intended to leverage existing training, on the job coaching, and demonstrations that address role requirements, business processes and IT systems;
  • core skills and capabilities—delivery of general training including writing skills, stakeholder engagement, critical thinking, judgement and evidence based decision making; and
  • specialist skills and professional capabilities—delivery of role specific training relating to key job functions within the regulatory assessment, compliance monitoring and administrative support roles.

3.5 The Department’s framework broadly aligns with six of the seven principles for better practice in learning and development identified by the Australian Public Service Commission (see Figure 3.1 below).48 The exception being a monitoring and evaluation component.

Figure 3.1: A framework for managing learning and development in the APS

 

A graph that shows that over 90 per cent of compliance resources are directed towards categories of airports and airline operators where a risk rating is ‘very high’ or ‘high’.

 

Source: Building capability: A framework for managing learning and development in the APS.

3.6 During the course of the audit, the Department advised the ANAO that work to develop a monitoring and evaluation component had commenced. As at March 2018, the Department has not yet developed a plan for monitoring and evaluating the implementation of the framework to determine if it is meeting objectives and continues to be appropriate in terms of quality, time, costs and alignment with strategic priorities.

3.7 The service provider engaged to design and deliver the training modules identified in the learning and development framework is contractually required to obtain feedback and evaluate whether training is meeting user needs (with oversight from the Department). This information, while useful, will not enable the Department to determine if the implementation strategy outlined in the learning and development framework continues to meet the needs of the Department as intended by the recommendation made in the ANAO’s previous audit.

3.8 Implementation of a monitoring and evaluation strategy early in the delivery of the new framework and using the training needs gap analysis as baseline data, will position the Department to better inform itself about the extent to which the objectives of the learning and development framework are being met.

Recommendation no.2

3.9 In implementing Recommendation No.4 from the previous audit, the Department should develop and implement a monitoring and evaluation strategy, so it can assess to what extent the objectives of the learning and development framework are being met.

Department of Home Affairs response: Agreed

3.10 An evaluation mechanism will be implemented for the broad learning and development framework that has been developed. The Department recognises the importance of being able to measure whether its investment in development is delivering the training needed by the workforce to function efficiently and effectively.

Is the Department delivering appropriate training?

Delivery of the training identified in the learning and development framework commenced in late February 2018. Prior to the formal commencement of the training modules identified in the learning and development framework, short-term training needs were met through the development of quick reference guides and the provision of procedural fairness and administrative decision making training.

3.11 In December 2017, the Department entered into a contract for the delivery of the training modules outlined in the learning and development framework. Training commenced in late February 2018 with the Department advising the ANAO that specific training needs (including training on the Regulatory Management System, as originally recommended by the ANAO (Recommendation No.4) of the previous audit will be incorporated into tailored, specialist modules which are scheduled to be delivered in June 2018.

3.12 Prior to the formal commencement of the training identified in the learning and development framework, staff had access to ‘principles of decision making’ and ‘administrative law and legislation’ training courses and ‘non-compliance notices’ training workshops. This training was delivered between June and September 2016. Specific material for users of the Regulatory Management System was delivered through the development and release of in-house quick reference guides.

4. Performance Monitoring and Reporting

Areas examined

This chapter examines the extent to which the Department of Home Affairs (the Department) has implemented effective performance monitoring and reporting arrangements including: developing performance measures for passenger screening in consultation with stakeholders, that are practical, achievable and measurable; and improved reporting to its stakeholders (ANAO Recommendation Nos.3 and 5).

Conclusion

The Department has developed, but not yet implemented performance monitoring and reporting arrangements despite numerous reports including ANAO Audit Report No.26 2002–03, Aviation Security in Australia recommending that performance measures be implemented. Most recently, the timely implementation of these performance measures has been impacted by the Department’s decision to implement the measures in July 2018 as part of the broader Enhanced Mandatory Reporting Project. While the Department has made progress to improve the reporting provided to its stakeholders, the ability of the Department to accurately assess the effectiveness of passenger screening is limited due to the quality of the data captured in the Regulatory Management System, the lack of an associated reporting function and because performance measures have not yet been implemented.

Recommendation

The ANAO has made one recommendation aimed at ensuring that performance measures are implemented in a timely manner and an effective monitoring and review mechanism is included to provide assurance that the performance measures for passenger screening are practical, achievable and measurable.

Has the Department consulted with stakeholders to develop performance measures that are practical, achievable and measurable?

The Department has developed performance measures in consultation with stakeholders informed by the performance measures outlined in the 2009 Aviation Security Screening Review. However, implementation has been delayed due to the Department’s decision to incorporate implementation into the Enhanced Mandatory Reporting Project. As a result, the extent to which the measures are practical, achievable and measurable cannot be currently assessed.

4.1 The previous ANAO audit report found that the Department did not have appropriate performance measures in place to determine whether passenger screening is effective.49 Subsequently, the ANAO recommended (ANAO Recommendation No.3) that the Department, in consultation with stakeholders, develop performance measures for passenger screening that are practical, achievable and measurable.50 The Department agreed to implement the recommendation, and advised the ANAO that:

This began in June 2016 with the systems test working group, which involves industry stakeholders, and has been established to review and update the systems test regime. As part of its review of the regime, the working group will develop performance measures for some aspects of passenger screening. This work will be broadened once the systems testing regime has been updated.

4.2 In January 2017, the systems test working group delivered the strategy to guide the future development of the systems test program. While it has a role in improving the ability of the Department to provide accurate systems test results, and provide necessary performance information it was not responsible for developing the performance measures.

4.3 To develop the performance measures, the Department established a separate working group, the Screening Innovation Working Group (the working group) and in August 2017, members were asked to provide feedback on a draft set of proposed performance measures. The proposed measures were approved by the Department in October 2017 and subsequently tabled at the Aviation Security Advisory Forum (the forum) and the Regional Industry Consultation Meeting (RICM).

Development of the performance measures

4.4 Numerous reviews51 including ANAO Audit Report No.26, 2002–03, Aviation Security in Australia have recommended that the Department develop and implement performance measures that are practical, achievable and measurable.52 Subsequently, in 200953 the Department released the Review of Aviation Security Screening Report54, which included 27 recommendations to improve screening. One of the recommendations was to develop a new performance measurement framework in partnership with industry. Suggested key performance indicators and performance targets were also outlined.

4.5 In response to the 2009 aviation security screening report, in May 2012 the Department engaged a consultant to develop key performance measures to assess the effectiveness of security screening checkpoints. This activity was a proof of concept exercise and was completed in June 2013, however the subsequent phase of the project55 did not progress. 56

4.6 In October 2016, the Department recommended that the development of a performance measurement framework be restarted. In August 2017, a series of draft measures informed by the measures recommended in the 2009 Review were presented to the working group for consideration. These draft performance measures were subsequently approved by the Department and circulated to industry participants in October 2017 (as discussed at paragraph 4.3).

The performance measures

4.7 The performance measures developed by the Department are designed to assess the effectiveness of the system and screeners at detecting weapons, and the use of technology designed to detect threats of most concern. Successful implementation of the performance measures relies on screening authorities providing accurate data to the Department in relation to the performance of screening providers and the results of systems tests conducted by the Department through its compliance monitoring program. The combination of data provided from external and internal sources should enable the Department to accurately assess the effectiveness of passenger screening.

4.8 Interviews with screening authorities along with discussions documented in the forum and RICM meeting minutes confirm that the measures selected, the reporting frequency and timeframes proposed are both practical and achievable from an industry perspective. The ANAO was also advised by the industry participants interviewed during the fieldwork phase of this audit that the performance data required is already captured by most industry participants.

Implementation of the performance measures

4.9 The Department advised the ANAO in November 2017 that implementation of the performance measures is scheduled to commence by 1 July 2018 as part of the Enhanced Mandatory Reporting Project. The aim of this project is to provide improved evidence that the Government will be able to use to assess the efficacy of Australia’s transport security regime, to target appropriate and proportionate security measures and address emerging issues by making it mandatory for industry participants to provide performance and security incident data. As at December 2017, initial consultations have been held with selected industry participants to discuss information requirements, however the scope, implementation strategy, deliverables and timeframes for the project have not been finalised.

4.10 The Department plans to initially receive data from a select group of industry participants by conducting a pilot program with a smaller subset of industry participants during the preparation stage of the project. This stage of the project is scheduled to commence in March 2018 with wider implementation to commence in July 2018 and is due to be completed by February 2019 – almost 10 years after the 2009 aviation screening review was released.

4.11 Without performance measures it will be difficult to: assess the effectiveness of the compliance monitoring program; determine which screening authorities are performing passenger screening effectively and require less monitoring; as well as those which are performing poorly, determine the cause, and implement corrective action. It will only be through the implementation of the measures that the practicality, measurability, and achievability of the measures can be assessed.

Recommendation no.3

4.12 In implementing Recommendation No.3 from the previous audit, the Department should ensure that performance measures are established in a timely manner alongside an effective monitoring and review mechanism to provide assurance that the performance measures developed for passenger screening are practical, achievable and measurable.

Department of Home Affairs response: Agreed.

Does the Department provide performance reporting that meets the business requirements of its various stakeholders?

The Department provides a range of activity and information reports to its various stakeholders. While this information is useful, and supports some stakeholders to assess specific elements of their screening operations, it does not enable the Department to assess the effectiveness of passenger screening or the extent to which the compliance monitoring program promotes improved passenger screening effectiveness.

4.13 In November 2017, the Department reported that it was on track to complete the implementation of ANAO Recommendation No.5 by 1 July 2018, 24 months after the previous audit was tabled. The Department originally advised that:

since the commissioning of the regulatory management system, the Department has continued to invest in the development of its reporting capability and remediate data holdings. It is anticipated that this work will begin to deliver the foundations of a reporting capability by the end of 2016. The Department’s intention remains to provide useful reports on passenger screening compliance trends and patterns to stakeholders once this capability is available.

4.14 Since the previous audit was tabled, the Department has improved and expanded the range of reporting that it provides to its stakeholders, however the improvements made and the products delivered are produced using resource intensive manual processes to extract data from the Regulatory Management System (RMS) and the Microsoft Excel based regional office work plans.

4.15 The previous audit report noted that the RMS was released in July 2014, and identified issues with training and data quality, specifically noting the lack of a reporting function. To address the issue of the lack of a reporting function, in October 2016, the Department endorsed a list of 15 operational reports with a set of six identified as high-priority. However, in March 2017, the Department decided not to proceed, due to the estimated cost of $160 000 to produce the six high priority reports. Subsequently, the Department decided to build the reporting capability in-house.

4.16 The in-house reporting function currently requires data to be manually extracted from the RMS. Requests for reports are submitted by the user to a dedicated internal mailbox and must be endorsed before the report will be produced and distributed. As at December 2017, the issues in relation to the lack of a reporting function continue to persist and the RMS is still unable to produce accurate standard reports such as industry participant contact details, and compliance histories of individual industry participants.

Compliance activity reporting

4.17 The various types of reporting provided by the Department to its stakeholders are outlined below at Table 4.1.

Table 4.1: Reports provided to stakeholders

Stakeholder

Report type

Description

OTS Executive

Monthly Summary Reports

Provide a summary of the number of compliance activities, response activities, and systems tests completed for the period; the results of the systems tests with some high level commentary (reason for the failure); and a summary of findings and systems testing for the year to date.

End of Year Report

Provide a high level analysis of the compliance activity conducted over the year. In 2016–17, findings were categorised into security mitigation categories, and included the number of findings per industry participant.

Department Secretary

OTS Divisional Report

Provide the outcomes from the National Compliance Plan, findings from systems tests, and a high level analysis of security incidents on a quarterly basis. The data provided is useful, however it is limited to providing a summary of the compliance activities completed against requirements outlined in the portfolio budget statements, the number and type of system tests conducted and high level results.

Minister

Ministerial Submissions

Include the provision of progress reports and updates to the Minister on matters relevant to passenger screening and the compliance monitoring program on an as required basis. Recent reports provided have covered the outcome of response activitiesa and the introduction of the enhanced systems test regime.

Departmental Annual Report

The annual report details how the Department has performed against required outcomes and key performance indicators identified in the Portfolio Budget Statementsb.

Regulatory Performance Report

Is an annual regulatory performance report, which details how the Department has performed as an industry regulator. It provides a self-assessment against the key performance indicators outlined in the regulatory performance framework including those relevant to the compliance.

Industry Participants

Systems Test Results

Provide de-identified system test results, outcomes from trials of new systems tests and test pieces.c

     

Note a: Ministerial submission provided include advice and updates in relation to the systems tests conducted as part of the suite of enhanced aviation security measures and the results.

Note b: In 2017–18, the performance targets in the Portfolio Budget Statements were amended to remove the target of completing 95 per cent of ‘high risk’ cases to completing 100 per cent of compliance activities within the National Compliance Plan.

Note c: The Department also provided an update to industry participants outlining the results of systems tests conducted to determine levels of achievement against the targets required as outlined in the enhanced aviation security measures.

Source: ANAO Analysis of Departmental documentation.

4.18 The Department has improved its ability to provide reports which meet the business needs of its various stakeholders. The reports outline progress achieved against the National Compliance Plan and the outcomes of compliance monitoring activities undertaken focusing on systems test results. However they remain resource intensive to produce; are dependent on the accuracy of data stored in Microsoft Excel based regional office work plans and the RMS. Further, the data that relates to the outcome of the compliance monitoring activities is not recorded in a format which facilitates easy retrieval for analysis and reporting.

Appendices

Appendix 1 Entity response

 

Department of Home Affairs response

 

a. The ANAO notes that the Department has disagreed with our assessment that it is not yet well placed to provide assurance that passenger screening is effective and that screening authorities comply with the Regulations. As discussed at (paragraphs 7, 2.2, 2.3 and 2.4) there continue to be constraints around the Department’s ability to extract and analyse accurate and reliable compliance activity data. Accurate and reliable compliance activity data, and the ability to effectively analyse it, is necessary to provide assurance that passenger screening is effective and screening authorities comply with the Regulations.

b. The report does not recommend that additional compliance activities should be imposed on airports granted grandfathering extensions for screening equipment. The report (see paragraph 2.9, 2.11 and Case Study No.1) highlight that the Department does not effectively utilise the available information (including the existence of grandfathering arrangements) when determining risk and compliance ratings. As such, compliance resources are not targeted towards individual airports and airline operators where the risk of non-compliance is higher or additional support is needed to comply with regulatory requirements.

Footnotes

1 Regulation 4.17 of the Aviation Transport Security Regulations 2005 establishes the methods, techniques and equipment to be used for screening.

2 Department of Infrastructure and Regional Development, Statistical Report: Aviation: Domestic aviation activity, October 2017, p. 9.

3 ANAO Report No.5 2016–17, Passenger Security Screening at Domestic Airports, p. 7.

4 ibid.

5 ibid., pp. 9–10.

6 ibid., p. 10.

7 Since 2002, 10 reviews into Aviation Security have been conducted. A full list of these reviews was included at Appendix 2 of the previous audit. Of the 10 reviews, four have specifically noted the need for the Department to develop and implement performance measures.

8 The Government announced in the 2018–19 Budget that new advanced screening technology requirements will be rolled out to include the use of body scanners and advanced X-ray equipment at major and regional Australian airports. The Government also announced that improved training and accreditation of all screening staff at airports would be implemented.

9 Security controlled airports are airports from which screened air services operate. Screened air services are defined under Regulation 4.02 of the Aviation Transport Security Regulations 2005 as those services where an aircraft must be a cleared before departure. An aircraft must be cleared before departure if it is operating a regular public transport or open charter operation and has a maximum take-off weight of at least 20 000 kilograms.

10 On 20 December 2017, the Office of Transport Security was transferred from the Department of Infrastructure and Regional Development to the newly established Department of Home Affairs. The term ‘the Department’ has been used throughout the report to refer to either.

11 Department of Infrastructure and Regional Development, Statistical Report: Aviation: Domestic aviation activity, October 2017, p. 9.

12 D Chester, (Minister for Infrastructure and Transport), Additional Aviation Security Measures, media release DC225/2017, 30 July 2017, p. 1.

13 Aviation Screening Notices are issued by the Department under Regulation 4.17, and specify the methods, techniques, and equipment to be used for screening.

14 70 per cent of compliance resources are allocated to ‘core’ compliance activities.

15 30 per cent of compliance resources are allocated to ‘non-core’ compliance activities.

16 In 2016-17 ‘response’ activities were known as ‘discretionary’ activities.

17 ANAO Report No.5, 2016–17, Passenger Security Screening at Domestic Airports, p. 7.

18 ibid., pp. 9–10.

19 ibid., p. 10.

20 ANAO Report No.5 2016–17, Passenger Security Screening at Domestic Airports, p. 30.

21 ibid., p. 9.

22 The Regulatory Management System (RMS) is the Information Technology system used by the Department to record compliance activities, findings and system test results.

23 Airports are categorised into one of six categories. Categories are determined using a combination of maximum take-off weight of the aircraft and the number of passengers that pass through.

24 Aircraft operators are broken down into Australian screened and unscreened operators, International operators of aircraft used to undertake regular public transport, high capacity charter operators, and air freight operators.

25 Findings are categorised as either ‘non-compliance notice’ or ‘observations’. Non-compliance notices are issued where there has been a breach of transport security regulations, and must be followed up. Observations represent potential security vulnerabilities. It is not mandatory for the Industry Participant to respond to the Observation notice.

26 There are eight security mitigation categories: Access Control; Barriers and Signage; Screening and Clearing; Detection and Monitoring; Incident Management; Identity Security; Security Management; and Operational.

27 Department of Infrastructure and Regional Development, Our Strategy 2017–20, p. 12.

28 ANAO Report No.5 2016–17, Passenger Security Screening at Domestic Airports, p. 30.

29 The ANAO has examined the quality and availability of data recording compliance outcomes in more detail at paragraph 2.3.

30 The category of airports and type of airlines that are grouped together are known as sub-segments within a mode. A mode refers to the sector (Aviation, Maritime, Air Cargo, Supply Chains and Issuing Bodies). A segment refers to the type of industry participant (Airline or Airport) and Sub-Segments refer to a category of airport or type of aircraft operator. Sub-Segments are comprised of individual industry participants.

31 The process used to calculate an adjustment score is a qualitative process relying on expertise and professional judgement.

32 Non–compliances notices are issued to industry participants where there has been a failure to comply with Australia’s transport security legislation.

33 Observations are findings that may not involve a breach of legislation, but describe potential security vulnerabilities or circumstances that may lead to a breach of Australia’s transport security legislation if left unaddressed. Observations are provided to the industry participant in the form of an Observation Notice. It is not mandatory for the Industry Participant to respond to the Observation notice.

34In May 2018, the Government announced that sophisticated new screening technology for passengers, baggage and cargo, including the use of body scanners and advanced X-ray equipment, will be required at major and regional Australian airports.

35 ANAO Report No.5 2016–17, Passenger Security Screening at Domestic Airports, p. 32.

36 Department of Infrastructure and Regional Development, Our Strategy 2017–20, p. 11.

37 ANAO Report No.5 2016–17, Passenger Security Screening at Domestic Airports, p. 31.

38 ibid.

39 Observations are findings that may not involve a breach of legislation, but describe potential security vulnerabilities or circumstances that may lead to a breach of Australia’s transport security legislation if left unaddressed. Observations are provided to the industry participant in the form of an Observation Notice. It is not mandatory for the Industry Participant to respond to the Observation notice.

40 Non–compliances notices are issued to industry participants where there has been a failure to comply with Australia’s transport security legislation.

41 A corrective action plan is only required after the procedural fairness process has concluded, the delegate has decided to issue the non-compliance notice, and a corrective action plan has been deemed necessary to return the industry participant to full compliance.

42 Regional Directors are Executive Level 2 officers.

43 These powers are available to Regional Directors under section 109 of the Aviation Transport Security Act 2004.

44 The procedural fairness training conducted at each Regional Office between July and September 2016 was comprised of principles of administrative decision making and non-compliance notice training.

45 An infringement notice includes a financial penalty, whereas a non-compliance notice or observation does not.

46 The learning and development framework applies to all roles within the Transport Security Operations Branch of the Office of Transport Security.

47 APS staff will be required to complete a Certificate IV in Government and Executive Level staff will be required to complete the higher level Diploma.

48 Australian Public Service Commission, Building Capacity: A framework for managing learning and development in the APS, April 2003, last updated 03 October 2013 <http://www.apsc.gov.au/publications-and-media/current-publications/buil…; [accessed 29 Jan 2018].

49 ANAO Report No.5, 2016–17, Passenger Security Screening at Domestic Airports, p. 33.

50 ibid., p. 35.

51 Since 2002, 10 reviews into Aviation Security have been conducted. A full list of these reviews was included at Appendix 2 of the previous audit. Of the 10 reviews, four have specifically noted the need for the Department to develop and implement performance measures.

52 ANAO Report No.26 2002-03, Aviation Security in Australia, p. 15.

53 This report was developed in 2008 by the Department with support provided by an external advisory group appointed by the then Minister for Infrastructure, Transport, Regional Development and Local Government, the Hon Anthony Albanese MP.

54 Department of Infrastructure, Transport, Regional Development and Local Government, Review of Aviation Security Screening Report, Canberra, 2009, p.ii.

55 This phase of the project was intended to take the proof of concept exercise and use the outcomes from it to design an IT solution capable of capturing, analysing and reporting performance data and sharing the data with industry.

56 The cost to complete the next phase of the project was estimated at $300 000.