The audit objective was to assess the effectiveness of the Department of Infrastructure and Regional Development's regulation of passenger security screening at Australian domestic airports.

Summary and recommendations

Background

1. The aim of passenger screening is to prevent prohibited items such as weapons and explosives from being carried onto aircraft. Passenger screening is required at 62 security controlled airports across Australia. It involves the use of specialised equipment and screening personnel to detect and control prohibited items.

2. The Department of Infrastructure and Regional Development (the Department) regulates passenger screening through its administration of the Aviation Transport Security Act 2004 (the Act), and the Aviation Transport Security Regulations 2005 (the Regulations). The Act and Regulations establish a framework for aviation security, mandate minimum security standards for passenger screening and provide for the Department to undertake compliance activities to ensure legislated requirements are met.

3. Airport operators and screening authorities are responsible for delivering security on a day-to-day basis and must meet the minimum legislated security requirements outlined in the Act and the Regulations. Additionally: airport operators are required to operate in accordance with a Transport Security Program1 that has been approved by the Department; and screening authorities must be authorised by the Department to carry out screening and operate in accordance with relevant Aviation Screening Notices. An airport operator may also be a screening authority. Throughout this report, the term industry participant is used when referring to both airport operators and screening authorities, unless a distinction is necessary.

4. The Department is responsible for ensuring that passenger screening is effective in detecting and controlling prohibited items, and for monitoring industry participants’ compliance with security requirements. The audit objective was to assess the effectiveness of the Department’s regulation of passenger security screening at Australian domestic airports.

Conclusion

5. The Department has implemented a regulatory framework that establishes minimum standards for passenger screening and a program of compliance activities at security controlled airports. However, the Department is unable to provide assurance that passenger screening is effective, or to what extent screening authorities comply with the Regulations, due to poor data and inadequate records. The Department does not have meaningful passenger screening performance targets or enforcement strategies and does not direct resources to areas with a higher risk of non-compliance.

Supporting findings

Regulating domestic passenger security screening

6. A sound regulatory framework has been established through the Act, Regulations and Aviation Screening Notices. The airport categorisation model identifies high risk airport types and applies specific regulatory requirements to those airports. The appointment of a screening authority and the approval of a Transport Security Program by the Department form the basis of the aviation security management arrangements between the Department and the aviation industry. Screening authorities are responsible for managing passenger screening in accordance with the requirements of the regulatory framework. The Department is responsible for administering the framework and monitoring compliance with regulatory requirements.

7. The Department’s approach to managing regulatory risk is based on the airport categorisation model that identifies airports with specific risk and operating profiles. The Department engages with industry participants to promote understanding of security obligations and ensure appropriate risk mitigation measures are in place. Engagement occurs through the regulatory approval process, industry forums and compliance activities.

8. The Department has not addressed a number of systemic issues that hamper its ability to implement a risk-based regulatory regime and provide assurance as to the effectiveness of passenger screening. The need to develop performance measures, analyse compliance data, implement an enforcement policy and provide adequate training have been identified in successive reviews but solutions are yet to be delivered.

9. A number of specific activities have been initiated to improve regulatory capability but progress has been delayed, which may reduce the effectiveness of passenger screening regulation.

Monitoring compliance

10. The Department conducts a planned program of compliance activities including audits, inspections and tests, which aim to assess the effectiveness of aviation security, including passenger screening. The compliance program is based on the airport categorisation model, which incorporates risk factors such as the maximum take-off weight of aircraft operating from the airport and the number of passengers departing the airport each year.

11. The compliance program does not incorporate the performance levels of individual airport operators or screening authorities, and non-compliance trends are not taken into account when prioritising compliance activities. The result is that non-compliance risk is not incorporated into the compliance program, and staff resources are not effectively deployed to areas that may require additional support or monitoring.

12. The Department does not have an enforcement policy for managing non-compliance with regulated requirements. In practice, non-compliance is managed through the corrective action plan process, which involves the development, approval and monitoring of a plan to rectify non-compliance. However, there is no clear escalation process for serious or systemic non-compliance, and no guidance on the application of the various sanctions that are available to the Department. Clear guidance on available enforcement options that are proportionate to the risks, and guidance on their application, would assist in the management of non-compliance.

Measuring performance

13. The Department has not established effective performance measures for passenger screening. The relevant key performance indicator in Programme 2.1 of the Department’s Portfolio Budget Statements relates to the number of compliance activities conducted by the Department and does not address the results of those activities. It does not include a quality measure that indicates whether aviation security meets a prescribed level of effectiveness. Similarly, the key performance indicators to be reported in 2015–16 as part of the Government’s Regulatory Performance Framework do not measure the effectiveness of aviation security generally, or passenger screening specifically.

14. The Department does not manage compliance data effectively. A new information management system, implemented in April 2014, does not meet the Department’s business requirements. The data is unreliable, there is inadequate reporting functionality, and training does not meet the needs of users. The Department does not have a robust system to collect, store and retrieve information about industry participants and their compliance with legislated security requirements. The project established to deliver the new information management system was closed in January 2016, and the Department reported that outstanding capability is to be addressed through other activities.

15. The Department has conducted limited analysis of compliance data. Poor data quality and lack of reporting capability in the current and previous systems have made analysis of data difficult. The reports produced from analysis activities reflect an amalgamation of available data, rather than a comprehensive analysis of compliance trends or systematic evaluation of compliance performance.

16. The Department does not report passenger screening compliance results to the Secretary, the Minister or to industry participants. Compliance activity results have been reported to the Office of Transport Security executive on a monthly or quarterly basis. These reports provided compliance results for the relevant reporting period, but did not support comparison of the data over time or identify trends.

Recommendations

Recommendation No. 1

Paragraph 2.15

That, independently of other projects being conducted, the Department sets a date at which the grandfathering provisions for the 2011 passenger screening equipment requirements will cease, and amends the Aviation Screening Notices accordingly.

The Department’s response: Agreed.

Recommendation No. 2

Paragraph 3.10

That the Department:

  1. establishes an analysis function to identify non-compliance trends based on accurate, reliable compliance activity data; and
  2. incorporates the results of the analysis into the compliance program, focusing on areas at risk of non-compliance.

The Department’s response: Agreed.

Recommendation No.3

Paragraph 4.6

That the Department, in consultation with stakeholders, develops performance measures for passenger screening that are practical, achievable and measurable.

The Department’s response: Agreed.

Recommendation No. 4

Paragraph 4.14

That the Department conducts a training needs analysis for users of the regulatory management system, delivers appropriate training, and monitors its effectiveness.

The Department’s response: Agreed.

Recommendation No. 5

Paragraph 4.38

That the Department provides targeted reporting to its stakeholders, based on accurate data, which enables an assessment of the effectiveness of passenger screening, and promotes improved passenger screening effectiveness.

The Department’s response: Agreed.

Summary of entity response

17. The Department provided formal comments on the proposed audit report, which are included at Appendix 1. Its summary response is set out below.

The Department’s summary response

The Department thanks the ANAO for the audit and agrees with all the recommendations.

The Department notes that following significant review work in 2015 it is investing in the broad reform of its transport security regulatory operations. This is to ensure that the Office of Transport Security’s regulatory activities are well positioned to respond to changing threats and risks, future industry growth and diversification, and that its approvals and compliance operations are efficient. This reform program comprises three elements: redesign of the transport security operating model, the establishment of a competency based learning and development framework and improvements to the regulatory management system (RMS). The reform program will drive key changes to the way the Department regulates domestic passenger screening.

The Department notes the ANAO’s conclusion that it is unable to provide assurance that passenger screening is effective, or the extent screening authorities comply with Regulations due to poor data and inadequate records. A number of initiatives are underway to improve the quality of our data and record keeping with dedicated resources to remediate and improve the current data in RMS, with this activity due to be completed in early 2017. The Department has developed guidance to assist staff on how to use RMS which has improved the quality of new data being collected.

A Working Group has been established to develop a framework to measure the effectiveness and extent that screening authorities are complying with passenger screening regulations. The regular inspections and audits that are undertaken to monitor an airport’s compliance with passenger security screening requirements will be a key component of the framework. This includes testing the effectiveness of their ability to detect and control the entry of prohibited items and weapons into the sterile area.

The Department notes that it has already put in place mechanisms to improve its capacity to use compliance data and is gradually building this capacity. It is also currently revising its compliance approach to better incorporate non-compliance risk into its planning. A schedule for the revoking of grandfathering provisions for security screening has already been developed by the Department and industry stakeholders were advised of the changed requirements in early July.

1. Background

Aviation security screening

1.1 Security screening in the aviation context is defined as:

The application of human, technical and/or other means to identify and/or detect weapons, explosives or other dangerous devices, articles, substances, or other prohibited items or behaviours which may be used to commit, or indicate an intention to commit, an act of unlawful interference against aviation.2

1.2 An unlawful interference is an act or attempted act that jeopardises the safety of civil aviation including the safety of aircraft, airports or people. The expected outcomes of screening are to mitigate against:

  • unlawful seizure of an aircraft;
  • hostage taking on board an aircraft;
  • intrusion on board an aircraft of a weapon or other material capable of threatening the integrity of an airframe; and
  • the use of the aircraft as a weapon.

1.3 The screening of passengers and their carry-on baggage is the most publicly visible part of the aviation security regime. It utilises a combination of specialist equipment and the judgement of screening personnel to detect and control prohibited items and identify certain behaviours. This security function needs to be performed effectively, to meet legislated requirements; and efficiently, in order to facilitate air travel. The ultimate aim of Australia’s screening regime is to ensure that no prohibited items, prescribed weapons, or explosives are carried onto an aircraft.

1.4 Passengers travelling on Australian domestic air services departing from security controlled airports3 are subject to three forms of screening prior to boarding:

  • walk-through metal detection for all passengers;
  • x-ray of all carry-on baggage; and
  • explosive trace detection for randomly selected passengers and their carry-on baggage.

1.5 Passenger security screening, which is required at all security controlled airports from which a screened air service operates, is conducted by a screening authority that has been assessed as meeting specified requirements and authorised by the Department to provide screening services. In most instances, an airport or airline will act as the authorised screening authority, and may sub-contract day-to-day operations to a specialist screening provider.

Aviation security regulatory framework

1.6 Globally, aviation security has been strengthened considerably following the hijacking of four passenger planes used to attack targets in the United States of America on 11 September 2001. In response to the attacks, Australia implemented the Aviation Transport Security Act 2004 (the Act), and the Aviation Transport Security Regulations 2005 (the Regulations). At the time the legislation was introduced, the Government supported explicit regulation, rather than self-regulation or co-regulation, to ensure universal compliance with mandated security standards.

1.7 Australia’s aviation security framework has its origins in the Convention on International Civil Aviation, which is administered by the International Civil Aviation Organization. As a signatory to the Convention, Australia is obliged to regulate its aviation industry to safeguard against acts of unlawful interference with aviation.

1.8 The Department regulates aviation security through its administration of the Act and the Regulations. The legislation establishes a framework for preventive aviation security by mandating minimum security standards that industry participants are expected to meet. It also provides for the Department to undertake compliance activities to ensure industry participants comply with legislated requirements.

1.9 Airports are commercial in nature and the cost of operations, the smooth movement of passengers, as well as safety and security, are priorities for industry participants. The Department is responsible for ensuring industry participants meet legislated requirements and aviation security is maintained in a way that is cost effective to the Australian Government, industry and the travelling public.

1.10 On 4 December 2014, the Senate referred the matter of airport and aviation security to the Rural and Regional Affairs and Transport References Committee for inquiry and reporting. Submissions to the inquiry were received in January 2015, and the Committee was due to report on 19 May 2016. 4 However, the inquiry lapsed on 9 May 2016 with the dissolution of the Senate and the House of Representatives and reporting was not completed.

Roles and responsibilities

1.11 The Department advises the Government on the policy and regulatory framework for Australian airports and the aviation industry. Within the Department, the Office of Transport Security is responsible for administering the Act and the Regulations and for regulating the aviation industry. The Office of Transport Security’s organisational structure is shown at Table 1.1.

Table 1.1: The Office of Transport Security structure and responsibilities

Branch

Responsibility

Aviation Security

Develop aviation security policy.

Oversee programs related to new policy settings and legislative amendments.

Transport Security Operations

Plan, deliver and monitor regulatory activities (approvals, compliance and enforcement) through the National Quality Control Programme.

Engage with industry participants.

Transport Security Operations Reform

Established in January 2016.

Improve the operating model, capability development framework and the capacity of business systems in Transport Security Operations branch.

The scope of the work to be undertaken by the reform branch has not been formalised.

Risk and International

Develop and distribute intelligence and transport security products.

Operate the Transport Security Coordination Centre.

International engagement.

Maritime, Identity and Surface Security

Undertake regulatory reform projects on maritime, offshore oil and gas infrastructure, identity and surface transport matters.

Administration of Aviation and Maritime Security Identification card schemes.

Air Cargo Security Taskforce

Security regulation of the air cargo supply chain.

Note: Branches relevant to aviation security are shaded.

Source: ANAO analysis of the Department’s documents.

1.12 As the regulator of security controlled airports5, the Department is responsible for ensuring that passenger screening is effective in detecting prohibited items and meeting expected outcomes (see paragraph 1.2). The Department administers aviation security legislation by:

  • maintaining a regulatory framework to safeguard against unlawful interference with civil aviation;
  • establishing minimum security requirements for civil aviation;
  • regulating industry participants; and
  • meeting Australia’s obligations under the Convention on International Civil Aviation.

1.13 Airport operators and screening authorities are responsible for delivering security on a day-to-day basis and must meet the minimum legislated security requirements outlined in the Act and the Regulations. Airport operators are required to develop and operate in accordance with a Transport Security Program that has been approved by the Department. Screening authorities must be authorised by the Department and operate in accordance with relevant Airport Screening Notices.

Audit approach

1.14 The audit objective was to assess the effectiveness of the Department’s regulation of passenger security screening at Australian domestic airports. To form a conclusion against the audit objective, the ANAO adopted the following high level criteria:

  • an appropriate framework for assessing and mitigating risks and an effective plan for monitoring compliance have been established;
  • an effective risk-based compliance program to communicate regulatory requirements and to monitor compliance has been implemented; and
  • sound arrangements have been established to manage non-compliance with agreed security screening requirements.

1.15 The audit team examined departmental records, observed screening activities and consulted with departmental staff and a range of key stakeholders.

1.16 The audit was conducted in accordance with ANAO auditing standards at a cost to the ANAO of approximately $521 345.

2. Regulating domestic passenger security screening

Areas examined

This chapter examines the Department’s administration of the regulatory framework, the application of a risk-based approach to passenger screening, and the identification and implementation of improvements by the Department.

Conclusion

A sound regulatory framework has been established that identifies high risk airport types and imposes minimum security standards at those airports. Screening authorities are responsible for managing passenger screening, and the Department is responsible for monitoring compliance with regulatory requirements. The Department’s approach is characterised by a strong relationship with the industry participants responsible for implementing the practical aspects of security, including passenger screening.

There are a number of gaps in the Department’s regulatory capability that have been identified in successive reviews since 2009. The Department has not established suitable performance measures, an enforcement policy, a compliance analysis capability or effective training.

Area for improvement

There is one recommendation aimed at ensuring that the Department set a date at which the grandfathering provisions for the 2011 passenger screening equipment requirements will cease.

Is a sound regulatory framework in place?

A sound regulatory framework has been established through the Act, Regulations and Aviation Screening Notices. The airport categorisation model identifies high risk airport types and applies specific regulatory requirements to those airports. The appointment of a screening authority and the approval of a Transport Security Program by the Department form the basis of the aviation security management arrangements between the Department and the aviation industry. Screening authorities are responsible for managing passenger screening in accordance with the requirements of the regulatory framework. The Department is responsible for administering the framework and monitoring compliance with regulatory requirements.

Regulatory approach

2.1 The Department’s regulatory approach is characterised by a strong relationship between industry and government, where minimum security requirements are supported by legislation and government has a role in monitoring and enforcing compliance. This arrangement recognises that industry participants have specialist capability and expertise to manage key infrastructure, and allows some discretion as to how they implement the Regulations.

Aviation security legislation

2.2 The Act and Regulations provide the overarching guidance for aviation security. The main purpose of the legislation is to establish a regulatory framework to safeguard against unlawful interference with aviation and specify minimum security requirements for civil aviation related activities. An airport categorisation model6 identifies security controlled airport types on the basis of inherent risk.7 It is designed to differentiate between different types of airports for the purpose of applying security measures, including measures related to passenger screening, and subjects them to regulation by the Department.

Aviation Screening Notices

2.3 Aviation Screening Notices (ASNs) are issued by the Department under Regulation 4.17, and specify the methods, techniques and equipment to be used for screening. They give operational effect to the Act and Regulations and prescribe how aviation security screening is to be undertaken. These notices acknowledge the operational differences between the airport categories. For example, ASN 2012 contains procedures for screening liquid, aerosol and gel products at international screening points that are not included in the ASN for category 4 and 5 airports as these airports do not operate international flights.

Transport Security Program

2.4 Operators of security controlled airports must submit, hold and maintain an approved Transport Security Program (TSP) that sets out the measures and procedures that the operator will implement to reduce the risk of unlawful interference with aviation. A TSP sets out the physical places within an airport that are subject to regulation, and how the airport operator will manage security for its operations, including for passenger screening. The TSP is developed by the airport operator, assessed and approved by the Department, and adherence is monitored through compliance activities.

Screening authorities

2.5 A screening authority is a body corporate—usually an airport operator or aircraft operator—that is authorised to conduct screening by a delegate of the Department. Applications to become a screening authority are assessed by the Department according to specific criteria, including knowledge and experience in managing security risks, screening equipment and security incidents. Screening authorities may contract the provision of screening services to other security related entities, known as screening service providers, which may, in turn, sub-contract to other entities. The screening authority remains legally responsible for demonstrating compliance with regulatory requirements, regardless of whether a third party is contracted to provide screening services.

Does the Department adopt a risk-based approach to regulating passenger screening?

The Department’s approach to managing regulatory risk is based on the airport categorisation model that identifies airports with specific risk and operating profiles. The Department engages with industry participants to promote understanding of security obligations and ensure appropriate risk mitigation measures are in place. Engagement occurs through the regulatory approval process, industry forums and compliance activities.

Airport categorisation

2.6 The airport categorisation model identifies airports with specific risk and operating profiles for the purposes of applying a range of security measures, including those related to passenger screening. The Regulations specify criteria to consider when identifying security controlled airports and categorises them according to: the maximum take-off weight of aircraft operating from the airport; the number of passengers departing the airport each year; and international air service or regular open charter operations. All airports that operate aircraft with a maximum take-off weight of 5 700 kilograms and over are included in the categorisation model and are referred to as security controlled airports. Airports that operate with larger aircraft and higher passenger numbers represent a higher risk. As at 16 June 2016, there were 174 security controlled airports classified in categories 1 to 6 (see Table 2.1). The Department monitors compliance of category 1 to 5 airports.

Table 2.1: Security controlled airport categories and risk ratings

Airport category

1

2

3

4

5

6

Number of airports

9

7

43

2

1

112

Risk rating

Very high

Very high

Medium

Low

Low

Very low

Source: ANAO analysis of the Department’s documents.

Regulatory approvals

2.7 The Department is responsible for assessing and approving requests to specify screening authorities and approve TSPs. The regulatory approval process, depicted as a simplified process in Figure 2.1, involves regular communication between industry participants and the Department. This helps to ensure that industry participants understand their security obligations and reduces the risk of non-compliance.

Figure 2.1: The regulatory approval process

 

The regulatory approval process

 

Note: The shaded boxes are undertaken by the Department, and unshaded boxes by the industry participant.

Source: ANAO analysis of the Department’s documents.

2.8 The ANAO reviewed the regulatory approval process by assessing guidance material and a sample of regulatory approvals and found that the:

  • guidance is clear and aligns with the legislation; and
  • engagement process is effective in assisting industry participants to understand their security obligations.

Aviation security forums

2.9 The Department facilitates two stakeholder forums—the Aviation Security Advisory Forum (ASAF)8 and the Regional Industry Consultative Meeting (RICM). Two separate forums recognise the different security challenges and regulatory cost sensitivities faced by regional airport operators and those operating larger, international airports. The forums provide an opportunity for the Department to work constructively with key stakeholders to help manage risk. Meeting minutes from these forums indicate attendance by a broad range of stakeholders (see Table 2.2).

Table 2.2: Aviation security industry forums facilitated by the Department

Forum

Purpose

Required frequency

Frequency in 2014 and 2015

Membership

ASAF

Facilitate constructive industry-government exchange of views on aviation security issues.

Enable industry-government liaison on strategic issues.

3 per year

3 per year

Major aviation industry stakeholder organisations.

Government agencies involved in aviation security activities.

At least one representative from the RICM.

RICM

Facilitate constructive industry-government exchange of views on regional aviation security issues.

4 per year

3 per year

Every industry participant with a TSP.

Local and Federal police.

Industry representative bodies.

Source: ANAO analysis of the Department’s documents.

2.10 The forums have an information sharing and education focus. For example, in November and December 2014 two workshops were held as part of the ASAF and RICM after the National Terrorism Alert Level was raised in September 2014. The purpose was to explore the potential implications of a heightened threat specific to the Australian aviation sector. Outcomes included sharing information on the alert level being raised, discussion around plausible risk scenarios that could arise in a heightened threat environment, and agreement on additional security measures that could be applied to prevent or reduce the likelihood of these risk scenarios from occurring.

2.11 Industry participants informed the ANAO that they value the opportunity to obtain and provide feedback on industry issues related to security in a constructive setting, and consider it an important tool for networking within the industry. The RICM terms of reference specify that meetings should be held four times per year. During 2014 and 2015 only three were conducted. The Department consulted with industry participants at the April 2016 RICM to determine the appropriate frequency of these forums and agreed to reduce the number of annual meetings to two, with options for extraordinary meetings and teleconferences. The terms of reference are to be updated accordingly.

Does the Department address gaps in regulatory capability?

The Department has not addressed a number of systemic issues that hamper its ability to implement a risk-based regulatory regime and provide assurance as to the effectiveness of passenger screening. The need to develop performance measures, analyse compliance data, implement an enforcement policy and provide adequate training have been identified in successive reviews but solutions are yet to be delivered.

A number of specific activities have been initiated to improve regulatory capability but progress has been delayed, which may reduce the effectiveness of passenger screening regulation.

2.12 The Department has commissioned several reviews relating to various aspects of aviation security in recent years.9 These reviews made over 100 recommendations aimed at improving specific policies and processes. The reviews highlighted a number of gaps in regulatory capability relevant to passenger screening, where the need for improvement has been repeatedly identified but solutions are yet to be achieved, including:

  • implementation of an enforcement policy (see paragraphs 3.16 to 3.18);
  • development of performance measures (see paragraphs 4.1 to 4.5);
  • training and guidance for staff (see paragraphs 4.11 to 4.13); and
  • analysis of compliance trends based on accurate, reliable data (see paragraphs 4.26 to 4.29).

2.13 A number of specific activities that are relevant to effective passenger screening regulation have been initiated by the Department but not completed. The following case studies highlight three such activities, where completion has been delayed and may reduce the effectiveness of passenger screening regulation.

2.14 The Department has recently established the Transport Security Operations Reform branch, which is responsible for improving the Department’s regulatory capability. Even so, not all improvements are part of the Transport Security Operations branch’s responsibility and a more integrated improvement approach is necessary.

Case study 1. Grandfathering arrangements for passenger screening equipment

The use of modern passenger screening equipment is an important risk mitigation tool in the aviation sector. In February 2011, the requirements for security screening equipment were revised and incorporated into relevant Aviation Screening Notices in January 2013. The notices allow for equipment that was purchased prior to February 2011 to continue to be used. This concession was intended to allow some time for all operators to update their equipment through normal commercial planning cycles, and was to be repealed at an unspecified time in the future. In December 2013, the Department informed the Minister that a proposal had been developed to end these ‘grandfathering’ provisions for passenger screening equipment by the end of 2015.

In October 2014, the Department decided that no further action would be taken regarding removal of the ‘grandfathering’ provisions, pending the outcomes of other work being done in the Aviation Security branch to amend the categorisation model. On 24 June 2016, after receiving the proposed report for this audit, the Department wrote to industry participants proposing the following timeline for removal of the ‘grandfathering’ provisions:

  • 1 July 2017, for explosive trace detection and walk-through metal detector equipment; and
  • 1 July 2018, for x-ray screening equipment.

At the time of publication, the relevant Aviation Screening Notices had not been amended.

Recommendation No.1

2.15 That, independently of other projects being conducted, the Department sets a date at which the grandfathering provisions for the 2011 passenger screening equipment requirements will cease, and amends the Aviation Screening Notices accordingly.

The Department’s response: Agreed.

2.16 A schedule for the revoking of grandfathering provisions for security screening equipment has been developed. Under this proposal, use of explosive trace detection (ETD) and walk-through metal detector (WTMD) equipment that does not meet requirements outlined in Aviation Screening Notices (ASN) will be required to cease by 1 July 2017, and x-ray and checked baggage screening (CBS) equipment by 1 July 2018. This will allow industry the opportunity to manage any equipment transitions that are needed. The Department is consulting with the CEOs of all relevant airports to advise them of the proposed schedule and to request, by 1 July 2016, their comments and plans for equipment replacement.

2.17 In the interim, all passenger security screening equipment currently in use is from respected, major international manufacturers and must pass daily operational tests to ensure it is operating to specification.

Case study 2. The suspension of the system test for detection of firearms

System tests are a specialist covert activity conducted in busy public places where multiple law enforcement and intelligence agencies operate. The tests aim to simulate realistic security threats in order to test the effectiveness of the passenger screening system.

The Department has developed procedures for a number of system tests, including one involving a replica gun used to test the screening process and its ability to detect firearms in carry-on baggage. This test was suspended in March 2014 due to administrative issues relating to firearms licencing and work health and safety concerns. In January 2016 a draft risk mitigation plan identified a number of safety and operational risks associated with the conduct of this test. The plan indicated that additional risk controls and strengthened operating procedures could reduce the risk rating to an acceptable level. Nevertheless, a decision on the future of this test is yet to be made, two years after it was suspended.

The Department informed the ANAO that carry-on baggage screening is tested using other test pieces, including one that has the same metallic content of a small hand gun. Given the intent of system testing to simulate realistic threat scenarios, the ANAO suggests that the issues which led to the suspension of testing using a replica gun be addressed and a schedule for reinstatement of the firearm test be implemented.

Case study 3. Inconsistent guidance material

Six out of nine reviews conducted between 2009 and 2015 made recommendations relating to training and guidance for Transport Security Inspectors. In 2013, the Department conducted a scan of guidance material and identified widespread duplication, conflicting information and information gaps among 295 internal-use guidance documents. Acknowledging the issues in the guidance material, the Department initiated a number of regulatory guidance material improvements during 2015, including:

  • the Document Governance Framework;
  • the Transport Security Operations Guidance Strategy;
  • the Transport Security Operations Guidance Documentation Status document;
  • an upgraded Regulatory Library; and
  • revision of key regulatory guidance documents.

However, the ANAO found numerous instances of undated, inconsistent documents in use in the Department. For example: six procedural documents relating to the application of risk to the compliance program were provided to the ANAO during the course of the audit.a All of the documents were undated and contained inconsistencies, primarily with regard to the number of applicable security mitigation categories in the compliance program. Some documents referred to eight categories and some referred to seven.b Some documents listed eight categories in one part of the document and seven in another.

The Department informed the ANAO that these documents were current, and that many were not centrally managed in the regulatory library, but held electronically in one regional office. This suggests that the systematic approach mandated in the Department’s Document Governance Framework is not always being followed. Inconsistencies in the guidance documentation for the application of security mitigation categories could impact the integrity of data inputs.c

Note a: The documents related to the application of security mitigation categories in the compliance program.

Note b: Those that referred to seven security mitigation categories did not include the category ‘screening and clearing’.

Note c: Inadequate guidance and training was identified as one factor in the flawed migration of data to the new information management system in 2014.

3. Monitoring Compliance

Areas examined

This chapter examines the Department’s arrangements for monitoring industry participants’ compliance with legislated requirements and departmental policies, and managing non-compliance.

Conclusion

The Department conducts a planned program of compliance activities based on the airport categorisation model. Past performance of individual industry participants has not been taken into account in the development of the compliance program, and compliance activities are not directed towards those industry participants that are at higher risk of non-compliance.

The Department does not have an enforcement policy and does not utilise the full suite of enforcement actions available to it. Voluntary compliance is promoted through education, counselling and the management of corrective action plans.

Area for improvement

There is one recommendation aimed at incorporating past performance into the compliance program to enable resources to be directed towards those industry participants with a higher risk of non-compliance.

Does the Department monitor compliance with regulated security requirements?

The Department conducts a planned program of compliance activities including audits, inspections and tests, which aim to assess the effectiveness of aviation security, including passenger screening. The compliance program is based on the airport categorisation model, which incorporates risk factors such as the maximum take-off weight of aircraft operating from the airport and the number of passengers departing the airport each year.

Compliance activities

3.1 The Department has established a plan for monitoring compliance with regulated requirements through the National Quality Control Programme (NQCP). The NQCP includes regular audits, inspections, tests and capacity building activities, which are conducted by transport security inspectors from the Department’s regional offices.

Table 3.1: Types of compliance activities

Compliance activity

Description

Audits

Purpose: to determine the extent to which an industry participant is compliant with Australia’s transport security legislation, and identify the root cause of non-compliances. Audit activities seek to provide assurance that the Transport Security Program (TSP) addresses regulatory requirements, and the measures set out in the TSP have been implemented.

Audits consist of a set of activities designed to systematically examine the extent of compliance with transport security requirements. Examples of audit activities include: inspection of passenger screening equipment; assessment of training records; and verification of access controls.

Inspections

Purpose: to confirm that an industry participant remains compliant with transport security legislation.

Inspections are designed to critically examine a process, procedure or an aspect of an industry participant’s business to determine their compliance with Australia’s transport security legislation. For example, inspections of training records.

Equipment tests

Purpose: to ensure the accuracy and effectiveness of equipment used for the screening of people, vehicle or goods at airport screening points.

The Department mandates testing requirements for metal detectors and x-ray equipment located at all airport screening points. Equipment tests are conducted daily by screening staff at the opening of each screening point, and also by the Department’s transport security inspectors as required by the NQCP.

System tests

Purpose: to test the effectiveness of the passenger screening system.a

Systems tests simulate a realistic threat of unlawful interference with, or penetration of, a security system. They are conducted covertly (without the knowledge of the screening operator) and involve a Departmental transport security inspector attempting to pass through a screening point while carrying a prohibited itemin order to test whether the item is detected, identified and controlled. System tests return a single pass or fail result. They are a simple way to test how effective the passenger screening system is at detecting and controlling prohibited items.

Capacity building activities

Purpose: to enhance industry’s attitude to, awareness and understanding of, and conformity with, security requirements.

Capacity building activities can be conducted face-to-face, by telephone or correspondence. For example, a telephone discussion with an industry participant about the content of their TSP.

Note a: The passenger screening system comprises the people, processes and procedures, and equipment used to detect and control prohibited items at the screening point.

Note b: In some cases, this may be a simulation of a prohibited item.

Source: ANAO analysis of the Department’s documents.

3.2 In 2014–15 the Department reported that it conducted 1216 compliance activities across a range of transport sectors. Of these, 905 activities (75 per cent) were conducted in the aviation sector. The majority of compliance activity was conducted at category 1 and 2 airports. Detailed analysis of these activities was not possible due to data integrity issues, which are discussed in Chapter 4.

3.3 Compliance activity findings are categorised as compliant, non-compliant or observation. Non-compliance represents a breach of transport security regulations and results in a non-compliance notice that is subject to a formal follow-up and acquittal process. Where non-compliance is identified, the extent of the non-compliance should be assessed and a corrective action plan developed, approved and monitored to ensure it is addressed. Observations represent potential security vulnerabilities and are provided to the industry participant as an observation notice that is not subject to formal follow-up arrangements.

3.4 Detailed guidance for managing corrective action plans is available, and discussions with transport security inspectors indicate sound knowledge of the process. However, as with most of the Department’s compliance activities, data on the number and status of non-compliances is difficult to extract from the information management system and the data is unreliable (see Chapter 4).

Does the Department apply a risk-based approach to compliance activity?

The compliance program does not incorporate the performance levels of individual airport operators or screening authorities, and non-compliance trends are not taken into account when prioritising compliance activities. The result is that non-compliance risk is not incorporated into the compliance program, and staff resources are not effectively deployed to areas that may require additional support or monitoring.

3.5 The NQCP team was established in 2014–15 and is responsible for developing a nationally consistent program of compliance activity in consultation with each regional office, and for reporting on compliance activities. The NQCP ‘Compliance Touch’ document details the minimum number and type of compliance activities required to be conducted at airports in each category annually.10

3.6 Additional NQCP activities are prioritised based on the airport categorisation model and the risk rating for each category (see Table 2.1). The process for determining the risk rating in the 2015–16 NQCP involves:

  • prioritising those airport categories with more security mitigation measures in place, such as those operating international flights, to determine a risk mitigation score;
  • applying ten adjustment score questions to each category to determine a quality control priority score; and
  • applying the risk mitigation score and the quality control priority scores to the airport categories to calculate a revised risk rating.

3.7 The compliance history of airport categories and individual industry participants is also relevant in determining compliance risk. Past performance information can be used to identify systemic issues in: individual operations; regional or seasonal factors; or specific security functions. This information may indicate areas where additional monitoring or support is required to improve performance.

3.8 The Department has stated that it uses compliance activity results to identify trends and areas of industry participant operations that may require additional compliance activity.11 However, the Department has limited information about the compliance history of individual industry participants. Likewise, the Department’s ability to identify systemic issues, understand seasonal variations or compare performance across industry is restricted by the absence of reliable data and analysis.

3.9 Experienced transport security inspectors, who have developed considerable knowledge of industry capability over time, can provide information about airport compliance based on their own learnings and by manually checking records of past compliance activity. However, without effective data collection, storage and retrieval to enable analysis of individual industry participant performance or the identification of systemic non-compliance, the Department is unable to assess the risk of non-compliance and apply resources where that risk is higher. Further, there is no feedback loop in the compliance cycle to indicate whether compliance is improving, or where additional monitoring may be required.

Recommendation No.2

3.10 That the Department:

  1. establishes an analysis function to identify non-compliance trends based on accurate, reliable compliance activity data; and
  2. incorporates the results of the analysis into the compliance program, focusing on areas at risk of non-compliance.

The Department’s response: Agreed.

3.11 The Department is already working to improve its capacity to use compliance data. In late 2014 a dedicated data and analytical team was established in the Office of Transport Security (OTS) to build the capability needed to undertake data analysis. Recruitment for this team has largely been completed, training is underway and its capacity to analyse data to inform the Department’s compliance program is developing.

3.12 For 2016-17, the Department is also changing its compliance approach to better incorporate individual non-compliance risk into its compliance planning. The revised approach will be based on current risk and threat assessments and informed by past compliance findings. Using historical data to inform the compliance plan will be contingent on ongoing data remediation work and progress building the regulatory management system reporting capacity.

Does the Department manage non-compliance with regulated security requirements?

The Department does not have an enforcement policy for managing non-compliance with regulated requirements. In practice, non-compliance is managed through the corrective action plan process, which involves the development, approval and monitoring of a plan to rectify non-compliance. However, there is no clear escalation process for serious or systemic non-compliance, and no guidance on the application of the various sanctions that are available to the Department. Clear guidance on available enforcement options that are proportionate to the risks, and guidance on their application, would assist in the management of non-compliance.

3.13 The Department has a range of responses available to manage non-compliance including:

  • education;
  • counselling;
  • cautions;
  • infringements;
  • enforceable undertakings;
  • powers to direct actions;
  • control directions;
  • injunctions;
  • cancellation of a TSP; and
  • prosecution.

3.14 The majority of responses are not applied and the Department encourages voluntary compliance through education, engagement and compliance activities.

3.15 When non-compliance is identified, a corrective action plan, developed by the industry participant to address the non-compliance, is approved and monitored by the Department. This approach is appropriate for most instances of non-compliance, where the industry participant is willing and able to implement appropriate actions to address the non-compliance in a timely manner. Where this is not the case, the Transport Security Compliance Manual advises that the matter may be escalated as a serious non-compliance12 and should be referred to the National Investigations Unit for further action. However, the National Investigations Unit is no longer in operation and an alternate arrangement has not been documented.13

3.16 The Department has recognised the need to develop a clearly articulated enforcement policy, and provided an undated paper to the ANAO that details options for enforcement approaches, and states:

[The Office of Transport Security’s] preferred approach to enforcement is to facilitate compliance through education and awareness … there is a need to strengthen its enforcement capability to ensure [the Office of Transport Security] can identify non-compliance, apply sanction and deter future non-compliance.14

3.17 In response to a review in May 2013, the Department undertook to establish a compliance and enforcement policy. In November 2015 the Department informed the ANAO that the compliance and enforcement policy had been put on hold pending the development of a Regulatory Strategy. Work on the Regulatory Strategy has also been put on hold pending implementation of a new Transport Security Operations operating model.15 The operating model is scheduled for implementation in early 2017. This suggests that an enforcement policy will not be considered before 2017 at the earliest, four years after the need for a policy was identified.

3.18 The Department should develop a clear enforcement strategy, independent of other operational projects such as the Transport Security Operations operating model, which is communicated to relevant stakeholders, including industry participants. An appropriate range of enforcement options that are proportionate to the risks would assist in the effective management of non-compliance.

4. Measuring Performance

Areas examined

This chapter examines the Department’s ability to measure the effectiveness of passenger screening, analyse performance and report to stakeholders.

Conclusion

The Department does not have appropriate performance measures in place to determine whether security screening is effective. Analysis and reporting of compliance findings has been limited and does not enable assessment of the performance of individual industry participants or facilitate trend analysis. The Department is unable to provide assurance to the Parliament that passenger screening is effective or to what extent industry complies with the regulations related to domestic passenger security screening.

The Department could do more to understand the information requirements of different stakeholders, and in what format that information should be delivered. This knowledge could be used to inform the development of its reporting capability.

Areas for improvement

There are three recommendations aimed at: developing performance measures; delivering targeted reporting to stakeholders; and providing appropriate training for regulatory management system users.

Does the Department measure the effectiveness of passenger screening?

The Department has not established effective performance measures for passenger screening. The relevant key performance indicator in Programme 2.1 of the Department’s Portfolio Budget Statements relates to the number of compliance activities conducted by the Department and does not address the results of those activities. It does not include a quality measure that indicates whether aviation security meets a prescribed level of effectiveness. Similarly, the key performance indicators to be reported in 2015–16 as part of the Government’s Regulatory Performance Framework do not measure the effectiveness of aviation security generally, or passenger screening specifically.

Key performance indicators

4.1 The Department’s obligations for managing transport security are reflected in Programme 2.1–Transport Security in its Portfolio Budget Statements. The Programme 2.1 objective states that the aim is to ensure flexible and proportionate regulation that delivers measurable benefits. Programme 2.1 includes one deliverable, as shown in Table 4.1.

Table 4.1: Portfolio Budget Statements key performance indicator

Deliverable

Targets

2014–15

2015–16

2016–17

2017–18

2018–19

Percentage of high risk cases subject to compliance activity within 12 monthsa

95%

95%

95%

95%

95%

Note a: Prior to 2015–16, to meet the key performance indicator, the compliance activity was defined as an audit, inspection or system test. From 2015–16, the compliance activity must be a full audit.

Source: Infrastructure and Regional Development Portfolio, Portfolio Budget Statement 2015–16, p. 41.

4.2 The Department reports that it has consistently met this performance indicator, with over 95 per cent of high risk industry participants being subject to a compliance activity every year. However, the performance indicator is not meaningful and is set below the Department’s usual level of activity. The indicator lacks sufficient detail to allow for an informed assessment of industry participant’s regulatory performance or demonstrate changes over time. The deliverable is a quantitative measure, and in isolation does not provide an effective means for Parliament to assess the quality of regulatory outcomes, only that a compliance activity has been undertaken.

4.3 The Department has established performance indicators, measures and evidence under the Government’s Regulatory Performance Framework. The framework includes five performance indicators (outlined in Appendix 3) against which the Department will report for the first time in 2016. Like the Portfolio Budget Statement’s performance indicator, the Regulatory Performance Framework indicators do not include a measure which, when reported against, would provide insight into the effectiveness of passenger screening.

4.4 In 2003, the ANAO recommended that the Department establish ‘… specific, practical, achievable and measurable performance requirements for aviation security …’.16 The Department agreed to consider the introduction of performance targets, noting that ‘development of a positive security culture within the aviation industry requires encouragement of a continuous improvement process through effective and comprehensive education and regulation’.17

4.5 The process of passenger screening and its effectiveness at detecting and controlling prohibited items and weapons is measurable through the system testing regime currently operated by the Department. Industry participants have indicated their support for the introduction of benchmarks for passenger screening to assist them to understand their performance in comparison to other operators, identify weaknesses and promote continuous improvement. Without performance measures, it is difficult to determine which screening authorities are performing passenger screening effectively and require less monitoring, and which are performing poorly and may require additional support to determine the cause and implement corrective actions.

Recommendation No.3

4.6 That the Department, in consultation with stakeholders, develops performance measures for passenger screening that are practical, achievable and measurable.

The Department’s response: Agreed

4.7 The Department is consulting with stakeholders to develop evidence-based, measurable key performance indicators. This began in June 2016 with the systems test working group, which involves industry stakeholders, and has been established to review and update the systems test regime. As part of its review of the regime, the working group will develop performance measures for some aspects of passenger screening. This work will be broadened once the systems testing regime has been updated.

Does the Department manage compliance data effectively?

The Department does not manage compliance data effectively. A new information management system, implemented in April 2014, does not meet the Department’s business requirements. The data is unreliable, there is inadequate reporting functionality, and training does not meet the needs of users. The Department does not have a robust system to collect, store and retrieve information about industry participants and their compliance with legislated security requirements. The project established to deliver the new information management system was closed in January 2016, and the Department reported that outstanding capability is to be addressed through other activities.

4.8 In 2009, the Review of Aviation Security Screening Report described the Department’s security screening compliance data as ‘insufficient either to provide assurance about effectiveness or to direct targeted support …’.18 At that time the Department’s management of passenger screening information was characterised by a range of disparate systems, with difficulties in maintaining data fidelity and extracting meaningful reports. In 2011, the Regulatory Capacity Building Program was established to address functional gaps, replace unsuitable systems and introduce new tools to support staff through the introduction of a regulatory management solution.

4.9 Priority was given to replacing existing information systems with a new information management system, to be known as the Regulatory Management System (RMS). In April 2013, funding of $3.6 million was approved to contract the development of RMS and implementation was scheduled for July 2014. The proposed solution was expected to provide a single source of truth, allow for accurate, regular and repeatable reporting across a range of activities in the Office of Transport Security, and enable analysis of multiple data sets and their relationships.

4.10 The RMS system was first released in April 2014, when user acceptance testing revealed a number of unexpected system problems that delayed training for regional office staff. Since then, the Department has identified ongoing issues with training, data quality, and lack of reporting functionality. These issues persist today and impact the Department’s ability to provide assurance that passenger screening is effective.

Regulatory Management System deficiencies

Training for Regulatory Management System users

4.11 RMS user training and guidance was to be delivered in 2013–14 as part of the project, and funds were allocated for this purpose.19 However, specific details about which staff members would receive what training were not recorded in project documentation. The Department has not delivered instructor-led training for RMS users, and informed the ANAO that, since the implementation of RMS in April 2014, all training has been delivered on an ad-hoc basis by internal staff.

4.12 Several times between March and September 2015 the Department acknowledged, in internal reports, that training was not meeting user needs and that an instructor-led training course for all RMS users was to be delivered. In January 2016, the Department reported that training would be provided ‘after the [Office of Transport Security] reforms are completed.’

4.13 The use of consistent practices by trained system users is required to facilitate data quality and reporting. Inadequate training has resulted in a lack of user confidence and has contributed to data quality issues.

Recommendation No.4

4.14 That the Department conducts a training needs analysis for users of the regulatory management system, delivers appropriate training, and monitors its effectiveness.

The Department’s response: Agreed.

4.15 As part of the broader development of a learning and development framework for the Transport Security Operations Reform Program, a needs analysis will be undertaken to ensure staff are trained and supported to undertake all of their regulatory responsibilities. This will include an assessment of training requirements for the regulatory management system. Once the skills, knowledge and capabilities of staff have been assessed against their roles, a plan to address gaps will be developed and implemented. The learning and development framework will include a process for reviewing and maintaining skills.

4.16 As the needs analysis is undertaken and framework developed, training outreach is being provided to all users of the regulatory management system.

Data quality

4.17 The data in RMS is unreliable and a range of anomalies have been identified in all aspects of data held within the system. Data quality was identified by the Department as a risk in March 2014, and was recognised as an issue in October 2014. The problems apply to completeness and correctness of all aspects of RMS data, and stem from incomplete data migrated from legacy systems and inconsistent processes adopted by users who were not trained in the use of the system.

4.18 The Department commenced planning a data remediation exercise in March 2015 and has developed a detailed schedule of activities to be completed, including estimated durations and required resources. Completion is scheduled for October 2016. Schedule data provided by the Department indicates that the duration of the first remediation activity was eight weeks. However, at the current rate of progress, this task will take considerably longer than estimated.

4.19 The Department informed the ANAO that many of the resources identified to conduct the remediation work are transport security inspectors. This will have a negative impact on the compliance program, reducing the number of compliance activities completed. The Department has not calculated the extent of the impact. Given the current rate of progress, it is likely to be significant.

Reporting capability

4.20 In March 2015, one year after RMS was implemented, the reporting functionality was found to be ‘not working to specification’.20 Specialist contractors have been engaged since June 2015 to analyse and develop detailed reporting requirements. The Department informed the ANAO that the contractor’s role is to: deliver a set of eight reports; propose and implement a final reporting solution; and support the data remediation project.

4.21 The eight reports were originally due for completion in September 2015. Each month since then, the Department has estimated that the reports will be completed in the following month. In April 2016 the Department informed the ANAO that four of the eight reports, described as complete in March, require ‘further adjustment and review’ and that one report is ‘not at all useful’. The Department has not developed a detailed schedule for the work required to complete the eight reports, but informed the ANAO that three contractors are expected to be retained until at least June, August and November 2016 respectively. As at 6 May 2016, the estimated cost for this work is around $977 000.

4.22 Two years after its implementation, RMS remains unable to produce standard reports such as industry participant contact details, regulatory approval status or compliance activity data.

Closure of the Regulatory Management System project

4.23 The Department informed the ANAO that the contractor responsible for developing and implementing RMS has met its contracted obligations, but the RMS project’s business requirements were not appropriately articulated in the contract specification. Consequently, the project has not delivered the expected benefits of providing: a single source of truth; accurate, regular and repeatable reporting; and analysis of multiple data sets and their relationships.

4.24 The RMS project was officially closed on 14 January 2016, with the budget of $3.6 million fully expended. However, the project has not delivered against its objectives and significant capability such as reliable data, basic reporting and user training remain outstanding. The RMS End Project Report notes that reporting capability, data remediation and training are to be delivered outside the RMS project.

4.25 The lack of data quality and reporting functionality indicates that there is a large body of work to be completed before RMS can deliver any tangible benefits to the Department. As a regulator, the Department should be able to report accurate information about the entities it regulates. With the closure of the RMS project, the Department should ensure appropriate resources are allocated to deliver the required capability.

Does the Department analyse compliance data?

The Department has conducted limited analysis of compliance data. Poor data quality and lack of reporting capability in the current and previous systems have made analysis of data difficult. The reports produced from analysis activities reflect an amalgamation of available data, rather than a comprehensive analysis of compliance trends or systematic evaluation of compliance performance.

4.26 The Department has conducted some analysis of compliance data. For example, limited analysis of system test results has been conducted and the results included in internal reports on a monthly or quarterly basis. The analysis shows the number of system tests conducted and the failure rate for each airport category for the period of the report. However, it was not possible from the data provided to determine whether system test failures were increasing or decreasing or whether certain screening authorities were performing better than others.21

4.27 The Department also produced six Regulatory Analysis Products (RAPs) in the period July 2014 to August 2015 that were intended to reflect analysis of aviation operations and identify trends. The reports were developed for internal use, with broad themes from the analysis discussed at industry forums. Two of the six reports provided a breakdown of compliance activity conducted at category 3 airports for a specified period and identified themes such as the rate of non-compliance in comparison with all other airport categories and the areas in which the non-compliance was found. The two reports were similar in presentation allowing comparison of information, although statistics such as the percentage of non-compliance and failed system tests were contained in the text and difficult to locate. The remaining four RAPs related to category 1 airports. The content and format of these reports varied markedly, precluding any comparison of the information contained in them. No further RAPs have been produced.

4.28 In September 2015, the Department conducted a pilot study to assess historical compliance data, captured in the legacy information management systems, for use in developing security mitigation categories. It utilised compliance results from the 2013–14 financial year and categorised them according to the security mitigation categories by state.22 The analysis identified that the most common non-compliances during the period were in the categories of security management, screening and clearing, and access control. The Department reported a number of issues identified during this activity including poor data quality and inconsistent recording of non-compliances.

4.29 The Department’s ability to analyse compliance data has been hampered by unreliable data and system limitations. Collection of data relies largely on manual processes to extract and verify information. Verified data is then manually consolidated in spreadsheets and filtered to produce reports. Analysis conducted by the Department does not enable comparison of compliance trends over time and it is not possible to determine if industry compliance is at an acceptable level, or whether it is improving or declining.

Does the Department report on the effectiveness of passenger screening?

The Department does not report passenger screening compliance results to the Secretary, the Minister or to industry participants. Compliance activity results have been reported to the Office of Transport Security executive on a monthly or quarterly basis. These reports provided compliance results for the relevant reporting period, but did not support comparison of the data over time or identify trends.

Compliance activity reporting to the Office of Transport Security executive

4.30 Reports on compliance activity to the Office of Transport Security executive included system test results and failure rates for passenger screening tests conducted during the relevant reporting period. The reports did not include historical data and the way information was categorised in different reports made it difficult to identify trends over time.

4.31 Monthly reports between July 2015 and November 2015 attributed each non-compliance and system test failure to a specific airport. Over time, this information would enable trend analysis by airport, airport category and region. However, a new format introduced in December 2015 groups all non-compliances and system test failures into security mitigation categories and airport categories. While this is a positive step in terms of identifying the type of non-compliance, the absence of airport identification compromises the Department’s ability to effectively identify trends or systemic issues.

Compliance activity reporting to the Secretary and the Minister

4.32 Reporting outside of the Office of Transport Security consists of: a Weekly Issues Report to the Secretary that includes planned compliance activity; and annual reporting to the Minister against the Portfolio Budget Statement key performance indicator. Information provided to the Secretary and the Minister is insufficient to provide assurance about the effectiveness of passenger screening.

Compliance activity reporting to industry participants

4.33 The Department reported system test results for category 1 airports during 2013–14 and 2014–15. The reports compared the results for individual category 1 airports with overall category 1 results. The resulting report clearly indicated individual system test failures compared to national results, expressed as a percentage of the total tests conducted. This reporting ceased in March 2015 due to data quality issues that are yet to be resolved. Industry participants have indicated support for this kind of comparative analysis, which provides clear outcomes focused results.

Reporting requirements

4.34 Different stakeholders have different information requirements. Internally, the Department requires information about areas of non-compliance that would benefit from additional monitoring and support. This would allow the Department to direct departmental resources to areas of high risk, and reduce the burden of compliance activity for industry participants with a high rate of compliance.

4.35 Reporting to the Parliament enables transparency and accountability. It can provide information about the effectiveness of aviation security generally and passenger screening specifically. It can also enable assessment of security policy to ensure the level of regulation being applied is appropriate to provide effective security outcomes.

4.36 Industry requires practical analysis of non-compliance data to determine the cause and identify solutions to improve passenger screening performance. Better reporting of compliance results would help screening authorities manage passenger screening more effectively and encourage ongoing improvement.

4.37 However, an assessment of stakeholder information requirements and resolution of data quality issues is required before meaningful, accurate reporting can be delivered.

Recommendation No.5

4.38 That the Department provides targeted reporting to its stakeholders, based on accurate data, which enables an assessment of the effectiveness of passenger screening, and promotes improved passenger screening effectiveness.

The Department’s response: Agreed.

4.39 Since the commissioning of the regulatory management system, the Department has continued to invest in the development of its reporting capability and remediate data holdings. It is anticipated that this work will begin to deliver the foundations of a reporting capability by the end of 2016. The Department’s intention remains to provide useful reports on passenger screening compliance trends and patterns to stakeholders once this capability is available.

Appendices

Appendix 1: Entity response

 

image of entity response letter

 

 

image of response letter

 

Appendix 2: Aviation Security Reviews

Review

Year

Description of Review

Number of Recommendations

Cost

Review of Aviation Security Screening

2009

Examination of a wide range of aviation security screening issues including purpose, regulation and measuring performance.

27

N/Aa

Screened Airport Vulnerability Assessment

2010

Conduct vulnerability assessments of perimeter security, access control and Front of House/mass gathering arrangements at eight regional screened airports.

29

N/Aa

Management of Compliance Findings by the Office of Transport Security

2011

Review of the controls in place for assessing, monitoring, reporting on and closing compliance findings.

4b

$29 158

Transport Security Operations, Review of Operations

2013

Review of Transport Security Operations’ operating structures and processes.

20

$42 446

Regulatory and Compliance Activities

2013

Review of the regulatory and compliance activities undertaken by the Office of Transport Security.

8

$56 771

Environmental Scan on Building Analytical Capability

2014

Identification of the current data and information sources, along with data gaps and reporting requirements.

19

N/Aa

National Compliance Program

2014

Review of the Transport Security Operations’ compliance program planning and processes.

6

$49 530

Data and Analytics Capability

2015

Identification of a blueprint to guide the use of data across the Office of Transport Security.

18c

$79 207

Regulatory Approvals – Efficiency and Effectiveness Review

2015

Review concentrated on reforming the management and resourcing around Transport Security Programme approvals.

6

$15 136

Note a: These reviews were conducted internally or by other Government entities at no cost to the Department.

Note b: The review report described these findings as ‘focus areas’.

Note c: The review report referred to ‘improvement opportunities’ and listed 4 high priority, 4 medium priority, and 10 low priority improvements.

Source: Analysis of documentation provided by the Department.

Appendix 3: Regulatory Performance Framework Key Performance Indicators, Measures and Evidence

Deliverables (Program 2.1)

Key Performance Indicators

Performance measures (examples of some measures to be applied)

Examples of evidence

  • Collaborating with industry to improve the efficiency and effectiveness of the transport security frameworks.
  • Working closely with industry and government partners to identify emerging and new risks and ensure that strategies are developed to mitigate them.
  • Targeting our capacity building activities to highest risks offshore.
  • Collaborating with international and industry partners to deliver effective programmes that sustainably strengthen security in the region.
  • Efficient and lawful administration of approval processes that meet statutory timeframes.
  • Working closely with the intelligence community and other stakeholders to ensure information is accessible, and shared, about threat and risk.
  • To partner with industry and government, in multilateral fora and bilaterally, to influence and inform the development of policy and standards.
  • Facilitating business-government partnerships through provision of secretariat support to the Maritime and Surface sub-groups of the Transport Sector Group within the Australian Government’s Trusted Information Sharing Network for Critical Infrastructure Resilience.

1. Action is taken by the Government to mitigate against new or emerging risks, where they are identified.

  • Conduct robust security risk assessments to inform OTS policy responses and compliance activities.
  • Monitoring compliance by industry partners with regulatory obligations and on a risk basis.
  • Strong connections with key intelligence agencies and good information sharing processes.
  • OTS Risk Framework and methodology.
  • OTS Security Risk Course developed and delivered to all staff.
  • Risk assessments are conducted as part of the policy development cycle to fully understand any risks when reviewing regulations and policies.
  • Outcomes of compliance activities.
  • Products about issues and threats are issued to inform industry views and risk assessments.

2. Collaboration with industry ensures that policy and regulatory frameworks are efficient and effective.

  • New, and revisions to, policies and regulations are developed in consultation with industry and government partners.
  • Regular industry forums are held and include consultation on policy and regulatory initiatives and issues.
  • Regulatory issues are systematically identified, reviewed and resolved, with timely advice provided to staff and industry.
  • Continuous improvement ensures that regulatory administration is fit- for-purpose, efficient and supports accountability.
  • Findings from QA audits, legal advice and other investigations are incorporated into operational guidance and advice to industry (consistent with the regulatory governance framework).
  • Regular industry consultative forums are used to collaborate with industry on new policy and regulation.
  • Ad hoc workshops to work through specific policy and regulatory issues are held with industry.
  • Industry representatives participate in ICAO Av Sec Panel to understand international frameworks.
  • Guidance is issued to staff and industry as regulatory issues are resolved and/or clarified.
  • Implementation of 2015/16 quality assurance audit program.

3. International and domestic engagement influences and shapes policy and future developments.

  • Industry is consulted on new, and revisions to, policy on both international and domestic issues.
  • Effective coordination and participation with international partners on security issues at multilateral meetings and bilaterally.
  • Development and delivery of the annual Last Ports of Call (LPOC) Program.
  • Development and delivery of the annual Capacity Building Program, focuses on areas of higher risk to Australia.
  • OTS participation in major international meetings such as IMO, ICAO AvSec Panel and working groups and QUAD. Positions and joint work agreed on key issues.
  • OTS forums ASAF, RICM, MISCF, OGSF and various working groups provide opportunity for industry to advise on policy issues and future trends of concern.
  • LPOC program effectively covers Australia’s high risk LPOCs and ensures reasonable understanding of all LPOC risks.
  • Capacity building program closely mirrors highest LPOC and other risks to Australia offshore.
  • OTS International Engagement strategy.

4. Communication with regulated entities is clear, targeted and effective.

  • Regulatory decisions are made in accordance with the principles of administrative law and consistent with the powers provided under the transport security legislation.
  • Regulatory advice is consistent, transparent and promotes accountability.
  • Decisions and associated advice are provided in a timely manner and clearly articulate reasons for decisions.
  • Industry is well informed in advance of proposed changes to transport security obligations.
  • All major changes to transport security obligations are implemented in accordance with well-developed communications strategies that are clear about:
  • changes to obligations for industry participants;
  • timings of those changes; and
  • information about changes to monitoring activities.
  • The Decision Making Framework outlines the principles of decision-making founded in administrative law, clarifies roles and responsibilities for decision makers, and provides advice on constructing statements of reasons.
  • Refusals under the transport security legislation are accompanied by a statement of reasons and advice on review and appeal mechanisms.
  • Consultative forums are used to raise issues early and consult on approaches where obligations are likely to change.

5. Compliance and monitoring approaches are streamlined and co-ordinated.

  • Compliance monitoring is based on risk and takes into account the operational requirements of the regulated entities.
  • Continuous improvement ensures that regulatory administration is fit- for-purpose, efficient and supports accountability.
  • Enforcement is undertaken in conjunction with partner agencies.
  • Clear communication about OTS’ regulatory approach that supports development and enforcement of clear, consistent transport security regulation and compliance activities.
  • Compliance and enforcement policies and operational arrangements are consistent with the agreed published OTS regulatory approach.
  • Infringement Notice campaigns to be conducted jointly with the AFP (airports) and Customs (ports).

Source: Department of Infrastructure and Regional Development, Office of Transport Security Key Performance Indicators, Measures and Evidence. Internet DIRD, available from <https://infrastructure.gov.au/department/deregulation/files/Office_of_Transport_Security_RPF_Metrics.pdf>.and accessed 10 May 2016.

Footnotes

1 An airport operator’s Transport Security Program must cover all airport security related activities. For example: security risk context; airport security procedures; quality control procedures; physical security and access controls; and screening and clearing.

2 Department of Infrastructure and Regional Development, Aviation Security Screening Manual: Guidance for Screening Authorities, April 2012, p. 1.

3 Security controlled airport categories and risk ratings are listed in table 2.1.

4 The original reporting date of 26 April 2015 was extended to 19 May 2016.

5 Security controlled airports are defined at paragraph 2.6.

6 The model is based on aircraft maximum take-off weight and number of passengers transiting the airport. Maximum take-off weight is the maximum gross weight (including cargo and passengers) that the manufacturer of the aircraft, or a person authorised by the Civil Aviation Safety Authority, certifies for structural safety or control of the aircraft at take-off.

7 Inherent risk refers to the likelihood that an airport will be targeted for an act of unlawful interference and the probable consequences of a successful attack.

8 The establishment of the ASAF allows Australia to meet the requirements set out by the International Civil Aviation Organization to establish a forum to perform the function of a National Aviation Security Committee.

9 Appendix 2 lists reviews relevant to passenger screening since 2009.

10 For example, all category 1 to 5 airports are subject to at least an annual audit and a specified number of other compliance activities.

11 Department of Infrastructure and Regional Development, Australian Aviation Security Quality Control Manual, April 2014, p. 26.

12 The Transport Security Compliance Manual defines serious non-compliance as a breach of Australia’s transport security legislation where a failure of cooperative approaches to remedy indicates a blatant, serious or systemic failure, or may be a more routine matter that cannot be resolved through a collaborative and cooperative approach.

13 The Department advised the ANAO that it is able to outsource investigations where serious non-compliance is identified but that no investigations have been outsourced.

14 Department of Infrastructure and Regional Development, Draft Options Paper – approaches to Enforcement Capability – Office of Transport Security, undated, p. 2.

15 Department of Infrastructure and Regional Development, Office of Transport Security 2015-16 Business Plan Review 1, Quarter 1, 2015-16, p. 3.

16 ANAO Audit Report No.26 2002-03, Aviation Security in Australia, p. 15.

17 ibid.

18 Department of Infrastructure, Transport, Regional Development and Local Government, Review of Aviation Security Screening: Report, April 2009, p. 33.

19 The Department’s Capital Investment Proposal allocated $90 000 for training.

20 Department of Infrastructure and Regional Development, Office of Transport Security, Regulatory Capacity Building Programme Status Report, 6 March 2015.

21 In February 2016, the NQCP Activity Summary Report 30 January – 26 February 2016 included year-to-date results for non-compliances, observations and system test results for the first time.

22 The Department informed the ANAO that 2013–14 data from a legacy system was used due to integrity issues with data currently held in RMS.