Audit snapshot

Why did we do this audit?

  • Auditor-General Report No.1 2018–19 found that while the Department of Social Services (DSS) largely established appropriate arrangements to implement the Cashless Debit Card (CDC) Trial, its approach to monitoring and evaluation was inadequate.
  • The 2018–19 audit made six recommendations related to risk management, procurement, contract management, performance monitoring, post-implementation review, cost–benefit analysis and evaluation.
  • This audit examined the effectiveness of DSS’ administration of the CDC program, including implementation of the previous audit’s recommendations.

Key facts

  • The CDC Trial commenced in 2016. It is operating in Ceduna, East Kimberley, Goldfields, Bundaberg, Hervey Bay, Cape York and the Northern Territory.
  • The objective of the CDC Trial is to encourage socially responsible behaviour by restricting cash available to participants to spend on alcohol, drugs or gambling.

What did we find?

  • DSS’ administrative oversight of the CDC program is largely effective, however DSS has not demonstrated that the CDC program is meeting its intended objectives.
  • DSS has implemented the recommendations from Auditor-General Report No.1 2018–19 relating to risk management, procurement and contract management.
  • DSS has partly implemented the recommendations relating to performance monitoring.
  • DSS has not effectively implemented the recommendations relating to cost–benefit analysis, post-implementation reviews and evaluation.

What did we recommend?

  • There were two recommendations to DSS. The recommendations relate to the development of internal performance monitoring, and the conduct of an external review of the evaluation.
  • DSS agreed to the first recommendation and disagreed to the second recommendation.

16,685

Active CDC participants as at February 2022

$36.5m

Costs of the CDC program in 2020–21

591

Number of CDC participants who have exited the program as at February 2022

Summary and recommendations

Background

1. The Australian Government introduced the Cashless Debit Card (CDC) Trial, later known as the CDC program, in 2016.

2. Under the CDC program, a portion of a participant’s income support payment is allocated to a restricted bank account, accessed by a debit card (the CDC). The CDC does not allow cash withdrawals; or purchase of alcohol, gambling or cash-like products. The objective of the CDC program is to assist people receiving income support to better manage their finances and encourage socially responsible behaviour.

3. The CDC is enabled under the Social Security (Administration) Act 1999. The Department of Social Services (DSS) and Services Australia are responsible for the CDC program. There are two card providers: Indue Ltd (Indue) and the Traditional Credit Union (TCU).

4. The CDC program has been implemented in Ceduna in South Australia; the East Kimberley region in Western Australia; the Goldfields region in Western Australia; the Bundaberg, Hervey Bay, and Cape York regions in Queensland; and the Northern Territory.

Rationale for undertaking the audit

5. Auditor-General Report No.1 2018–19 The Implementation and Performance of the Cashless Debit Card Trial found that while DSS largely established appropriate arrangements to implement the CDC Trial, its approach to monitoring and evaluation was inadequate. It was therefore difficult to conclude if the CDC Trial was effective in achieving its objective of reducing social harm and whether the card was a lower cost welfare quarantining approach compared to other components of Income Management such as the BasicsCard.

6. The report made six recommendations relating to risk management, procurement, contract management, performance monitoring, cost–benefit analysis, post-implementation reviews and evaluation.

7. This follow-on audit provides the Parliament with assurance as to whether:

  • DSS has addressed the agreed 2018–19 Auditor-General recommendations;
  • DSS’ management of the extended CDC program is effective; and
  • the extended CDC program was suitably informed by a second impact evaluation of the CDC Trial.

Audit objective and criteria

8. The objective of the audit was to examine the effectiveness of DSS’ administration of the Cashless Debit Card program, including implementation of the recommendations made in Auditor-General Report No.1 2018–19, The Implementation and Performance of the Cashless Debit Card Trial.

9. To form a conclusion against the audit objective, the following criteria were applied:

  • Do DSS and Services Australia have effective risk management, procurement and contract management processes in place for the CDC program?
  • Has DSS implemented effective performance measurement and monitoring processes for the CDC program?
  • Was the expansion of the CDC program informed by findings and lessons learned from an effective evaluation, cost–benefit analysis and post-implementation review of the CDC Trial?

Conclusion

10. DSS’ administrative oversight of the CDC program is largely effective, however, DSS has not demonstrated that the CDC program is meeting its intended objectives. DSS implemented the recommendations from Auditor-General Report No.1 2018–19 relating to risk management, procurement and contract management, partly implemented the recommendations relating to performance monitoring, and did not effectively implement the recommendations relating to cost–benefit analysis, post-implementation review and evaluation.

11. DSS and Services Australia have effective risk management processes in place for the CDC program, although DSS has not yet developed a risk-based compliance framework. DSS’ limited tender procurement processes were undertaken in accordance with Commonwealth Procurement Rules, however DSS’ due diligence over its procurement of the Traditional Credit Union could have been more thorough. Contract management arrangements with the card providers are effective. A service level agreement between DSS and Services Australia was finalised in April 2022. Recommendations from Auditor-General Report No.1 2018–19 relating to risk management, procurement and contract management were implemented.

12. Internal performance measurement and monitoring processes for the CDC program are not effective. Monitoring data exists, but it is not used to provide a clear view of program performance due to limited performance measures and no targets. DSS established external performance measures for the CDC program. These were found to be related to DSS’ purpose and key activities, but one performance indicator was not fully measurable. External public performance reporting was accurate. Recommendations from Auditor-General Report No.1 2018–19 relating to performance measurement and monitoring were partly implemented.

13. The CDC program extension and expansion was not informed by an effective second impact evaluation, cost–benefit analysis or post-implementation review. Although DSS evaluated the CDC Trial, a second impact evaluation was delivered late in the implementation of the CDC program, had similar methodological limitations to the first impact evaluation and was not independently reviewed. A cost–benefit analysis and post-implementation review on the CDC program were undertaken but not used. The recommendations from Auditor-General Report No.1 2018–19 relating to evaluation, cost–benefit analysis and post-implementation review were not effectively implemented.

Supporting findings

Risk management, procurement and contract management

14. There are fit-for-purpose risk management approaches in place in DSS and Services Australia for the CDC program although there is no risk-based compliance strategy. Risks are identified and treatments are established. DSS’ CDC risk management processes are aligned with the DSS enterprise risk framework. Services Australia has appropriate risk documentation and processes in place. Documentation of shared risk between DSS and Services Australia is developing. (Paragraphs 2.4 to 2.37)

15. The limited tender procurements for the extension and expansion of CDC services and for an additional card issuer in the Northern Territory were undertaken in a manner that is consistent with the Commonwealth Procurement Rules. Procurement processes, procurement decisions and conflict of interest declarations were documented. The conduct of procurements through limited tender was justified in reference to appropriate provisions within the Commonwealth Procurement Rules. The procurement of Indue for expanded card services was largely effective. A value for money assessment for one of the Indue procurements was not fully completed. The procurement of a second card provider (TCU) was meant to be informed by a scoping study. The scoping study, which was contracted to TCU, did not fully inform the subsequent limited tender procurement. There was limited due diligence into TCU’s ability to deliver the services. A value for money assessment was conducted during contract negotiations. (Paragraphs 2.38 to 2.67)

16. There are appropriate contract management and service delivery oversight arrangements in place for the CDC program. Effective contract management plans are in place for the contracts with the card providers. Documentation supporting the monitoring of Indue service delivery risk could be more regularly reviewed. In April 2022, DSS and Services Australia finalised a CDC service level agreement. This was established late in the relationship, which commenced in 2016. (Paragraphs 2.68 to 2.90)

Performance measurement and reporting

17. Reports of internal performance measures are not produced as required under the CDC data monitoring strategy. There are a number of data reports provided to and considered by DSS on a regular basis. These reports include some performance measures and no performance targets, and provide limited insight into program performance or impact. DSS has not implemented the recommendation from Auditor-General Report No.1 2018–19 that it fully utilise all available data to measure performance. (Paragraphs 3.5 to 3.19)

18. In 2020–21, DSS developed two external performance indicators for the CDC program. An ANAO audit of DSS’ 2020–21 Annual Performance Statement found that the two indicators were directly related to a key activity (the CDC program), that one of the performance indicators was measurable, and part of the second indicator was not measurable because it was not verifiable and was at risk of bias. A minor finding was raised. DSS reports annually against the two CDC performance measures. (Paragraphs 3.22 to 3.32)

Evaluation, cost–benefit analysis and post-implementation review

19. DSS’ management of the second impact evaluation of the CDC Trial was ineffective. Results from a second impact evaluation were delivered 18 months after the original agreed timeframe and there is limited evidence the evaluation informed policy development. The commissioned design of the second impact evaluation did not require the evaluators to address the methodological limitations that had been identified in the first impact evaluation. DSS did not undertake a legislated review of the evaluation. (Paragraphs 4.7 to 4.49)

20. A cost–benefit analysis and post-implementation review were undertaken on the CDC. Due to significant delays and methodological limitations, this work has not clearly informed the extension of the CDC or its expansion to other regions. (Paragraphs 4.50 to 4.73)

Recommendations

Recommendation no. 1

Paragraph 3.20

Department of Social Services develops internal performance measures and targets to better monitor CDC program implementation and impact.

Department of Social Services’ response: Agreed.

Recommendation no. 2

Paragraph 4.39

Department of Social Services undertakes an external review of the second impact evaluation of the CDC.

Department of Social Services’ response: Disagreed.

Summary of entity responses

Department of Social Services

The Department of Social Services (the department) acknowledges the insights and opportunities for improvement outlined in the Australian National Audit Office (ANAO) report on Implementation and performance of the Cashless Debit Card (CDC) Trial — Follow-on.

We acknowledge the ANAO’s overall conclusion that the department’s administrative oversight of the program is largely effective. The department accepts the conclusion relating to internal performance measures. We acknowledge the rationale and supporting evidence that the second impact evaluation and the cost-benefit analysis were constrained by limitations to available data.

The department accepts Recommendation 1 and acknowledges the suggested opportunities for improvement. The department has taken steps to address these and actions are either underway or already complete. This will strengthen the department’s oversight of the operation and effectiveness of the CDC program.

The department does not agree with Recommendation 2. Limitations of the second impact evaluation are openly acknowledged. An external review will not generate additional evidence or insights and would only reiterate data availability and accessibility constraints. This would not constitute value for money to the taxpayer.

The department is supportive of the independent review process and commits to undertaking a review of any future evaluations.

ANAO comment on Department of Social Services response

21. In relation to Recommendation 2, the Social Services Legislation Amendment (Cashless Debit Card Trial Expansion) Act 2018 included a requirement that any review or evaluation of the CDC Trial must be reviewed by an independent expert within six months of the Minister receiving the final report (refer note ‘c’ to Table 1.1). Contrary to the legislative requirement, the evaluation of the CDC program was never reviewed by an independent expert. Although the Department of Social Services (DSS) describes the 2021 evaluation as having limitations, the results were used by DSS to conclude that it met one of two externally reported CDC program performance measures: ‘Extent to which the CDC supports a reduction in social harm in communities’ (refer paragraph 3.30). In making the legislative amendment, the clear intent of Parliament was to obtain independent assurance that the cashless welfare arrangements are effective in order to inform expansion of the arrangements beyond the trial areas. A review of the evaluation methodology would, moreover, help ensure that the design of future evaluation work is fit for purpose and represents an appropriate use of public resources.

Services Australia

Services Australia (the agency) welcomes this report and notes that there are no recommendations directed at the agency. Recognising that the audit concluded that the agency had effective risk management processes in place for the Cashless Debit Card (CDC) program, we will ensure that the improvement opportunity identified for risk treatments to be reviewed regularly is incorporated into these processes.

In respect to the service level agreement between the agency and the Department of Social Services, this was finalised and acknowledged by the ANAO during the report comment period in April 2022. We will take into consideration the broader audit findings, and incorporate any lessons where appropriate as we work with the Department of Social Services to implement the CDC program policies and deliver services.

Key messages from this audit for all Australian Government entities

Below is a summary of key messages, including instances of good practice, that have been identified in this audit and which may be relevant for the operations of other Australian Government entities.

Group title

Policy/program design

Key learning reference
  • An evaluation strategy should be developed during the design and implementation of a program, including the identification of effective baseline data. Undertaking effective evaluations will build a base to inform future policy changes and provide evidence as to whether the program is meeting its intended objectives.
Group title

Performance and impact measurement

Key learning reference
  • If data is to be used to monitor program performance, it needs to have corresponding performance measures and targets applied to ensure that it is meaningful and can provide a clear view of the program’s progress in achieving its stated objectives.
Group title

Policy/program implementation

Key learning reference
  • When the service delivery of a program is provided through another Australian Government entity, there are dual accountabilities. Ensuring there is an effective service agreement in place can address shared risks, clarify roles and responsibilities, and establish clear service delivery expectations through service level standards.

1. Background

Introduction

1.1 The Australian Government introduced the Cashless Debit Card (CDC) Trial, later known as the CDC program, in 2016.

1.2 Under the CDC program, a portion of a participant’s income support payment is allocated to a restricted bank account, accessed by a debit card (the CDC). The CDC does not allow cash withdrawals or the purchase of alcohol, gambling or cash-like products. The objective of the CDC program is to assist people receiving income support to better manage their finances and encourage socially responsible behaviour.

1.3 The Department of Social Services (DSS) and Services Australia are responsible for the CDC program. Service delivery functions were originally delivered by DSS and transitioned to Services Australia from March 2020.1 There are two card providers: Indue Ltd (Indue) and the Traditional Credit Union (TCU). Other organisations support the CDC program including local partners and Australia Post outlets (contracted by Indue), and community panels.2 The roles and responsibilities of DSS, Services Australia, Indue, TCU and local partners are set out in Appendix 3.

1.4 In 2020–21 the cost of the CDC program was $36.5 million.

1.5 A number of supporting services and programs have been introduced and funded by the Australian Government as part of the CDC program to provide direct assistance to CDC participants and communities. These include employment programs and training; mobile outreach services for vulnerable people; child, parent and youth support services; family safety programs to reduce violence against women and children; alcohol and drug services; allied health and mental health programs; and assistance to manage and build personal financial capability. These programs are delivered by non-government organisations in the CDC regions. In 2020–21 and 2021–22, the total funding for these programs totalled $22.1 million across both years.

Cashless Debit Card enabling legislation

1.6 There have been several amendments to the Social Security (Administration) Act 1999 (the SSA Act) to establish, expand and extend the CDC Trial and program (Table 1.1).

Table 1.1: CDC legislative changes, 2015 to 2020

Name of legislation

Date passed

Changes to the CDC

Social Security Legislation Amendment (Debit Card Trial) Act 2015

12 November 2015

  • Enabled a trial of the CDC beginning 1 February 2016 and ending 30 June 2018 in up to three locations specified by the Minister as ‘trial areas’, involving up to 10,000 participants

Social Security Legislation Amendment (Cashless Debit Card) Act 2018

20 February 2018

  • Extended the CDC Trial to 30 June 2019
  • Removed the restriction to three CDC Trial areas
  • Specified the Goldfields region in Western Australia as the third Trial region

Social Services Legislation Amendment (Cashless Debit Card Trial Expansion) Act 2018

21 September 2018

  • Added the Bundaberg and Hervey Bay Trial regions, with a Trial end date of 30 June 2020
  • Increased the cap on participant numbers to 15,000
  • Added cash-like products to the list of restricted goods
  • Moved participant criteria from legislative instrument to primary legislationa
  • Moved Ministerial power to authorise a community body and determine a trial area into notifiable instrumentsb
  • Included an amendment to require an evaluation of any review of the CDC Trialc

Social Security (Administration) Amendment (Income Management and Cashless Welfare) Act 2019

5 April 2019

  • Extended the end date of the CDC Trial in the original three regions to 30 June 2020
  • Included a new exit pathway for participants who could demonstrate reasonable and responsible management of financial affairs

Social Security (Administration) Amendment (Cashless Welfare) Act 2019

12 August 2019

  • Allowed the Secretary of DSS to be the decision-maker for all exit applicationsd
  • Broadened the criteria for the exit pathway to include the person’s ability to manage their financial affairs

Social Security (Administration) Amendment (Continuation of Cashless Welfare) Act 2020

17 December 2020

  • Established the CDC as a program
  • Included a clause causing the CDC program to cease on 31 December 2022 unless further legislation is passed to extend the program
  • Amended the objectives (refer Table 1.2)
  • Provided for participants in Income Managemente in the Northern Territory and Cape York to transition to the CDC program voluntarily
     

Note a: Legislative instruments are laws on matters of detail made by a person or body authorised to do so by the relevant enabling (or primary) legislation.

Note b: Notifiable instruments differ from legislative instruments in that they are not subject to parliamentary scrutiny or automatic repeal (sun-setting).

Note c: The amendment was that ‘(1) If the Minister or the Secretary causes a review of the trial of the cashless welfare arrangements mentioned in section 124PF to be conducted, the Minister must cause the review to be evaluated. (2) The evaluation must: (a) be completed within 6 months from the time the Minister receives the review report; and (b) be conducted by an independent evaluation expert with significant expertise in the social and economic aspects of welfare policy. (3) The independent expert must: (a) consult trial participants; and (b) make recommendations as to (i) whether cashless welfare arrangements are effective; and (ii) whether such arrangements should be implemented outside of a trial area. (4) The Minister must cause a written report about the evaluation to be prepared. (5) The Minister must cause a copy of the report to be laid before each House of Parliament within 15 days after the completion of the report.

Note d: Participants can apply to exit the CDC program if they meet the eligibility criteria and provide all necessary documentary evidence.

Note e: Income Management, or welfare quarantining, is designed to assist income support recipients to budget their income support payments to ensure they have the basic essentials, such as food, education, housing and electricity. Income Management was introduced in 2007 and is in place in 14 regions across Australia.

Source: ANAO analysis of relevant legislation.

1.7 In the December 2020 legislative amendment, the objectives of the CDC were changed as set out in Table 1.2.

Table 1.2: Changes to the CDC program objectives

SSA Act — November 2015

SSA Act 10 — December 2020

The objects of this Part are to trial cashless welfare arrangements so as to:

  1. reduce the amount of certain restrictable payments available to be spent on alcoholic beverages, gambling and illegal drugs; and
  2. determine whether such a reduction decreases violence or harm in trial areas; and
  3. determine whether such arrangements are more effective when community bodies are involved; and
  4. encourage socially responsible behaviour.

The objects of this Part are to administer cashless welfare arrangements so as to:

  1. reduce the amount of certain restrictable payments available to be spent on alcoholic beverages, gambling and illegal drugs; and
  2. support program participants and voluntary participants with their budgeting strategies; and
  3. encourage socially responsible behaviour.
   

Source: SSA Act, section 124PC.

Location, scope and size of the Cashless Debit Card Trial and program

1.8 The CDC Trial commenced in Ceduna in South Australia in March 2016 and the East Kimberley region in Western Australia in April 2016. The CDC Trial was expanded to: the Goldfields region in Western Australia in March 2018; the Bundaberg and Hervey Bay region in Queensland in January 2019; the Cape York region in Queensland; and the Northern Territory in March 2021 (Figure 1.1).

Figure 1.1: CDC locations

 

Map of Australia showing the locations where the CDC is in place.

 

Source: ANAO analysis of CDC eligibility.

1.9 Table 1.3 provides an overview of the CDC regions, date of commencement, eligible participants and split of income support payments on the CDC.

Table 1.3: CDC program eligibility and conditions

CDC region

Date of commencement

Eligible participants

Split of income support payments

Ceduna, South Australia

March 2016

The program applies to all people who receive a ‘trigger’ paymenta and are not at pension age.

Participants receive 20 per cent of their welfare payment in their regular bank account, which can be spent as they wish and can be accessed as cash; and 80 per cent of their welfare payment on to the CDC.

East Kimberley, Western Australia

April 2016

Goldfields, Western Australia

March 2018

Bundaberg and Hervey Bay, Queensland

December 2018

The program applies to people aged under 36 years who receive JobSeeker Payment, Youth Allowance (JobSeeker), Parenting Payment (Partnered) and Parenting Payment (Single).

Cape York, Queensland

March 2021

The program applies to people who have been referred to the Queensland Family Responsibilities Commission.b

The Queensland Family Responsibilities Commission determines the CDC payment split.

Northern Territory

March 2021

The program applies to existing Income Management (BasicsCard)c participants in the Northern Territory who volunteer to transition to the CDC.

Participants receive the same payment split they received under Income Management. In most cases, they receive 50 per cent of their welfare payment in their regular bank account and 50 per cent of their welfare payment on to the CDC.

For participants that have been referred under the Child Protection Measure or by a recognised authority of the Northern Territory (the Banned Drinkers Register), 70 per cent will be placed on to the CDC.

       

Note a: Subsection 124PD(1) of the SSA Act defines a trigger payment as a social security benefit or a social security pension of the following kinds: Carer Payment, Disability Support Pension, Parenting Payment Single; or ABStudy that includes an amount identified as living allowance.

Note b: The Queensland Family Responsibilities Commission is an independent statutory body that aims to support welfare reform, and community members and their families; restore socially responsible standards of behaviour; and establish local authority.

Note c: The BasicsCard was introduced in 2007 to support the Australian Government’s Income Management initiative. It can only be used at approved merchants. It differs from the CDC in that it cannot be used to purchase cigarettes or pornographic material, in addition to the common restrictions on alcohol purchase, gambling and cash withdrawals.

Source: ANAO analysis of CDC eligibility and DSS advice to the ANAO.

1.10 In addition to the payment splits shown in Table 1.3, 100 per cent of any lump sum payments, for example the Newborn Upfront Payment3, are paid on to the participant’s CDC.

1.11 Participants remain on the CDC program if they move to a different region of Australia, unless they apply to exit the CDC or there is a well-being exemption. Exit applications are managed by Services Australia. The Services Australia delegate may determine that a participant can exit the CDC if the participant can demonstrate reasonable and responsible management of their affairs, including finances. There are also well-being exemptions if the Secretary of DSS is satisfied that participation in the CDC would pose a serious risk to a person’s mental, physical or emotional wellbeing.

1.12 Figure 1.2 below shows the number of CDC participants by region. As at February 2022, the total number of active participants across all regions was 16,685.

Figure 1.2: Number of CDC participants, January 2020 to February 2022

 

Line graph showing the numbers of CDC participants in each region — Ceduna, Goldfields, Cape York, East Kimberley, Bundaberg and Hervey Bay, and the Northern Territory.

 

Note: The decrease in active participants in the Bundaberg and Hervey Bay and Goldfields regions between June 2020 and April 2021 was partly due to the coronavirus disease 2019 (COVID-19) pandemic. The Minister agreed to pause the commencement of new participants on to the CDC due to the high workload that Services Australia was experiencing processing income support payments. The decrease was also due to CDC participants moving off the CDC for various reasons such as cessation of income support.

Source: ANAO analysis of CDC reports available from data.gov.au.

CDC technology

1.13 The CDC looks and operates like a regular financial institution transaction card (Figure 1.3), except that it cannot be used to buy alcohol, gambling products or cash-like products; or to withdraw cash. The CDC has contactless ‘tap and pay’ functionality. The participant’s CDC bank account accrues interest. The CDC can be used for online shopping, payment of bills online and payment for goods and services at most merchants nationwide that have eftpos or Visa facilities.4

Figure 1.3: Cashless Debit Cards

 

Images of the Cashless Debit Cards provided by Indue and the Traditional Credit Union.

 

Source: DSS documentation.

1.14 For the purposes of the CDC, merchants are categorised into three groups:

  • unrestricted merchants, which sell only unrestricted items (the majority of merchants);
  • restricted merchants, which primarily sell restricted items; and
  • mixed merchants, which sell both restricted and unrestricted items.

1.15 The card cannot be used at restricted merchants. Restricted merchants are those where the primary business is the sale of restricted products such as alcohol or gambling services. This includes online merchants that primarily sell restricted products.

1.16 Product level blocking assists CDC participants to shop at mixed merchants. Product level blocking prevents the purchase of restricted items at mixed merchants by automatically detecting restricted items at the point of sale. It aims to simplify operations for small businesses that wish to accept the CDC, and to provide more choice for participants.

1.17 Product level blocking involves three elements: a card payment terminal (‘PIN pad’), a point-of-sale system and a payment integrator that connects the PIN pad to the point-of-sale system. When items are scanned at the checkout point, the point-of-sale system checks for restricted items and sends a restricted item flag to the PIN pad. When the customer presents the CDC to the PIN pad, the PIN pad recognises the CDC and checks if there are any restricted item flags. If restricted items are flagged, the PIN pad cancels the transaction and displays the message ‘Cancelled, Restricted Item’ to the cardholder. The cardholder can remove the restricted items, purchase the restricted items with another payment method or cancel the sale altogether.

1.18 Product level blocking was initially implemented in 2018 by three major retailers — Australia Post, Woolworths and Coles. In February 2019 DSS commenced a product level blocking project targeted at small to medium-sized merchants. From June 2020 onwards, product level blocking was extended to more merchants. DSS advised that as at September 2021 there were 30 participating small to medium-sized merchants across the CDC regions, and work to increase the numbers was ongoing.

Rationale for undertaking the audit

1.19 Auditor-General Report No.1 2018–19 The Implementation and Performance of the Cashless Debit Card Trial found that while DSS largely established appropriate arrangements to implement the CDC Trial, its approach to monitoring and evaluation was inadequate. It was therefore difficult to conclude if the CDC Trial was effective in achieving its objective of reducing social harm and whether the card was a lower cost welfare quarantining approach compared to other components of Income Management such as the BasicsCard.

1.20 The report made six recommendations relating to risk management, procurement, contract management, performance monitoring, cost–benefit analysis, post-implementation reviews and evaluation.

1.21 This follow-on audit provides the Parliament with assurance as to whether:

  • DSS has addressed the agreed 2018–19 Auditor-General recommendations;
  • DSS’ management of the extended CDC program is effective; and
  • the extended CDC program was suitably informed by a second impact evaluation of the CDC Trial.

Audit approach

Audit objective, criteria and scope

1.22 The objective of the audit was to examine the effectiveness of DSS’ administration of the Cashless Debit Card program, including implementation of the recommendations made in Auditor-General Report No.1 2018–19 The Implementation and Performance of the Cashless Debit Card Trial.

1.23 To form a conclusion against the audit objective, the following criteria were applied:

  • Do DSS and Services Australia have effective risk management, procurement and contract management processes in place for the CDC program?
  • Has DSS implemented effective performance measurement and monitoring processes for the CDC program?
  • Was the expansion of the CDC program informed by findings and lessons learned from an effective evaluation, cost–benefit analysis and post-implementation review of the CDC Trial?

Audit methodology

1.24 The audit involved:

  • reviewing contracts with the card providers and the bilateral arrangements between DSS and Services Australia, including reporting against deliverables;
  • reviewing procurement activity in DSS;
  • examining an evaluation report, post-implementation review and cost–benefit analysis on the CDC Trial, including the underpinning methodologies and application of the findings;
  • reviewing DSS’ performance monitoring and reporting;
  • reviewing the end-to-end business process for the extraction, provision and usage of CDC participant data;
  • meetings with staff in DSS and Services Australia; and
  • four submissions received through the ANAO citizen contribution facility.

1.25 The audit was conducted in accordance with ANAO Auditing Standards at a cost to the ANAO of approximately $399,100.

1.26 The team members for this audit were Renina Boyd, Sonya Carter, Supriya Benjamin, Ji Young Kim, Peta Martyn and Christine Chalmers.

2. Risk management, procurement and contract management

Areas examined

This chapter examines whether the Department of Social Services (DSS) and Services Australia have effective risk management, procurement and contract management processes in place to support the administration of the Cashless Debit Card (CDC) program.

Conclusion

DSS and Services Australia have effective risk management processes in place for the CDC program, although DSS has not yet developed a risk-based compliance framework. DSS’ limited tender procurement processes were undertaken in accordance with Commonwealth Procurement Rules, however DSS’ due diligence over its procurement of the Traditional Credit Union could have been more thorough. Contract management arrangements with the card providers are effective. A service level agreement between DSS and Services Australia was finalised in April 2022. Recommendations from Auditor-General Report No.1 2018–19 relating to risk management, procurement and contract management were implemented.

Areas for improvement

The ANAO suggested that DSS promptly develops a compliance framework for the CDC program and that Services Australia ensures that risk treatments are reviewed regularly.

2.1 Auditor-General Report No.1 2018–19 The Implementation and Performance of the Cashless Debit Card Trial found that DSS did not actively monitor risks identified in risk plans, there were some deficiencies in elements of the procurement process and there were weaknesses in its approach to contract management.

2.2 The ANAO recommended in 2018–19 that: DSS should confirm risks are rated according to its risk management framework requirements; ensure risk mitigation strategies and treatments are appropriate and regularly reviewed; implement a consistent and transparent approach when assessing tenders; fully document procurement decisions; and employ appropriate contract management practices.5

2.3 The ANAO examined the:

  • risk management arrangements including treatment of shared and fraud risks, issues escalation processes and establishment of a risk-based compliance strategy;
  • procurement processes in place for the CDC program; and
  • contract and service management processes.

Are fit-for-purpose risk management arrangements in place?

There are fit-for-purpose risk management approaches in place in DSS and Services Australia for the CDC program although there is no risk-based compliance strategy. Risks are identified and treatments are established. DSS’ CDC risk management processes are aligned with the DSS enterprise risk framework. Services Australia has appropriate risk documentation and processes in place. Documentation of shared risk between DSS and Services Australia is developing.

2.4 As non-corporate Commonwealth entities, DSS and Services Australia must manage risk in accordance with the Commonwealth Risk Management Policy. The goal of the Commonwealth Risk Management Policy is to embed risk management into the culture of Commonwealth entities where a shared understanding of risk leads to well-informed decision making.6

DSS’ CDC risk management arrangements

2.5 The DSS enterprise risk management framework (the DSS risk framework) includes a risk assessment toolkit, risk matrix and templates for risk and issues logs. The DSS risk framework is generally consistent with the Commonwealth Risk Management Policy. DSS’ overall enterprise risk appetite is low.

CDC risk management plans

2.6 In accordance with the DSS risk framework, DSS has developed a risk management plan for the CDC program. The CDC risk management plan is reviewed monthly and endorsed by the managers of the two DSS branches involved in administering the CDC program: the Cashless Welfare Policy and Technology Branch and the Cashless Welfare Engagement and Support Services Branch.

2.7 The August 2021 CDC risk management plan outlines 11 risks, comprising eight ‘medium’ rated risks and three ‘high’ rated risks. The high rated risks are:

  • evaluation [of the CDC program] is not perceived as being independent, methodologically sound and/or does not extend and enhance the evidence base of the CDC;
  • failure to maintain stakeholder support for the CDC program; and
  • the transition of Income Management participants in the Northern Territory and Cape York to the CDC does not occur seamlessly for participants.

2.8 The risk ‘that a second issuer of the CDC in the Northern Territory is not implemented in a timely manner’ was rated incorrectly — according to the DSS risk matrix it should have been rated high whereas it was rated medium. This means that it did not receive the required risk escalation discussed at paragraph 2.12. This risk is now closed as a second card issuer in the Northern Territory has been contracted.

2.9 The CDC risk management plan contains the elements of an effective risk plan: risks, controls and treatments are identified. No risk tolerances or appetites are prescribed for CDC risks and the DSS enterprise risk framework does not require individual programs to establish a tolerance. Although the CDC program area did not use DSS’ Risk-E tool, which automatically calculates risk ratings7, it received support from the DSS enterprise risk area for an alternative approach using a different risk management template. Analysis of the risk management plan found that: 10 of the 11 risks are rated correctly; controls and their effectiveness are identified; and treatments are in place for mitigating seven of eight medium rated risks and all high rated risks.

2.10 As found in Auditor-General Report No.1 2018–19, most treatments are listed as ongoing rather than having specific due dates. Some treatments had specific due dates but these dates were obsolete, indicating a lack of review of the treatments. For example, the August 2021 risk management plan contained one treatment for an open risk that was still listed as due for completion in May 2021 and two treatments that were still listed as due for completion in July 2021.

Risk reporting

2.11 Project status reports for the CDC program cover issues and high risks, and are provided to DSS’ enterprise Implementation Committee and Executive Management Group quarterly. One incorrectly rated risk was not included in the status reports but was included in a risk management plan which accompanies the status reports.

2.12 According to the DSS risk framework, high and extreme risks must be reported to the relevant branch and group managers (for awareness, treatment authorisation or acceptance) and the Chief Risk Officer (for awareness and consolidation). The Group Manager, Communities reviews the CDC risk management plan and issues log (refer paragraph 2.33) before it is provided to the Executive Management Group. DSS advised that the Chief Risk Officer is provided with a copy of the CDC risk management plan and issues log on a quarterly basis as a member of the Executive Management Group.

2.13 Risk is a standing item at the CDC Steering Committee.8 DSS and Services Australia risk registers are tabled and discussed at the meetings. Steering Committee discussions include discussion of risks shared by DSS and Services Australia as well as the development of a shared risk register. Risks are also discussed regularly at the other CDC governance committees, comprising the:

  • Northern Territory Transition Implementation Committee;
  • Bundaberg and Hervey Bay Community Reference Group;
  • Cashless Welfare in Cape York Meeting;
  • Cashless Welfare Working Group (refer paragraph 2.90); and
  • Cashless Debit Card Technology Working Group.

Services Australia’s CDC risk management arrangements

2.14 Services Australia has a risk management framework, risk management policy, enterprise risk management model dated May 2021 and a suite of risk management tools including a risk matrix and risk management process guidelines.

2.15 Services Australia has two risk plans in place to assist in the management of CDC risks. The first plan (for CDC extension and expansion) comprises two low and five medium rated risks. The second plan documents the risks associated with the transition from Income Management to the CDC in the Northern Territory and Cape York. The second plan has one low and 11 medium rated risks. The open risks at February 2022 were rated in line with Services Australia’s risk matrix.9

2.16 The Services Australia risk management plans contain the required elements of an effective risk plan, including risks and controls with assigned owners and treatments. The two risk plans were approved by the Deduction and Confirmation Branch National Manager.

2.17 The transition to the CDC in the Northern Territory and Cape York plan states that the plan should be reviewed monthly or when significant change occurs. Services Australia advised the ANAO that the plans are reviewed monthly and senior executive endorsement of the plans generally occurs on a quarterly basis or when a significant change is made. In both of the latest plans, the treatments for open risks are overdue. Services Australia should ensure that risk treatments are reviewed regularly, so that risk ratings and specified treatments remain appropriate.

Management of CDC shared risks

2.18 Entities have a responsibility under the Commonwealth Risk Management Policy to implement arrangements to understand and contribute to the management of shared risks.10 The bilateral management arrangement between DSS and Services Australia (refer paragraph 2.85) requires the entities to work cooperatively to identify and manage existing and emerging risks, and to comply with Commonwealth risk management policies.11 The DSS risk framework states that in order to manage shared risks, DSS must develop joint risk management plans that determine the governance arrangements in overseeing and monitoring shared risks.

2.19 Prior to October 2021, where a risk control was the joint responsibility of DSS and Services Australia, DSS stated this in its CDC risk management plan. Services Australia also had DSS listed as a joint risk owner in both of its risk management plans against relevant risks and controls. For example, a risk that ‘customer communication is ineffective’ is jointly owned by DSS and Services Australia in the Services Australia CDC risk management plan. Joint risks and controls were communicated though sharing respective CDC risk management plans at CDC Steering Committee meetings.

2.20 DSS and Services Australia did not have an approved shared risk register in place for the CDC program until October 2021, when a shared risk register was endorsed by the CDC Steering Committee. From October 2021, the content of the shared risk register is agreed by the relevant branch manager of each entity. The DSS and Services Australia shared risk register contains five risks that are all rated medium using the Services Australia risk matrix. The minutes of the 21 October 2021 meeting of the CDC Steering Committee note that that the joint risk register is supported by the entity risk plans which provide further detail on the risks. The risks in the shared risk register focus on the transition of participants from Income Management to the CDC in the Northern Territory rather than on the CDC more broadly.

2.21 The shared risk register does not contain a risk review date, risk control owners or treatment owners, or due dates for treatments. Services Australia advised the ANAO that the detail of risk controls and treatments, including their due dates, appear in the entities’ respective risk plans rather than the shared risk register. Service Australia also advised that further detail, such as the risk identification number, would be added to the shared risk register to allow for clearer linkage between the shared risk plan and the respective DSS and Service Australia risk plans. All risks in the shared risk register could be mapped by the ANAO to the Services Australia risk register. However, approximately 50 per cent of the treatments and controls in the shared risk register could not be mapped to the Services Australia register and there was no linkage to shared risks in the DSS risk register. It is unclear which entity owns the unmapped treatments and controls for shared risks.

Fraud and compliance risk

2.22 The kinds of fraud and compliance risks that could affect the CDC program include:

  • participant fraud — CDC cardholders circumvent the program;
  • merchant fraud — merchants circumvent the program;
  • third party fraud —fraud or theft by family, friends or other members of the public; and
  • service provider fraud — fraud committed by the card provider (for example, insider knowledge or abuse of market position) or the support service providers (for example, service not provided).

2.23 A CDC program logic developed in 2016 outlines some potential specific circumventions (refer Appendix 4).

2.24 The DSS CDC risk management plan includes two participant, merchant and third party fraud and compliance risks.

  • Participant/merchant fraud — program integrity is jeopardised and participants and/or merchants significantly circumvent the program.
  • Third party fraud — participants experience fraud or theft by family members, friends, other members of the public and online shopping.

2.25 The CDC risk management plan includes treatments for the two fraud and compliance risks.

  • The risk controls for participant and third party fraud are to be managed primarily by Services Australia and Indue Ltd (Indue).
  • Merchant fraud is to be managed primarily by DSS through product level blocking, data monitoring and merchant engagement. DSS advised the ANAO that Indue provides fraud protection services, including real time fraud detection, in line with standard banking practices.
Management of fraud by Services Australia, Indue and Traditional Credit Union

2.26 The DSS CDC risk management plan specifies the controls and treatments to be applied by Services Australia and Indue for fraud risks. The second card provider, Traditional Credit Union (TCU), has a responsibility to implement a fraud control plan and to report to DSS any instances of fraud against the Australian Government committed by TCU staff. TCU relies on Indue to apply fraud controls and treatments, as it relies on Indue systems.

2.27 Controls and treatments specified in the CDC risk management plan include provision of a ‘hotline’ and customer service centre, giving card security advice to participants, staff training, card blocking and fraud monitoring software. The ANAO did not confirm whether these fraud controls and treatments were implemented by Services Australia and Indue. DSS advised the ANAO that it monitors Services Australia and Indue’s fraud control activities through monthly operational meetings. A standing agenda item to discuss risks outlined in the CDC risk management plan was introduced in the 22 March 2022 meeting agenda and discussion was evidenced in the minutes. DSS advised that the CDC Compliance Section, established in November 2021, also reviews Indue’s performance against agreed service levels on a monthly basis. Finally, DSS advised that Indue has integrated into the existing system a new fraud monitoring software, IBM Safer Payments, to improve fraud detection. The ANAO did not confirm the application of the new software.

Management of fraud by DSS

2.28 DSS has a fraud control plan for the CDC dated February 2021. The plan includes actions such as the development of a compliance framework, fraud awareness training, analysis of CDC fraud risk, undertaking compliance checks and data analytics, and operation of a fraud hotline. DSS advised that it monitors social media12 relating to the functionality of the CDC at specific merchants. The ANAO did not confirm whether all of the actions outlined in the fraud control plan or advised to the ANAO have been implemented by DSS. Indue analytics software is used to identify suspicious merchant activity, which is reported to DSS each month. DSS conducts some merchant investigations based on this data.

2.29 DSS has two standard operating procedures in place to assist with compliance activities. These are about generating merchant investigation reports and conducting social media monitoring. DSS advised the ANAO that the CDC Compliance Section is working with the CDC Data Section to develop improved ways of identifying patterns in compliance data to identify any instances of possible circumvention by merchants or participants and whether further investigation is required.

Compliance framework

2.30 A compliance framework guides compliance activities to ensure that they are managed effectively and are risk-based, and establishes thresholds for triggering investigations and other compliance actions.

2.31 Although DSS undertakes compliance activities, thresholds for triggering compliance activity are not applied.

2.32 DSS advised the ANAO that CDC compliance is guided by the DSS 2021–23 fraud control plan and supported by an overarching Enterprise Compliance Framework dated December 2018. Although it is listed in the fraud control plan as an activity, DSS does not have a compliance framework or strategy for the CDC. In February 2022 DSS advised the ANAO that the CDC Compliance Section established within the Cashless Welfare Policy and Technology Branch intends to develop a compliance framework. This should be undertaken as a priority to ensure that compliance activities are appropriately risk-based.

CDC issues management and escalation

2.33 DSS maintains an issues log for the CDC that is reviewed monthly by the CDC branch managers. The issues log is largely consistent with the DSS departmental issues log template except for an issues consequence column, which was not included in the CDC issues log. As at August 2021, the DSS CDC issues log had seven open issues — one was rated ‘low’ and six were rated ‘medium’. It was unclear whether the issues were rated based on consequence or another consideration.

2.34 Services Australia maintains two CDC issues logs that contains 10 active issues — five minor, four moderate and one major. The major issue is about the requirement to undertake manual processing in order to transition participants on to the CDC in the Northern Territory.

2.35 DSS and Services Australia manage issues together primarily through the CDC governance committees. The ANAO was advised that time sensitive issues are raised directly with the senior managers. For example, the issue of manual transition of customers in the Northern Territory was managed by executive staff in DSS and Services Australia, and noted in the shared risk register and both entities’ risk management plans and issues logs.

2.36 Issues with Indue are managed by Services Australia through weekly meetings. DSS holds monthly meetings with Indue and an action register is kept to track operational issues arising from these meetings. DSS advised that these meetings have been held since March 2016. The register was started in late September 2021.

2.37 There is no documented issues escalation process for the CDC in DSS or Services Australia. In DSS, issues escalation occurs through the issues log update that is provided to the DSS Executive on a quarterly basis through the CDC project status report. General issues are discussed at monthly CDC Steering Committee meetings and at weekly Cashless Welfare Working Group meetings. DSS advised that they are discussed at weekly branch meetings and that the relevant Group Manager and Deputy Secretary also meet weekly to discuss the CDC program.

Were transparent procurement processes undertaken, consistent with the Commonwealth Procurement Rules?

The limited tender procurements for the extension and expansion of CDC services and for an additional card issuer in the Northern Territory were undertaken in a manner that is consistent with the Commonwealth Procurement Rules. Procurement processes, procurement decisions and conflict of interest declarations were documented. The conduct of procurements through limited tender was justified in reference to appropriate provisions within the Commonwealth Procurement Rules.

The procurement of Indue for expanded card services was largely effective. A value for money assessment for one of the Indue procurements was not fully completed.

The procurement of a second card provider (TCU) was meant to be informed by a scoping study. The scoping study, which was contracted to TCU, did not fully inform the subsequent limited tender procurement. There was limited due diligence into TCU’s ability to deliver the services. A value for money assessment was conducted during contract negotiations.

Procurement of Indue

2.38 DSS’ contractual relationship with Indue commenced in August 2015 when DSS engaged Indue to deliver the CDC under an IT build contract for delivery of systems technology and infrastructure for the CDC. A contract was then required for Indue to deliver operational aspects of the CDC Trial, commencing April 2016, including a debit card product, spending restrictions and ongoing support to participants.

2.39 The operational contract was varied 10 times between April 2016 and February 2022. Variations were to extend the duration of the CDC program, expand the CDC to additional regions and implement functionality improvements.

2.40 The operational contract between DSS and Indue states that DSS can extend the contract more than once, provided that it does not extend the commitment beyond five years after the start date of the contract (that is, April 2021). As amendments to the Social Security (Administration) Act 1999 (SSA Act) extended the CDC program beyond April 2021, DSS was required to extend CDC service provision in existing sites and newly introduced sites to 31 December 2022 through a new procurement.

Procurement to extend services in existing CDC sites

2.41 On 15 December 2020 the DSS delegate approved a limited tender procurement, with the sole tenderer being Indue, for the continued delivery of CDC card and account services in existing sites.13 A request for quote was issued to Indue for the provision of CDC and associated services to existing sites for the period 1 January 2021 to 31 December 2022. These services were to be in line with the terms and conditions of the existing Indue operational contract.

2.42 Extending the Indue contract was assessed as low risk due to the ongoing relationship and Indue’s proven capacity to deliver the services.

2.43 DSS’ evaluation of Indue’s tender found that Indue represented value for money and that there was likely to be an opportunity to realise savings as part of commercial negotiations. The value for money assessment included a comparative analysis of the pricing between the quote, the then contract and the proposed costs for 2020–21 and found there was no significant difference.

Procurement to expand CDC services in new sites and for shared services

2.44 On 29 January 2021, the delegate approved a limited tender procurement, again with Indue, for the expansion of CDC card and account provider services in the Northern Territory and Cape York region for the period 17 March 2021 to 31 December 2022. A request for quote was issued on 9 February 2021.

2.45 Indue responded in two parts: Part A was to secure a provider for the CDC and related services for the Northern Territory and Cape York; and Part B was for the provision of shared services for the Additional Issuer Trial.14

  • Part A — the tender evaluation committee found that the pricing assessment was consistent with requirements and in line with existing Indue contract prices.
  • Part B — the tender evaluation committee noted that although most of the required information was presented, some pricing information had been omitted and the required pricing template was not used. Initially, a partial value for money assessment was undertaken and DSS was not able to gather a full understanding of the costs and risks. The tender evaluation committee decided that an updated value for money and risk assessment be undertaken during contract negotiations. In December 2021, the response was determined to represent value for money.

2.46 The tender evaluation committee assessed Indue’s tender to be medium risk based on the considerations set out below.

  • Services Australia would need to be involved in negotiations to ensure compliance with Services Australia’s service delivery processes.
  • Indue would not provide information, correspondence or advice to participants in the Northern Territory about the CDC until Indue, DSS and Services Australia were satisfied it met all regulatory requirements. This was resolved during contract negotiations as Indue has a regulatory requirement to be compliant with consent requirements for participants before providing them with advice on the CDC.
  • The short timeframe for commercial negotiations might impact DSS’ ability to attain savings on pricing.

2.47 A timeline for the Indue procurements is depicted in Figure 2.1.

Figure 2.1: Procurement of Indue

 

A timeline of the key dates in the procurement process to contract Indue as the primary card provider, from December 2020 through to December 2021.

 

Source: ANAO analysis.

2.48 DSS justified the sole provider approach for both procurements under Division 2, clause 10.3(g) of the Commonwealth Procurement Rules.15 This clause allows a limited tender in the case of a limited trial. The justification was that the CDC program was a limited trial to test the suitability of the service concept and its ability to achieve the program’s goals. After the passage of the amended CDC sections of the SSA Act in December 2020, which removed the term ‘trial’ and replaced it with ‘program’, DSS continued to justify a limited tender under clause 10.3(g) by stating that ‘the suitability of technology that underpins the CDC and its ability to achieve the program objectives will continue to be tested until 31 December 2022’.

2.49 The members of the tender evaluation committee identified no conflicts of interest with any part of the tenders and all declarations were appropriately documented.

Procurement of the Traditional Credit Union

2.50 In November 2017 the Minderoo Foundation16, in consultation with retail, banking, and payment organisations, compiled the Cashless Debit Card Technology Report with the aim of advising the Australian Government on ways to improve the technology model behind the CDC. The report found that there would be benefits in introducing additional CDC card issuers.

Feasibility study

2.51 In May 2020 DSS commissioned eftpos Australia to conduct a feasibility study into the potential enhancements to CDC participant experiences through having a choice of financial institutions, and the lowest cost and effort alternatives for financial institutions to implement the program.

2.52 eftpos Australia initially consulted five banking institutions (including Indue) to understand how best the industry could meet the requirements of the CDC. eftpos later engaged with three other banking institutions (including TCU).

2.53 The feasibility study presented three models for an additional card issuer:

  • in-house solution — financial institutions would modify their own banking infrastructure to provide the CDC;
  • modified basic banking — financial institutions utilise their own infrastructure with complex CDC-specific functions outsourced; and
  • shared service model — shared service providers process CDC accounts on behalf of other financial institutions.

2.54 The study recommended that DSS develop a shared service model in the short-term as it requires less implementation time than creating a new card provider infrastructure. The shared services model would involve adding one or more card issuers onto Indue’s existing systems infrastructure. The report did not recommend a specific financial institution but focused on the feasibility of and options for a second card issuer.

2.55 The results of the feasibility study were sent to the Minister for Social Services in October 2020, with DSS recommending the Minister agree ‘to explore adding TCU as a CDC issuer’ in the Northern Territory. Although the briefing to the Minister noted that DSS was conducting market research with potential providers to look at options and costs for adding additional card issuers in the future, TCU was the only provider considered and recommended for the Northern Territory. The briefing did not mention plans to commission TCU to undertake a scoping study.

Scoping study

2.56 In November 2020 the Secretary of DSS approved a direct request for quote with TCU to undertake a multiple issuer scoping study. The purpose of the scoping study was to examine how best to add additional card issuers and deliver CDC banking services in regional and remote communities. The request for quote required ‘a comprehensive review of how a financial institution or issuer could work with the customer, Services Australia and the selected card provider to provide support to CDC participants’.

2.57 The final TCU report was provided to DSS on 1 February 2021 and met the requirements set out by DSS in the scoping study contract. The scoping study found that the delivery of the CDC by an additional issuer was feasible and would improve services to participants in regional and remote communities. The study also stated that a CDC and associated banking account could be developed as a product by small to medium-sized authorised deposit-taking institutions and would provide CDC participants with a greater choice of providers.

2.58 TCU stated in its final report that the study had limitations when discussing small to medium-sized authorised deposit-taking institutions, as the majority of the examples, experiences and learnings in the report were based specifically on TCU as the potential CDC issuer. Although DSS anticipated that the scoping study would identify other potential card issuers it was not a requirement of the contract with TCU for the scoping study.

2.59 The delegate approved a limited tender procurement of TCU as a second card issuer on 29 January 2021, before the final scoping study report was provided on 1 February 2021. The DSS Secretary was advised on 5 February 2021 that ‘Traditional Credit Union has scoped this work and has concluded it is able to deliver this product‘.

Limited tender for Additional Issuer Trial

2.60 On 10 February 2021, a request for quote was issued to TCU for the services of an additional card issuer to deliver the CDC and associated operational and support services to participants in the Northern Territory. DSS justified a limited tender approach for the Additional Issuer Trial in the Northern Territory because:

  • of the trial nature of the activity (refer to paragraph 2.48); and
  • it was procuring services from a small to medium-sized enterprise with at least 50 per cent Indigenous ownership17, noting TCU was the only Indigenous-led and owned provider in the Northern Territory with experience in delivering banking services to Indigenous customers in regional and remote communities.

2.61 DSS also noted that TCU had an existing relationship with Indue through the provision of banking systems infrastructure.

2.62 The procurement of an Indigenous enterprise for this activity was in line with the Indigenous Procurement Policy and whole-of-government efforts to increase procurements from Indigenous small to medium-sized enterprises. TCU’s Indigenous Participation Plan18 stated that TCU met the mandatory minimum requirements of the Indigenous Procurement Policy.

2.63 Following the release of the request for quote, the procurement process for the second card issuer for the CDC is set out below and summarised in Figure 2.2.

  • A draft contract was provided to TCU as part of the request for tender. TCU expressed concern about meeting the tender requirements and timeframes. The tender closing date was extended and DSS withdrew the draft contract from the tender documentation with a view to negotiating the contract terms with TCU based on TCU’s capacity and capability to deliver the service.
  • The DSS Procurement Branch advised the CDC program area that ‘a comprehensive value for money assessment could not be undertaken without an understanding of how the [TCU] proposal interacted with the likely conditions of the contract’. The tender evaluation report stated the following:

    As the Committee considered that the impact on government policy objectives of delaying the procurement (e.g. by re-issuing documentation) was unacceptable, the Committee has instead adopted a recommendation proposing that the department proceed to negotiations with the Traditional Credit Union, and that once these negotiations are substantively complete and contract positions agreed, a value for money assessment will be made using the agreed contract terms and a further recommendation made to the delegate.

  • The evaluation committee’s recommendation was approved by the delegate in March 2021.
  • Upon receiving TCU’s response to the tender, the evaluation committee identified multiple weaknesses including a finding that some aspects of TCU’s tender response ‘reflected a lack of understanding of government requirements or the scale of changes that may be required in order for the Traditional Credit Union to meet those requirements’. DSS advised that the evaluation committee was unable to make a value for money assessment.
  • Further negotiations led to TCU submitting a revised proposal. The price in the revised proposal was 20 per cent less than the first proposal, but 23 per cent more than the proposed budget approved by the delegate for the TCU procurement.19 After assessing the revised response, the evaluation committee recommended that the revised proposal represented value for money and DSS entered into contract negotiations. The value for money assessment was based on the draft contract rather than a costed proposal from TCU.
  • During contract negotiations, it was recommended that the maximum contract value be reduced to reflect the reduction in time available for TCU to implement the service and the reduction in the number of CDC accounts that TCU had the capability to deliver. The delegate approved this change to the contract terms on 9 September 2021.

2.64 A timeline of the process for the procurement of TCU is depicted in Figure 2.2.

Figure 2.2: Procurement of TCU

 

A timeline of the key dates in the procurement process to contract the Traditional Credit Union as the second card provider, from November 2020 through to December 2021.

 

Source: ANAO analysis.

2.65 In accordance with the Commonwealth Procurement Rules, a risk assessment of the limited tender procurement of TCU was undertaken in January 2021 which did not identify any high risks. A revised risk assessment was undertaken in March 2021 which identified the following high risks:

  • TCU’s expectations of the contract process may not align with the Commonwealth’s expectations;
  • TCU may not be sufficiently mature or resourced at present to meet additional preferred requirements of the Commonwealth; and
  • supplier may be unwilling to accept standard contract terms (as reflected from the withdrawal of draft contract from the tender documentation).

2.66 The mitigation strategies for these risks were addressed during contract negotiations. The mitigations primarily involved TCU engaging additional staff to meet the terms of the contract and service delivery requirements.

2.67 Each member of the tender evaluation committee signed a deed of confidentiality, which warranted that no conflicts of interest existed or were likely to arise. The Deputy Secretary approved the limited tender procurement. Separation of roles were appropriately managed. Final approval was given by an officer who did not participate in the tender evaluation.

Are there appropriate contract and service schedule management arrangements in place?

There are appropriate contract management and service delivery oversight arrangements in place for the CDC program. Effective contract management plans are in place for the contracts with the card providers. Documentation supporting the monitoring of Indue service delivery risk could be more regularly reviewed. In April 2022, DSS and Services Australia finalised a CDC service level agreement. This was established late in the relationship, which commenced in 2016.

Management of Indue and Traditional Credit Union contracts

Indue contract management

2.68 The contract management arrangements between DSS and Indue are underpinned by a Services Agreement; the latest contract variation; and a contract management plan.

2.69 The Services Agreement is the base contract that was established in 2016 between DSS and Indue for the implementation and operation of the CDC Trial. This agreement provides comprehensive details of the terms and conditions of the relationship, such as the statement of work, required service levels, respective roles and responsibilities and risk management. DSS has maintained records of variations to the Services Agreement, with the latest contract variation dated 5 December 2021.

2.70 DSS developed a contract management plan in September 2019 in response to the recommendation of Auditor-General Report No.1 2018–19. The contract management plan aligns with DSS’ contract management guidance and addresses the essential requirements of the guidance.

2.71 The contract management plan has been reviewed after each material variation to the contract. According to the contract management plan, changes to the plan need to be approved by the Cashless Welfare Policy and Technology Branch Manager. The contract management plan was updated in August 2021 after the ninth contract variation; however, the updates did not receive the required approvals. DSS advised in January 2022 that the contract management plan is a live document and the current version is being reviewed for amendments and will be progressed for approval.

2.72 The ANAO reviewed DSS’ contract management approaches against its contract management plan for Indue. DSS’ approach aligned with the requirements in the plan. DSS developed risk plans and treatments, held progress meetings and assessed non-compliance with service levels. DSS’ risk register for its contract with Indue has been in place since May 2019. The contract management plan required DSS to review the risk register by March 2020. This review date was not appropriately updated in subsequent versions of the contract management plan. DSS advised the ANAO in January 2022 that the CDC Compliance Section was in the process of reviewing the register.

2.73 Indue is also required to maintain a risk register for the delivery of banking services for the CDC program. Indue provided DSS with its risk register in July 2020 and the contract management plan stated that an updated version was to be requested by DSS from Indue in August 2021. In January 2022, DSS requested that Indue provide an updated risk register which was received by DSS in February 2022.

2.74 There are 10 service levels in the Services Agreement relating to:

  • system availability;
  • account creation, card production and account funding;
  • call centre and disputed transactions;
  • cardholder support; and
  • reporting and publication.

2.75 The service levels include targets and the methodologies to measure achievement against the specified targets for each service level.

2.76 DSS monitors Indue’s performance against the service levels through a monthly program outcomes report provided by Indue. Indue’s results are assessed against the targets in the Services Agreement and are recorded as a ‘pass’ or ‘fail’ in a ‘service level tracker’. In accordance with the Services Agreement, DSS applies a penalty when Indue fails to meet its service level targets to the next month’s invoice. The ANAO observed examples of this occurring.

2.77 The CDC Compliance Section was established in November 2021 and meets monthly with Indue to discuss operational issues. Prior to the establishment of the CDC Compliance Section, other areas of the CDC branches met with Indue on either a regular or ad hoc basis. The types of issues discussed include: participant behaviours; card operations; information and data sharing; the Northern Territory transition; and relationship matters between DSS, Services Australia, Indue and TCU. There are also weekly meetings with Indue to discuss infrastructure and IT-related matters.

TCU contract management

2.78 The contract management arrangements between DSS and TCU are underpinned by a Services Agreement; Term Sheet (Deed) for the Additional Issuer Trial20 (superseded by a Deed of Collaboration in March 2022); and a contract management plan.

2.79 The Services Agreement for implementation and operational services for the Additional Issuer Trial was established in September 2021 with a start date of 6 December 2021. This includes providing CDC card and account services for up to 12,500 participants in the Northern Territory.21 The Services Agreement specifies the roles and responsibilities of TCU that need to be undertaken in accordance with the terms and conditions, pricing provisions and statement of work.

2.80 DSS developed a contract management plan that provides background to the contractual relationship with TCU, information about the Services Agreement and guidance for managing the contract. The contract management plan is compliant with DSS’ contract management guidance. Overall, it is an effective baseline contract management plan that addresses the requirements of the contract.

2.81 The contract with TCU sets out 11 service levels relating to:

  • system availability and performance;
  • account creation, card production and account funding;
  • call centre and disputed transactions;
  • cardholder support; and
  • reporting.

2.82 Similar to the Indue service levels, the TCU contract establishes the methodologies to measure achievement against specified targets for each service level. TCU submits monthly reports to DSS on progress against the service levels. DSS and TCU hold weekly meetings with Indue and another weekly meeting is held between DSS, TCU and Services Australia to discuss issues relating to the Additional Issuer Trial.

Service monitoring arrangements with Services Australia

2.83 Upon commencement of the CDC Trial in 2016, Services Australia was responsible for identification of participants; switching participants on to the CDC; calculation of the quarantined amount; directing funds to restricted and unrestricted accounts; managing changes of circumstances that may impact CDC eligibility; and provision of data to DSS.

2.84 From 16 March 2020, Services Australia assumed additional responsibilities for CDC service delivery. These included: provision of digital and telephony channels; face-to-face services through Centrelink offices and remote servicing arrangements; and managing CDC exit applications and wellbeing exemptions. Appendix 3 provides a list of Service Australia’s CDC responsibilities.

2.85 The bilateral management arrangement between Services Australia and DSS — established under the Services Australia Bilateral Agreement Framework22 — consists of an overarching Statement of Intent (April 2018) and a set of protocols and agreements.23 The Statement of Intent is intended to support interactions between the two entities and sets out the principles, governance and management information for the entire bilateral relationship.

2.86 Under the Statement of Intent, services schedules are required to be established which set out the specific elements of the service being delivered, such as service level standards and responsibilities of each entity.

2.87 Prior to October 2021, there was no services schedule in place for the delivery of the CDC by Services Australia. In October 2021, a draft CDC program Letter of Exchange was prepared and signed. Under the Bilateral Agreement Framework, a Letter of Exchange is meant to be used as an interim arrangement while a more detailed services schedule is being negotiated, or where the service is short term.

2.88 The ANAO reviewed the Letter of Exchange against the requirements of the Bilateral Agreement Framework. The Letter of Exchange included the majority of the required elements with the exception of service level standards. On 1 April 2022 a Service Level Agreement was finalised between DSS and Services Australia. This replaces the Letter of Exchange and contains specific service level standards for the CDC program’s service delivery.

2.89 In terms of service levels, DSS and Services Australia advised that two measures were already being externally reported for the CDC program.

  • The first measure is that determinations to exit the program be made no more than 60 days after the person submits a complete and correct exit application. This measure is legislated under subsection 124PHB(4A) of the SSA Act.24 High-level exit application information is published on data.gov.au.
  • The second measure is in relation to Services Australia’s enterprise-level telephony service level standard, which is the average length of time a customer waits to have a call answered. The service level target is under 16 minutes. CDC Telephony Reports are provided to DSS on a weekly basis. Services Australia’s overall performance on its telephony service level is published in its annual reports.

2.90 Operational matters for the CDC are overseen by the Cashless Welfare Working Group which comprises representatives from DSS and Services Australia. The group meets weekly to discuss CDC program-related matters, including service delivery and performance. The CDC Steering Committee oversees the relationship between DSS and Services Australia and receives updates from the Cashless Welfare Working Group, which primarily discusses operational issues.

3. Performance measurement and monitoring

Areas examined

This chapter examines whether the Department of Social Services (DSS) implemented effective performance measurement and monitoring processes for the Cashless Debit Card (CDC) program.

Conclusion

Internal performance measurement and monitoring processes for the CDC program are not effective. Monitoring data exists, but it is not used to provide a clear view of program performance due to limited performance measures and no targets. DSS established external performance measures for the CDC program. These were found to be related to DSS’ purpose and key activities, but one performance indicator was not fully measurable. External public performance reporting was accurate. Recommendations from Auditor-General Report No.1 2018–19 relating to performance measurement and monitoring were partly implemented.

Areas for improvement

The ANAO made one recommendation aimed at developing internal performance measures and targets that draw on CDC data reports. The ANAO also suggested that DSS reviews the methodology of one of the CDC public performance measures.

3.1 Auditor-General Report No.1 2018–19 The Implementation and Performance of the Cashless Debit Card Trial found that while key performance indicators (KPIs) were developed for the CDC Trial as part of an evaluation, there was a lack of ongoing reporting against the KPIs and there were no performance measures in place for the extension of the CDC.

3.2 The ANAO recommended in 2018–19 that DSS fully utilise all available data to measure performance and develop performance indicators for the CDC.25

3.3 Regular monitoring and reporting to DSS executive management, the Minister and more broadly to Parliament and the public contributes to a better understanding as to whether the CDC program is being implemented as intended and achieving its policy objectives. This performance reporting should be underpinned by relevant performance measures and corresponding targets where applicable.

3.4 To examine whether the CDC program is effectively monitored and reported, the ANAO assessed the CDC program’s:

  • internal performance monitoring and reporting arrangements; and
  • external performance measures and reporting.

Are there effective internal performance monitoring and reporting arrangements?

Reports of internal performance measures are not produced as required under the CDC data monitoring strategy. There are a number of data reports provided to and considered by DSS on a regular basis. These reports include some performance measures and no performance targets, and provide limited insight into program performance or impact. DSS has not implemented the recommendation from Auditor-General Report No.1 2018–19 that it fully utilise all available data to measure performance.

Program monitoring framework and strategies

3.5 CDC program performance should be monitored on an ongoing basis. The CDC Program Monitoring Framework, Data Monitoring Strategy for CDC Program Health and the Data Monitoring Strategy to Address CDC Circumvention aim to establish DSS’ approach to program monitoring.

The CDC program health and circumvention monitoring strategies are designed to improve the visibility of how the program is being implemented in all sites … the strategies utilise program data to confirm if components of the program are performing as intended thus minimising potential circumvention and unintended disruption to participants.

3.6 The Data Monitoring Strategy for CDC Program Health June 2021 states that:

The strategy is designed to provide the department with an overarching view that tracks and measures the effectiveness and operation of the CDC program, and that it is not causing any unnecessary disruption to participants’ day-to-day lives or increasing levels of harm in the community. To meet this objective, the strategy intends to develop monthly program health reports, which will include data that highlights key evidence of program health.

3.7 The Data Monitoring Strategy for CDC Program Health lists four CDC program outputs and corresponding internal performance measures (Table 3.1).

Table 3.1: CDC outputs and internal performance measures

Output

Performance measure / indicators

Participants have access to their income support payments in their CDC account

  • Number of active accounts
  • Average account balances
  • Number of accounts overdrawn
  • Average debt balance on overdrawn accounts

Participants spending on priority and unrestricted goods and have access to a range of merchants

  • Totals and percentage breakdowns of expenditure on goods and services categories
  • Numbers of transactions at a diverse range of merchants (based on Merchant Category Code)a

Participants cannot spend their income support payment on restricted items and blocked merchants

  • Number of declined transactions
  • Merchant category code of declines
  • Total value of declines

Ongoing adjustments made to the program where relevant

  • Number of issues/risks identified
  • Number of issues/risks solved
   

Note a: Merchant Category Codes classify merchants and businesses by the type of goods or services provided.

Source: DSS Data Monitoring Strategy for Cashless Debit Card Program Health.

3.8 DSS advised the ANAO that these internal measures ‘provide effective monitoring of the program and its administration’.

3.9 The internal performance measures in Table 3.1 provide management-level information. Resource Management Guide (RMG) 131 Developing good performance information differentiates between management-level information and effectiveness / efficiency measures, noting that while the former may be useful for internal management purposes it should not be used as a substitute for the latter.26 RMG 131 further states that common issues when developing performance measures are that they are too focused on outputs and do not go to the impact made, value created or outcomes achieved; and that they are selected on the basis that they are the easiest to measure.27 Effective monitoring relies on clear performance expectations. The internal performance measures in Table 3.1 do not have targets, limiting their usefulness as a monitoring tool. DSS advised that ‘it is not meaningful to attach targets’ to some of the measures. Finally, the ANAO was unable to find evidence that the fourth output and associated performance measures are reported to program decision-makers (refer paragraph 3.18).

Performance reporting

3.10 As part of the contract and service requirements, Services Australia, Indue and Traditional Credit Union (TCU) provide DSS with regular data reports on the operation of the CDC and participants.

Services Australia reporting

3.11 Services Australia provides six data reports to DSS each week (Table 3.2).

Table 3.2: CDC data reports provided by Services Australia to DSS

Report type

Information included

Uses

CDC listings for DSS

Listing of CDC participants, including current and previous participants, and customer-level demographic and circumstance data

This report is used to calculate the number of participants with an active card for the second CDC external performance measure (refer Table 3.3)

CDC summary report

Summary of CDC participant numbers and characteristics at the national and CDC region level

Reporting to DSS executive, and Minister; and for planning, program monitoring, compliance and policy development

 

Income Management to CDC potential customer listing

List of Income Management customers who are eligible or potentially eligible for transition to the CDC in the Northern Territory

CDC service delivery

Overview of CDC service delivery, including volume of common requests and exemptions, categorised by region

CDC Hotline telephony

Telephony statistics covering the operation of the CDC Hotline

CDC exit report

Numbers of participants who have applied to exit, including exit application outcomes

Used for a CDC summary report published on data.gov.au

Weekly CDC implementation report

High level data on CDC exits and number of people transitioning to the CDC in the Northern Territory

Sent to the DSS Group Manager and Deputy Secretary to provide information on CDC progress

     

Source: ANAO analysis of Services Australia documentation.

3.12 In addition to the weekly data reports, there have been numerous ad hoc reports from Services Australia to DSS, prepared in response to formal requests for data by DSS.

3.13 The ANAO undertook a systems assurance review of Services Australia’s production of the data and extraction of the reports, and found that there were no control deficiencies and the process was sound.

3.14 The data presented in the reports are not linked to program objectives or service standards, do not involve targets, and do not provide information to executive management about program performance against expectations. These characteristics mean that the reporting provides limited insight into whether the CDC is making an impact in the participating regions and whether the responsible entities are administering the program well.

Indue and TCU reporting

3.15 Indue provides multiple data reports to DSS relating to: participants’ transactions; cancellation or loss of card; point-of-sale data, including product level blocking; investigation reports on suspicious activity; and merchant data, including merchant investigations. Information is provided either on a daily, weekly, monthly or ad hoc basis, depending on the data report.

3.16 TCU is required to provide three reports to DSS: an information systems report (daily); a trial outcomes report (initially weekly for the first 10 weeks of the roll-out and then monthly); and a monthly service level report.

3.17 DSS advised that it uses the data it receives from the two card providers to prepare ‘a range of reports and monitoring’. DSS provided the ANAO with two examples of emails with attached data from TCU and Indue. DSS advised that the data is also being used for long-term analysis under the CDC Data Infrastructure and Analytics Project (refer paragraphs 4.43 to 4.45). The data has also been used for evaluations of the CDC Trial and a cost–benefit analysis.

DSS reporting

3.18 The DSS Data Monitoring Strategy for CDC Program Health required monthly reports to the branch manager. The Data Monitoring Strategy to Address CDC Circumvention outlines the potential avoidance activities that CDC participants may attempt in order to increase their access to cash or purchase restricted items and states that monthly CDC circumvention reports would be provided to the branch manager and may be provided to the Minister’s Office upon request.

  • DSS advised that monthly reports required under the Data Monitoring Strategy for CDC Program Health were replaced with quarterly dashboards.
  • The quarterly dashboards indirectly address three of four outputs and some of the 11 performance measures in Table 3.1. For example, there is information on the number of active accounts (participants), account balances, expenditure by merchant category and blocked transactions. The dashboards do not address the fourth output. DSS advised that other regular reports contain the performance measures and provided two examples. ANAO analysis of the two examples found that they did not address the performance measures and were focused primarily on participant numbers.
  • DSS advised the ANAO that ad hoc CDC circumvention reports have been generated rather than a regular monthly circumvention report. DSS advised that monthly reports will commence once a CDC compliance strategy has been developed.

3.19 In addition to the above reports required under the strategies, several other reports are prepared for internal and external stakeholders.

  • DSS compiles ‘community dashboards’ for each CDC region (except for the Northern Territory). These dashboards draw on data provided by Services Australia and Indue and provide a point-in-time report including the numbers of participants in the region; type of income support paid; Indigenous identity; exit applications, numbers and value of card transactions; merchant information; and declined transactions. DSS advised the ANAO that the dashboards are presented at meetings with the community reference groups within the CDC regions. These meetings happen monthly or bi-monthly, depending on the area.
  • DSS advised that it provides reports to the Secretary of DSS during new CDC site implementation and at different intervals (daily, weekly or fortnightly) during different stages of the implementation. DSS provided examples of the reports to the Secretary during the Cape York roll-out.
  • DSS provides occasional briefings to the Australian Government on the CDC program. These briefings focus on the operation of the card and changes to the program. There is limited focus on performance information including program impact.

Recommendation no.1

3.20 Department of Social Services develops internal performance measures and targets to better monitor CDC program implementation and impact.

Department of Social Services’ response: Agreed.

3.21 The department has revised its internal performance measures and targets to better monitor CDC program implementation and impact. The department is committed to continuing to strengthen the governance and oversight arrangements established over performance monitoring and reporting.

Are there effective external performance measures and reporting for the CDC program?

In 2020–21, DSS developed two external performance indicators for the CDC program. An ANAO audit of DSS’ 2020–21 Annual Performance Statement found that the two indicators were directly related to a key activity (the CDC program), that one of the performance indicators was measurable, and that part of the second indicator was not measurable because it was not verifiable and was at risk of bias. A minor finding was raised. DSS reports annually against the two CDC performance measures.

3.22 Identifying relevant performance measures and establishing appropriate targets assists entities to measure and demonstrate whether program objectives are being met.28

3.23 The ANAO undertakes annual performance statements audits on selected entities’ performance information as set out in their Portfolio Budget Statements and corporate plans and reported on in their annual performance statements. It assesses the appropriateness of performance information against the requirements of the Public Governance, Performance and Accountability Act 2013 (PGPA Act) and the Public Governance, Performance and Accountability Rule 2014 (PGPA Rule).

3.24 PGPA Rule section 16EA requires an entity’s performance measures, in the context of the entity’s purposes or key activities, to:

  1. relate directly to one or more of those purposes or key activities;
  2. use sources of information and methodologies that are reliable and verifiable;
  3. provide an unbiased basis for the measurement and assessment of the entity’s performance;
  4. where reasonably practicable, comprise a mix of qualitative and quantitative measures;
  5. include measures of the entity’s outputs, efficiency and effectiveness if those things are appropriate measures of the entity’s performance; and
  6. provide a basis for an assessment of the entity’s performance over time.

3.25 Under subsection 16E(2) item 5 of the PGPA Rule, an entity’s corporate plan should specify targets for each of those performance measures for which it is reasonably practicable to set a target.

3.26 Prior to 2020–21, DSS reported on the number of CDC participants in its annual performance statement. In 2020–21 DSS implemented two performance measures with targets (Table 3.3).

Table 3.3: CDC external performance measures, 2020–21

Key activity

Performance measure

Target

Method

Description

CDC:

Aims to reduce the levels of harm associated with alcohol consumption, illicit drug use and gambling

Extent to which the CDC supports a reduction in social harm in communities

Evaluation results show improvements in social outcomes

Evaluation by an independent provider

Evaluates the improvement in social outcomes of the relevant population

Extent to which participants are using their CDC to direct income support payments to essential goods and services, including to support the wellbeing of the participant

95 per cent of CDC participants have activated their card and are using their card to purchase non-restricted items

Analysis of program administrative data on activation and use of the card

Measures the number of participants having activated their cards, not the benefit of activating their card and accessing non-restricted items

         

Source: DSS Corporate Plan 2020–21 and ANAO analysis.

3.27 The DSS 2020–21 Annual Performance Statements audit of DSS’ performance measures assessed whether the two CDC performance measures were related (refer to criterion ‘a’ in paragraph 3.24); and measurable (refer to criteria ‘b’, ‘c’ and ‘f’ in paragraph 3.24).

Table 3.4: Assessment of 2020–21 performance measures for the Cashless Debit Card

Performance measure

Relateda

Measurableb

 

 

Reliable (data)

Verifiable (method)

Free from bias

Extent to which the CDC supports a reduction in social harm in communities

Extent to which participants are using their CDC to direct income support payments to essential goods and services, including to support the wellbeing of the participant

         

Legend:  Fully and/or mostly meets requirements; Partially meets requirements; Does not meet requirements

Note a: Related refers to the requirement of subsection 16EA(a) of the PGPA Rule 2014, as amended. In applying the ‘related’ criterion, the ANAO assessed whether the entity’s performance measures:

  • related directly to one or more of the entity’s purposes or key activities;
  • provided a clear link between purposes, key activities and performance measures; and
  • were expressed in a consistent way.

Note b: In applying the ‘measurable’ criterion, the ANAO assessed whether the entity’s performance measures were:

  • reliable and verifiable: used sources of information and methodologies that are reliable and verifiable;
  • free from bias: provided an unbiased basis for the measurement and assessment of the entity’s performance; and
  • provided a basis for an assessment of the entity’s performance over time.

Source: ANAO analysis based on the PGPA Act and PGPA Rule.

3.28 The DSS 2020–21 Annual Performance Statements audit made the following findings.

  • Extent to which the CDC supports a reduction in social harm in communities — The performance indicator was found to be related and measurable. Although the outcome ‘to determine whether [a reduction in the amount of certain payments available to be spent on restricted items] decreases violence or harm in trial areas’ was removed from the legislation (refer Table 1.2), DSS has retained the outcome-related performance measure in its Portfolio Budget Statements for 2020–21 and 2021–22. The target was changed in 2021–22 to ‘Cashless Debit Card program data demonstrates improvement in social outcomes through analysis of card spending and use’.
  • Extent to which participants are using their CDC to direct income support payments to essential goods and services — The performance indicator was found to be related and partly measurable, depending on the target.
    • 95 per cent of CDC participants have activated their card — This part of the second indicator’s target was found to be related and measurable. As set out in the CDC Program Profile29, the target of 95 per cent for the second measure was selected on the basis of Indue operational data, which showed that in 2018–19 around 95 per cent of participants had an active CDC in place that was being used to purchase unrestricted items.30
    • 95 per cent of CDC participants are using their card to purchase non-restricted items — This part of the second indicator’s target was found to be related but not measurable. The target was found to be not verifiable and at risk of bias. This is because the card provider, Indue Ltd (Indue), tracked merchant-level data and did not track the purchase of individual non-restricted items or products and there was potential for ‘operational’ and ‘exclusion’ bias.31 Once product level blocking (refer paragraph 1.16) is fully rolled out across all mixed merchants by July 2022, this will address the finding. Until then, the performance measure will continue to be at risk of operational and exclusion bias.

3.29 DSS should review the measurability of the second CDC performance measure.

3.30 In its 2020–21 Annual Performance Statement, DSS reported that it met both of the external CDC performance measure targets.

  • Extent to which the CDC supports a reduction in social harm in communities — DSS reported ‘Findings made in relation to improvements in social outcomes’.
  • Extent to which participants are using their CDC to direct income support payments to essential goods and services — DSS reported that 95 per cent of CDC participants had activated their card and were using it to purchase non-restricted items.

3.31 The DSS 2020–21 Annual Performance Statements audit found this reporting to be accurate.

3.32 DSS also publishes monthly CDC statistics on the data.gov.au website. The reports have been published since December 2019 and cover program outputs such as:

  • the number of CDC participants by location;
  • community panel decisions to change the restricted portion of an income support payment; and
  • well-being exemptions and exits.

4. Evaluation, cost–benefit analysis and post-implementation review

Areas examined

This chapter examines whether the extension and expansion of the Cashless Debit Card (CDC) program was informed by an effective second impact evaluation, cost–benefit analysis and post-implementation review.

Conclusion

The CDC program extension and expansion was not informed by an effective second impact evaluation, cost-benefit analysis or post-implementation review. Although the Department of Social Services (DSS) evaluated the CDC Trial, a second impact evaluation was delivered late in the implementation of the CDC program, had similar methodological limitations to the first impact evaluation and was not independently reviewed. A cost–benefit analysis and post-implementation review on the CDC program were undertaken but not used. The recommendations from Auditor-General Report No.1 2018–19 relating to evaluation, cost–benefit analysis and post-implementation review were not effectively implemented.

Areas for improvement

The ANAO made one recommendation for DSS to undertake an external review of the second impact evaluation of the CDC.

4.1 Auditor-General Report No 1 2018–19 The Implementation and Performance of the Cashless Debit Card Trial found that a strategy to monitor and analyse the CDC Trial was developed and approved by the Minister for Families and Social Services. The audit found that DSS did not complete all the activities identified in the strategy (including a cost–benefit analysis), did not undertake a post-implementation review of the CDC Trial despite its advice to the Minister that it would do so, and did not build evaluation into the CDC Trial design.

4.2 The ANAO recommended in 2018–19 that DSS should continue to monitor and evaluate the extension of the CDC in Ceduna, East Kimberley and any future locations to inform design and implementation. In addition, the ANAO recommended DSS review its arrangements for monitoring and evaluation, including the level of collaboration between the evaluation and CDC program areas, and build evaluation capability within DSS to facilitate the effective review of evaluation methodology. Finally, the ANAO recommended that DSS should undertake a cost–benefit analysis and a post-implementation review of the CDC Trial to inform the extension and further roll-out of the CDC.32

4.3 To examine whether DSS undertook an effective evaluation, cost–benefit analysis and post-implementation review, the ANAO examined:

  • the evaluation process, methodology and report;
  • the cost–benefit analysis process and report; and
  • the post-implementation review process and report.

Was the management of the 2021 evaluation effective?

DSS’ management of the second impact evaluation of the CDC Trial was ineffective. Results from a second impact evaluation were delivered 18 months after the original agreed timeframe and there is limited evidence the evaluation informed policy development. The commissioned design of the second impact evaluation did not require the evaluators to address the methodological limitations that had been identified in the first impact evaluation. DSS did not undertake a legislated review of the evaluation.

4.4 In December 2015 DSS developed the CDC Trial Monitoring and Evaluation Strategy, which indicated that DSS would evaluate the CDC Trial. The strategy, which was approved by the Assistant Minister for Social Services on 17 February 2016, required a monitoring project (including a cost–benefit analysis, discussed from paragraph 4.50) and an evaluation project.

4.5 Auditor-General Report No.1 2018–19 examined the first impact evaluation of the CDC, which was undertaken from February 2016 to August 2017 by ORIMA Research (ORIMA). Auditor-General Report No 1 2018–19 found that DSS developed high-level guidance to support its approach to evaluation, however the guidance was not fully operationalised. DSS did not build evaluation into the CDC Trial design, nor did it collaborate and coordinate data collection to ensure an adequate baseline to enable impact measurement, including any change in social harm.

4.6 DSS agreed to the recommendation from Auditor-General Report No.1 2018–19 that it undertake a second evaluation of the CDC program. The focus of this audit is on DSS’ management of the second impact evaluation. The ANAO also examined DSS’ evaluation strategy and approach for the CDC program along with changes to DSS’ overall evaluation capability.

Evaluation strategy

4.7 In December 2018 the CDC Evaluation Steering Committee agreed that an evaluation strategy was a priority for development. The request for quote and the contract with the second evaluators — the University of Adelaide — guided the development of the project plan, which functioned as an evaluation strategy for the CDC. It included many required elements including an evaluation objective; outcomes; questions; methodology and data; consultation process and peer review; deliverables; and timeframes.

4.8 DSS developed risk assessments for the second impact evaluation and for baseline data collection in the Bundaberg and Hervey Bay, and Goldfields regions. Similar to findings in Auditor-General Report No 1 2018–19 on the first impact evaluation:

  • the risk plans were developed as part of the procurement process and were not updated when issues arose, such as delays to the evaluation;
  • where a risk was considered outside of tolerance, treatments were not registered; and
  • there are four instances in each register where there was an error in the risk rating according to the DSS risk framework matrix and as a result, these risks were not appropriately treated and escalated.

Second impact evaluation of the CDC

4.9 The University of Adelaide was commissioned in November 2018 to undertake the second impact evaluation of the CDC.

4.10 DSS established the methodology and requirements for the impact evaluation in the request for quote and the contract. The University of Adelaide advised the ANAO that core design project parameters (including one wave of survey data collection, the absence of a control group, and the use of administrative and community level data) were determined by the request for quote and the subsequent contract. DSS advised the ANAO that it ‘expected that potential suppliers apply their expertise when proposing an evaluation design as part of their response to a Request for Quotation [and that a] potential supplier may propose an evaluation design that exceeds the Customer’s minimum requirements’.

4.11 An overarching requirement was to ‘develop sound, rigorous evidence-based information and knowledge to inform policy decisions’. DSS also required the evaluators to consider the first evaluation, the first ANAO audit on the CDC and internal CDC framework documents. Other requirements included:

  • reviewing a program logic, theory of change and key performance indicators;
  • undertaking a single wave quantitative survey of CDC participants in the Ceduna, East Kimberley and Goldfields regions;
  • conducting qualitative interviews with CDC participants and stakeholders in the Ceduna and East Kimberley regions;
  • conducting qualitative interviews with CDC participants and stakeholders in the Goldfields region, as a follow-up to baseline interviews in the Goldfields region (refer paragraph 4.19); and
  • ongoing analysis of CDC participant administrative data and available community (state harm and support service) data.

4.12 Although baseline qualitative data was collected for the Bundaberg and Hervey Bay region, full impact evaluation coverage of that region did not proceed because DSS considered the impact evaluation would not be of value where the program had been operating for a relatively short time. Qualitative interviews conducted in these regions as part of baseline studies enabled an assessment of the conditions within the Bundaberg and Hervey Bay region just before, and at the time of, the introduction of the CDC. They were also conducted for the purpose of developing a design and framework for conducting a potential future impact evaluation of the CDC in the region.

4.13 The timeline for the various phases of the evaluations is shown in Figure 4.1.

Figure 4.1: CDC evaluation timing

 

A timeline of the Cashless Debit Card evaluation activities from February 2016 to September 2021, showing the commencement dates of the CDC program in each region, the timespan to complete the evaluation activities, and the baseline data collection fieldwork period and deliverables.

 

Note: Dates associated with the development of the quantitative survey instrument are as advised to the ANAO by the University of Adelaide. A Public Interest Certificate is issued when a research proposal meets the necessary legislative requirements for accessing data. The University of Adelaide advised the ANAO that the Public Interest Certificate allowing them to conduct the quantitative fieldwork was issued in September 2019. DSS advised that two other Public Interest Certificates were issued — one in March 2019 (which allowed access to data from the Bundaberg and Hervey Bay region) and another in June 2019 (which enabled the release of CDC participant income support payment data to the University of Adelaide).

Source: ANAO analysis.

Program logic and theory of change

4.14 A program logic and theory of change was developed in July 2016 by ORIMA (Appendix 4).33

4.15 The request for quote for the second impact evaluation stated that the program logic and theory of change would be reviewed and refined at the beginning and end of the evaluation. A workshop was undertaken in November 2018 between DSS and the University of Adelaide before the second evaluation, with no resulting changes to the program logic and theory of change. The program logic and theory of change were reworked and combined into a policy logic as part of the second impact evaluation (Figure 4.2).

Figure 4.2: CDC policy logic and expected transmission method, January 2021

 

Diagram depicting the Cashless Debit Card overarching policy objectives at the individual, family/household and community levels.

 

Source: Report on the Evaluation of the CDC in Ceduna, East Kimberley and the Goldfields Region, January 2021.

4.16 The evaluation questions in the request for quote and the contract cover outcomes, both intended and unintended, to be observed from the CDC in the three regions. These included assessing:

  • ‘first round’ effects — reduced spending and consumption on alcohol, illegal drugs and gambling, and their interactions;
  • ‘second round’ effects — increased spending on groceries, basic household goods and paying bills; and
  • tertiary impacts — reduced violence and crime in CDC sites, increased economic participation from CDC participants and a reduced reliance on social security.

4.17 Reporting examined several first round effects, second round effects and tertiary impacts of the CDC on participants, their families and their communities. The evaluators advised the ANAO that ‘while some shorter term effects were clearly visible in the data, some of the longer-term effects were harder to trace within the short-term sampling framework for the evaluation’.

Baseline data

4.18 Auditor-General Report No 1 2018–19 identified that a significant limitation of the first evaluation was the lack of baseline data against which program impact (including reductions in social harm) could be assessed.34

4.19 The University of Adelaide was commissioned to undertake baseline data studies in the Goldfields and Bundaberg and Hervey Bay regions. The University of Adelaide advised the ANAO that the baseline data collection ‘aimed to understand the pre-conditions and expectations of the impact of the CDC’.

  • Goldfields baseline — Baseline studies included quantitative and qualitative data elements. The quantitative element focused on the availability and potential usefulness of administrative and external data, and on developing the sample, methodology and survey instrument for quantitative data collection to evaluate impact. The qualitative data collection was aimed at developing the design of the quantitative survey fieldwork, and refining the key evaluation questions and program logic. Qualitative interviews were held between June and September 2018 after the CDC Trial had commenced in March 2018. The results from both elements were provided to DSS by the evaluators, and the qualitative data collection was made public in February 2019. Due in part to its primary purpose of informing the evaluation design, the quantitative element of the baseline study was not published.
  • Bundaberg and Hervey Bay baseline — Qualitative interviews were conducted from January to April 2019. The Cashless Debit Card Baseline Data Collection in the Bundaberg and Hervey Bay Region: Qualitative Findings baseline report was finalised in December 2019. Ministerial approval for publication was provided in May 2020.35

4.20 There was no baseline data collected or used for any of the regions prior to implementation of the CDC Trial to assist with establishing program impact. Although studies which were called ‘baseline’ were undertaken, the University of Adelaide advised that neither baseline study was designed to collect information prior to program commencement.

  • The CDC program commenced in the Bundaberg and Hervey Bay region in January 2019, and data was collected in January to April 2019. Similarly, baseline data in the Goldfields region was collected three to six months after the program had commenced in this region.
  • The second impact evaluation did not draw on data from the first impact evaluation as a point-in-time baseline as the University of Adelaide considered the first evaluation’s methodology to be flawed.
  • The University of Adelaide stated that a longitudinal methodology (where the same participant could be followed over time) was not possible, primarily because the first evaluation data did not retain contact details of survey participants that could be used for longitudinal follow-up.

4.21 The Cashless Debit Card Baseline Data Collection in the Bundaberg and Hervey Bay Region: Quantitative Data Snapshot May 2020 provided, in part, an overview of the social and economic characteristics of the population living in the area prior to the roll-out of the CDC. The study used publicly available data sources such as hospital presentations, crime, gaming in hotels, labour force participation and employment, household characteristics and socio-economic status and also drew on Australian Government administrative data to describe the characteristics of welfare benefits recipients in the regions. This information was collected to inform a future evaluation.

Impact evaluation methodology

4.22 The second evaluation methodology consisted of quantitative surveys, qualitative interviews, analysis of Australian Government administrative data, and use of community level data. Quantitative survey data was collected though one wave of surveys of CDC participants in the Ceduna, East Kimberley and Goldfields regions. Australian Government and state government administrative data was used primarily to describe the demographic profile of the regions (for example, participant and community characteristics) and to validate and weight the evaluation sample of CDC participants. The University of Adelaide advised the ANAO that the second evaluation explored community level data in order to determine the degree to which this type of data could be used, but found it would be of limited value for the evaluation.

4.23 Table 4.1 shows the main characteristics of the first and second impact evaluations.

Table 4.1: Comparison of the first and second CDC evaluations

 

First impact evaluation

February 2016 to August 2017

Second impact evaluation and baseline studies

July 2018 to January 2021

CDC region coverage

  • Ceduna and East Kimberley
  • Ceduna, East Kimberley and the Goldfieldsa

Participant data collection

  • Two waves of quantitative surveys involving 552 CDC Trial participants
  • Qualitative interviews with participants and stakeholders
  • Self-reported data on socially unacceptable behaviours
  • One wave of quantitative surveys involving 1936 CDC Trial participantsb
  • Qualitative interviews with participants and stakeholders
  • Self-reported data on socially unacceptable behaviours

Community level and administrative data used for outcomes analysisc

  • Gaming machine revenue in Ceduna (South Australian Attorney General’s Department)
  • South Australian and Western Australian police crime data
  • South Australian and Western Australian government agency data regarding drug and alcohol related injuries and hospital admissions
  • South Australian and Western Australian police crime data
  • The University of Adelaide included some descriptive statistics of other community level data but did not use this information to make statistical inferences regarding the impact of the CDC, deeming this to be methodologically unsuitabled

Baseline data and comparison sites

  • First and second wave quantitative survey data were compared
  • Comparison site data used for some measures
  • Non-CDC participants surveyed as a control group
  • No comparison across waves
  • No control group or comparison sitee
     

Note a: Qualitative interviews were also conducted in this period in the Bundaberg and Hervey Bay region for the purpose of conducting a third evaluation.

Note b: The requirement for a single wave of data collection was in the request for quotation.

Note c: The University of Adelaide advised the ANAO that the method and extent of access to administrative data and community level data in the second evaluation was to be managed by DSS.

Note d: Both the first and second evaluation accessed administrative data but did not use it specifically for the purposes of outcomes analysis. The first evaluation used administrative data to profile the demographic characteristics of CDC participants in the first three CDC Trial regions. The second evaluation used CDC participants’ administrative data to describe the socioeconomic characteristics and patterns of CDC transactions of CDC program participants within the CDC Trial regions.

Note e: The University of Adelaide advised the ANAO that the decision not to use a control or comparison group was taken by DSS and was due to time and funding limitations. The request for quote required the evaluators to ‘Explore the feasibility of and provide advice on utilising a future comparison group, using existing administrative data on CDC participants to create a sampling frame for the comparison group’.

Source: ANAO analysis of first and second CDC impact evaluation reports.

4.24 The University of Adelaide undertook the evaluation largely in line with the evaluation plan and the methodology set by DSS. Exceptions were:

  • no matching of Goldfields baseline and follow-up data as proposed in the project plan36; and
  • limited integration of previous research material, including the evidence collected for the first evaluation.37

4.25 The University of Adelaide made reference in its evaluation report to methodological limitations affecting the ability to draw conclusions about impact.38 These included: behaviours subject to strong social acceptability reporting biases39; lack of experimental design in the trials; no quantitative survey of stakeholders and the broader non-CDC population; longer-term outcomes being assessed in a short study timeframe; community data limitations; concurrent CDC policies and initiatives making it difficult to identify whether it was the CDC that drove any observed changes; and lack of generalisability of the findings due to extensive use of qualitative methodologies. The evaluation report also stated that a challenge of the second evaluation was to gain the trust of CDC participants and stakeholders.

4.26 The ANAO also notes that in addition to social acceptability bias identified by the evaluators, participants were asked to recall their behaviours and attitudes prior to the introduction of the CDC, as well as reporting their current behaviours. This reliance on memory is subject to recall bias. A range of data sources including alcohol sales, emergency hospital presentations and wastewater monitoring were not used in the second evaluation at an aggregated level. The concurrent programs funded by the Australian Government through the CDC program are discussed at paragraph 1.5, noting that there are also additional programs funded by the states and territories.

4.27 In presenting these caveats, the University of Adelaide noted the multi-disciplinary, inclusive, multi-method and multi-faceted evidence base of the evaluation; that the limitations and caveats do not invalidate the findings of the evaluation; that it did not consider the research evidence to be flawed; and that empirical research of this nature contains inherent and unavoidable limitations. In the evaluation report, users of the evaluation were encouraged to avoid ‘misleading conclusions that are either too strong, or too weak, in any particular direction’.40

4.28 The University of Adelaide noted that ‘the [CDC] trials were not implemented with an evaluation design built into them’.41 The lack of an evaluation design made it difficult to develop statistically robust quantitative analysis measuring the impact of the CDC. In its response to this audit, the University of Adelaide further noted that:

In the reports, the University of Adelaide highlighted the methodological superiority of a rigorous experimental design for policy evaluation. It is recognised that the implementation lies in the hands of the client who introduces the time and funding parameters of such evaluations.

Evaluation findings

4.29 The findings from the second impact evaluation were presented in three parts: a consolidated report, a quantitative supplementary report and a qualitative supplementary report. For the consolidated report, evidence from the quantitative surveys and qualitative interviews were combined to come to an overall assessment.

4.30 In relation to the reduction of three social harms (first round effects), the second evaluation found the following.

  • Alcohol consumption — Alcohol consumption was reduced after the introduction of the CDC. The evaluators stated it was not possible to attribute these changes to the CDC alone.
  • Gambling — There was a 3.5 per cent short term reduction in the prevalence of gambling in each of the CDC Trial site areas.42 One in five CDC participants reported that the CDC had helped reduce gambling problems.
  • Illicit drug use — The evaluation could not provide a clear conclusion about whether the CDC influenced the personal or social harm caused by the use of illicit drugs. Results suggested a net five per cent increase in the use of illegal drugs when using the same methodology that was used to calculate the decrease in gambling.

4.31 The majority of CDC program participants reported not participating in the target behaviours prior to the CDC implementation. Findings in relation to second round effects and tertiary outcomes were mixed.

Cost, timing and reporting of the evaluation

4.32 The initial budget for the second impact evaluation was $1.5 million. During the evaluation fieldwork, two variations to the original contract price were agreed. Variations were requested to conduct additional interviews, surveys and stakeholder consultations; and for additional staff. The total cost was $2.5 million excluding goods and services tax.

4.33 The evaluation was initially scheduled to be completed in June 2019. The University of Adelaide advised the ANAO that the survey instrument was completed by the University and approved by DSS in early February 2019. The University of Adelaide advised that the delay in DSS providing participant data in September 2019 meant that fieldwork could not commence until October 2019. DSS stated that many activities were able to be undertaken by the University of Adelaide while the relevant Public Interest Certificates were being finalised, however the University of Adelaide stated that the delayed commencement meant that the fieldwork needed to be redesigned and that time available for analysis and report-writing was compressed. The University noted that the impact of the COVID-19 pandemic also led to delays.

4.34 Delays to the evaluation were first reported to government in July 2020. The second impact evaluation report was completed in November 2020 and published in February 2021 on the DSS website.43 Approval to publish was provided by the relevant DSS manager in line with DSS policy.

4.35 Six briefings regarding the second impact evaluation were provided to government between July 2020 to January 2021. The government was provided the final evaluation consolidated report in October 2020 and the final supplementary reports in November 2020.

4.36 A decision to continue the CDC program was made by the Parliament in December 2020. DSS advised the ANAO in December 2021 that:

The February 2021 legislation changes defined the Cape York region to be a CDC program site, to allow for the transition from Income Management to the CDC in Cape York. The purpose of the second impact evaluation was not to make a decision on whether certain legislation should be introduced. The second impact evaluation data collection provided an opportunity to build on the existing evidence base, while responding to lessons learned.

4.37 DSS advised the ANAO that DSS provided briefings to the Minister, including draft findings from the evaluation, prior to the legislative changes in December 2020 and that the changes relating to Cape York ‘were informed, in part, by these findings’. Advice to government was reviewed by the ANAO and these included one brief reference to the second evaluation which was not yet complete at the time the advice was provided.

Independent review of the evaluation

4.38 Under the amendments to the Social Security (Administration) Act 1999 (SSA Act) in September 2018, section 124PS required that evaluations of the CDC be subject to a review, which was then to be tabled in Parliament.44 The contract with University of Adelaide notes that ‘the evaluation’s methodological rigour and integrity will be independently reviewed via an independent, peer review process’. DSS executive approved a review of the evaluation in February 2021. DSS subsequently decided that it was not required to undertake the review because the evaluation was commissioned in November 2018 prior to the commencement of section 124PS of the SSA Act in December 2018. To date, no independent review has taken place. This decision does not appear to be consistent with the intent of the legislation.

Recommendation no.2

4.39 Department of Social Services undertakes an external review of the second impact evaluation of the CDC.

Department of Social Services’ response: Disagreed.

4.40 The author of the second impact evaluation, the University of Adelaide, the department and the ANAO have acknowledged the significant limitations of the evaluation due to availability of robust data, including data held by the state and territory governments.

4.41 Significant time has elapsed since the second impact evaluation was conducted (between November 2018 and January 2021) impacting the relevance and timeliness of any insights. This, together with the fact that a review of the second evaluation will encounter the same constraints regarding data accessibility, the department is strongly of the view that the relevance and value of insights provided by the recommended review would be limited. The department is of the strong view that a review of the second evaluation does not represent value for money.

4.42 As acknowledged in the ANAO’s report, the department has a strong and continuing focus on addressing accessibility to data through the CDC Data Infrastructure and Analytics Project, and it is anticipated that by late 2022 there will be a maturing data asset that would allow the department to provide further advice to government on scope and timing of any subsequent evaluation. The department is supportive of an independent external review for subsequent evaluations and commits to undertaking an independent review for these evaluations.

CDC Data Infrastructure and Analytics Project

4.43 In 2020–21, DSS received $2 million in funding for improved data and evidence collection under the Cashless Welfare Economic and Employment Support Services Package budget measure. The CDC Data Infrastructure and Analytics Project was initiated after it was found that the two impact evaluations of the CDC could not make substantial use of available data to establish impact.

4.44 The project aims to provide greater evidence on program impact and participant outcomes to inform future policy development. It also aims to support analysis across a range of topics such as community harm, social harm, unemployment and long-term welfare dependence. The project involves three areas of work: obtaining and consolidating relevant Australian Government and state and territory data; building new data assets; and expanding data analytics capability.

4.45 The Minister approved the first stage of the project in May 2021. An approach to market was released in October 2021 and a contract was signed in November 2021. DSS anticipates that the project will be completed in approximately three years.

DSS’ evaluation approach and capability

4.46 DSS’ evaluation policy outlines DSS’ approach to evaluation and was last updated in February 2018. The policy included information about the types of evaluations, evaluation principles, the DSS Evaluation Unit’s roles and responsibilities and the evaluation planning process.

4.47 The DSS Evaluation Unit was responsible for the CDC second impact evaluation when it was commissioned in November 2018. In December 2018, the CDC Evaluation Section was established to manage the evaluation of the CDC until the evaluation’s completion in 2021.

4.48 DSS has developed guidelines to assist with managing evaluations, including a guide to program logic, templates and factsheets.

4.49 A Chief Evaluator was appointed in August 2017. DSS advised that the following evaluation capability improvements have been made since 2018–19:

  • refinement and updating of resources and guidance for line areas;
  • development and launch of the DSS Evidence Portal in June 2021, which consolidates research and evaluation knowledge into one searchable database; and
  • development of evaluation training resources.

Were a cost–benefit analysis and a post-implementation review undertaken?

A cost–benefit analysis and post-implementation review were undertaken on the CDC. Due to significant delays and methodological limitations, this work has not clearly informed the extension of the CDC or its expansion to other regions.

Cost–benefit analysis

4.50 Auditor-General Report No 1 2018–19 recommended that DSS undertake a cost–benefit analysis on the CDC as it was a commitment under the CDC Trial Monitoring and Evaluation Strategy, endorsed by the Minister. At the time of the audit, DSS had not undertaken this work. An initial approach to market for a cost–benefit analysis of the Ceduna and East Kimberley trial sites was approved by DSS in November 2018 at a cost of $80,000. DSS later indicated that the cost of the work would be higher than originally anticipated due to the complexity involved; a revised budget of $300,000 was approved.

4.51 In April 2019, a decision was taken to postpone the cost–benefit analysis so that it could take into account the findings of the second impact evaluation. In June 2020, DSS undertook an approach to market and a contract with the Centre for International Economics was finalised in August 2020.

4.52 The cost–benefit analysis was undertaken between August 2020 and September 2021.

4.53 The request for quote and the contract for the cost–benefit analysis stated that the cost–benefit analysis would cover the first three CDC program regions, and through a contract variation in December 2020, it was expanded to cover four CDC program regions: Ceduna; East Kimberley; Goldfields; and Bundaberg and Hervey Bay45 over the period 2015–16 to 2019–20.

4.54 The cost–benefit analysis sought to quantify the following effects of the CDC:

  • costs — system administration; policy evaluation; communications, legal and consultancy; other departmental; and inconvenience to participants who prefer cash; and
  • benefits — the effects of reduced alcohol misuse on criminal justice, the health system, productivity and traffic accidents; reduced gambling; and improvements to child wellbeing associated with health, food, safety and education.

4.55 The approach to the cost–benefit analysis was based on the Department of Prime Minister and Cabinet’s Cost–benefit analysis — guidance note March 2020.46 The cost–benefit analysis followed the steps in the guidance note: establishing the base case; quantifying the changes from the base case; placing values on the changes; valuing these impacts; and undertaking sensitivity analyses to test key assumptions and inputs.

4.56 The methodology for the cost–benefit analysis included:

  • development of an economic framework by the Centre for International Economics in conjunction with DSS, to identify the full range of potential benefits, and direct the areas for analysis such as the base case, benefit selection and impact measurement;
  • the evidence base from the second impact evaluation up to June 2020, which was requested by DSS at the time of procurement;
  • additional analysis on changes in employment outcomes, consumption patterns and expenditure behaviours; and
  • a consultation process with stakeholders who interact with program participants, to validate impacts and specific modelling assumptions.

4.57 Correspondence from the Centre for International Economics to DSS in November 2021 noted limitations in the data that had been used for the cost–benefit analysis, including:

  • existing data was not designed to measure economic impacts;
  • the evidence base was typically limited to ‘averages’, preventing separate measurement of impacts for those that benefited and those that did not; and
  • previous evidence and evaluations did not include all of the CDC regions covered by the cost–benefit analysis.

4.58 Limitations were also acknowledged in the final report dated 20 September 2021. In particular:

  • the second impact evaluation relied heavily on self-reported survey data, which was viewed as partly unreliable due to negativity bias47;
  • conclusions could not be drawn on whether benefits could be attributed to the CDC because of concurrent policies and support services in place; and
  • not all impacts identified in the economic framework had a sufficient evidence base, leading to inconclusive results.

4.59 The report’s main findings are listed below.

  • The costs of alcohol misuse were reduced by an estimated 15–20 per cent across all four CDC Trial sites based on self-reported changes in alcohol consumption. Causal attribution to the CDC could not be clearly established.
  • It was estimated there was a total benefit value of $8.5 million from reduced alcohol misuse between 2015–16 and 2019–20 through improved productivity, reduced traffic accidents, and reduced interactions with the criminal justice and health systems.
  • The reduction in gambling-related harm was estimated to be $2.3 million and net child wellbeing benefits of around $0.1 million.
  • There was insufficient evidence to substantiate measurable benefits in relation to safety, illicit drug use, employment outcomes or mental health.
  • The total program costs for the CDC were $68.3 million48 from 2015–16 to 2019–20, which includes all costs incurred by DSS, the card providers and participants.

4.60 The Centre for International Economics concluded that ‘Despite uncertainty around benefit estimates, the core conclusion that the benefits of the CDC are outweighed by the costs appears to be robust’.

Reporting of the cost–benefit analysis

4.61 The cost–benefit analysis has not been published.

4.62 At the Senate Community Affairs Legislation Committee hearing in October 2021, when asked about the cost-benefit analysis, DSS officers testified that there was insufficient data to do an accurate cost–benefit analysis and additional work is being undertaken to compile more thorough data including data held by the states and territories (discussed further from paragraphs 4.43 to 4.45). The Committee noted the period of time that had elapsed since Auditor-General Report No.1 2018–19 and stated that DSS would have known the need for improved data.

4.63 In November 2021, DSS provided its Audit Committee with an update on actions taken to address the recommendations from Auditor-General Report No.1 2018–19, which included progress on the cost–benefit analysis. Action to address the recommendation was noted as complete with the recommendation being progressed through the closure process.

4.64 In December 2021, DSS advised the ANAO that there were difficulties in applying the cost–benefit analysis report findings due to the underpinning data limitations. As at February 2022, it was ‘considering implications of the findings’. DSS further advised that it has acquired a cost–benefit analysis tool which aims to enable further analysis as additional data becomes available in the future.

4.65 In May 2022, DSS advised the ANAO that given the limitations of the cost-benefit analysis, they considered the findings to be ‘of very limited value’.

Post-implementation review

4.66 DSS advised the ANAO that the post-implementation review of the CDC Trial commenced in January 2019. DSS undertook a presentation of the context; a desktop review; confirmation of the terms of reference; in-depth program and department interviews; testing of initial findings and recommendations with senior department staff; and drafting of the final report. The terms of reference examined the following matters: initial policy design; program implementation, planning and execution; stakeholder engagement; stakeholder communications; service delivery; merchant management; community panels, complementary support services; program monitoring; and management.

4.67 A draft report was prepared in August 2019. Issues noted in the draft report included:

  • limited scope (a desktop review and internal consultations);
  • delayed execution of the review; and
  • absence of an implementation performance framework against which to assess effectiveness.

4.68 The draft report presented seven suggestions for program management consideration:

  • develop a strategy for ongoing community engagement and communications;
  • develop a longer-term strategy for support service funding, for both existing and new sites;
  • continue the current approach to administrative data management;
  • review and revise the program’s logic framework and related implementation, monitoring and management responsibilities;
  • consolidate a program operations manual which formalises all key CDC documentation;
  • consider the application of a more vertical program management structure; and
  • conduct a facilitated program workshop to formalise the approach, resourcing, management and monitoring responsibilities of community panels.

4.69 DSS subsequently had concerns with the draft report, including the scope and methodology. Significant amendments were recommended by the Branch Manager.

4.70 DSS advised the ANAO that in October 2020 a review of the draft report was completed to ensure more emphasis was placed on findings and actionable recommendations. Further revisions were subsequently undertaken by DSS and the report was finalised in September 2021. The final post-implementation review made 37 recommendations across ten program components comprising initial policy design; program implementation, planning and execution; stakeholder engagement; stakeholder communication; service delivery; merchant management; community panels; complementary support services; program monitoring; and program management.

4.71 In November 2021, DSS provided an update to its Audit Committee on actions taken to address the recommendations from Auditor-General Report No 1 2018–19, which included progress on the post-implementation review.

4.72 DSS advised that the delay in finalising the report was due to the need for substantial revisions and the decision to wait until a program evaluation was completed. DSS further advised that this delay meant that some of CDC program changes relating to the issues identified in the post-implementation review had already been implemented by the time the report was finalised.49 DSS also advised that it continues to look for opportunities to implement the recommendations resulting from the post-implementation review.

4.73 Delays to the post-implementation review meant that the further expansion of the CDC in Cape York and Northern Territory was not clearly informed by the findings and recommendations of the review.

Appendices

Appendix 1 Entity responses

 

Page one of the response from the Department of Social Services. You can view a summary of the response in the summary and recommendations chapter of this report.

 

 

Page two of the response from the Department of Social Services. You can view a summary of the response in the summary and recommendations chapter of this report.

 

ANAO comment on Department of Social Services response

In relation to Recommendation 2, the Social Services Legislation Amendment (Cashless Debit Card Trial Expansion) Act 2018 included a requirement that any review or evaluation of the CDC Trial must be reviewed by an independent expert within six months of the Minister receiving the final report (refer note ‘c’ to Table 1.1). Contrary to the legislative requirement, the evaluation of the CDC program was never reviewed by an independent expert. Although the Department of Social Services (DSS) describes the 2021 evaluation as having limitations, the results were used by DSS to conclude that it met one of two externally reported CDC program performance measures: ‘Extent to which the CDC supports a reduction in social harm in communities’ (refer paragraph 3.30). In making the legislative amendment, the clear intent of Parliament was to obtain independent assurance that the cashless welfare arrangements are effective in order to inform expansion of the arrangements beyond the trial areas. A review of the evaluation methodology would, moreover, help ensure that the design of future evaluation work is fit for purpose and represents an appropriate use of public resources.

 

Page one of the response from Services Australia. You can view a summary of the response in the summary and recommendations chapter of this report.

 

 

Page two of the response from Services Australia. You can view a summary of the response in the summary and recommendations chapter of this report.

 

 

Response from the University of Adelaide.

 

 

Response from the CIE.

 

 

Response from indue.

 

Traditional Credit Union

Hello Renina,

TCU has reviewed the ‘Extract from Auditor‐General Proposed Audit Report on Implementation and Performance of the Cashless Debit Card Trial’. TCU accepts the wording as presented and proposes no change.

Regards

Tony Hampton

Chief Executive Officer

Appendix 2 Improvements observed by the ANAO

1. The existence of independent external audit, and the accompanying potential for scrutiny improves performance. Improvements in administrative and management practices usually occur: in anticipation of ANAO audit activity; during an audit engagement; as interim findings are made; and/or after the audit has been completed and formal findings are communicated.

2. The Joint Committee of Public Accounts and Audit has encouraged the ANAO to consider ways in which the ANAO could capture and describe some of these impacts. The ANAO’s 2021–22 Corporate Plan states that the ANAO’ s annual performance statements will provide a narrative that will consider, amongst other matters, analysis of key improvements made by entities during a performance audit process based on information included in tabled performance audit reports.

3. Performance audits involve close engagement between the ANAO and the audited entity as well as other stakeholders involved in the program or activity being audited. Throughout the audit engagement, the ANAO outlines to the entity the preliminary audit findings, conclusions and potential audit recommendations. This ensures that final recommendations are appropriately targeted and encourages entities to take early remedial action on any identified matters during the course of an audit. Remedial actions entities may take during the audit include:

  • strengthening governance arrangements;
  • introducing or revising policies, strategies, guidelines or administrative processes;
  • initiating reviews or investigations; and
  • a service level agreement between DSS and Services Australia was finalised in April 2022. This replaces the Letter of Exchange and contains specific service level standards for the CDC program’s service delivery.

4. In this context, the below actions were observed by the ANAO during the course of the audit. It is not clear whether these actions and/or the timing of these actions were planned in response to proposed or actual audit activity. The ANAO has not sought to obtain assurance over the source of all of these actions or whether they have been appropriately implemented.

  • The final report of post-implementation review of the Cashless Debit Card (CDC) Program was finalised in September 2021.
  • In September 2021 an issues log to track Indue operational issues was established by the Department of Social Services (DSS).
  • A Letter of Exchange between DSS and Services Australia was developed and signed in October 2021 to formalise the bilateral arrangement between the two entities for the delivery of the CDC. The transition of CDC service delivery to Services Australia occurred in March 2020.
  • A service level agreement between DSS and Services Australia was finalised in April 2022. This replaces the Letter of Exchange and contains specific service level standards for the CDC program’s service delivery.
  • A new CDC Compliance Section was established in the Cashless Welfare Policy and Technology Branch in DSS in November 2021.
  • DSS requested that Indue provide an updated risk register which was received from Indue Ltd in February 2022.
  • A CDC shared risk register between DSS and Services Australia was finalised in October 2021. Previously some shared risks were set out each entity’s CDC risk management plans. An updated shared risk register was provided in March 2022.

Appendix 3 Entities responsible for the Cashless Debit Card

Table A.1: Cashless Debit Card entity roles and responsibilities

Entity

Responsibilities

Department of Social Services (DSS)

  • Designing and implementing the Cashless Debit Card (CDC) program — setting policy, guidelines, eligibility criteria, and legislation
  • Providing advice and guidance to Services Australia on program policy, guidelines, eligibility criteria, and legislation
  • Managing the contract for the CDC providers, Indue and the Traditional Credit Union
  • Community engagement and consultation
  • Management of CDC community panel applications
  • Data monitoring and evaluation of the program
  • Merchant management and product level blocking roll out

Services Australia

  • Program implementation and service delivery operations for the CDC, including digital and telephony channels, face-to-face services, and remote servicing arrangements
  • Issuing temporary cards
  • Administration of all assessment processes and determinations related to the CDC
  • Exit applications
  • Assessment and facilitation of funds transfer requests
  • Assessment and facilitation of housing and other expenses limits
  • Provision of CDC data to DSS

Indue

  • Creating bank accounts for participants and provision of supporting services
  • Issuing and activating permanent cards and replacing lost or stolen cards
  • Balance checking and transaction history
  • Assisting participants to set up deductions and transfers from the CDC
  • Approval of merchants
  • Provision of back-end banking systems infrastructure
  • Provision of transactional data to DSS

Traditional Credit Union

  • Creating bank accounts for participants in the Northern Territory and provision of supporting services, for example call centre and website
  • Issuing and activating permanent cards and replacing lost or stolen cards
  • Balance checking and transaction history
  • Assisting participants to set up deductions and transfers from the CDC
  • Provide a network of branches for CDC participants to access

Local partners

  • Issuing temporary cards and assist with activation
  • Support to activate permanent cards
  • Provide participants with phone and online banking technology and associated support
  • Agents assist participants to provide documentation to Services Australia
   

Source: ANAO analysis.

Appendix 4 Cashless Debit Card program logic and theory of change

Figure A.1: Cashless Debit Card program logic, June 2016

 

Diagram showing the relationship between inputs, activities, outputs and outcomes of the CDC program. It also shows some of potential additional benefits of the program and some of the unintended adverse consequences.

 

Source: DSS documentation.

Figure A.2: Cashless Debit Card theory of change, June 2016

 

Diagram that presents changes that the CDC program should contribute to in the CDC regions such as reduced drug, alcohol and gambling and less crime and family violence incidents.

 

Source: DSS documentation.

Footnotes

1 As at January 2022, the two branches in DSS that manage the CDC program are the Cashless Welfare Engagement and Support Services Branch and the Cashless Welfare Policy and Technology Branch. In Services Australia, the CDC program is managed by the Deduction and Confirmation Branch.

2 Local partners are non-governmental organisations contracted by Indue to provide face-to-face support to CDC participants in their communities. Local partners are meant to assist with initial card set up, account balance checking, bill payments, temporary and replacement cards, and general issues.

The legislation for the CDC allows the Minister to authorise the establishment of a community body (called a community panel). The function of community panels is to make determinations by written direction to Services Australia to vary a CDC participant’s restricted amount of payment on the CDC to a percentage that is different to the standard percentage but still between 50 and 80 per cent, depending on the region. Community panels have delegated authority to review a CDC participant’s restricted percentage but do not have the power to determine who does or does not participate in the CDC program. Community panels are only in place in the Ceduna and East Kimberley regions.

3 The Newborn Upfront Payment is a lump sum payment when an individual starts caring for a child, is eligible for Family Tax Benefit Part A and is not receiving Parental Leave Pay for the same child.

4 eftpos is a domestic payments network that facilitates the electronic transfer of funds when debit or some credit cards are used at the point of sale.

5 Recommendation 1 of Auditor-General Report No.1 2018–19: Social Services should confirm risks are rated according to its Risk Management Framework and ensure mitigation strategies and treatments are appropriate and regularly reviewed.

Recommendation 2 of Auditor-General Report No.1 2018–19: Social Services should employ appropriate contract management practices to ensure service level agreements and contract requirements are reviewed on a timely basis.

Recommendation 3 of Auditor-General Report No.1 2018–19: Social Services should ensure a consistent and transparent approach when assessing tenders and fully document its decisions.

6 Department of Finance, Commonwealth Risk Management Policy, Finance, 2014, p.9.

7 DSS updated the Risk-E tool to automatically calculate risk ratings as part of addressing the ANAO recommendations from the previous CDC audit, however the CDC program area does not use the tool for its management of CDC risk.

8 The CDC Steering Committee consists of Senior Executive Service and Executive Level 2 representatives from DSS, Services Australia, the National Indigenous Australians Agency and the Department of the Prime Minister and Cabinet. It meets monthly and its purpose is to guide the direction of the CDC program.

9 Three risks were not rated in line with Services Australia’s risk matrix. These are now closed risks in the current registers but two were rated incorrectly when the risks were open.

10 The Commonwealth Risk Policy states that ‘Shared risks are those risks extending beyond a single entity which require shared oversight and management. Accountability and responsibility for the management of shared risks must include any risks that extend across entities and may involve other sectors, community, industry or other jurisdictions’.

11 A letter of exchange between Services Australia and DSS sets out the requirements for the two entities to work together to implement policies and deliver services.

12 DSS advised that it conducts a daily review of the social media content, to identify any issues relating to the functionality of the card at merchants and that an email is distributed each morning covering media reports and social media postings. The most common issue is report of a declined purchase. In rare cases, where it is identified that participants share advice on circumvention strategies on social media, DSS will investigate the matter further.

13 The Department of Finance defines a ‘limited tender’ as a procurement based on quotes being sought directly from one or more suppliers. It includes what was previously referred to as ‘sole source’, ‘select’ or ‘restricted’ source procurements. A limited tender can only be used for procurements above the relevant value thresholds where it is specifically allowed by the Commonwealth Procurement Rules.

14 DSS is trialling new providers through the Additional Issuer Trial. The Trial is intended to provide financial institution branch network presence and offer participants more choice in their financial institution.

15 Department of Finance, Commonwealth Procurement Rules, DoF, 2020. Division 2, clause 10.3(g): ‘when a relevant entity procures a prototype or a first good or service that is intended for limited trial or that is developed at the relevant entity’s request in the course of, and for, a particular contract for research, experiment, study, or original development’.

16 The Minderoo Foundation is a philanthropic organisation.

17 If this criteria is met under Appendix A – Exemption 16 of the Commonwealth Procurement Rules (CPRs), then the procurement is exempt from meeting Division 1 paragraphs 4.7, 4.8 and 7.26, and Division 2 of the CPRs.

18 From 1 July 2016, Commonwealth entities are required to include mandatory minimum requirements for Indigenous participation for certain contracts. When tendering for procurements to which minimum requirement targets apply, tenderers must outline within an Indigenous Participation Plan how those targets will be achieved.

19 DSS advised the ANAO that the procurement did not exceed the overall budget approved by the delegate as the ‘split of funding between Indue and TCU shifted during contract finalisation based on legal advice regarding recommended commercial contracting practice’.

20 The Term Sheet (Deed) was developed temporarily in lieu of a Deed of Collaboration which was signed on 10 March 2022. DSS stated that the Deed of Collaboration aims to ensure that there are clear roles between DSS, Indue and TCU including boundaries, dependencies, collaboration requirements, mechanisms for governance and dispute avoidance and management.

21 The initial contract includes funding to cover approximately 12,500 participants with TCU CDC accounts, as it was anticipated that some participants would not choose to transition. DSS plans to vary the contract when TCU gets close to supporting 12,500 participants.

22 Services Australia’s Bilateral Agreement Framework states that it ‘lays the foundation for developing bilateral agreements and managing Services Australia’s relationships with partner entities’.

23 Under the Services Australia and DSS bilateral management arrangement, the Statement of Intent is underpinned by protocols, services schedules, letters of exchange and shared premises agreements.

24 Services Australia provides weekly reports to DSS on CDC exits and this information is available monthly on the data.gov.au website as part of a high-level ‘dashboard’ report on the CDC program.

25 Recommendation 5 of Auditor-General Report No.1 2018–19: Social Services should fully utilise all available data to measure performance, review its arrangements for monitoring, evaluation and collaboration between its evaluation and line areas, and build evaluation capability within the department to facilitate the effective review of evaluation methodology and the development of performance indicators.

26 Department of Finance, Resource Management Guide No.131: Developing good performance information, DOF, paragraph 77, available from https://www.finance.gov.au/government/managing-commonwealth-resources/d… 19 May 2022].

27 ibid, paragraph 76.

28 Department of Finance, Developing Good Performance Information: Resource Management Guide No. 131, Finance, 2020.

29 The CDC Program Profile details Key Activity 2.1.6 ‘Cashless Debit Card’, which is part of Program 2.1 ‘Families and Communities’. It explains the context and rationale for the CDC program, key activities and associated performance measures and targets.

30 There are a number of reasons that CDC participants do not activate their card such as: the person is transitioning into the program and has a temporary card; no funds are being deposited in their account due to existing deductions in place such as rent or bill payments; the person opted not to participate in the CDC (and therefore their income support payments accumulate in their account unused); or the person may be incarcerated for a period of time.

31 Department of Finance, Developing good performance information (RMG 131) [Internet], available from https://www.finance.gov.au/government/managing-commonwealth-resources/developing-good-performance-information-rmg-13 [accessed 6 December 2021].

Paragraph 46 states that exclusion bias ‘occurs when relevant information is not collected (for example, where information is not collected on activities that make key contributions to fulfilling a purpose)’. Operational bias ‘occurs when the process for collecting information is not followed or when errors are made in the recording and analysis of data’.

32 Recommendation 4 of Auditor-General Report No.1 2018–19: Social Services should undertake a cost–benefit analysis and a post-implementation review of the trial to inform the extension and further roll-out of the CDC.

Recommendation 5 of Auditor-General Report No.1 2018–19: Social Services should fully utilise all available data to measure performance, review its arrangements for monitoring, evaluation and collaboration between its evaluation and line areas, and build evaluation capability within the department to facilitate the effective review of evaluation methodology and the development of performance indicators.

Recommendation 6 of Auditor-General Report No.1 2018–19: Social Services should continue to monitor and evaluate the extension of the Cashless Debit Card in Ceduna, East Kimberley and any future locations to inform design and implementation.

33 The program logic describes how policy input, activities and outputs seek change in CDC Trial communities. A theory of change is an illustration of how and why a desired change is expected to happen in a particular context.

34 Auditor-General Report No.1 2018-19 The Implementation and Performance of the Cashless Debit Card Trial, p. 9, paragraph 18.

35 Department of Social Services, Cashless Debit Card Baseline Data Collection in the Bundaberg and Hervey Bay Region: Qualitative Findings, May 2020, https://www.dss.gov.au/families-and-children-programs-services-welfare-reform-cashless-debit-card/cashless-debit-card-baseline-data-collection-in-the-bundaberg-and-hervey-bay-region-qualitative-findings [accessed 12 January 2022].

36 At the time the baseline data was collected, contact details of participants were not systematically obtained that would allow for follow-up. The University of Adelaide advised that this was because there was no plan to reinterview respondents and the impact evaluation had not yet been commissioned.

37 The University of Adelaide advised that this was because they considered the methodology from the first evaluation to be flawed and that other previously conducted research was similarly methodologically inadmissible.

38 K Mavromaras, M Moskos, S Mahuteau and L Isherwood, Evaluation of the Cashless Debit Card in Ceduna, East Kimberley and the Goldfields Region: Consolidated Report, University of Adelaide, January 2021, pp. 33-35, available from https://www.dss.gov.au/families-and-children-programs-services-welfare-reform-cashless-debit-card-cashless-debit-card-evaluation/evaluation-of-the-cdc-in-ceduna-east-kimberley-and-the-goldfields-region-consolidated-report [accessed 9 May 2022].

39 ibid., p. 33 states ‘The social acceptability of some of the behaviours the CDC is attempting to reduce range from being partially socially unacceptable to being clearly illegal. As such, their reporting may suffer from severe biases and is difficult to measure in an accurate way’. The evaluators advised the ANAO that they attempted to mitigate this through using validated questionnaires, face-to-face interviews conducted by local community members, and avoiding questions about drug consumption. Recall bias occurs when interviewees do not remember previous events, behaviours or experiences accurately.

40 ibid., p.33.

41 ibid., p.33.

42 The proportion of CDC participants who gamble declined under CDC and those who continue to gamble reported they do so less frequently.

43 Department of Social Services, Families and Children: Cashless Debit Card - Evaluation [Internet], DSS, 2021, available from https://www.dss.gov.au/families-and-children-programs-services-welfare-reform-cashless-debit-card/cashless-debit-card-evaluation [accessed 19 February 2022].

44 124PS Evaluation of review of Part’s operation: The amendment was that ‘(1) If the Minister or the Secretary causes a review of the trial of the cashless welfare arrangements mentioned in section 124PF to be conducted, the Minister must cause the review to be evaluated. (2) The evaluation must: (a) be completed within 6 months from the time the Minister receives the review report; and (b) be conducted by an independent evaluation expert with significant expertise in the social and economic aspects of welfare policy. (3) The independent expert must: (a) consult trial participants; and (b) make recommendations as to (i) whether cashless welfare arrangements are effective; and (ii) whether such arrangements should be implemented outside of a trial area. (4) The Minister must cause a written report about the evaluation to be prepared. (5) The Minister must cause a copy of the report to be laid before each House of Parliament within 15 days after the completion of the report.

45 The Centre for International Economics report stated that ‘we have assumed that the impacts to Bundaberg and Hervey Bay as consistent with the average impact across the other three regions’.

46 Department of the Prime Minister and Cabinet, Cost–benefit analysis — guidance note March 2020 [Internet], available from www.pmc.gov.au › default › files › publications › cost-benefit-analysis [accessed 19 November 2020].

47 The authors noted that participants were more likely to report negative views rather than positive views.

48 The CDC program is associated with a net cost of $57.4 million in present value terms.

49 DSS advised that some examples of the changes include the transfer of the service delivery function of the program from DSS to Services Australia and additional communications activities.