The audit objective was to assess whether the Department of the Prime Minister and Cabinet has effectively established and implemented the Indigenous Advancement Strategy to achieve the outcomes desired by government.

Please note: Aboriginal and Torres Strait Islander people should be aware that this website may contain images of deceased people.

Summary and recommendations

Background

1. In September 2013, responsibility for the majority of Indigenous-specific policy and programs, as well as some mainstream programs that predominantly service Indigenous Australians, was transferred into the Department of the Prime Minister and Cabinet (the department). These substantial changes saw 27 programs consisting of 150 administered items, activities and sub-activities from eight separate entities moved to the department.

2. In May 2014, the Indigenous Advancement Strategy (the Strategy) was announced by the Australian Government as a significant reform in the administration and delivery of services and programs for Indigenous Australians. Under the Strategy, the items, activities and sub-activities inherited by the department were consolidated into five broad programs under a single outcome. The Australian Government initially committed $4.8 billion to the Strategy over four years from 2014–15.1 The 2014–15 Budget reported that the Australian Government would save $534.4 million over five years by rationalising Indigenous programs, grants and activities.

3. In the first year of the Strategy, from July 2014 to June 2015, the department focused on transitioning over 3000 funding agreements, consolidating legacy financial systems, administering a grant funding round and establishing a regional network.

Audit objective and criteria

4. The objective of the audit was to assess whether the department had effectively established and implemented the Indigenous Advancement Strategy to achieve the outcomes desired by Government.

5. To form a conclusion against the audit objective, the Australian National Audit Office (ANAO) adopted the following high-level audit criteria:

  • the department has designed the Strategy to improve results for Indigenous Australians in the Australian Government’s identified priority areas;
  • the department’s implementation of the Strategy supports a flexible program approach focused on prioritising the needs of Indigenous communities;
  • the department’s administration of grants supports the selection of the best projects to achieve the outcomes desired by the Australian Government, complies with the Commonwealth Grants Rules and Guidelines, and reduces red tape for providers;
  • the department has designed and applied the Strengthening Organisational Governance policy to ensure funded providers have high standards of corporate governance; and
  • the department has established a performance framework that supports ongoing assessment of program performance and progress towards outcomes.

6. The audit examined the department’s activities relating to the Strategy’s establishment, implementation, grants administration and performance measurement leading up to the Strategy’s announcement in 2014 and its initial implementation until the audit commenced in March 2016.

Conclusion

7. While the Department of the Prime Minister and Cabinet’s design work was focused on achieving the Indigenous Advancement Strategy’s policy objectives, the department did not effectively implement the Strategy.

8. The Australian Government’s identified priority areas are reflected in the Strategy’s program structure which is designed to be broad and flexible. The department considered the potential risks and benefits associated with the reforms. Planning and design for the Strategy was conducted in a seven week timeframe, which limited the department’s ability to fully implement key processes and frameworks, such as consultation, risk management and advice to Ministers, as intended.

9. The implementation of the Strategy occurred in a short timeframe and this affected the department’s ability to establish transitional arrangements and structures that focused on prioritising the needs of Indigenous communities.

10. The department’s grants administration processes fell short of the standard required to effectively manage a billion dollars of Commonwealth resources. The basis by which projects were recommended to the Minister was not clear and, as a result, limited assurance is available that the projects funded support the department’s desired outcomes. Further, the department did not:

  • assess applications in a manner that was consistent with the guidelines and the department’s public statements;
  • meet some of its obligations under the Commonwealth Grants Rules and Guidelines;
  • keep records of key decisions; or
  • establish performance targets for all funded projects.

11. The performance framework and measures established for the Strategy do not provide sufficient information to make assessments about program performance and progress towards achievement of the program outcomes. The monitoring systems inhibit the department’s ability to effectively verify, analyse or report on program performance. The department has commenced some evaluations of individual projects delivered under the Strategy but has not planned its evaluation approach after 2016–17.

Supporting findings

Establishment

12. The five programs under the Strategy reflect the Government’s five priority investment areas and were designed to be broad and flexible by reducing the number of programs and activities. Clearer links could be established between funded activities and the program outcomes.

13. The department developed a consultation strategy but did not fully implement the approach outlined in the strategy.

14. The department considered the impacts of the Strengthening Organisational Governance policy upon organisations, including whether it was discriminatory to Indigenous organisations.

15. The department identified, recorded and assessed 11 risks to the implementation of the Strategy. The department implemented or partially implemented 27 of 30 risk treatments. Risk management for the Strategy focused on short-term risks associated with implementing the grant funding round.

16. The department provided advice to the Prime Minister and Minister on aspects of planning and design during the establishment of the Strategy. The department did not meet its commitments with respect to providing advice on all the elements identified as necessary for the implementation of the Strategy. The department also did not advise the Minister of the risks associated with establishing the Strategy within a short timeframe.

Implementation

17. The department identified but did not meet key implementation stages and timeframes.

18. The department established and implemented transitional arrangements at the same time as administering the application process for the grant funding round. During this time, general information was sent to service providers about the new arrangements under the Strategy that was not targeted at the project or activity level. To support continuity of investment, the department extended some contracts at reduced funding levels of five per cent or more. Following the grant funding round additional service gaps were identified and action was taken to fill them.

19. The regional network did not have sufficient time to develop regional investment strategies and implement a partnership model, which were expected to support the grant funding round.

Grants Administration

20. The department developed Indigenous Advancement Strategy Guidelines and an Indigenous Advancement Strategy Application Kit prior to the opening of the grant funding round. The guidelines and the grant application process could have been improved by identifying the amount of funding available under the grant funding round and providing further detail on which activities would be funded under each of the five programs. Approximately half of the applicants did not meet the application documentation requirements and there may be benefit in the department testing its application process with potential applicants in future rounds.

21. The department provided adequate guidance to staff undertaking the assessment process. The department developed mandatory training for staff, but cannot provide assurance that staff attended the training.

22. The department’s assessment process was inconsistent with the guidelines and internal guidance. Applications were assessed against the selection criteria, but the Grant Selection Committee did not receive individual scores against the criteria or prioritise applications on this basis.

23. Internal quality controls and review mechanisms did not ensure that applications were assessed consistently.

24. The department did not maintain sufficient records throughout the assessment and decision-making process. In particular, the basis for the committee’s recommendations is not documented and so it is not possible to determine how the committee arrived at its funding recommendations. The department did not record compliance with probity requirements. Further, the department did not maintain adequate records of Ministerial approval of grant funding.

25. The list of recommended projects provided to the Minister did not provide sufficient information to comply with the mandatory requirements of the Commonwealth Grants Rules and Guidelines, or support informed decision-making, and contained administrative errors.

26. The department’s initial advice to applicants about funding outcomes did not adequately advise applicants of the projects and amount of funding approved.

27. Negotiations with grant recipients were largely verbal and conducted in a short timeframe, with some agreements executed after the project start date. In most instances the department negotiated agreements that were consistent with the Minister’s approval. There is limited assurance that negotiations were fair and transparent.

28. The department has established arrangements to monitor the performance of service providers, but has not specified targets for all providers and arrangements are not always risk-based.

Performance Measurement

29. The department developed a performance framework and refined this in line with the requirements of the enhanced performance framework under the Public Governance, Performance and Accountability Act 2013. The current framework does not provide sufficient information about the extent to which program objectives and outcomes are being achieved.

30. The performance indicators against which funding recipients report cannot be easily linked to the achievement of results and intended outcomes across the Strategy.

31. The department developed a system by which entities could provide performance reporting information electronically. However, the processes to aggregate and use this information are not sufficiently developed to allow the department to report progress against outcomes at a program level, benchmark similarly funded projects, or undertake other analysis of program results.

32. The department drafted an evaluation strategy in 2014, which was not formalised. In May 2016 the Minister agreed to an evaluation approach and a budget to conduct evaluations of Strategy projects in 2016–17.

Recommendations

Recommendation No. 1

Paragraph 3.29

The Department of the Prime Minister and Cabinet ensure that administrative arrangements for the Indigenous Advancement Strategy provide for the regional network to work in partnership with Indigenous communities and deliver local solutions.

Department of the Prime Minister and Cabinet’s response: Agreed.

Recommendation No. 2

Paragraph 4.28

For future Indigenous Advancement Strategy funding rounds, the Department of the Prime Minister and Cabinet:

  1. publishes adequate documentation that clearly outlines the assessment process and the department’s priorities and decision-making criteria; and
  2. consistently implements the process.

Department of the Prime Minister and Cabinet’s response: Agreed.

Recommendation No. 3

Paragraph 4.49

The Department of the Prime Minister and Cabinet, in providing sound advice to the Minister for Indigenous Affairs about the Indigenous Advancement Strategy:

  1. clearly and accurately outlines the basis for funding recommendations; and
  2. documents the outcomes of the decision-making process.

Department of the Prime Minister and Cabinet’s response: Agreed.

Recommendation No. 4

Paragraph 5.21

The Department of the Prime Minister and Cabinet identify the outcomes and results to be achieved through the Indigenous Advancement Strategy and analyse performance information to measure progress against these outcomes.

Department of the Prime Minister and Cabinet’s response: Agreed.

Summary of entity response

The department welcomes the audit of the processes involved in the establishment of the Indigenous Advancement Strategy, as a major reform in the administration and delivery of services and programs to Indigenous Australians in response to widespread criticism over many years.

The department acknowledges the findings of the report but notes that the implementation occurred in a very challenging timeframe and was accompanied by a reasonably well-managed transition from the old to the new arrangements without serious interruption to service delivery. Having moved through the transition period, with greater experience among applicants and stakeholders, the Strategy is moving into a more mature phase of implementation that draws on lessons learnt.

The ANAO report highlights broad stakeholder support for features of Indigenous Advancement Strategy, including the benefits of having a more consolidated program structure, greater flexibility through which organisations can receive funding, reduced red tape for service providers and developing on the ground responses to issues.

The department accepts all four recommendations, with action already taken or underway that implement improvements consistent with the recommendations. This includes revisions to the Indigenous Advancement Strategy Guidelines; a strengthened role for regional network staff in supporting organisations to develop funding proposals; improvements in application assessment processes, briefing processes and record keeping; and strengthening of performance monitoring and evaluation approaches.

1. Background

The Indigenous Advancement Strategy and Indigenous affairs reform

1.1 In September 2013, responsibility for the majority of Indigenous-specific policy and programs, as well as some mainstream programs that predominantly service Indigenous Australians, was transferred into the Department of the Prime Minister and Cabinet (the department). These substantial changes saw 27 programs consisting of 150 administered items, activities and sub-activities from eight separate entities moved to the department.

1.2 In May 2014, the Indigenous Advancement Strategy (the Strategy) was announced by the Australian Government as a significant reform in the administration and delivery of services and programs for Indigenous Australians. Under the Strategy, the items, activities and sub-activities inherited by the department were consolidated into five broad programs under a single outcome. The Australian Government initially committed $4.8 billion to the Strategy over four years from 2014–15.2 The 2014–15 Budget reported that the Australian Government would save $534.4 million over five years by rationalising Indigenous programs, grants and activities.

1.3 In the first year of the Strategy, from July 2014 to June 2015, the department focused on transitioning over 3000 funding agreements, consolidating legacy financial systems, administering a grant funding round and establishing a regional network. The timeline for the establishment and implementation of the Strategy is set out in Figure 1.1. Figure 1.2 sets out the legacy programs inherited by the department and the new program structure.

Program objectives and outcomes

1.4 The Strategy’s outcome, as set out in Figure 1.2, is to improve results for Indigenous Australians, with a particular focus on the Government’s priorities of ensuring children go to school, adults work, Indigenous business is fostered, the ordinary rule of law is observed in Indigenous communities as in other Australian communities, and Indigenous culture is supported. In addition to these policy outcomes, the Strategy is intended to:

  • reduce program duplication and fragmentation, reduce delivery costs and more clearly link activity to outcomes;
  • ensure communities have the key role in designing and delivering local solutions to local problems; and
  • offer a simplified approach to funding that reduces the red tape burden on communities and providers.

Figure 1.1: Timeline of key milestones and events in establishing and implementing the Strategy

 

 

Source: ANAO analysis.

Figure 1.2: Transition of legacy programs to new program structure under the Strategy

 

 

Note: The National Partnership Agreement on Early Childhood Development ceased funding in 2013–14.

Source: ANAO analysis; Department of the Prime Minister and Cabinet, 2016–17 Portfolio Budget Statements, DPM&C, Canberra, 2016.

Grant funding through the Strategy

1.5 Under the Indigenous Advancement Strategy Guidelines 2014, organisations could receive funding through five methods: open competitive grant rounds; targeted or restricted grant rounds; direct grant allocation; demand-driven processes; and ad hoc grants. The department announced the first open competitive grant funding round in September 2014. Approximately $2.3 billion of funding was available, of which $1.65 billion was set aside for the grant funding round, with a further $515 million to support processes such as demand-driven applications and other emerging priorities across 2014–18.

1.6 In the grant funding round, the department received applications from 2345 organisations, which resulted in 996 organisations receiving approximately $1 billion to deliver 1350 projects across Australia.

1.7 The department released new guidelines in March 2016, which consolidated the funding mechanisms into three options. The department can:

  • invite applications through an open grant process or targeted grant process where it has identified a need to address specific outcomes on a national, regional or local basis;
  • approach an organisation where an unmet need is identified; or
  • respond to community-led proposals for support related to an emerging community need or opportunity.

1.8 Funding under the Strategy is administered through a single funding agreement between an organisation and the department, irrespective of the number of projects funded. The intended focus of project funding is clear and measurable results, with payments linked to the achievement of results and outcomes.

1.9 Under the Strategy’s Strengthening Organisational Governance policy, organisations receiving $500 000 or more (GST exclusive) in any one financial year are required to be incorporated under Commonwealth legislation, and remain so while receiving funding through the Strategy. Indigenous organisations are required to be incorporated under the Corporations (Aboriginal and Torres Strait Islander) Act 2006, and non-Indigenous organisations under the Corporations Act 2001.3

Audit approach

1.10 The objective of the audit was to assess whether the department had effectively established and implemented the Indigenous Advancement Strategy to achieve the outcomes desired by Government.

1.11 To form a conclusion against the audit objective, the Australian National Audit Office (ANAO) adopted the following high-level audit criteria:

  • the department has designed the Strategy to improve results for Indigenous Australians in the Australian Government’s identified priority areas;
  • the department’s implementation of the Strategy supports a flexible program approach focused on prioritising the needs of Indigenous communities;
  • the department’s administration of grants supports the selection of the best projects to achieve the outcomes desired by the Australian Government, complies with the Commonwealth Grants Rules and Guidelines, and reduces red tape for providers;
  • the department has designed and applied the Strengthening Organisational Governance policy to ensure funded providers have high standards of corporate governance; and
  • the department has established a performance framework that supports ongoing assessment of program performance and progress towards outcomes.

1.12 The audit examined the department’s activities relating to the Strategy’s establishment, implementation, grants administration and performance measurement leading up to the Strategy’s announcement in 2014 and its initial implementation until the audit commenced in March 2016. The audit methodology included examining and analysing: the department’s Strategy documentation; correspondence and advice to the responsible Minister; systems and documentation relating to performance and evaluation; departmental data relating to funding and organisations receiving funding; and funding assessments. The audit methodology also involved interviewing departmental staff, applicants from the grant funding round, and relevant peak bodies; and reviewing communication received through the citizens’ input facility.4

1.13 The audit was conducted in accordance with the ANAO’s Auditing Standards at a cost to the ANAO of approximately $908 000.

1.14 Since the commencement of the audit the department has implemented new grant guidelines, re-introduced an Assessment Management Office, implemented a risk management framework, initiated improvements to reporting and held consultation sessions with stakeholders. The department also informed the ANAO that it was in the process of establishing processes to drive risk and compliance management at the local level, developing place-based investment strategies and progressing a strategy to assess the effectiveness of the Strengthening Organisational Governance policy. These initiatives did not fall within the scope of this audit because they were implemented subsequent to the establishment and initial implementation of the Strategy and will be the subject of a future audit by the ANAO.

1.15 The department has acknowledged shortcomings in its record-keeping and management in relation to grants administration under the Strategy. The ANAO found that documentation relating to the planning and initial implementation of the Strategy was not held on a records management system and that the department did not have effective version controls. On this basis, the ANAO does not have assurance that the department has produced complete records of the design and implementation of the Strategy.

2. Establishment

Areas examined

This chapter examines whether the Department of the Prime Minister and Cabinet has designed the Indigenous Advancement Strategy to improve results for Indigenous Australians in the Australian Government’s identified priority areas.

Conclusion

The Australian Government’s identified priority areas are reflected in the Strategy’s program structure and the department considered the potential risks and benefits associated with the reforms. Planning and design for the Strategy was conducted in a seven week timeframe, which limited the department’s ability to fully implement key processes and frameworks, such as consultation, risk management and advice to Ministers, as intended.

Were outcomes identified and did program activities align with desired outcomes?

The five programs under the Strategy reflect the Government’s five priority investment areas and were designed to be broad and flexible by reducing the number of programs and activities. Clearer links could be established between funded activities and the program outcomes.

2.1 The Indigenous Advancement Strategy’s (the Strategy) outcome is to ‘improve results for Indigenous Australians including in relation to school attendance, employment and community safety, through delivering services and programs, and through measures that recognise the place that Indigenous people hold in this nation’ and is set out in outcome two of the department’s Portfolio Budget Statements (see Figure 1.2).5 The Australian Government’s five areas for priority investment in Indigenous affairs, set out in paragraph 1.4, broadly align with the five programs under the Strategy.

2.2 The Department of the Prime Minister and Cabinet (the department) advised the ANAO that the program objectives were intended to be broad and flexible. The department also advised that to facilitate flexible and innovative approaches to investment activity it took a limited approach to identifying program activities under the five broad programs. The department provided examples of the types of high-level activities that would be funded under each program in the Indigenous Advancement Strategy Guidelines 2014 (the guidelines). In March 2016 the department revised the guidelines providing additional examples of the types of high-level activities that will be funded under each program.6

2.3 The department had not classified the projects funded under each program into common groups or activities, presenting challenges in linking program activities and the program outcomes. This is discussed further in Chapter 5 in relation to the department’s ability to collect and report performance information at the activity level. In January 2017, the department advised that it had completed a project to group and classify activities under the Strategy.

Were the risks and benefits of delivering the Strategy through a new program structure considered?

The department considered the risks and benefits of delivering the Strategy through a consolidated program structure. To manage the risks identified, the department implemented program guidelines and in 2016 approved an evaluation approach.

2.4 To support the development of a new program structure, the department undertook a review of Indigenous-specific programs and activities transferred during Machinery of Government changes in September 2013. As a part of this review the department outlined which administered items, activities and sub-activities it had inherited through the legacy programs and analysed the potential risks and benefits of moving to a more consolidated program structure.

2.5 The potential benefits identified by the department included: reduced program fragmentation and duplication; simplified program structure and therefore funding arrangements; and clearer linking of activities and investment to outcomes. Feedback provided to the Senate Finance and Public Administration References Committee in June 2015 and to the ANAO indicated broad support for features of the Strategy.7 These features included the benefits of having a more consolidated program structure, greater flexibility through which organisations could receive funding, reducing red tape for service providers and developing on-the-ground responses to issues.

2.6 The key risks of consolidating the program structure identified by the department in its review of Indigenous-specific programs included: continued duplication due to the introduction of large flexible funding streams; reduced flexibility in funding due to existing commitments under previous funding arrangements; inability to link activities to outcomes due to the limited existing evaluation of programs and activities; and the risk that consolidating programs would be viewed by Indigenous Australians as an attempt to reduce support. To manage these risks the department recommended that it establish robust guidelines, foster a strong culture of evaluation and manage the expectations of Indigenous communities, organisations and the service delivery sector.

2.7 The department published the program guidelines in August 2014. The evaluation and performance improvement strategy was drafted in 2014 but was not finalised, and evaluation activities have been limited. In May 2016 an evaluation approach and budget was approved by the Minister for Indigenous Affairs (the Minister) (see Chapter 5).

Were stakeholders consulted as intended?

The department developed a consultation strategy but did not fully implement the approach outlined in the strategy.

2.8 The department identified that broad consultation with individuals, governments, service providers, communities and Indigenous leaders was important to successfully deliver the Strategy’s reforms and the Government’s desired new engagement with Indigenous Australians. To provide direction and guidance in informing and consulting with key stakeholders during the implementation of the Strategy, the department developed a communication and stakeholder engagement strategy. This approach outlined the purpose of initial consultation—to provide information, seek feedback on concerns and opportunities, and seek views on how the new arrangements could best be implemented. Figure 2.1 sets out the key stakeholder groups the department intended to engage with and the purpose for engaging with each group.

Figure 2.1: Purpose of communication and engagement with key stakeholders

 

 

Note: The figure refers to three phases of engagement and communication during the implementation of the Strategy—phase one covers June/July 2014; phase two covers July/August 2014; and phase three covers September 2014 onwards.

Source: Department of the Prime Minister and Cabinet.

2.9 The department provided information to the Indigenous Advisory Council (the council) on the Strategy’s new reforms, and sought feedback from the council on concerns and opportunities.8 The department advised the ANAO that it met with Indigenous leaders and national peak bodies outside of the council but did not keep records of the meetings. As such, it is not clear who the department met with, what feedback the department received and how this was considered in developing the Strategy.

2.10 The department reported that it also held approximately 80 information sessions during August and September 2014 with key regional stakeholders, including potential applicants.9 The information sessions focused on providing information on the policy intent and key principles of the Strategy, and the grant funding round. The information sessions did not cover some of the main purposes for consultation set out in Figure 2.1, including how the department should action the Strategy at the local level through community consultation, co-designing regional strategies and actively partnering with government and other service providers to address needs in priority programs and locations.10 Of the 82 public submissions received from stakeholders during the audit, 27 per cent indicated that there had been a lack of consultation between the department and Indigenous communities. Of the 114 applicants interviewed, 12 per cent said that they had been consulted prior to the commencement of the Strategy and 69 per cent said that they had not.11

2.11 During the Senate Finance and Public Administration References Committee inquiry into the Commonwealth Indigenous Advancement Strategy tendering processes, the department stated that it did not have a consistent engagement plan and mechanism for engaging more broadly with service providers and the community.12 Further, the department acknowledged that more engagement should have been undertaken in the early stages of designing and delivering the Strategy.13

Was the impact on organisations considered in designing the Strengthening Organisational Governance policy?

The department considered the impacts of the Strengthening Organisational Governance policy upon organisations, including whether it was discriminatory to Indigenous organisations and the financial and administrative implications.

2.12 The purpose of the Strengthening Organisational Governance policy (the policy) is to ensure that funded organisations have high standards of governance and accountability that facilitate high quality service delivery for Indigenous Australians. In March 2014, the department sought approval from the Minister for the policy, noting that Indigenous organisations could perceive the policy as discriminatory and may resist the requirement to incorporate under the Corporations (Aboriginal and Torres Strait Islander) Act 2006 (the CATSI Act) due to the lack of choice and perceived additional regulatory requirements, reporting and scrutiny being applied. The department also noted that if given the choice of incorporating under the Corporations Act 2001 (the Corporations Act) Indigenous organisations would not be able to access the support services available to meet the specific needs of Indigenous people under the CATSI Act or benefit from the level of monitoring and scrutiny available through the Office of the Registrar of Indigenous Corporations. The department’s advice to the Minister also indicated that the policy would probably not contravene the Racial Discrimination Act 1975.

2.13 The department estimated that the financial impact on organisations required to transfer their incorporation could be between $10 000 and $20 000 for small organisations, and in the order of $100 000 for larger organisations with more complex arrangements.

2.14 Indigenous organisations already incorporated under the Corporations Act were expected to either transfer incorporation to the CATSI Act or apply for an exemption from the policy if able to demonstrate they were well-governed and high-performing. In May 2015 the department recommended to the Minister that the policy be amended to allow Indigenous organisations already incorporated under the Corporations Act to retain this incorporation without applying for an exemption. The department advised the Minister that to force Indigenous organisations already incorporated under the Corporations Act to transfer their incorporation ‘is onerous and could be perceived as heavy-handed’ as ‘both the Corporations Act and CATSI Act require similar governance standards’. The Minister agreed to the department’s recommendation.

2.15 The department has not finalised a formal strategy to assess the effectiveness of the policy and whether it is having a positive impact upon governance within applicable organisations.14 The department advised the ANAO in July 2016 that the regulatory requirements of Commonwealth incorporation legislation are expected to inherently strengthen organisational governance and that, as most organisations had transferred incorporation within the past six months, it is too early to identify broad improvements in governance.

Were risks identified and processes put in place for ongoing risk management?

The department identified, recorded and assessed 11 risks to the implementation of the Strategy. The department implemented or partially implemented 27 of 30 risk treatments. Of the risk treatments implemented or partially implemented, three had been fully implemented within the timeframes specified by the department. Risk management for the Strategy focused on short-term risks associated with implementing the grant funding round.

2.16 The department initially assessed the overall implementation risk for the Strategy as medium due to the complexity of the legacy arrangements transitioned to the department, short timeframes and the number of stakeholders impacted. After consulting with the Department of Finance, the department increased the implementation risk level to high for similar reasons, such as the significant structural changes and tight implementation timeframes involved with the reforms proposed under the Strategy.15

2.17 The department identified, recorded and assessed 11 risks to the implementation of the Strategy and identified 30 risk treatments. Of the 30 risk treatments, the ANAO identified that 17 have been implemented, 10 have been partially implemented and three have not been implemented. Of the 17 treatments implemented, three had been implemented by the date specified, nine had not been implemented by the date specified, four had been partially implemented by the date specified and one did not specify a date.

2.18 In assessing implementation risks, the department identified the key sources of risk as: complexities associated with the reform; the number of stakeholders involved; short timeframes; pace and scale of the changes; and inadequate resourcing. The risks identified included: inadequate consultation with external stakeholders; organisations not engaged in a timely fashion; program objectives not achieved; and agreed implementation timeframes not met. The department developed a number of strategies to treat the risks. These included: an implementation plan; high-level program objectives and key performance indicators; governance arrangements; communication and stakeholder engagement strategy; and engagement with peak bodies and the council. The department developed high-level program objectives and key performance indicators and governance arrangements by the dates it had specified in its risk assessment but did not develop or implement the rest of the strategies outlined above until later in the implementation of the Strategy.

2.19 The department reported risks on a monthly basis to the Indigenous Affairs Reform Implementation Project Board.16 The reports focused on short-term delivery risks associated with the timeframes, resources and systems required for the grant funding round. The department did not focus on strategic risks identified in the risk assessment and risk register.

2.20 An Implementation Readiness Assessment17 in December 2014 found that the department did not have a mature approach to risk management during the establishment of the Strategy, with no risk management framework implemented and risk registers not regularly updated. By the second Implementation Readiness Assessment in November 2015, the department had developed a risk management framework and was making progress in updating risk registers covering strategic, program and provider risk. In May 2016, the department developed a strategic risk assessment for the Strategy.

Was timely and appropriate advice provided to the responsible Minister to establish the Strategy?

The department provided advice to the Prime Minister and Minister on aspects of planning and design during the establishment of the Strategy. The department did not meet its commitments with respect to providing advice on all the elements identified as necessary for the implementation of the Strategy. The department also did not advise the Minister of the risks associated with establishing the Strategy within a short timeframe.

2.21 The department provided advice to the Prime Minister and Minister during the seven weeks between the Strategy’s approval by Government and the Strategy’s implementation in July 2014, including about:

  • Indigenous affairs funding received by non-Indigenous/non-government organisations;
  • the future framework for Indigenous affairs delivery;
  • budget savings and reinvestment in Indigenous affairs, the department’s review of Indigenous-specific programs and the proposed submission to Government;
  • background papers on proposed reforms to Indigenous affairs;
  • proposed 2014–15 transition strategy for the implementation of the Strategy;
  • draft items to be included in the Financial Management and Accountability Regulations 1997 for the five new programs under the Strategy; and
  • an update on the implementation of the Strategy.

2.22 In seeking Government approval to establish and implement the Strategy, the department undertook to provide a range of key documents to the Minister by 30 June 2014. These documents addressed key elements of planning and designing the Strategy and included an implementation plan, budget investment approach, program administration arrangements (including program outcomes and key indicators, program guidelines, funding agreements and risk management), an evaluation and performance improvement strategy and a community engagement approach. These were not provided to the Minister by 30 June 2014.

2.23 In July 2014 the department provided advice to the Minister on the program guidelines. These guidelines included information on program outcomes and key indicators and the initial approach the department would take to funding agreements. In June 2015 the department provided advice on a proposed consultation and engagement approach focused on reviewing the guidelines. In March 2016 the department also briefed the Minister on an evaluation approach.

2.24 In September 2014 the department provided advice to the Minister on the proposed approach to the grant funding round. In its advice to the Minister the department noted that the tight timeframes and expected volume of applications would place significant pressure on current resourcing but that the department would investigate ways to ensure that the round was finalised successfully. Prior to implementation of the funding round, the department did not provide any further advice to the Minister that agreed implementation timeframes would not be met and the implications of this on implementation (see Chapter 3, timeframes for implementation).

3. Implementation

Areas examined

This chapter examines whether the Department of the Prime Minister and Cabinet’s implementation of the Indigenous Advancement Strategy supports a flexible program approach focused on prioritising the needs of Indigenous communities.

Conclusion

The implementation of the Strategy occurred in a short timeframe and this affected the department’s ability to establish transitional arrangements and structures that focused on prioritising the needs of Indigenous communities.

Area for improvement

The ANAO made one recommendation aimed at providing for the regional network with support to work in partnership with Indigenous communities and deliver local solutions.

Did the department establish internal oversight arrangements for implementation?

The department established an Indigenous Affairs Reform Implementation Project Board to provide oversight and make decisions about the implementation of the Strategy and guide reforms to Indigenous affairs.

3.1 In May 2014 the Department of the Prime Minister and Cabinet (the department) established the Indigenous Affairs Reform Implementation Project Board (the implementation project board) as the decision-making body to oversee the Australian Government’s Indigenous affairs reform agenda. The implementation project board met eight times between May 2014 and October 2014. The membership of the implementation project board was made up of the Indigenous Affairs Group executive.

3.2 In line with its terms of reference, the implementation project board initially discussed overarching strategic issues such as the whole-of-government program framework, evaluation and performance improvement strategy, and communication and engagement strategy. From July 2014, the meetings became focused on the grant funding round and related issues requiring immediate attention.

3.3 In June 2015, the implementation project board was revised as the Program Management Board as part of a new governance structure for the Indigenous Affairs Group of the department. Similar to the implementation project board, the Program Management Board’s membership included Indigenous Affairs Group executive with the addition of the regional network director and an Indigenous engagement special advisor. The responsibilities of the Program Management Board are similar to the implementation project board and include making decisions and providing advice to program owners18 and the department’s executive on the strategic directions and implementation of Indigenous-specific programs, in particular the Indigenous Advancement Strategy (the Strategy).

Were key stages and timeframes for implementation identified and met?

The department identified but did not meet key implementation stages and timeframes. The main timing delays occurred during the early implementation planning and at the assessment stage of the grants funding round.

3.4 The Strategy was a significant reform that the department committed to delivering within a short timeframe. In March 2014 the department developed an implementation strategy that included key deliverables and timeframes (see Table 3.1).

Table 3.1: Key deliverables for implementation of the Strategy

Deliverable

Timeframe for completion

Date achieved

Implementation plan

June 2014

Draft developed but not approveda

Budget investment approach

June 2014

Draft developed but not approvedb

Program guidelines

June 2014

August 2014

Risk management plan

June 2014

November 2015

Program outcomes and performance indicators

June 2014

May 2014

Evaluation and performance improvement strategy

June 2014

Draft developed but not approvedc

Community engagement strategy

June 2014

July 2014

Departmental capability development plan

June 2014

Not developed

     

Note a: The Secretary considered the draft implementation plan in July 2014.

Note b: The budget investment approach was initially presented to the implementation project board in June 2014.

Note c: The department developed a draft evaluation and performance improvement strategy which was considered by the Indigenous Affairs Reform Implementation Project Board in July 2014. While some evaluation and information analysis activity occurred, the plan was not formally agreed to, endorsed or funded. In May 2016, the Minister for Indigenous Affairs approved funding for evaluation activities.

Source: ANAO analysis.

3.5 The department identified pressures on implementation timeframes in early June 2014. This was partly related to the overall tight timeframes for delivering critical elements of the Strategy, and the lack of appropriately skilled resources for the implementation period. At this time, key tasks had not been completed or were unlikely to be completed within the desired timeframes. For example, the department had not determined the final budget amounts that were available for the 2014–15 and 2015–16 financial years due to the multiple financial systems containing information making it difficult to develop a comprehensive financial position. This affected the department’s ability to build a funding profile for the Strategy and determine priority activities. By August 2014, a number of deliverables had been completed or commenced, but the department was tracking behind schedule in relation to developing a policy for regional strategies, funding allocation options, risk assessment guidelines, the evaluation and performance improvement strategy and a procurement strategy for the funding round.

3.6 The Minister for Indigenous Affairs (the Minister) agreed to timeframes for the grant funding round in September 2014. The department finalised its approach to the assessment process in October 2014, after the grant funding round had been advertised but prior to closing the round to applicants. The department did not meet the initial assessment timeframes because of issues including the volume of project applications received, and the assessment process was not supported by the use of a grants management system (discussed further in Chapter 4). In November 2014, the Minister agreed to extend the timeframes for the assessment process. The department did not meet the revised timeframes. The original and revised timeframes for the grant funding round are outlined in Table 3.2. The department assessed 2472 applications19 between October 2014 and March 2015 at a cost of $2.2 million.20

Table 3.2: Original and revised timeframes for grant funding round

Action

Timeframe for completion

Date achieved

Original

Tender closes

17 October 2014

17 October 2014

Application assessment complete

2 November 2014

3 March 2015

Application assessment summary complete

10 November 2014

3 March 2015

Initial advice provided to the Minister

27 November 2014

See below

Final advice provided to the Minister

28 November 2014

See below

Applicants notified of outcomes

1 December 2014

See below

Revised

Initial advice provided to the Minister

24 December 2014

1 March 2015

Final advice provided to the Minister

23 January 2015

March–June 2015

Applicants notified of outcomes

February 2015

From March 2015

     

Source: ANAO analysis.

3.7 The department has acknowledged that its planning and timeframes were optimistic and not achievable, and recognises the importance of realistic program planning and implementation timeframes that are commensurate with the scale and complexity of the intended activity.

Were transitional arrangements established that provided certainty for providers, and supported continuity of investment and employment?

From July 2014 to July 2015, the department established and implemented transitional arrangements at the same time as administering the application process for the grant funding round. During this time, general information was sent to service providers about the new arrangements under the Strategy that was not targeted at the project or activity level. To support continuity of investment, the department extended some contracts at reduced funding levels of five per cent or more. Following the grant funding round additional service gaps were identified and action was taken to fill them.

3.8 The Strategy was implemented from 1 July 2014, with a transition period of 12 months to allow continuity of frontline services and time for communities and service providers to adjust to the new arrangements. The purpose of the transition arrangements was to ensure certainty for providers, continuity of investment and continuity of employment, while consultation was undertaken and longer term investment strategies across the five programs were developed. During the transition period, the department also implemented the application and assessment process for the grant funding round.

3.9 In May 2014, the department wrote to existing service providers to advise that existing programs and contracts were transitioning to new arrangements under the Strategy. This correspondence provided general information about the budget, the new Strategy programs and the regional network, and noted that contracts would be honoured, with some extensions offered for either six or 12 months. In September 2014, the department again wrote to service providers advising about the upcoming 2014 grant funding round, and held information sessions for stakeholders. The correspondence was not targeted at the project or activity level and did not state under which of the five programs an existing project or activity would fit.21 Nor did it clearly state that existing service providers would need to reapply for funding for existing activities, instead noting that additional funding cannot duplicate existing services. The department provided the ANAO with a sample of emails sent to existing providers prior to the opening of the funding round to advise under which Strategy program their existing activity fit and that service providers were encouraged to consider whether the proposed activity aligned with other programs.

3.10 Of the 114 applicants interviewed by the ANAO, 16 indicated that the process was confusing, 20 indicated that there needed to be more clarity regarding the types of organisations and activities being funded and 18 indicated that that the department needed to more effectively engage and communicate changes to organisations. Nineteen submissions to the ANAO also indicated that organisations found it difficult to fit projects within the structure and that it was unclear what activities were being funded. This is consistent with feedback provided to the department. In November 2014, the department identified that over 100 existing service providers with contracts expiring in December 2014 had not applied for funding and that, as a result, there was a risk of service gaps.

3.11 In June 2014, the department had 2961 contracts that required transitioning to the new arrangements under the Strategy. The contract transition arrangements established by the department for June 2014 to December 2014 included:

  • honouring existing contracts;
  • extending some contracts that were ending on 30 June 2014 for six or 12 months; and
  • continuing other activities, such as Indigenous wage and cadet subsidies or other tailored employment grants, under legacy guidelines until the new guidelines were released, and limiting these activities to contracts of one year where possible.

3.12 The department estimated that the transition strategy would commit approximately $347.8 million of the $467.4 million funding available for 2014–15.

3.13 In order to make additional funding available for new activities, the Minister agreed to the department offering contract extensions with reduced funding of five per cent or more. The department advised the ANAO that the value of contract extensions was $130 million.

3.14 In November 2014, when timeframes for the grant funding round process were extended, the department estimated that over 1200 funding applicants were not contracted beyond December 2014. The Minister agreed to the department extending the assessment process to avoid service delivery gaps. Six month extensions were offered to providers with contracts expiring in December 2014, and 12 month extensions (to December 2015) were offered for some school-based activities. In both May and December 2014, the department advised existing service providers that contracts would be extended within a month of the contract expiring.

3.15 Consequences reported to the department indicated that delays in announcements had an impact on existing services due to the lack of security about what activities would be continued and some organisations reported losing staff due to the delays in finalising funding.22

Service delivery gaps following the grant funding round

3.16 The department identified that gaps in service delivery would emerge from July 2015 after it made the initial announcement of grant funding round recipients in March 2015. The department advised the Minister that the service delivery gaps had emerged because:

  • providers had not applied for funding;
  • providers received a large reduction in funding on the department’s recommendation and the subsequent level of approved funding would result in reduced levels of service;
  • providers were not initially recommended for funding by the department in the assessment process;
  • the department’s recommended funding reductions would result in reductions in employment;
  • services were not originally assessed as high need by the department; and
  • the department made administrative errors.

3.17 The department identified these gaps in March, May, June and November 2015, with all but one identified before the end of June 2015.23 As a result, the department re-briefed the Minister, entered into new contracts and adjusted funding amounts in existing contracts. The gap-filling exercise was completed in November 2015 and the total value of changes made to the original recommendations was an increase of $240 million.

Did the regional network support communities to have a key role in designing and delivering local solutions through a partnership approach?

The regional network did not have sufficient time to develop regional investment strategies and implement a partnership model, which were expected to support the grant funding round.

3.18 The Australian Government’s intent under the Strategy was for Indigenous communities to ‘have the key role in designing and delivering local solutions to local problems’.24 The primary mechanism by which the department engages with Indigenous communities is through its regional network of offices and staff distributed around Australia.

3.19 The regional network was designed to support working in partnership with Indigenous communities to tailor action and long-term strategies to achieve outcomes and clear accountability for driving Australian Government priorities on the ground. The regional network is made up of 37 regional offices, across 12 defined regions. The regions reflect similarities in culture, language, mobility and economy, rather than state and territory boundaries. The Indigenous Advancement Strategy Guidelines 2014 (the guidelines) state that the regional network would be established in July 2014, with a 12–18 month transition period. The department advised that the regional network was established in March 2015.

3.20 A key feature of the Strategy’s investment approach was to support regionally focused solutions through the development of regional strategies. The department envisaged that regional strategies would: map the region’s profile against priority indicators (for example, school attendance); identify the key policy and geographic areas that would have the greatest impact on improving outcomes; and propose strategies to improve outcomes and the measures by which these strategies would be assessed. Regional strategies were also intended to reflect community-identified priorities and inform government investment decisions following consultation with Indigenous communities.

3.21 Regional strategies were not developed to support the 2014–15 grants funding round due to time constraints. Instead, the department created simplified regional profiles to inform investment. Each regional profile contained demographic data and statistics about the disadvantage of the region’s Indigenous populations and information relevant to each of the five Strategy programs. There is limited evidence that regional profiles were considered in the grant assessment process. As at August 2016, two years after the commencement of the regional network, the department has drafted but not finalised regional strategies.

3.22 In July 2014, the department planned to develop a Network Performance and Accountability Strategy, setting out the key performance indicators and reporting requirements for regional network staff. The department did not develop the proposed strategy. To monitor the performance of the regional network, the department required regional network staff to report against regional strategies and provide regular reports to the Minister.

3.23 The department advised that it has provided the Minister with dashboard reports since August 2015. The reports include a short local issues section that, until January 2016, focused on the Community Development Program and Remote School Attendance Strategy. The dashboard reports provide a reflection of activity within the regions, but do not meaningfully report on the performance of the regional network.

Community involvement in designing and delivering local solutions

3.24 The department established a series of criteria for assessing grants. One specifically considered community consultation and participation, and another related to an applicant’s understanding of need in a community.

3.25 To date, the main funding opportunity under the Strategy has been through the grant funding round. In considering funding applications received in the grant funding round, the department:

  • assessed applications against the five published selection criteria and then a moderated score was presented to the Grant Selection Committee25 (the committee) as a single score. The committee did not receive scores against individual selection criteria;
  • allocated each application a need score based on assessment of the regional need for the project which was presented to the Grant Selection Committee. The need score did not always record a supporting rationale that explained the basis of the score or how it directly related to need;
  • assessed applications at a national level against the merits of all other applications;
  • did not contact applicant referees to ascertain the extent of their support for projects in their community; and
  • did not contact applicants or engage with communities to discuss the impact of partial funding on the original projects.

3.26 The department intended that the regional network would partner with Indigenous communities to design and deliver local solutions to local problems. The regional network contributed local knowledge to the funding round through the determination of a need score and participation on the committee.26 The regional network was also responsible for negotiating funding agreements once the Minister had approved funding. However, the extent to which the regional network could adopt a partnership model during the administration of the grant funding round was limited because of the short timeframes involved in the application, assessment and negotiation process. The department advised the ANAO that, given the time constraints, a partnership approach was unrealistic during the grant funding round.

3.27 Outside of the funding round, communities could also apply for funding under the demand-driven process.27 While demand-driven applications were considered at a regional network level, the department’s administrative arrangements state that program owners are responsible for making funding recommendations within a program and the Minister is the ultimate decision-maker. To support a local solution through the demand-driven process, the department’s process required that the proposal was submitted through the regional network to the program area in national office for consideration. The demand-driven process was not supported by a consistent internal process, grants investment strategy, clear budget or guidance on what could be funded. The regional network reported issues with the demand-driven process. The issues included a lack of clear articulation of program intent and transparency around the available funding, as well as program area specific considerations. Further, decisions about demand-driven project applications were often time-consuming.

3.28 Stakeholder feedback provided to the ANAO indicated that community involvement in the Indigenous Advancement Strategy was limited.28 Thirteen submissions expressed a concern that local solutions were not supported, 22 indicated that there was a lack of consultation with Indigenous communities and nine stated that there was still a top-down approach to service delivery. When the ANAO asked applicants what changes they would like to see, the two most common responses were greater partnership and collaboration with Indigenous communities to design solutions (40 applicants) and a bottom up approach to service delivery (31 applicants).

Recommendation No.1

3.29 The Department of the Prime Minister and Cabinet ensure that administrative arrangements for the Indigenous Advancement Strategy provide for the regional network to work in partnership with Indigenous communities and deliver local solutions.

Entity response: Agreed.

3.30 This recommendation is consistent with steps already taken and ongoing work to ensure administrative arrangements for the Indigenous Advancement Strategy optimise opportunities for the department’s regional network to work in partnership with Indigenous communities and to deliver local solutions.

3.31 Regional network staff had an integral role in delivering information on the roll out of the Indigenous Advancement Strategy and the grant funding round. Approximately 80 information sessions were held in August and September 2014 with key regional stakeholders including potential applicants. These sessions by necessity focused on explaining the key elements of the Strategy and the grant funding round given the implementation timeframes. With a maturing of the Indigenous Advancement Strategy this role is being strengthened and extended.

3.32 Regional network staff also played a key part in assessing grant round applications and applying local knowledge to the assessment process. Regional Assessment Teams were established in network offices to consider all applications which had projects proposed for delivery within their region. These teams drew on the known current service footprint and considered local issues relevant to each application.

3.33 In March 2016 the department issued revised Indigenous Advancement Strategy Guidelines. These provide a stronger role for regional network staff in supporting organisations to develop proposals, particularly through the new community-led grants process. The revised Guidelines also include strengthened assessment criteria that relate to Indigenous community support for proposals.

3.34 The regional network has an extensive on-the-ground presence through 37 offices in capital cities, regional and remote locations, supplemented by a direct presence in approximately 75 communities. As such, it is integral to the effective implementation of the Indigenous Advancement Strategy. The network places senior staff close to communities and has specialist officers who lead direct engagement with communities: Government Engagement Coordinators and Indigenous Engagement Officers (IEOs). IEOs live in their community, speak the Indigenous language(s) used by the local community and use their knowledge of the community and language to help government understand local issues and to ensure community feedback is heard. The regional network supports active engagement with communities and stakeholders to explain government policies, identify and implement tailored local solutions to improve outcomes for Indigenous Australians and monitor funded projects and results achieved.

4. Grants Administration

Areas examined

This chapter examines whether the Department of the Prime Minister and Cabinet’s administration of grants supported the selection of the best projects to achieve the outcomes desired by the Australian Government, complied with the Commonwealth Grants Rules and Guidelines and reduced red tape for providers.

Conclusion

The department’s grants administration processes fell short of the standard required to effectively manage a billion dollars of Commonwealth resources. The basis by which projects were recommended to the Minister was not clear and, as a result, limited assurance is available that the projects funded support the department’s desired outcomes. Further, the department did not:

  • assess applications in a manner that was consistent with the guidelines and the department’s public statements;
  • meet some of its obligations under the Commonwealth Grants Rules and Guidelines;
  • keep records of key decisions; or
  • establish performance targets for all funded projects.
Areas for improvement

The ANAO has made two recommendations aimed at ensuring consistency between the department’s internal assessment process and the advice provided to applicants and improving the quality of the advice provided to the Minister for Indigenous Affairs.

The ANAO suggests that the Department of the Prime Minister and Cabinet explore options for implementing an effective grant management system that: facilitates effective and efficient management of the Indigenous Advancement Strategy; streamlines the process by which applicants engage with the department; is consistent with the grant assessment and performance monitoring processes implemented by the department; and supports decentralised approaches and greater regional network engagement.

Did the published guidelines and application process support applicants to apply for funding?

The department developed Indigenous Advancement Strategy Guidelines and an Indigenous Advancement Strategy Application Kit prior to the opening of the grant funding round. The guidelines and the grant application process could have been improved by identifying the amount of funding available under the grant funding round and providing further detail on which activities would be funded under each of the five programs. Approximately half of the applicants did not meet the application documentation requirements and there may be benefit in the department testing its application process with potential applicants in future rounds.

The guidelines

4.1 The Indigenous Advancement Strategy Guidelines 2014 (the guidelines) and the Indigenous Advancement Strategy Application Kit (the application kit) were the key references for applicants applying for funding under the Indigenous Advancement Strategy (the Strategy).29 The guidelines described the five general selection criteria of the Strategy, which addressed the extent to which applicants understood the needs of the target group or area, would achieve outcomes and value for money, could show experience or capacity in both program delivery and achieving outcomes, and fostered Indigenous participation.

4.2 The guidelines stated that the Australian Government had committed $4.8 billion over four years to the Strategy. The application kit stated that $2.3 billion was available over four years, primarily through the initial grant funding round. The Department of the Prime Minister and Cabinet (the department) did not publicly communicate how available funding would be distributed between the five programs. Figure 4.1 illustrates the distribution of funding between programs for the 2015–16 financial year for the grant funding round, relative to the amount of funding sought by applicants.

Figure 4.1: Funding requested against funding available, by program, 2015–16

 

 

Source: ANAO analysis.

4.3 As funding was allocated to specific programs30, the program for which an applicant chose to apply was a factor in determining the amount of funding that could be received. Publicly communicating available funds would have been of benefit to both applicants and the department, assisting: applicants to appropriately scope activities relative to the funds available; and the department to manage applicant’s expectations of funding success.

4.4 The guidelines could have been improved by providing further detail and clarity to support applicants in identifying the most suitable program for which to apply. The department had mapped legacy programs to the new program structure, but this information was not included in the guidelines. Further, the language used is complex; simpler language would assist in making the guidelines more accessible for first-time applicants.31

4.5 Applicants provided feedback to the ANAO and the department that a lack of clarity regarding available funding and eligible program activities complicated the application process. Following the grant funding round, a departmental review indicated that there was insufficient clarity surrounding activities that were eligible for funding, for which program an applicant should apply, and the amount of funding available.32 The department advised that it consulted with stakeholders on the guidelines; however, as noted in Chapter 2, the department did not keep records of these meetings. As such, the ANAO is not able to confirm that the department tested the guidelines with stakeholders to identify concerns prior to the application process commencing.33

Application process

4.6 The ANAO examined the manner in which applicants could submit applications to the department, and the associated requirements for application forms and documentation. The application kit stated that applicants could apply for Strategy grants via email, or by providing a hard-copy application form. The department advised the ANAO that timeframe pressures and limitations of existing systems prevented the development of an online grant management system.34

4.7 Applicants were required to submit a single application form for funding under the Strategy, regardless of the number of initiatives for which funding was sought.35 The application kit also established a series of requirements for application documentation including word limits36, maximum page counts, attachments and application size. The department advised that 1233 (50 per cent) of the 2472 applications received were assessed as non-compliant against these requirements.37

4.8 The department did not test the application form, requirements and process with applicants prior to the opening of applications. Of the 108 applicants that provided feedback to the ANAO about this issue, 44 per cent (47 applicants) rated the difficulty of the application process as high, and 18.4 per cent (21 applicants) as medium.

Were department staff provided with adequate guidance and training to achieve consistency in the assessment process?

The department provided adequate guidance to staff undertaking the assessment process. The department developed mandatory training for staff, but cannot provide assurance that staff attended the training.

4.9 The department developed guidance for staff involved in the assessment process, in the form of process diagrams and documents, an overall assessment plan, and a probity plan. These documents described roles and responsibilities, and expected outputs. The department also established an internal email address for staff to contact with questions and concerns regarding probity matters.

4.10 The written guidance was adequate to support staff to undertake the assessment process, but could be improved by providing additional clarity about how individual projects on the same application could be differentiated.38 Departmental staff were responsible for identifying separate projects on the single application form submitted by applicants, as the application form did not provide a means for applicants to identify separate projects themselves. In around 90 per cent of instances, the two assessing staff identified the same number of projects on application forms. For 9.3 per cent of the applications, the two assessors identified different numbers of projects. For half of these applications, the number of projects identified by each assessor differed by two or more, and the greatest difference in projects identified in an application was 19.

4.11 The department advised the ANAO that verbal guidance regarding distinguishing projects on an application was provided in training and by assessment panel chairs to ensure assessors had assistance where required and to maintain the integrity of assessments. No evidence of this guidance or these discussions was provided to the ANAO.

4.12 The department required that staff assessing grant applications attend a mandatory training session and it developed a series of training materials. The department was unable to provide a training register or other documentation recording which staff had attended training. As such, no assurance is available that all staff assessing grant applications completed the mandatory training requirements.39

Was the assessment process consistent with the process outlined in the Strategy’s grant guidelines?

The department’s assessment process was inconsistent with the guidelines and internal guidance. Applications were assessed against the selection criteria, but the Grant Selection Committee did not receive individual scores against the criteria or prioritise applications on this basis. The basis for the committee’s recommendations is not documented and so it is not possible to determine how the committee arrived at its funding recommendations.

4.13 As noted in Chapter 1, funding under the Strategy was available through multiple mechanisms including open competitive tender and demand-driven funding processes. The bulk of Strategy funding was made available via the competitive tender process. The funding mechanisms were assessed via different processes.

Assessment of open competitive tender applications

4.14 The guidelines and application kit specified that applications for funding would be assessed by a series of assessment panels according to five equally-weighted general selection criteria (discussed in paragraph 4.1). The department manually registered all applications received in a database developed specifically for the grant assessment process. The database and several additional spreadsheets, stored on network drives without adequate version controls, were used to manage the assessment process.

4.15 Following registration, applications were assessed for compliance against the requirements outlined in the guidelines. The department advised the ANAO and Parliament that, due to the high rates of non-compliance, these requirements were waived. The ANAO conducted a limited review of 60 of the 1233 non-compliant applications and found that the majority were assessed in accordance with the standard assessment process. However, the department did not waive the compliance requirements for all applications. The ANAO identified four applications that were not assessed40, and a further two instances were identified in which organisations submitted multiple compliant applications for different projects, but only one application progressed to an assessment. Applications that did proceed through the compliance process were assessed and assigned:

  • an assessment score (up to 35 points), assessed by two staff from one of seven assessment panels against the five Strategy selection criteria; and
  • a need score (up to seven points), assessed by regional network staff as to whether the project demonstrates a response or solution to demand and need within the region (as discussed in Chapter 3).41

4.16 The application was also assessed by a risk assessment team. The assessment process required that the risk assessment team conduct a comprehensive examination of an applicant’s viability. Due to time pressures, the risk assessment element of the assessment was reduced to background checks that did not meet the department’s requirements.

4.17 The Grant Selection Committee42 (the committee) received information about each project, including a project description, assessment score, need score, project risk rating and the amount of funding available in each program. The committee did not receive the project’s score against each of the five selection criteria for each project as the scores were aggregated into a moderated single score. As a result the committee did not receive an assessment of the competitiveness of each project against individual selection criteria, and could not identify which applicants had or had not sufficiently addressed each criterion. This meant that the information from the application assessments was so aggregated that it did not provide meaningful information to the committee against the selection criteria, and subsequently the Minister.

Alignment of competitive tender assessment process and outcomes with the guidelines

4.18 The application kit stated that the selection criteria would be equally weighted. In practice, the assessment process assigned 14 points (out of a total of 42) to assessments relating to need (seven from the assessment panel and seven from the regional network assessment), while a maximum of seven points was available for each of the other four criteria. In doing so, the department increased the weighting of consideration of need, relative to the other selection criteria.

4.19 The guidelines and application kit state that applicants would be assessed against the selection criteria, in the context of place-based regional needs and available funding, and then prioritised against competing eligible applicants. Applications were not ranked according to selection criteria scores. The department advised the ANAO that, in forming its recommendations to the Minister, the committee considered other factors and information such as the merits of the entire set of applications. These considerations were not documented or reflected in minutes of the committee, nor were the considerations linked back to the scores against selection criteria.

4.20 ANAO analysis shows that some projects that were awarded a high score against the selection criteria and need score were not recommended for funding, and some low-scoring applicants were recommended. For example, 59 projects that were awarded assessment scores of 20 or below and a need score of 3 or less were recommended for funding. Further, 222 projects that were awarded assessment scores of 26 or above, and a need score of six or above were not recommended for funding. This is shown in Table 4.1 and Table 4.2. ANAO analysis also indicates that a different funding profile would have resulted had the recommendations been based on assessments against the selection criteria.

Table 4.1: Number of projects recommended for funding, by need and assessment score

 

 

Note 1: The above analysis is based on recommendations to the Minister on 3 March 2015.

Note 2: Shading indicates frequency of occurrence in range. Lighter shading is less frequent, darker shading is more frequent.

Source: ANAO analysis.

Table 4.2: Number of projects not recommended for funding, by need and assessment score

 

 

Note 1: The above analysis is based on recommendations to the Minister on 3 March 2015.

Note 2: Shading indicates frequency of occurrence in range. Lighter shading is less frequent, darker shading is more frequent with red representing the most common occurrences.

Source: ANAO analysis.

4.21 ANAO analysis also shows that there were variations in recommended project scores by program, particularly in relation to the lowest score recommended (see Table 4.3).

Table 4.3: Variations in recommended project scores by program

Program

Lowest score recommended (out of 42)

Highest score not recommended (out of 42)

Jobs, Land and Economy

17

41

Children and Schooling

12

39

Safety and Wellbeing

11

41

Culture and Capability

10

41

Remote Australia Strategies

18

41

     

Note: The above analysis is based on recommendations to the Minister on 3 March 2015.

Source: ANAO analysis.

4.22 For approximately 80 per cent of recommended projects, the committee recommended an amount of funding less than that requested by the applicant. The department advised that this was necessary as the round was over-subscribed. The changes to funding amounts are likely to have altered the nature of the project and the deliverables the applicant could achieve, and potentially affected the financial viability of the project. The committee did not request new assessment scores against the proposed funding amounts, nor did the committee consult with applicants to determine the impact the proposed funding changes may have on the project.43

4.23 The basis for making funding recommendations was not adequately documented by the committee. The department provided the ANAO with a spreadsheet recording the rationale for individual funding recommendations made by the committee. For 61 of the 85 projects for which the committee recommended a funding amount of five per cent or less of the amount requested, the basis for the funding decision was not recorded. For a further 11 projects, the funding decision was partially explained, but was not clear.44 Two examples are provided below.

Box 1: Examples of rationales for individual funding recommendations documented by the Grants Selection Committee

Example 1

An applicant requested funding for $50.9 million under the Children and Schooling Program. The application received an assessment score of 30 and a need score of 5. The committee recommended funding of $1.5 million, 3 per cent of the funding requested. The project was listed as ‘supported’ with the rationale stating ‘the application demonstrated close alignment with the Government’s children and schooling program priorities and the project activities appear likely to deliver good outcomes on the ground’. The reason for the level of funding recommended was not recorded.

Example 2

An applicant requested funding of $10.9 million under the Safety and Wellbeing Program. The application received an assessment score of 28 and a need score of 2. The committee recommended funding of $221 612, 2 per cent of the funding requested. The rationale stated that ‘the project is closely targeted to community needs’. The reason for the level of funding recommended was not recorded.

Assessment of demand-driven applications

4.24 The department developed an abbreviated assessment process to assess demand-driven applications in which a single staff member would assess the application against the selection criteria and provide a proposal to the relevant program area. The program area would then consider the proposal, having regard to available funding, and make a recommendation to the department’s executive, which would in turn make a decision as to whether to recommend the project to the Minister.

4.25 The department provided the ANAO with a list of 415 demand-driven applications it assessed, but could not confirm that it was a complete list of the applications received. The ANAO identified 11 applications that were under assessment for more than one year, with the longest time between application and notification of outcome being 592 days. The department advised the ANAO that it has now completed the assessment of all demand-driven applications.

4.26 The department has not provided the ANAO with details of the amount of funding requested by the applicant, the amount of funding recommended by the department, or the amount of funding approved by the delegate, for all of the 415 applications. The department is aware that the lack of oversight of demand-driven applications has been an issue, with the Program Management Board noting:

… it is clear that applications come through different doors, and are assessed inconsistently across the Department. This creates a lack of visibility, transparency and capacity to track and report on those applications. Recent questions from Senate Estimates and the Senate Inquiry, and requests from the Minister’s office, indicate that this is not an acceptable position, and that the Department needs to be able to report coherently on these applications and their outcomes. This is not possible with current processes.45

4.27 In response to these issues, the department advised that it has made changes to the function of the work area responsible for managing assessments. The Assessment Management Office was re-introduced after the release of the revised guidelines in March 2016. The department advised that the Assessment Management Office centrally monitors the progress of applications received under these Guidelines.

Recommendation No.2

4.28 For future Indigenous Advancement Strategy funding rounds, the Department of the Prime Minister and Cabinet:

  1. publishes adequate documentation that clearly outlines the assessment process and the department’s priorities and decision-making criteria; and
  2. consistently implements the process.

Entity response: Agreed.

4.29 The initial Indigenous Advancement Strategy Guidelines and Application Kit did outline the assessment criteria and approach for the 2014 grant round. However, the department has previously acknowledged there were lessons learned from the first round, including the need for clearer communications with potential applicants and associated documentation. With the transition period behind us and greater knowledge and experience held among participants and stakeholders, the Indigenous Advancement Strategy is now bedding down in a more certain environment.

4.30 The department undertook a robust assessment process for the 2014 grant funding round but acknowledges there were some deficiencies in record keeping which impacted upon the department’s ability to demonstrate the approach taken.

4.31 Since the initial funding round the department has released revised Indigenous Advancement Strategy Guidelines with a range of supporting application material and information for applicants. This material clearly describes the application assessment process and the criteria that an application will be assessed against by the department.

4.32 The department has also built into its grants management arrangements a strengthened Assessment Management Office which is responsible for coordinating, oversighting and providing assurance on application and assessment processes.

Were assessments and decisions well-documented and supported by internal quality control and review?

Internal quality controls and review mechanisms did not ensure that applications were assessed consistently, and in some instances introduced errors. The department did not document key decisions or record compliance with probity requirements.

4.33 The department’s probity plan specified key principles for the grant funding round, including fairness, consistency and transparency, and the identification and resolution of conflicts of interest. The process by which the department sought to assure itself that assessments were conducted in this manner included:

  • reviewing applications to identify errors and omissions, and reassessing those that were missed;
  • requiring that the assessment score component of the assessment be performed by two separate staff, and moderating differences; and
  • establishing probity requirements and procedures to manage conflicts of interest for assessment panel staff and the committee.

Review of applications

4.34 The review of applications identified 300 missing applications that the department had received but not registered for assessment. In order to assess these applications the department formed an additional panel.46 The review also resulted in a department-generated number being entered into the ABN field for a small number of applicants (20 applicants) that did not provide an ABN on their application form, as required.47 The invalid ABNs remained in assessment documentation and were not corrected or removed at any time during the process.

Assessment and moderation by two assessors

4.35 The department’s assessment plan required that, in the event that the total scores derived by the two assessors differed, the scores would be moderated by either the Chair of the relevant assessment panel (for differences greater than five points), or via consensus between the two assessors (for differences of five or fewer points). Moderation was conducted on the basis of the total score, rather than the scores against individual selection criterion. Post-moderation, it was not possible to determine how assessment panels had scored projects against each criterion. The amalgamation of the individual criteria scores into a single score resulted in the loss of valuable information from the assessment process, and meant that the department was unable to provide the committee or the Minister with assessments against the selection criteria in accordance with the Commonwealth Grants Rules and Guidelines.

4.36 The requirement that all projects were assessed by two staff was not applied to all projects. Analysis by the ANAO identified that 815 out of approximately 5000 project assessments did not receive two assessment scores. Advice from the department indicated that this occurred when:

  • the two assessors identified different numbers of projects on the same application;
  • a processing error caused the project to be sent to a single staff member only; or
  • both assessing staff identified themselves in the system as the same assessor, causing the second assessment to be discarded.

4.37 The 815 projects received an automatically assigned score of 16 (out of 35) for their moderated assessment score, regardless of the score assigned by the assessor. As a consequence of being assigned an automated score, the 815 projects were not appropriately considered against each of the individual selection criteria at the assessment panel stage. The department advised the ANAO that it considers that these projects were given fair consideration by the committee and that the automatic assignment of scores was consistent with probity advice.

Probity and managing conflicts of interest

4.38 All staff involved in grant application assessments were required to declare conflicts of interest, which were recorded centrally in a register. The department advised that staff responding to questions and requests for assistance from Strategy applicants were not assigned to assessment panels, to ensure the impartiality of staff assessing applications.

4.39 Assessments were matched to staff names by a unique identifier recorded against each assessment. The ANAO was not able to conclusively match all staff identifiers with corresponding entries in the probity register, as the department did not enforce controls around the identifier that could be entered, consistently record the names associated with these identifiers, or conduct its own reconciliation of identifiers against the probity register.

4.40 The probity plan required that committee meetings be recorded, with minutes signed by the Chair of the committee, and that the Chair ensure that the grant assessment process be comprehensively documented and tied explicitly to the selection criteria. The committee did not record minutes of its meetings, keep a record of the actions taken by the committee to address conflicts of interest recorded by committee members, or keep records of key decisions.48

4.41 The ANAO was provided with two different spreadsheets that, according to the department, recorded the funding recommendations made by the committee. Neither spreadsheet matched with the value of funding recommendations recorded on the brief that the committee Chair signed to approve the recommendations to the Minister, and one of the spreadsheets recorded recommendations that the department later advised the Minister were not an accurate representation of the committee’s recommendations.

Did the department provide complete, accurate and timely advice to the Minister in support of his role as decision-maker?

The list of recommended projects provided to the Minister did not provide sufficient information to comply with the mandatory requirements of the Commonwealth Grants Rules and Guidelines, or support informed decision-making, and contained administrative errors. The department did not maintain adequate records of Ministerial approval of grant funding.

4.42 The department provided advice to the Minister at various stages throughout the grant funding round, in the form of verbal and written briefs. The advice provided estimates of the time required to complete the funding round, progress updates, funding recommendations, and formalised approvals. However, the department did not maintain adequate records of the content and outcome of these discussions.

4.43 In October 2014, while the application process was open, the department provided the Minister with an estimate of the time and staff required to assess all applications. The timeframes were based on an estimate that 4000 applications would be received, each containing a range of projects requiring assessment. The department received 2345 applications for approximately 5000 projects. On 20 November 2014 the department requested the Minister’s approval to extend the assessment period by three months. The Minister agreed to this request.

4.44 At the time the department advised the Minister of these estimates, it did not have a suitable grant management system that was capable of assessing the applications received.49 The department advised the ANAO that the lack of a suitable existing system contributed to the assessment process exceeding expected timeframes.

4.45 The department provided initial funding recommendations to the Minister on 1 March 2015, recommending the awarding of $917 million to 1278 projects. The department subsequently revised these recommendations to an amount of $844.8 million and presented them to the Minister on 3 March.50 The department made several further recommendations to the Minister in order to correct administrative errors in funding amounts, fill service gaps, and supplement projects that had been provided insufficient funding to deliver services or employment at current levels. These recommendations were substantively completed by June 2015, and the total value of the adjustments was approximately $240 million.

4.46 The recommendations provided by the department, which were presented in the form of a spreadsheet that listed the projects, contained significant shortcomings. The department provided descriptive information (such as project locations and electorates), but did not provide the Minister with assessments of the extent to which projects met each of the five Strategy selection criteria.51 The department advised the ANAO that it provided the assessment score and need score to the Minister in its initial $917 million recommendation brief, but was unable to provide evidence of this.52

4.47 The department retained inadequate records of the information it provided to the Minister. In the course of the audit, the ANAO was provided with lists of recommendations for several key briefs that the department later advised were incorrect, that were not as described in the relevant brief, or that did not match the values listed in the relevant brief. These issues limit the assurance that the department is able to provide regarding recommendations about specific projects.

4.48 Similarly, the department did not maintain adequate records of the grants approved by the Minister.53 While the department retained signed briefs of the total amount approved, the department did not keep adequate records of the supporting attachments which detailed project approvals. The lack of assurance as to the specific projects (and project amounts) approved means that the terms of that approval are unclear.54

Recommendation No.3

4.49 The Department of the Prime Minister and Cabinet, in providing sound advice to the Minister for Indigenous Affairs about the Indigenous Advancement Strategy:

  1. clearly and accurately outlines the basis for funding recommendations; and
  2. documents the outcomes of the decision-making process.

Entity response: Agreed.

4.50 The department provided advice to the Minister at various stages throughout the grant assessment process for the 2014 grant round, together with a range of detailed information to support and inform decision making.

4.51 The department acknowledges the need to capture all relevant information in consolidated briefing and to ensure comprehensive record keeping.

4.52 Since the initial funding round the department has implemented improved grant assessment processes which underpin and promote consistency in the briefing advice provided to the Minister for Indigenous Affairs. Each brief submitted for delegate consideration includes a range of information to support funding recommendations. Standardised briefing templates have been established for briefing purposes to ensure consistency. Building on the success of the Indigenous Advancement Strategy into the future requires processes that ensure departmental advice provided to the Minister is clear and accurate, and that outcomes from the decision-making process are appropriately documented.

Outcomes of the decision-making process

4.53 The final funding outcomes approved by the Minister were similar to those recommended by the department. The Minister’s funding decisions differed from the final recommendations of the department in the following ways:

  • eight projects not recommended by the department were funded; and
  • the department recommended agreement durations of varying lengths for projects delivering pre-school services, and the Minister approved funding for all such projects for 2.5 years.

4.54 The Minister reported these decisions to the Finance Minister, as required by the Commonwealth Grants Rules and Guidelines.

Was timely and accurate feedback provided to unsuccessful applicants?

The department’s initial advice to applicants about funding outcomes did not adequately advise applicants of the projects and amount of funding approved. Feedback provided to applicants was high-level, insufficiently detailed and was slower than the department would have preferred.

4.55 Under the guidelines, applicants may be provided with reasons for an application not being eligible or successful, although the department reserved the right not to offer individualised feedback.

4.56 Applicants approved for funding were advised in writing that their application was successful, but not the amount of funding approved or the project(s) for which funding was approved. The letter noted that the funding offered may differ from what was applied for and the department has acknowledged that this aspect of the process frustrated applicants. Of 1245 requests for feedback received by the department, 436 (35 per cent) of the requests came from applicants that had been recommended for funding.

4.57 Templates and instructions were provided to staff providing feedback to applicants to support consistent preparation of replies. Departmental staff were not able to provide assessment scores to applicants, as these were considered confidential, or discuss with applicants why an application was recommended for less funding than requested.

4.58 As of 10 August 2015, the department had received 892 requests for feedback via email and 353 requests via telephone.55 Of the 892 email requests, 665 were considered resolved by the department. The average response time for these requests was 22 days; 234 requests had a response time in excess of 28 days, in some cases taking up to three months. The department has noted that the process of providing feedback to successful and unsuccessful applicants was slower than preferred.56

4.59 Of the 114 applicants interviewed by the ANAO, 35 advised that they had received feedback on their application. Of the 35, 18 (51 per cent) indicated that feedback informed them that there was limited funding available for the Strategy. In addition, 14 of the 35 applicants (40 per cent) indicated that the feedback was not helpful, with 11 of the 14 applicants (79 per cent) indicating that the feedback was not helpful because it was not meaningful. A concern raised with the department was that applicants felt that the officer providing them feedback had not read their application.

Were negotiations with grant recipients fair, transparent and consistent with the Minister’s approval of the grant application?

Negotiations with grant recipients were largely verbal and conducted in a short timeframe, with some agreements executed after the project start date. In most instances the department negotiated agreements that were consistent with the Minister’s approval. There is limited assurance that negotiations were fair and transparent.

4.60 The negotiation process was designed to provide an opportunity for outcomes to be negotiated with applicants to, among other things, ensure that projects were possible with the funding offered.57 As discussed in paragraph 4.22, the committee did not approach applicants that it recommended for substantially less funding than requested prior to making recommendations to the Minister. Subsequent to funding approval, the department approached the Minister to request adjustments to funding amounts for some providers.

4.61 The department advised the ANAO that the process of negotiating funding agreements was largely verbal, and may also consist of file notes and emails stored within the regional network. A small number of funding agreements (19 of 91 projects rated as high, major or severe risk) that were negotiated by the regional network did not contain required clauses to mitigate risks, or targets for service provider performance (these issues are discussed further at paragraph 4.69). As such, the department has limited assurance that the process of negotiating funding agreements and amounts was conducted fairly, and that applicants received similar treatment across the regional network.

4.62 As discussed above, the department delivered recommendations to the Minister in March 2015, with adjustments to approved amounts and project terms substantively complete by June 2015. As a result, funding agreement negotiations took place later than anticipated, limiting the time available to negotiate with providers whose agreements were due to expire at the end of the financial year. The ANAO analysed a sample of 158 funding agreements and found that for one quarter (39 projects) the funding agreement was executed after the contracted start date.

Consistency with the Minister’s approval

4.63 In the 3 March 2015 brief, the Minister delegated some aspects of ongoing grant administration to the department, including the ability to effect variations to the grant that were not substantial in nature, subject to the funded project continuing to promote the Strategy outcomes. The threshold at which a change could be considered ‘substantial’ was not defined.58

4.64 In most instances, the department negotiated agreements that were consistent with the Minister’s approval. However the ANAO identified an instance in which funds were moved between separate projects for a single organisation without the approval of the department’s executive or the Minister.59 The ANAO also observed that the department reallocated funding from projects approved by the Minister to 92 that were not. The department advised the ANAO that these movements were between projects proposed by the same organisation that were often closely related or similar, and the movements were approved by department delegates and considered in line with the department’s delegation. The ANAO identified 68 organisations with movements (with value totalling $51.8 million) between Strategy programs.60 Examples of projects funded by the department that are not consistent with the Minister’s approval include:

  • $1.2 million approved by the Minister for a community capacity building project. The project was described as supporting a playgroup, engagement in early childhood services, and adult literacy and numeracy support for parents and carers. The department moved funding from the original project to three projects which related to community capacity building (as per the original project), schooling projects and reduced substance misuse and harm projects; and
  • $200 000 moved from a project for counselling and healing services for the stolen generations to a playgroup program.

4.65 Maintaining funding accountability by staggering payments was a key principle of the negotiation process approved by the Minister. The principle was applied for the majority of funding agreements. The ANAO found that 33 agreements of the 996 agreements negotiated by the department were inconsistent with this principle and the guidelines, and resulted in these projects receiving a single payment. The total value of these single payments was $13.6 million. Three projects, with low risk ratings, received a single payment of more than $500 000 each.

Were arrangements in place to monitor the performance of service providers?

The department has established arrangements to monitor the performance of service providers, but has not specified targets for all providers and arrangements are not always risk-based.

4.66 The department employs a range of approaches to monitor the performance of service providers, such as reporting, site visits and telephone contact. Funding agreements outline the performance requirements, including outcomes to be achieved, performance indicators, and timeframes for performance reports and milestone payments.

4.67 The department primarily uses a grant management system to record performance monitoring information.61 The system records project milestones and performance indicators, and funding recipients report their progress against these indicators using an electronic form. Applicants interviewed by the ANAO advised that customised reports were also used by some regional offices to monitor the performance of funded projects.

4.68 While the department established performance indicators for service providers, performance targets were not clearly defined. For example, 1100 projects are required to report against a mandatory indicator for number and proportion of Indigenous people employed in delivery of the project. No target number or proportion of Indigenous people was specified for 832 of those projects. The lack of targets compromises the department’s ability to monitor performance, identify underperforming providers and enforce contractual penalties. Performance indicators for funding recipients are discussed in more detail in Chapter 5.

4.69 To manage high-risk organisations, the department has developed several standardised clauses that may be inserted into the schedules of relevant funding agreements.62 A departmental review conducted in October 2015 of 91 funded projects with high, major or severe risk ratings identified that the clauses to mitigate higher risk levels were included in funding agreements for 67 organisations. The clauses had not been included in the funding agreements of the remaining 19 organisations63, which may impact the department’s ability to effectively manage high risks associated with activities carried out by those organisations.

4.70 The frequency and quantity of performance reporting obligations for funding recipients was not consistent with the amount of funding received and the department’s assessment of the provider’s level of risk (Table 4.4).

Table 4.4: Performance reporting obligations relative to payment and risk rating

Risk rating

Average days between reporting events

Average number of reports required per payment received

Severe

127.29

1.20

Major

106.28

1.47

Medium

125.17

1.26

Minimal

136.62

1.13

Insignificant

134.70

1.15

     

Source: ANAO analysis.

4.71 The department advised the ANAO that the effective administration of Strategy grants was complicated by the diversity of grant management systems inherited by the department in the Machinery of Government changes of 201364, and the lack of a suitable system available at the time of the grant funding round (see paragraph 4.6 and 4.44).

4.72 The ANAO notes that limitations of available systems were a factor in requiring applicants to submit hard copy application forms (see paragraph 4.6), considerable manual processing by department staff (see paragraph 4.14), and collecting and reporting on performance information from funding recipients (discussed further in Chapter 5).

4.73 The second Implementation Readiness Assessment of the Strategy noted that the existing grant management system may require an additional management information system operated by the department. The department is currently trialling such a system, and the ANAO will consider the department’s progress in implementing changes to its grant management systems in its future audit work program.

Was the Strengthening Organisational Governance Policy applied consistently to all organisations?

The department applied the Strengthening Organisational Governance policy consistently to 57 relevant organisations. The department did not apply the exemption policy consistently to four out of 26 applicable organisations and incorrectly assessed the incorporation status of three organisations to which the policy applied.

4.74 As at August 2016, the Strengthening Organisational Governance policy (the policy) had been applied to 57 organisations, including 37 Indigenous organisations and 20 non-Indigenous organisations.

4.75 The Minister, or an approved delegate, may approve an exemption from the policy. As at September 2016, a total of 26 organisations had applied for an exemption. This includes 15 Indigenous organisations, of which 40 per cent had exemption applications approved, and 11 non-Indigenous organisations, of which 73 per cent had exemption applications approved. The time taken to assess applications varied from 10 to 221 days, with an average of 132 days.65 One of the two criteria for exemption from the policy is that grant funding received through the Strategy is a ‘small portion’ of an organisation’s total revenue, and as such changing incorporation status may unfairly impose additional requirements on its operations and business model. The department has not formally defined what constitutes a ‘small portion’ of an organisation’s total revenue, but advised the ANAO that in approximately September 2015 it was determined that 25 per cent, or lower, of an organisation’s total would be considered a small portion of total revenue. The absence of a documented definition increases the risk that exemption applications will be assessed inconsistently.

4.76 The guidelines and exemption policy did not state that Indigenous organisations could apply to incorporate under the Corporations Act 2001 (the Corporations Act) instead of the Corporations (Aboriginal and Torres Strait Islander) Act 2006 (CATSI Act). However, four Indigenous organisations were exempted from incorporating under the CATSI Act, conditional upon their incorporation under the Corporations Act.66

4.77 The ANAO identified three non-Indigenous organisations to which the policy applied that registered with the Australian Securities and Investments Commission as Registrable Australian Bodies, retaining state or territory incorporation. The department incorrectly assessed these organisations as having incorporated under the Corporations Act and meeting the requirements of the policy. The department issued $10 000 payments to assist these organisations with the transfer of incorporation costs. To correct this error, the department has advised that two organisations are in the process of transitioning to the Corporations Act and one organisation has applied for an exemption.

5. Performance Measurement

Areas examined

This chapter examines whether the Department of the Prime Minister and Cabinet has established a performance framework and performance measures that support ongoing assessment of program performance and progress towards outcomes.

Conclusion

The performance framework and measures established for the Strategy do not provide sufficient information to make assessments about program performance and progress towards achievement of the program outcomes. The monitoring systems inhibit the department’s ability to effectively verify, analyse or report on program performance. The department has commenced some evaluations of individual projects delivered under the Strategy but has not planned its evaluation approach after 2016–17.

Areas for improvement

The ANAO has made one recommendation aimed at improving the performance framework of the department and the ability of the department to report on performance.

Did the department develop a performance framework to measure success in the Strategy?

The department developed a performance framework and refined this in line with the requirements of the enhanced performance framework under the Public Governance, Performance and Accountability Act 2013. The current framework does not provide sufficient information about the extent to which program objectives and outcomes are being achieved. There is an opportunity for the department to further develop performance measures to more clearly report on the department’s progress towards improving the lives of Indigenous Australians.

5.1 The Department of the Prime Minister and Cabinet (the department) developed a performance framework for the Indigenous Advancement Strategy (the Strategy) in 2014–15 under previous Australian Government requirements for reporting performance.67 In 2015 the department refined its performance framework in line with the requirements of the enhanced performance framework under the Public Governance, Performance and Accountability Act 2013. The department’s performance framework for the Strategy is outlined in Figure 5.1.

Figure 5.1: Performance framework for the Strategy

 

 

Note 1: The 2014–15 Portfolio Budget Statements did not link to the department’s first corporate plan as this was developed to cover the period from 2015–19. The performance measures in the 2014–2015 and 2015–16 Portfolio Budget Statements were the same.

Note 2: PBS means Portfolio Budget Statements; KPI means key performance indicators.

Source: ANAO analysis.

5.2 When the Strategy was established in 2014, the main performance measures were outlined in its Portfolio Budget Statements and the department reported against the measures in its annual report. The performance indicators for 2014–2015 and 2015–16 were primarily quantitative, output-focused and did not provide targets or benchmarks by which to measure progress in achieving program objectives. At the time, the Department of Finance advised the department that it was necessary to develop more meaningful performance indicators that linked to program outcomes.

5.3 The department published its 2015–19 Corporate Plan in 2015 under the new performance framework, setting out the department’s strategic priorities.68 The plan did not provide specific performance measures relating to the Strategy or the strategic priority of improving the lives of Indigenous Australians. Instead, the corporate plan noted that priorities should be read in conjunction with the Portfolio Budget Statements, which provided more detail on activities.69

5.4 The department’s revised corporate plan, published in August 2016, provides more detail in relation to the purpose of improving the lives of Indigenous Australians. The plan sets out nine performance indicators and 17 measures against the activities of providing policy advice, collaboration, support and advice, policy coordination, and monitoring and implementation (program delivery), enhancing capability, and contract management and delivery.70

5.5 The supporting strategic performance information for the Strategy is included under five program objectives in the department’s Portfolio Budget Statements. This performance information is not provided in the department’s corporate plan and there is not a clear alignment between the information provided in the Portfolio Budget Statements and corporate plan.71 The performance criteria outlined in the department’s 2016–17 Portfolio Budget Statements are almost identical to the deliverables in the 2014–15 and 2015–16 Portfolio Budget Statements. All five programs include the target that a majority of funded activities would meet the mandatory performance indicator on the extent of compliance with funding agreement terms and conditions.72 While this is a reasonable expectation at a project level, as a measure of program performance, this target does not provide meaningful information about the activities within a program, nor can it measure success and progress towards achieving program objectives and outcomes.

5.6 The current suite of performance measures in the 2016–2020 Corporate Plan and the 2016–17 Portfolio Budget Statements provide some useful insights into program delivery. As the department matures its approach to performance measurement and reporting under the Public Governance, Performance and Accountability Act 2013 there is an opportunity for the department to improve its performance measures to provide a more meaningful and complete picture of whether the department is achieving its purpose in improving the lives of Indigenous Australians.

5.7 The department reported its performance for 2015–16 in its annual performance statement.73 As the department had not specified measures to report against in its corporate plan for 2015–16, the department reported against the deliverables and performance indicators set out in the 2015–16 Portfolio Budget Statements. The department’s annual performance statement sets out the performance indicators for each program within the Strategy and includes a narrative against each. The majority of results reported against the performance indicators are expressed in general terms, stating that the performance indicator was either ‘met’ or ‘on track’ but did not specify a target or benchmark.

5.8 The ANAO has made a recommendation to improve the department’s performance framework for the Strategy in paragraph 5.21.

Did the department establish clear and measurable performance indicators for funding recipients?

The performance indicators against which funding recipients report cannot be easily linked to the achievement of results and intended outcomes across the Strategy. More than half of the indicators are deliverables and do not measure outcomes. In addition, the performance indicators do not clearly address the relevant program objectives and are not clearly defined.

5.9 Central to the reforms under the Strategy was having clear and measurable results for service providers, with payments linked to the achievement of outcomes.74 The Strategy’s performance framework states that a key element of performance reporting is that which is undertaken by funding recipients. The department developed a ‘menu’ for each program in the Strategy, to guide the selection of 85 key performance indicators for projects delivered by funding recipients. Some scope was included to allow the department to tailor individual key performance indicators. Of the 85 key performance indicators, two are mandatory for all projects, and eight are shared by multiple programs.75

5.10 The ANAO assessed whether the key performance indicators represented a support activity attributable to the management of a program (deliverable), an output measure (deliverable), or a measure of effectiveness in achieving an objective (key performance indicator).76 This analysis, shown in Table 5.1, indicates that more than half of the 85 key performance indicators are deliverables.

Table 5.1: Assessment of key performance indicators within the menu

Program

Deliverables

Key performance indicators 

 

Support activity

Output measures

Jobs, Land and Economy

11

2

13

Children and Schooling

10

1

15

Safety and Wellbeing

14

2

9

Culture and Capability

7

5

9

Remote Australia Strategies

13

4

7

Totala

55

14

53

       

Note a: Each key performance indicator within a program was assessed in relation to its relevant program objective, meaning that those that were shared were assessed more than once. As a result, the total number of key performance indicators assessed is 122, rather than 85.

Source: ANAO analysis.

5.11 The analysis shows that of the 53 key performance indicators identified by the ANAO, 26 did not did not clearly address their respective program objectives, 15 did not clearly outline an indicator value and 11 did not clearly outline whether an increase or decrease in results should be interpreted as a positive or negative. For example, an indicator is the number of Indigenous people who participated in the activity.

5.12 The department considers that performance reporting undertaken by funding recipients underpins the Strategy’s performance approach. Of the 122 key performance indicators in the menu, 21.7 per cent assist a funding recipient to measure their effectiveness in achieving the Strategy’s program objectives. The department advised the ANAO that it is not always practical to require funding recipients to report on outcomes. Developing appropriate key performance indicators that measure the impact or effectiveness of a program in achieving outcomes is challenging, however a principle of the Strategy is linking payments to the achievement of results and intended outcomes. Without appropriate key performance indicators, the department has not been able to link performance reporting at the project level to the achievement of results and outcomes, or ensure that funding is resulting in improved outcomes for Indigenous Australians.

5.13 In August 2016 the department established the Indigenous Affairs Group Contract Reporting and Key Performance Indicator Framework Working Group to attempt a redesign of the suite of project performance indicators. This group is scheduled to complete this task by February 2017.

Are systems in place to effectively collect performance information and accurately report performance?

The department developed a system by which funding recipients could provide performance reporting information electronically. However, the processes to aggregate and use this information are not sufficiently developed to allow the department to report progress against outcomes at a program level, benchmark similarly funded projects, or undertake other analysis of program results.

5.14 As discussed in Chapter 4, the department established a facility by which funding recipients could report progress against their key performance indicators using an electronic form. This form is generated by the department and, where possible, is pre-filled with information regarding the funding recipient and the key performance indicators that are a part of their funding agreement.

5.15 Funding recipients indicate the extent of their progress against key performance indicators (in the form of an on track/not on track/partially on track indicator). A narrative description of progress and specific figures of performance data (such as number of employees engaged) is also included if relevant. The form is submitted electronically, along with any attachments containing supporting evidence that are relevant to the report.

5.16 The Federal Online Funding Management System (the system) identifies projects that self-report as not meeting (or partially meeting) the requirements of their indicators. However, the department’s ability to make use of more detailed performance information collected through this process is constrained by a number of factors, primarily:

  • the department did not have a mechanism for classifying projects funded by type, meaning it is not possible to readily compare performance data between projects that are alike, for benchmarking and comparison purposes; and
  • fields by which applicants provide performance data are free-text fields, and do not contain a separate field to indicate the units, or type of data, being entered.

5.17 The department advised that additional constraints are imposed due to limitations in the system. While funding agreements can contain performance indicators relating to any of the five Strategy programs, the system permits indicators to be recorded from a single program only. As a result, if a project has indicators for more than one program, these indicators are not pre-filled into the performance report, and are not centrally tracked within the system unless indicators are manually added against additional programs.

5.18 The department further advised that its ability to address these issues is constrained by the terms of its Memorandum of Understanding regarding the use of the system. Changes must be implemented by the managing agency, and incur a fee for service. Due to the difficulties and costs associated with adapting the system to the needs of the Strategy, the department is exploring other options for managing grants.

5.19 The department has also taken steps to address these shortcomings through the working group referred to in paragraph 5.13, which will attempt to:

  • develop a framework for classifying funded projects into common groups;
  • identify sets of indicators that can be applied to the groups identified above; and
  • implement an improved reporting solution, to expand the department’s ability to report on project outcomes.

5.20 In the interim, analysis of performance data can only be conducted through resource intensive manual analytical approaches. This limits the department’s use of detailed performance data provided by applicants for the purposes of reporting against outcomes, monitoring compliance with departmental performance targets, benchmarking similarly funded projects, or other similar analytical activities. In January 2017, the department advised that it had completed a project to group and classify activities under the Strategy.

Recommendation No.4

5.21 The Department of the Prime Minister and Cabinet identify the outcomes and results to be achieved through the Indigenous Advancement Strategy and analyse performance information to measure progress against these outcomes.

Entity response: Agreed.

5.22 The department will review the performance framework and measures related to the Indigenous Advancement Strategy included in its Corporate Plan and Portfolio Budget Statements.

5.23 The department has recently designed and implemented an Indigenous Advancement Strategy Activity Coding Framework which will inform the design and application of key performance indicators for similar types of grants. An Indigenous Advancement Strategy Reporting Solution has also been implemented to support performance analysis and reporting on grants. These initiatives will better enable the department to compare performance results for similar grants across service providers and locations.

5.24 The department has in place an evaluation plan for the Indigenous Advancement Strategy and work is underway to build a longer term program of evaluations. This program of work will be critical to measuring the impact and effectiveness of the Strategy in achieving outcomes

Was an evaluation strategy developed to assess the overall success of the Strategy?

The department drafted an evaluation strategy in 2014, which was not formalised. In May 2016 the Minister agreed to an evaluation approach and a budget to conduct evaluations of Strategy projects in 2016–17.

5.25 In 2014, the department identified that the existing approach to evaluation across Indigenous programs and departments was inconsistent. A review of Indigenous programs in 2013 reported that fewer than 30 per cent of programs inherited by the department had been evaluated in the past five years and that that only seven per cent of activities were able to demonstrate they were achieving positive results against stated objectives. As a result, the department considered that improved evaluation strategies would be an essential component of the new approach. In 2014, the Australian Government agreed to the development of an Evidence, Evaluation and Performance Improvement Strategy (evaluation strategy) by June 2014 to address the shortage of outcome evaluations for Indigenous activities and to support timely analysis of performance within the Strategy.

5.26 The department drafted an evaluation strategy in June 2014. The evaluation strategy was presented to the Indigenous Affairs Reform Implementation Project Board, which requested further refinements. This work was deferred until the results of the funding round were known. In July 2015 a refined draft was presented to the Program Management Board, which agreed to the establishment of an evaluation register. Evaluation and information analysis activity that occurred from July 2014 to July 2015 focused on legacy program evaluations, advice on the development of performance indicators and data collation to support the funding round assessment. A constraint on evaluation activity was that the evaluation strategy was not formalised and no funding was set aside to implement the evaluation strategy. The department advised the Minister in March 2016 that:

We do not currently have enough evidence about the impact of the [Strategy] on Indigenous people and communities. The Closing the Gap Clearinghouse and an internal stocktake of evaluation shows that there are substantial gaps in the evidence base about outcomes and impact. At the moment a very high proportion of what is funded through the [Strategy] lacks a good evidence base. We do not have enough good quantitative studies testing the effects that can be attributed to interventions.

5.27 In May 2016, the department requested that the Minister approve an evaluation approach and a budget of $3.5 million (GST exclusive) be set aside in 2016–17 and 2017–18 from existing Strategy administered funds to conduct evaluations of Strategy projects. The Minister approved funds for 2016–17 but did not approve funds for evaluation in 2017–18. As at May 2016, nine evaluation activities were underway or planned; and 13 activities were approved and funded for 2016–17. These activities include impact analysis, case studies, reviews, wellbeing studies and outcomes measurement development and testing.

Appendices

Appendix 1 Entity response

 

 

 

 

Appendix 2 Feedback from stakeholder consultation

The Australian National Audit Office (ANAO) engaged with a range of stakeholders about the audit’s focus areas and approach, followed by:

  • face-to-face interviews and teleconferences with a range of applicants across metropolitan, regional and remote areas to discuss their experiences with the Strategy; and
  • establishment of an online portal through which submissions could be made to the audit.

All applicants who took part in the grant funding round were invited to provide submissions. The ANAO interviewed 114 randomly selected applicants from selected locations across Australia and received 82 public submissions. Interviewed organisations were asked a standard set of questions and respondents to the ANAO’s invitation to provide submissions were asked to comment against the audit’s four criteria. The information gained through interviews and submissions provided additional information for the ANAO to consider in developing audit findings.

Key themes and issues identified through the stakeholder consultation are outlined below.

Consultation on the establishment of the Strategy
  • Lack of consultation with Indigenous communities prior to the implementation of the Strategy.
  • Insufficient focus on partnership and collaboration with Indigenous organisations to design solutions.
  • Information sessions on the Strategy, provided by the Department of the Prime Minister and Cabinet (the department), were not useful.
  • Support for the principles behind the Strategy and reducing the number of activities into five broad programs.
Grant funding application process
  • The process did not comply with the Commonwealth Grants Rules and Guidelines.
  • The application process was difficult, time consuming and resource intensive, with the application structure proving complex and confusing.
  • Applicants had difficulty classifying projects under the five programs.
  • There was a lack of clarity around the Indigenous Advancement Strategy Guidelines 2014, funding priorities, who was eligible to apply for funding and what was able to be funded.
  • Small organisations lacked the resources to complete a competitive application, with the open tender process favouring large, non-Indigenous, non-governmental organisations.
  • The timeframes surrounding the process impacted on organisations.
Communication of funding outcomes and feedback
  • Communication surrounding the outcomes of the grant funding round was poor.
  • It took a long time to find out the outcome of the applications and the details of the funding.
  • The initial letters and emails sent to applicants did not state the amount or the projects they were funded for.
  • There were inaccuracies and inconsistencies in the amount of funding applicants were told they had been recommended for.
  • Applicants received conflicting messages about the outcome of their application.
  • Applicants did not receive feedback on their applications, and the feedback that was received was not meaningful and would not assist in developing future applications.
  • No explanation was provided for the amount or length of funding received, or why one project was successful but others were not.
Funding outcomes
  • There were significant discrepancies reported by funding recipients between the amounts asked for and the amount received.
  • The amount of funding received by applicants was inadequate for, or had impacted on, service delivery.
  • The funding decisions made by the department did not reflect need.
  • The funding decisions and selection process lacked transparency.
  • The funding system is flexible.
Performance reporting
  • Reporting under the Strategy did not reduce red tape.
  • It was unclear how the information reported would be used.
  • The reporting template was not fit for purpose.
  • Reporting cannot be linked to the activities undertaken by the organisation or the achievement of results and outcomes.
The Department of the Prime Minister and Cabinet
  • Overall communication between the department and stakeholders was poor.
  • There was a lack of corporate knowledge in the department.
  • There was a lack of continuity in the staff working within the Strategy.
  • Regional network staff and personal contacts within the department were supportive.

Footnotes

1 Department of the Prime Minister and Cabinet, Indigenous Advancement Strategy Guidelines, DPM&C, Canberra, 2016, p.3.

2 Department of the Prime Minister and Cabinet, Indigenous Advancement Strategy Guidelines, DPM&C, Canberra, 2016, p.3.

3 Statutory bodies, government bodies and organisations operating under a specific piece of legislation are exempted from the requirement. It also does not apply to funding for capital works or procurement. The policy was amended in May 2015 to allow Indigenous organisations already incorporated under the Corporations Act to retain this incorporation without applying for an exemption.

4 Appendix 2 outlines the key themes and issues identified through the stakeholder consultation.

5 Department of the Prime Minister and Cabinet, 2016–17 Portfolio Budget Statements, DPM&C, Canberra, 2016.

6 Department of the Prime Minister and Cabinet, Indigenous Advancement Strategy Guidelines, DPM&C, Canberra, 2016.

7 This includes feedback received during ANAO stakeholder consultation and written submissions received by the Senate Finance and Public Administration References Committee during the Commonwealth Indigenous Advancement Strategy tendering processes inquiry in 2015.

8 The Indigenous Advisory Council was established on 25 September 2013 to provide ongoing advice to the Government on emerging policy and implementation issues related to Indigenous affairs.

9 Department of the Prime Minister and Cabinet, Submission to the Senate Finance and Public Administration References Committee—Impact on service quality, efficiency and sustainability of recent Commonwealth Indigenous Advancement Strategy tendering processes by the Department of the Prime Minister and Cabinet, 30 April 2015, p. 11 and Attachment G.1.

10 The Strategy was a substantial reform implemented within a very short timeframe. As such, the department advised that it would have been very difficult to undertake broader community consultation, co-design regional strategies and actively partner with government and other service providers in these early days.

11 Six per cent of applicants said they were unsure whether they were consulted and 13 per cent did not answer the question.

12 Commonwealth, Senate Finance and Public Administration References Committee – Commonwealth Indigenous Advancement Strategy tendering processes, 29 June 2015, p. 69.

13 ibid., p. 55.

14 The department advised in January 2017 that it has drafted a formal strategy to assess the effectiveness of the Strengthening Organisational Governance policy and will shortly seek approval of the strategy through the Indigenous Affairs Group Executive Programme Management Board.

15 Officials involved in the development or revision of program guidelines are required to complete a risk assessment of the granting activities and associated guidelines, in consultation with the Department of Finance. See Department of Finance, Commonwealth Grants Rules and Guidelines, Finance, Canberra, 2014, p. 11.

16 The department established the Indigenous Affairs Reform Implementation Project Board in May 2014 as the decision-making body to oversee the implementation of the Australian Government’s reform agenda for Indigenous affairs (see Chapter 3 for more detail).

17 Implementation Readiness Assessments are short, independent reviews undertaken by the Department of Finance that provide additional assurance to Government and the relevant entity on how well practical delivery issues are being addressed.

18 A program owner is an executive in the department who has overall responsibility for a Strategy program.

19 The number of applications assessed is more than the number of organisations that applied due to some organisations applying more than once.

20 The $2.2 million included specialist services to support the operation of the grant funding round, advertising costs, contracting staff to assist with data entry and support with financial viability assessments. The department advised the Senate Finance and Public Administration Legislation Committee and the ANAO that staff time was not recorded or costed as the assessment of applications for funding was part of the regular business of the department and there was no additional impact on internal staffing costs.

21 Although the department had mapped legacy programs to the new program structure, it had not made this public.

22 The department recorded the feedback provided by applicants but not how many applicants responded on individual issues.

23 The gap identified In November 2015 related to a single activity that required Ministerial approval to address an identified service delivery gap.

24 Department of the Prime Minister and Cabinet, 2014–15 Portfolio Budget Statements, DPM&C, Canberra, 2016, p. 36.

25 The Grant Selection Committee was a departmental governance body which oversaw the assessment and recommendations of the Strategy grant funding round. The committee considered recommendations from assessment panels and regional assessment teams on each application and, through the Chair, made recommendations to the Minister.

26 Initially, there were two senior regional network staff on the Grant Selection Committee, however one was only involved from November to December 2014. As Grant Selection Committee meetings were not documented, the degree of input from regional network representatives in this forum is unknown.

27 The new Strategy guidelines, released in March 2016, include a community-led funding proposal model which replaced the demand-driven process. This process was not part of the scope of this audit but may be examined in a future audit.

28 See Appendix 2 for details of ANAO consultation.

29 The department released revised guidelines in March 2016.

30 The department was able to move between programs 12.5 per cent of the value of the appropriation for each program in a single financial year.

31 The guideline’s Flesch-Kincaid Grade Level is 15.3, indicating that readers require a tertiary education to comprehend them. The Flesch-Kincaid Grade Level assigns text a numerical score broadly equivalent to a school grade reading level.

32 Of the 108 applicants that provided the ANAO with feedback on the application process, 31 (27.9 per cent) indicated that they experienced difficulty classifying a project within the five programs. The revised guidelines removed the requirement for applicants to classify their project by program.

33 Paragraph 8.8 of the Commonwealth Grants Rules and Guidelines states that in order to ensure that the rules of granting activities are simply expressed and clear in their intent, officials should consider testing the clarity of grant guidelines with stakeholders prior to release. As discussed in Chapter 2, the department advised the ANAO that it met with indigenous stakeholders regarding the guidelines, but was unable to provide records of the outcomes of these meetings or how feedback was addressed.

34 The department did not have a system to support an online application process. The benefits of an online application system can include pre-population of application forms for currently funded providers and reduction in the processing burden on department staff.

35 The revised Strategy guidelines, introduced in March 2016, have removed the requirement for applicants to submit a single application form.

36 Three versions of this form were released, with initial version of the application form specified a 1000 word limit of for responses to selection criteria.

37 The ANAO identified 10 applicants that submitted applications using previous versions of the application form (including those with word limits) that were also assessed as non-compliant.

38 As discussed later in this chapter, each application to the strategy was assessed by two staff, who then produced a moderated assessment score.

39 Paragraph 12.5 of the Commonwealth Grants Rules and Guidelines states that officials involved in assessing grant applications should be appropriately skilled, and have access to procedural instructions and/or training before processing grant applications.

40 The department advised the ANAO that the applications were not assessed as they did not use the application form. However, the ANAO identified other applications also not using the application form that were assessed.

41 This need score and assessment was separate to the selection criteria and was not reflected in the program guidelines.

42 The committee comprised a representative from each of the five Strategy program divisions, a regional network representative, an Indigenous advisor, and the Deputy Secretary (Indigenous Affairs), who served as Chair.

43 The department advised that for a large number of recommended projects, the department already funded the applicant to deliver similar services, and understood existing funding applied for the services delivered.

44 For example, specifying that a project was to be funded for ‘some’ elements, but not specifying which.

45 Department of the Prime Minister and Cabinet, IAG Programme Management Board – Meeting 4 Decisions and Action Items, 18 August 2015 (unpublished).

46 The ANAO identified that to complete the additional assessments, the department used 44 contracted staff to undertake 854 project assessments and 359 were exclusively assessed by contractors.

47 The department advised that the number was added in order to uniquely identify documentation that had been received in multiple emails but related to the same applicant.

48 Three members of the Grant Selection Committee declared conflicts of interest.

49 The department did not consider the available systems, including systems used by some other agencies, were suitable to assess grants under the Strategy.

50 The department advised the ANAO that the recommendations were revised as they had originally been developed to be available for a meeting between the Prime Minister and the Minister for Indigenous Affairs on 2 March and that, upon reflection, the department was not fully satisfied with the recommendations made.

51 Paragraph 4.7 of the Mandatory Requirements section of the Commonwealth Grants Rules and Guidelines states that officials need not rank grants, but should, at a minimum, identify which grant applications fully, partially and do not meet the selection criteria.

52 While the provision of the scores would have improved the advice, they did not convey the department’s assessment against each of the criteria, and as such did not meet the requirements of paragraph 4.7 of the Commonwealth Grants Rules and Guidelines.

53 Paragraph 4.11 of the Commonwealth Grants Rules and Guidelines states that where Ministers approve proposed expenditure relating to a grant, Ministers must also record, in writing, the basis for the approval relative to the grant guidelines and the key consideration of value with relevant money.

54 As required under clause 71 of the Public Governance, Performance and Accountability Act 2013.

55 The data as at 10 August 2015 was the most recent the ANAO received from the department.

56 Department of the Prime Minister and Cabinet, Submission to the Senate Finance and Public Administration References committee— Impact on service quality, efficiency and sustainability of recent Commonwealth Indigenous Advancement Strategy tendering processes by the Department of the Prime Minister and Cabinet, 30 April 2015, p. 18.

57 Commonwealth, Senate Finance and Public Administration References Committee – Commonwealth Indigenous Advancement Strategy tendering processes, 29 June 2015, p. 53.

58 The department did not define what ‘not substantial’ changes were but provided examples: allocations across financial years, the grant value or scope of the project/activities.

59 In total, this affected eight projects with a total value of $6.3 million.

60 The department disagrees with the ANAO’s analysis in this section. The department considers that the projects were executed in the programs for which the projects were approved or additional funds were approved and added at a later time than the original grant round. The ANAO acknowledges that overall funding amounts were approved by the Minister. The ANAO’s analysis is based on consideration of Ministerial briefs and extracts from the Federal Online Funds Management System. This analysis shows there is not a direct relationship between a project and a corresponding Ministerial approval.

61 The grant management system is the Federal Online Funds Management System. The system is managed by the Department of Social Services, which provides access to the department under a Memorandum of Understanding on a fee-for-service basis.

62 These clauses relate to additional governance and accountability measures for providers. For example, requiring changes to organisation constitutions, criminal background checks on governing officials, or restricting expenditure to a list of budgeted items.

63 An additional agreement lacked these clauses on the basis that the project being funded was an employment program in which payments were provided in arrears based on outcomes. The department could not verify if the clauses were present for four projects.

64 As part of the Machinery of Government changes the department also inherited several legacy systems, which were used to manage a range of grants. The department has since consolidated several of these systems.

65 This excludes exemption applications which involved the department providing recommendations to the Minister more than once, such as review of decisions. The period is measured from the date the organisation signed the exemption application, to the date the Minister signed the exemption decision.

66 Three of the four Indigenous organisations requested approval to incorporate under the Corporations Act. One of the four Indigenous organisations was granted a conditional exemption to correct a departmental error. The department incorrectly advised the organisation that it had to incorporate under the Corporations Act, instead of the CATSI Act.

67 Performance reporting in the Australian Government was previously provided under the Outcomes and Programs framework.

68 The performance measurement and reporting requirements for Commonwealth entities are established under the Public Governance, Performance and Accountability Act 2013. Since 2015–16, entities have been required to develop a corporate plan, setting out the entity’s strategies for achieving its purposes and determining how success will be measured. Portfolio Budget Statements, which are published as part of the budget each year, are to describe at a strategic level the outcomes intended to be achieved with the funding appropriated by the Parliament.

69 Department of the Prime Minister and Cabinet, Corporate Plan 2015–19, DPM&C, Canberra, 2015, p. 2.

70 Department of the Prime Minister and Cabinet, Corporate Plan 2016–20, DPM&C, Canberra, 2016, pp. 9-11.

71 Department of Finance, Resource Management Guide No. 132: Corporate plans for Commonwealth entities, Finance, Canberra, 2015, p. 5.

72 Majority refers to an acceptable level of between 70 and 90 per cent (Department of the Prime Minister and Cabinet, Portfolio Budget Statements 2016–17, DPM&C, Canberra, 2016, p. 42). For three of the five programs this is the only target set for 2016–17.

73 Under the enhanced Australian Government performance framework entities are required to prepare annual performance statements, which are to be included in their annual reports. Annual performance statements are intended to be the key location for all public data on the actual performance of an entity in a reporting period. The content reported by entities in their statements should directly reflect the actual results achieved against the planned performance measures and intended results set out in their corporate plans.

74 Department of the Prime Minister and Cabinet, Portfolio Budget Statement 2014-15, DPM&C, Canberra, 2014, p. 36; Department of the Prime Minister and Cabinet, Portfolio Budget Statement 2015–16, DPM&C, Canberra, 2015, p. 33.

75 The mandatory key performance indicators relate to the number of Indigenous Australians employed in a project, and project compliance with funding agreement terms and conditions.

76 Guidance from the Department of Finance explains the distinction between deliverables and key performance indicators and the components of each (Department of Finance, Guidance for the Preparation of the 2015–16 Portfolio Budget Statements, March 2014, p. 24 and 37). The guidance further states that agencies should focus on the impacts the program will make rather than support activities.