Audit snapshot

Why did we do this audit?

  • The Building Better Regions Fund (BBRF) is the largest open and competitive grants program administered by the Department of Infrastructure, Transport, Regional Development, Communications and the Arts (Infrastructure).

Key facts

  • The program was established in 2016.
  • Five funding rounds have been completed, with $1.15 billion in grant funding awarded to 1293 projects. A sixth round is underway.
  • Funding awarded under the program was to be directed to projects outside of major capital cities to drive economic growth and build stronger regional communities into the future.

What did we find?

  • The award of funding was partly effective and partly consistent the Commonwealth Grant Rules and Guidelines.
  • While the BBRF was well designed in a number of respects, there were also deficiencies in some important areas.
  • Appropriate funding recommendations were provided for three of the five completed rounds.
  • Funding decisions were not appropriately informed by departmental advice, and the basis for the funding decisions has not been appropriately documented.
  • The award of funding was partly consistent with the guidelines.

What did we recommend?

  • The Auditor-General made one recommendation to Infrastructure and four aimed at improving the grants administration framework.
  • The Department of Finance noted all four of the recommendations directed at improving the framework and Infrastructure agreed to its one recommendation.

$1.15bn

in grant funding awarded across the five completed rounds.

94%

of approved projects were located in regional areas.

65%

of infrastructure project stream applications approved for funding were not those assessed as being the most meritorious in the departmental assessment process.

Summary and recommendations

Background

1. The Building Better Regions Fund (BBRF) is a $1.38 billion open and competitive grant program administered by the Department of Infrastructure, Transport, Regional Development, Communications and the Arts1 (Infrastructure) under the Commonwealth Grants Rules and Guidelines (CGRGs). Commencing in November 2016, it was one of the first grant programs to be delivered in partnership2 between Infrastructure and the Department of Industry, Science and Resources3 (DISR).4

2. The objectives of the BBRF are to drive economic growth and build stronger regional communities into the future. A total of $1.15 billion has been awarded to 1293 projects across the first five funding rounds. A sixth round was underway at the time of the audit, with successful applications expected to be announced between June and August 2022. The ANAO has not audited the assessment of applications and award of grant funding under the sixth round.

3. Separate guidelines were prepared in each round for the ‘community investments’ and ‘infrastructure projects’ funding streams. The value of grants awarded under the infrastructure projects stream was significantly larger ($1.57 million on average of funding approved compared with $45,782 on average approved under the community investments stream) which was reflected in 98 per cent of the total BBRF funding awarded being under the infrastructure projects stream.

Rationale for undertaking the audit

4. The BBRF is the largest open, competitive and merits-based grants program administered by Infrastructure. This audit was undertaken concurrent with completion of the fifth round and design of the sixth round, providing opportunities to examine any improvements that have been made over time. An audit of the BBRF also allowed the ANAO to provide assurance to the Parliament as to whether Infrastructure has implemented relevant agreed recommendations from the last audit of a regional grant funding program, tabled in November 20195 as well as embedding grants administration improvements previously observed by the ANAO6 and matters previously raised by the Joint Committee of Public Accounts and Audit (JCPAA) in reviews of earlier audits of regional grants programs. The JCPAA identified the potential audit topic as a priority of the Parliament for 2021–22.

Audit objective and criteria

5. The objective of the audit was to assess whether the award of funding under the BBRF was effective as well as being consistent with the CGRGs.

6. To form a conclusion against the objective, the following high-level criteria were adopted.

  • Was the program well designed?
  • Were appropriate funding recommendations provided?
  • Were funding decisions informed by the advice provided and appropriately documented?
  • Was the distribution of funding consistent with program objectives and grant opportunity guidelines?

Conclusion

7. The award of funding under the first five rounds of the Building Better Regions Fund was partly effective and partly consistent with the Commonwealth Grant Rules and Guidelines.

8. While the BBRF was well designed in a number of respects, there were also deficiencies in a number of important areas. Positive aspects included the guidelines clearly setting out: that an open competitive application process was being employed; relevant and appropriate eligibility requirements; and the process through which the Business Grants Hub would assess the merits of applications against the four published criteria. Key shortcomings in the design of the program were that the published program guidelines across all six rounds:

  • did not transparently set out the membership of the panel that was to make decisions about which applications would receive grant funding; and
  • stated that the decision-making panel may use at its discretion the consideration of a non-exhaustive list of ‘other factors’ to override the results of the merit assessment process, with applicants not asked to specifically address those other factors in their applications for grant funding.

9. Infrastructure provided appropriate funding recommendations based on merit assessment results for three of the five rounds that have been completed. This was not the case for the third and fifth funding rounds. In those two rounds, rather than clearly identifying which applications should be successful up to the limit of the available funding, Infrastructure recommended the panel select from a ‘pool’ of projects. The total value of those projects was significantly in excess of the funding available to be awarded (more than double the funding available in the third round, and more than triple the funding available in the fifth round). The significance of the changes made to Infrastructure’s approach was that, for the fifth round, there would be no circumstance in which any of the 528 applications seeking $761.4 million in grant funding that had been assessed as representing value with relevant money would need to be reported to the Finance Minister (in accordance with the CGRGs) as an approval of an application that the department had recommended be rejected.

10. The decisions about the award of grant funding across each of the five funding rounds were not appropriately informed by departmental advice, particularly with respect to the third and fifth rounds. The basis for the funding decisions has not been appropriately documented, particularly in the three most recently completed rounds. It was common in the infrastructure projects stream (the larger of the two streams) for the approved applications to not be those assessed as best meeting the published criteria and for applications assessed as highly meritorious against the published criteria to not be awarded funding. There were inadequate records of the inputs to the decision-making process and the basis for decisions to approve projects assessed as less meritorious against the four published merit criteria and to reject applications assessed as having higher merit.

11. Overall, the award of funding was partly consistent with the grant opportunity guidelines. Consistent with the program design, the funding has been directed to projects in rural and regional areas. As the program has progressed through the first five rounds, there has been an increasing disconnect between the assessment results against the published merit criteria and the applications approved for funding under the infrastructure projects stream (which comprised the majority of approved projects and funding). This reflects the extent to which the ministerial panel has increasingly relied upon the ‘other factors’ outlined in the published program guidelines when making funding decisions.

Supporting findings

Program design

12. Grant opportunity guidelines were developed, approved and published for each of the six rounds of the BBRF that have been conducted (see paragraphs 2.2 to 2.5).

13. The guidelines for each round clearly outlined the way in which funding candidates would be identified, including the application process. An open competitive approach to selecting the most meritorious applications from those applications assessed as eligible was to be adopted. The guidelines for each round underscored the importance of applications’ performance against the assessment criteria as this was to be the basis for the award of funding (see paragraphs 2.6 to 2.8).

14. Relevant and appropriate eligibility requirements were set out in the grant opportunity guidelines and have remained relatively consistent for each round. Key changes included: adding eligible project locations from outer metropolitan areas; strengthening the definition of an eligible applicant; and decreasing the volume of mandatory supporting documentation required for some projects. Additional eligibility requirements were used in round four to facilitate the targeting and funding of projects in drought-affected communities (see paragraphs 2.9 to 2.14).

15. Four relevant and appropriate merit criteria were identified in the guidelines for each round and stream. In a section separate to the merit criteria, the guidelines for each round also included a non-exhaustive list of ‘other factors’ the ministerial panel could consider when deciding which applications to fund. The inclusion of the other factors was not consistent with the key principles for grants administration set out in the CGRGs as it had the potential to undermine the award of grant funding on the basis of merit (see paragraphs 2.15 to 2.30).

16. Up to and including round four, the grant opportunity guidelines for each round set out how applications would be scored against the merit criteria and stated that the results would provide the basis for determining the most meritorious applications for funding under the program. The guidelines have not accurately reflected the significance of ministerial discretion (represented in the guidelines by the ‘other factors’ clause) as a basis for the panel’s funding decisions (see paragraphs 2.33 to 2.36).

17. While the decision-making role performed by the ministerial panel was clearly identified in the published guidelines for each round, the membership of the panel was not. In addition, the division of roles and responsibilities between the two administering departments has been refined over time and changes reflected in various services schedules to a December 2016 Memorandum of Understanding (see paragraphs 2.37 to 2.41).

18. Infrastructure has practices in place to use lessons learned from previous rounds to inform the design and delivery of subsequent BBRF funding rounds. Notwithstanding a recommendation it agreed to in an earlier audit of the award of regional grant funding, Infrastructure has not taken appropriate steps to analyse the reasons for the panel diverging from the results of the documented merit assessments so as to inform the development of the guidelines for later rounds and/or make adjustments to the conduct of the assessment process (for example, to undertake a departmental assessment of all factors the guidelines state can be taken into account when funding decisions are taken). (See paragraphs 2.42 to 2.49).

Funding recommendations

19. Decision-makers were provided with funding recommendations in written briefings and submissions in each round by the Department of Infrastructure. The guidelines outlined that Cabinet would be consulted on the funding decisions, and written submissions to Cabinet were prepared for each round (see paragraphs 3.3 to 3.7).

20. While the approach differed across rounds, the results of the assessment process were used to place competing, eligible applications that had met the published criteria to a minimum standard into orders of merit that:

  • for rounds one and two, individually ranked almost all applications by overall merit assessment score from highest scoring to lowest; and
  • from round three onwards, ranked applications in groups, or ranking bands, of between one and 61 applications by overall merit assessment score from highest scoring to lowest (see paragraphs 3.8 to 3.14).

21. For the third, fourth and fifth rounds, applicant requests to be exempted from the requirement to include partnership funding were assessed and clear recommendations were provided to the ministerial panel as to whether those requests should be granted. An assessment of these requests was not undertaken in the first two rounds. Without recommending which of the requests should be approved by the panel in those rounds, Infrastructure provided the panel with details of the applications seeking exemptions and recommended that the chair record which of those exemption requests the panel had approved. Across all five rounds, all applicants that sought an exemption were approved by the respective ministerial panels, irrespective of any departmental recommendations (see paragraphs 3.15 to 3.23).

22. For the first, second and fourth rounds, Infrastructure clearly recommended that the ministerial panel approve specific applications up to the limit of the available funding. Consistent with the published guidelines, the department recommended the panel approve the 591 highest ranked applications across those three rounds up to the aggregate value of $661 million. In a departure from the competitive and merits-based process outlined in the program guidelines, Infrastructure’s recommendations for the third and fifth rounds were presented as ‘pools’ of candidates for the panel to choose from and, while sorted in descending order based on their total assessment scores, the department did not recommend that the highest scored and ranked applications be chosen as successful. Specifically, in:

  • round three, there were 306 ‘recommended’ projects seeking grants of $426 million put forward by Infrastructure for the panel to choose from, notwithstanding that there was only $205 million available; and
  • round five, there were 528 ‘recommended’ projects seeking grants of $761 million put forward by Infrastructure for the panel to choose from, compared with the $200 million the guidelines had said was available and the $300 million subsequently made available (see paragraphs 3.24 to 3.34).

Funding decisions

23. All applications approved under the five concluded funding rounds were submitted through the open processes outlined in the program guidelines. With one exception, only those applications assessed as eligible were then assessed against the merit criteria and awarded grant funding (see paragraphs 4.2 to 4.3).

24. Approval processes have varied over time and have not been conducted fully in accordance with the published grant opportunity guidelines. Consistent with these guidelines, funding approvals for each round were provided by the chair of the panel after consultation with Cabinet. Including the ‘other factors’ in the guidelines was inconsistent with the intent of the CGRGs and eroded the role of the published merit assessment process by enabling and introducing discretion to decision-makers with little or no transparency. In addition, the guidelines did not provide for the panel’s consideration of applications against the other factors prior to it receiving Infrastructure’s recommendations or, as was the case in the third round, in parallel with DISR’s merit assessment process (see paragraphs 4.4 to 4.21).

25. Appropriate records were not made of the decision-making process. While not good practice in the first round, the record-keeping practices for the decision-making process deteriorated significantly in later rounds. Infrastructure performed a secretariat function for the panel in the first round, and although secretariat services are readily able to be provided by DISR’s grants hub, it has not been engaged to provide administrative support for the panel (see paragraphs 4.22 to 4.34).

26. Out of the 1293 projects approved across the five concluded rounds, there were 179 instances where the bases for the panel’s funding decisions were not recorded. This comprised:

  • 164 instances where the panel decided not to approve applications that had been recommended by Infrastructure for funding (which had been assessed as having a high degree of merit against the four published criteria), without recording the reasons for not approving those applications. This contrasted with the approach taken for 40 other recommended applications where the panel’s reasons for not approving were recorded; and
  • 15 instances (10 missing from the 76 that should have been recorded in the second round, and 5 from 173 in the fifth round) where projects not recommended by the department were funded.

27. In addition, and particularly in more recent rounds, while there has been some record made of why applications assessed as less meritorious were approved for funding, the basis recorded did not adequately explain why applications assessed as more meritorious in terms of the published criteria have been overlooked in favour of those other applications with lower merit scores (see paragraphs 4.35 to 4.41).

28. Processes were not in place for decision-makers to re-score applications where they disagreed with the departmental merit assessments. In a departure from previously observed better practice, Infrastructure advised the minister in the later stages of the first round that it was unnecessary to rescore applications because it was not required by the CGRGs. Accordingly, where the panel indicated disagreement with departmental scoring in the first two rounds, it did not revise scores. By the third round, the panel’s written basis for funding decisions made no reference to the merit assessment results, instead focusing on the ‘other factors’ mentioned in the guidelines for the program (see paragraphs 4.42 to 4.48).

Award of funding

29. Projects approved for funding under the community investments stream (grant funding totalling $26.3 million) have consistently been those assessed as being the most meritorious against the published assessment criteria. For projects approved under the infrastructure projects (IP) (involving grant funding of $1.1 billion, or 98 per cent of total program funding awarded), 65 per cent of IP stream applications approved for funding were not those assessed as being the most meritorious in the assessment process. Under the first round, 75 per cent of IP applications funded had been scored highly (and recommended by Infrastructure for funding). For subsequent rounds, highly scored IP applications were approved at a lesser rate of between 13 and 55 per cent (see paragraphs 5.3 to 5.11).

30. Across the five rounds, the approach for reporting the panel’s approval of applications that were not recommended for funding falls short of the intent of the CGRGs. This has been the result of Infrastructure changing its approach to making funding recommendations. Certain types of decisions under some rounds have been reported to the Finance Minister, with similar decisions in other rounds not reported. In addition, Infrastructure’s approach to categorising applications into ‘pools’ in conjunction with alternatively worded funding recommendations has resulted in categories of applications where there is no clear recommendation and, as a result and based on Infrastructure’s interpretation of the CGRGs, no requirement for applications approved in those pools to be reported to the Finance Minister (see paragraphs 5.12 to 5.19).

31. Consistent with the objectives for the BBRF, almost $1.08 billion (94 per cent) of the funding awarded under the first five rounds has been directed towards projects in rural (74 per cent or $850 million) and provincial (20 per cent or $227 million) regions. Compared with the results of the merit assessment process, the award of funding favoured electorates held by The Nationals as those electorates were awarded $104 million (or 29 per cent) more funding in total across the five rounds than would have been the case had the results of the merit assessment process been relied upon. Electorates held by all other parties were awarded less grant funding than would have been the case had the merit assessment results been relied upon (see paragraphs 5.22 to 5.26).

32. While there were delays between final approvals and announcements under the first two rounds, public announcements of the funding outcomes have been timely for recent rounds. There have been some delays with the completion of projects, and this has been reflected in seven projects from round one and 33 from round two still being underway. (see paragraphs 5.27 to 5.30).

Recommendations

Recommendation no. 1

Paragraph 2.31

The Australian Government amend the Commonwealth Grants Rules and Guidelines to require that, in circumstances where funding decisions may be made by reference to factors that are in addition to, or instead of, the published assessment criteria:

  • applicants be afforded the opportunity to address those other factors as part of their application for funding; and
  • records be made as part of the decision-making process as to how each competing applicant had been assessed to perform against each of those factors.

Department of Finance response: Noted.

Recommendation no. 2

Paragraph 3.35

The Australian Government amend the Commonwealth Grants Rules and Guidelines to strengthen the written advice prepared for approvers on the merits of a proposed grant or group of grants by requiring that advice to include a clear and unambiguous funding recommendation that:

  • identifies the recommended applications that have been assessed as eligible and the most meritorious against the published assessment criteria; and
  • does not recommend applications for an aggregate value of grant funding that exceeds the total amount available for the particular grant opportunity.

Department of Finance response: Noted.

Recommendation no. 3

Paragraph 4.17

The Australian Government amend the Commonwealth Grants Rules and Guidelines to apply the principles for grants administration to situations where stakeholders, such as parliamentarians, play a role in the assessment and award of grant funding.

Department of Finance response: Noted.

Recommendation no. 4

Paragraph 4.49

The Department of Infrastructure, Transport, Regional Development, Communications and the Arts:

  • put appropriate processes in place to ensure that where more than one minister, such as a ministerial panel, performs the role of grants decision-maker its written advice on the merits of proposed grants is provided to all panel members prior to funding decisions being taken; and
  • improve record-keeping practices so that the basis for decisions is clear, including in circumstances where the decision-maker has not agreed with the assessment of candidates undertaken by officials.

Department of Infrastructure, Transport, Regional Development, Communications and the Arts response: Agreed.

Recommendation no. 5

Paragraph 5.20

The Australian Government amend the Commonwealth Grants Rules and Guidelines to require that:

  • when advising on the award of grant funding, officials recommend that the decision-maker reject all applications not supported for the award of a grant within the available funding envelope; and
  • the basis for any decisions to not approve applications that were recommended for funding be recorded.

Department of Finance response: Noted.

Summary of entity response

33. The proposed audit report was circulated to the three relevant departments and the chair of each ministerial panel from the five concluded funding rounds (as stakeholders with a special interest in the report under subsection 17(5) of the Auditor‐General Act 1997). Letters of response are included at Appendix 1 and were provided by Infrastructure, DISR, the Department of Finance, the Hon Michael McCormack MP and the Hon Fiona Nash. Summary responses, where provided, are included below.

Department of Infrastructure, Transport, Regional Development, Communications and the Arts

The Department of Infrastructure, Transport, Regional Development and Communications welcomes the report, agrees with the recommendation directed to us, and notes the remaining four recommendations propose modifications to the Commonwealth Grants Rules and Guidelines (CGRGs).

In discharging our grant administration functions across a range of diverse programs, the Department has consistently complied with both the requirements and the spirit of the CGRGs, whilst operating within the context of the design of each program and the Government’s policy objectives.

We note the ANAO’s commentary about the purported use of a ‘pool’ in rounds three and five of the program and acknowledge we incorrectly used the term ‘pool’ in our briefing – it was not an accurate characterisation of the advice we provided to Ministers, in which every eligible application in each round was merit assessed, scored and ranked.

We also note the ANAO’s broader remarks to all entities on the importance of instituting a governance culture that enlivens the principles-based framework of grants administration. We strongly support this sentiment – seeking to live it in our published Values and through our reforms such as development of a pro-integrity framework, improved record keeping, establishment of a regional program governance office, and strengthened probity arrangements.

ANAO comments on Infrastructure’s summary response

34. The ANAO concluded that the award of funding under the first five rounds of the Building Better Regions Fund was partly effective and partly consistent with the CGRGs. The four recommendations to the Australian Government to improve aspects of the CGRGs reflect findings that a stronger grants administration framework would guard against the practices employed in the award of BBRF funding that were not consistent with the underlying intent of the framework.

Department of Industry, Science and Resources

The Department of Industry, Science, Energy and Resources acknowledges the Australian National Audit Office’s report on Award of Funding under the Building Better Regions Fund.

The department notes the ANAO’s conclusion that the award of funding was partly effective and partly consistent with the Commonwealth Grant Rules and Guidelines (CGRGs). The department supports the ANAO’s recommendations to Finance to amend the Commonwealth Grants Rules and Guidelines to strengthen the framework for Commonwealth grants administration.

The department welcomes the ANAO’s finding that the program was well designed in a number of respects.

The department thanks the ANAO for its report, and commits to working with policy partners that engage the Business Grants Hub to ensure adequate transparency in program guidelines when co-designing future programs to ensure alignment with the underlying intent of the CGRGs framework.

Department of Finance

The Department of Finance notes recommendations 1, 2, 3 and 5, which are directed to the Australian Government to amend the Commonwealth Grants Rules and Guidelines.

Any amendment to the Commonwealth Grants Rules and Guidelines is a matter for consideration by Government. The Department of Finance will brief the Government on the ANAO’s findings and recommendations.

The Hon Fiona Nash

I support the ANAO’s recommendations.

The Building Better Regions Fund’s purpose is to ensure funding for projects designed to strengthen regional communities in rural, regional and remote areas of Australia.

While the departmental processes for assessing and scoring applications are in my view sound, it must be recognised that the departmental decision-makers in that process are located in the cities. They do not have the benefit of an on-the-ground understanding of the regional communities, and their circumstances, where projects are proposed to be located, and the potential impact and benefit of those projects.

One of the intentions of the Ministerial Panel was to bring local community knowledge to the decision-making process to strengthen the robustness of funding decisions.

My involvement in the program was through Round 1. I would note that I believe my decisions were appropriately informed by departmental advice.

Regarding Round 1, the ANAO stated:

  • In Rounds 1 and 2 “there was a focus on identifying where applicant’s claims against the merit criteria had been overstated or understated”
  • “Under the first round, 75% of IP projects [funded] had been scored highly..”
  • “Round 1 had the greatest alignment between the merit assessment results and funding approval.” (IP stream).

Key messages from this audit for all Australian Government entities

Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.

Group title

Governance and risk management

Key learning reference
  • Good governance involves entity leaders developing a culture requiring and supporting actions which are not only in compliance with rule frameworks but also with the intent of those frameworks. For principles-based frameworks such as the Commonwealth Procurement Rules and the Commonwealth Grants Rules and Guidelines, this means that accountable authorities and officials must put in place practices and procedures that are consistent with the overarching intent of the principles, not just focus on adhering to the small number of specific mandatory requirements.

 

Group title

Grants

Key learning reference
  • Potential applicants and other stakeholders may reasonably expect that program funding decisions will be made in a manner and on a basis consistent with the published program guidelines. While different conclusions as to which applications are most worthy of funding may legitimately be reached, it is important that the criteria being applied in reaching those conclusions be transparent to applicants, that applicants are afforded the opportunity to make their case against each of the criteria and that, for accountability purposes, the performance of competing applications against the criteria be appropriately recorded.
  • To promote adherence to the requirements of the Commonwealth Grants Rules and Guidelines, all criteria used to assess the merits of applications should be clearly linked to the program objectives and contained within the assessment criteria section of the program guidelines.
  • Decision-making in competitive grants programs is best supported by unambiguous departmental advice that prioritises applications on the basis of their assessed merit against each of the published criteria, clearly identifying which applications are recommended for approval and which for rejection.
  • Officials responsible for grants administration activities should be supported by robust internal grant management frameworks which clearly outline granting procedures, risk management processes and roles and responsibilities.

1. Background

Introduction

1.1 The objectives of the Building Better Regions Fund (BBRF) are to drive economic growth and build stronger regional communities into the future. A total of $1.38 billion has been allocated to the program out to 2024–25.

1.2 As illustrated by Figure 1.1, five funding rounds have been completed. Across the five rounds, more than 4300 applications were received seeking $6.26 billion in grant funding. A total of $1.15 billion in grant funding has been awarded to 1293 projects. Rounds three and five were the most over-subscribed, with the funding available for award in round five increased in October 2021 by $100 million from the $200 million announced in the grant opportunity guidelines. A sixth funding round commenced in the latter part of the audit, for which the successful applications are expected to be announced between June and August 2022.7

1.3 Two streams of funding have been available in each round — infrastructure projects and community investments. The infrastructure projects stream funds new infrastructure or the upgrade or extension of existing infrastructure. The community investments stream funds new or expanded local events, strategic regional plans, or leadership and capability strengthening activities.

1.4 Of the 1293 successful applications across the five rounds, 56 per cent were within the infrastructure projects stream and 44 per cent were in the community investments stream. The value of grants awarded under the infrastructure projects stream was significantly larger ($1.57 million on average of funding approved compared with $45,782 on average approved under the community investments stream) which was reflected in 98 per cent of the total BBRF funding awarded being under the infrastructure projects stream.

Program administration

1.5 The Commonwealth Grants Rules and Guidelines (CGRGs) are issued by the Finance Minister under section 105C of the Public Governance, Performance and Accountability Act 2013 (PGPA Act). Introduced in July 2009, the CGRGs state that the objective of grants administration is to ‘promote proper use and management of public resources through collaboration with government and non-government stakeholders to achieve government policy outcomes’.8 The CGRGs provide a largely principles-based framework, with some specific requirements.9 They are set out in two interrelated parts, with Part 1 containing mandatory requirements (including that practices and procedures must be put in place to ensure adherence to the seven key grants administration principles10), and Part 2 further explaining how entities should apply the key principles.

1.6 Principles-based grants administration frameworks continue to be adopted over prescriptive approaches by governments because they provide the required flexibility to be fit-for-purpose across a range of granting activities and circumstances.11 For these frameworks to operate effectively, the key principles must be prioritised and applied with appropriate judgement to each grant opportunity in a way that meets both the intent of the framework and its requirements.

1.7 The Department of Infrastructure, Transport, Regional Development, Communications and the Arts (Infrastructure) is the Australian Government entity with policy responsibility for the BBRF.12 The Department of Industry, Science and Resources (DISR)13 through its Business Grants Hub has been engaged by Infrastructure to administer some aspects of the program (discussed at paragraph 2.38).

1.8 A ministerial panel (the panel, see Appendix 3 for the membership of the panel) makes the key decisions on the award of BBRF funding.14 The panel:

  • decides whether to agree to applicant requests for an exemption to the requirement that they make a cash contribution towards the cost of the project; and
  • in consultation with Cabinet, decides which applications are funded.

1.9 Infrastructure has provided written briefings to inform the panel in its decision-making at both key points, with secretariat support for the panel and its members provided by the office of the chair of the ministerial panel.15 For round five, this support was to include documenting any conflicts of interest declared by panel members in respect to applications from within their electorates and whether they had recused themselves from discussing those projects.16

Figure 1.1: Applications received, assessed and awarded funding

Note a: In round five, there were three applications (seeking total funding of $595,515) among those that had been identified for funding by the ministerial panel that were later withdrawn by the respective applicants between 23 and 30 September 2021. On 6 October 2021, five other applications (seeking a total of $589,951) that had been assessed as being value with relevant money were approved instead.

Source: ANAO analysis of departmental records.

Rationale for undertaking the audit

1.10 Commencing in 2016, the BBRF is a longstanding grants program with a value that has grown over time to $1.38 billion. It is the largest open, competitive and merits-based program subject to the CGRGs administered by Infrastructure. This audit was undertaken concurrent with completion of the fifth round and the commencement of the design of the sixth round, providing opportunities to examine any improvements that have been made over time. An audit of the BBRF also allowed the ANAO to provide assurance to the Parliament as to whether Infrastructure had:

  • implemented relevant agreed recommendations from the last audit of a regional grant funding program, tabled in November 201917; and
  • embedded the grants administration improvements observed by the ANAO in audits of regional grants programs between August 2013 and December 2016.18 These improvements had addressed matters raised previously by the Joint Committee of Public Accounts and Audit (JCPAA) in its reviews of earlier ANAO audits of regional grants programs (Appendix 4 provides further detail).

1.11 The JCPAA identified the potential audit topic as a priority of the Parliament for 2021–22.

Audit approach

Audit objective, criteria and scope

1.12 The objective of the audit was to assess whether the award of funding under the BBRF was effective as well as being consistent with the CGRGs.

1.13 To form a conclusion against the objective, the following high-level criteria were adopted.

  • Was the program well designed?
  • Were appropriate funding recommendations provided?
  • Were funding decisions informed by the advice provided and appropriately documented?
  • Was the distribution of funding consistent with program objectives and grant opportunity guidelines?

1.14 The audit examined the award of funding under each of the five concluded funding rounds and the implementation of the sixth round. The audit scope did not include the conduct of the eligibility and merit assessment work undertaken by DISR (the use of the results of DISR’s assessment work by Infrastructure in developing funding recommendations was examined), or the negotiation and management of grant agreements by DISR.

Audit methodology

1.15 The audit methodology included examination and analysis of Infrastructure and DISR records and engagement with relevant Infrastructure and DISR staff.19

1.16 Access to relevant email records was delayed by 12 weeks due to issues with Infrastructure’s IT system. It was agreed that Infrastructure’s process for identifying and gathering email records in response to requests under the Freedom of Information Act 1982 (FOI Act) provided an appropriate mechanism for retrieving the records requested by the ANAO. In addition to enabling Infrastructure officials to work remotely and in line with COVID-19 related public health directions, this approach was also expected to provide more timely and accurate results than those expected from Infrastructure’s IT system back-ups.

1.17 Initial records provided to the ANAO represented around 75 per cent of the expected volume, indicating there were underlying issues with Infrastructure’s FOI processes. Work was undertaken by Infrastructure to rectify this during the course of the audit to provide the ANAO with a complete set of the requested records. Infrastructure advised the ANAO in April 2022 that this has resulted in improvements to the way its records are identified for FOI purposes.

1.18 The audit was conducted in accordance with ANAO Auditing Standards at a cost to the ANAO of approximately $614,000.

1.19 The team members for this audit were Amy Willmott, Swatilekha Ahmed, Jessica Carroll, Tessa Osborne and Brian Boyd.

2. Program design

Areas examined

The ANAO examined whether the Building Better Regions Fund (BBRF) was well designed, with a focus on whether appropriate grant opportunity guidelines were in place for each round.

Conclusion

While the BBRF was well designed in a number of respects, there were also deficiencies in a number of important areas. Positive aspects included the guidelines clearly setting out: that an open competitive application process was being employed; relevant and appropriate eligibility requirements; and the process through which the Business Grants Hub would assess the merits of applications against the four published criteria. Key shortcomings in the design of the program were that the published program guidelines across all six rounds:

  • did not transparently set out the membership of the panel that was to make decisions about which applications would receive grant funding; and
  • stated that the decision-making panel may use at its discretion the consideration of a non-exhaustive list of ‘other factors’ to override the results of the merit assessment process, with applicants not asked to specifically address those other factors in their applications for grant funding.
Areas for improvement

The ANAO made one recommendation aimed at improving the way the grants administration framework promotes alignment between the program objectives, the conduct of merit assessments and decision-making.

2.1 The Commonwealth Grant Rules and Guidelines (CGRGs) require that practices and procedures be in place that ensure that the conduct of grants administration is consistent with the seven key principles for grants administration. To promote open, transparent and equitable access to available funding, the CGRGs require that grant opportunity guidelines be developed and made publicly available where grant applications are to be sought. The ANAO examined whether the design of the program included the development of opportunity guidelines for each selection process, and whether those guidelines were consistent with the CGRGs.

Were guidelines for each round developed, approved and published?

Grant opportunity guidelines were developed, approved and published for each of the six rounds of the BBRF that have been conducted.

2.2 For each of the six rounds, grant opportunity guidelines were developed by the Department of Infrastructure, Transport, Regional Development, Communication and the Arts (Infrastructure) and the Department of Industry, Science and Resources (DISR). Separate guidelines were prepared in each round for the ‘community investments’ (CI) and ‘infrastructure projects’ (IP) funding streams.

2.3 Under the CGRGs, entities must conduct a risk assessment of the grants and associated guidelines. Unless a grant is a one-off or ad hoc grant, entities must provide this to and agree the risk assessment rating with the Department of Finance20 (Finance) and the Department of the Prime Minister and Cabinet (PM&C). The agreed rating then informs the handling and public release process for the guidelines. Finance’s ‘streamlined approval process for grant guidelines’ allows for guidelines to be approved for publishing by the responsible portfolio minister without the Finance Minister’s approval in the following circumstances:

  • for grant opportunities where amendments are to ‘existing guidelines’ and constitute ‘minor administrative changes’; and
  • where the agreed grant opportunity ‘risk rating is low and the [whole-of-government guidelines] templates have been populated as instructed’.

2.4 Guidelines were approved by the chair of the ministerial panel prior to their publication, and for the first three rounds, were also approved by either the Finance Minister or the Prime Minister. All guidelines were made publicly available on business.gov.au on the same day that applications opened, with the exception of the first and fifth rounds where the guidelines were made available earlier. Table 2.1 provides an overview of the approval and release of the guidelines for each round.

Table 2.1: Release of grant opportunity guidelines across funding rounds

Round

Guidelines approveda

Approver

Guidelines published

Applications open

Applications closed

 

 

 

 

 

IP stream

CI stream

1

22 Nov 2016

Finance Minister

23 Nov 2016

18 Jan 2017

28 Feb 2017

31 Mar 2017

2

6 Nov 2017

Prime Minister

7 Nov 2017

19 Dec 2017

3

20 Sep 2018

Finance Minister

27 Sep 2018

15 Nov 2018

4

12 Nov 2019

Chair of the panel

14 Nov 2019

19 Dec 2019

5

9 Dec 2020

Chair of the panel

16 Dec 2020b

12 Jan 2021

12 Mar 2021b

6

2 Dec 2021

Chair of the panel

13 Dec 2021

10 Feb 2022

             

Note a: From round four, approval of the guidelines has been provided by the chair of the ministerial panel in accordance with agreement between Infrastructure and Finance that the changes in the program guidelines were ‘minor administrative changes’.

Note b: The round five guidelines were amended on 4 March 2021 to reflect the extended application deadline for both funding streams from 4 to 12 March 2021.

Source: ANAO analysis of departmental records.

2.5 With some variations and departures (see paragraph 2.22), the BBRF guidelines were based largely on Finance’s whole-of-government guidelines templates and so reflected the minimum content requirements set out in the CGRGs. Revisions were made to the guidelines prior to each new round from lessons learned from previous rounds.

Did the guidelines clearly outline the way in which funding candidates would be identified, including any application process?

The guidelines for each round clearly outlined the way in which funding candidates would be identified, including the application process. An open competitive approach to selecting the most meritorious applications from those applications assessed as eligible was to be adopted. The guidelines for each round underscored the importance of applications’ performance against the assessment criteria as this was to be the basis for the award of funding.

2.6 The CGRGs outline that competitive, merits-based selection processes can achieve better outcomes and value with relevant money and should be used unless specifically agreed otherwise by a minister, accountable authority or delegate.21 Between 31 December 2017 and 30 June 2021, 20 per cent of grants representing 20 per cent of grant funding were reported by entities subject to the CGRGs as having been the result of an open competitive selection process.22

2.7 The grant opportunity guidelines for each round of the BBRF outlined that an open call for applications was being undertaken. A significant number of applications were received in each round (see Figure 1.1), with $6.26 billion applied for which was more than four times the $1.38 billion available.

2.8 The guidelines further outlined that applications would be assessed against specified eligibility criteria with eligible applications to then proceed to be assessed against specified merit criteria. Consistent with the CGRGs, the guidelines for the six rounds each emphasised the importance of applications’ performance against the merit criteria. After merit assessments were completed, advice was to be provided to the decision-maker on the merit of the competing applications. The ministerial panel, as the decision-maker, had final approval on which projects would be chosen for funding.

Were relevant and appropriate eligibility requirements established?

Relevant and appropriate eligibility requirements were set out in the grant opportunity guidelines and have remained relatively consistent for each round. Key changes included: adding eligible project locations from outer metropolitan areas; strengthening the definition of an eligible applicant; and decreasing the volume of mandatory supporting documentation required for some projects. Additional eligibility requirements were used in round four to facilitate the targeting and funding of projects in drought-affected communities.

2.9 The CGRGs explain that eligibility criteria represent mandatory requirements which must be met to qualify for a grant. Eligibility criteria should be straightforward and effectively communicated in the grant opportunity guidelines to enable the selection of grantees in a consistent, transparent and accountable manner.

2.10 The guidelines for each round of the BBRF included two sections setting out the mandatory requirements that needed to be satisfied to qualify to proceed to the merit assessment stage of the grant awarding process.

2.11 The first section was titled ‘eligibility criteria’, with the guidelines for each round advising that applications could not be considered for funding if they did not satisfy all eligibility criteria. This section addressed who was eligible to apply and who was not eligible to apply.

2.12 The second section was titled ‘what the grant money can be used for’. It addressed eligible activities, eligible locations and eligible expenditure, with further detail on the latter appended to the guidelines.

2.13 While there was a high degree of consistency in the eligibility requirements for each stream across the five rounds, there were some key changes over time to entity type requirements; eligible project locations; and the supporting information applicants were required to include with their application.

  • To be eligible, applicants had to have an Australian Business Number (ABN) and be either a local governing body, or a non-for-profit organisation. Key changes were that:
    • from round two, the requirement for a not-for-profit organisation to be established for at least two years was removed; and
    • from round three, not-for-profit entities were limited to those with incorporated status.
  • As a regional grants program, projects were required to be located in an eligible regional area (as defined by the ‘remoteness classification’ of projects). This requirement was modified from round two to include as eligible for funding any projects located in:
    • outer metropolitan regions, or ‘peri-urban areas’, by redefining the geographic boundaries around excluded capital cities23; and
    • an otherwise excluded area if the project seeking funding could clearly demonstrate that significant benefits and employment outcomes would flow to an eligible area.
  • The grant opportunity guidelines for round four stated that funding under that round would be targeted at drought-affected communities. To give effect to this, additional eligibility requirements were included such that applications would be assessed as ineligible if the project was not located in a drought-affected location, or if this claim was not satisfactorily evidenced.24
  • From round two, applicants to the IP stream were no longer required to include a business case, project management plan, or risk management plan to be eligible.
  • The provision of a cost benefit analysis under the IP stream was mandatory in round one for applicants seeking over $1 million and optional for other applicants. This became mandatory for all applicants in round two but returned to being mandatory for only those applicants seeking more than $1 million from round three onwards.

2.14 While November 2021 advice from Infrastructure to the ANAO25 did not identify any issues with the construction of the eligibility criteria, Infrastructure outlined that there had been some shortcomings in the assessment against the eligibility criteria. Specifically, that one ineligible application was awarded funding in the first round, another ineligible application was not removed from consideration in the second round (although it was not awarded grant funding) and a ‘potentially ineligible’ entity received grant funding in the third round.26

Were relevant and appropriate assessment criteria established?

Four relevant and appropriate merit criteria were identified in the guidelines for each round and stream. In a section separate to the merit criteria, the guidelines for each round also included a non-exhaustive list of ‘other factors’ the ministerial panel could consider when deciding which applications to fund. The inclusion of the other factors was not consistent with the key principles for grants administration set out in the CGRGs as it had the potential to undermine the award of grant funding on the basis of merit.

2.15 Under the CGRGs framework, assessment (or merit) criteria play an important role in relation to the key principle of achieving value for money from granting activities. Assessment criteria are ‘used to assess the merits of proposals and, in the case of a competitive grant opportunity, to determine application rankings.’27

2.16 Administering entities are advised by the Department of Finance28 that all assessment criteria for a competitive grant opportunity should be contained within the assessment criteria section of the program guidelines.29 Other details that should be contained there include:

  • any ‘prioritisation criteria’ to be applied to applications in the event of oversubscription, such as location, policy priority or target groups; and
  • whether the selection process occurs in two-stages.

2.17 As reflected in Infrastructure’s responses to questions during Senate Estimates in February 202230, the assessment criteria for the BBRF comprised of two independent sets of criteria that are applied to applications at separate points in the process prior to final funding decisions. Specifically, these are the:

  • four merit criteria (see paragraph 2.18), against which eligible applications are assessed by DISR. The results from these assessments provide the basis for Infrastructure’s funding recommendations to the ministerial panel (see paragraph 3.24); and
  • ‘other factors’31, that the ministerial panel may apply to applications when deciding which projects to fund (see paragraph 2.22).

Published assessment criteria

2.18 In a section titled ‘assessment criteria’, the grant opportunity guidelines for each round and stream of the BBRF included the following four merit criteria that applications would be assessed against:

  • economic benefits the project would deliver to the region;
  • social benefits the project would deliver to the region;
  • capacity, capability and resources of the applicant to deliver the project; and
  • the impact of grant funding on the project (referred to as ‘value for money’ in the first two rounds32).

2.19 The merit criteria were appropriate having regard to the program objectives.

2.20 While the merit criteria remained largely consistent across the rounds, slight changes were made over time to the scores achievable across some criteria. Table 2.2 sets out the scores and weightings applied to each criterion across each funding round and stream.

Table 2.2: Maximum score achievable per criterion for each funding stream and round

Merit criterion

Round 1

Round 2

Round 3

Round 4

Round 5

 

 

 

 

 

IP

CI

  

 

 

Economic benefits

15

15

15

10

15

15

Social benefits

10

10

15

10

15

15

Value for moneyᵃ

5

5

5

5

5

Project delivery

5

5

5

5

5

Total score

35

35

40

30

40

40

                   

Note a: The ‘value for money’ criterion was revised to be ‘impact of grant funding’ from round three onwards.

Source: ANAO analysis of BBRF program guidelines.

2.21 When seeking approval of the round three program guidelines, Infrastructure advised the Deputy Prime Minister that equal scoring points for the first two criteria (within each stream) had been adopted to provide better alignment with program objectives. The equivalent briefing for the fourth round did not outline the basis for further amendments to the scores in round four. In April 2022, Infrastructure advised the ANAO that:

In Round Four, changes were made to merit criteria one and two, to make it clear what needed to be addressed by the applicant against these two criterion [sic]. The merit scores for the CI stream were then aligned with those of the IP stream, which enables consistent application of the merit framework when looking at similar criterion across the two streams, consistent with the principles of proportionality set out in the CGRGs.

‘Other factors’

2.22 Reflecting the expectations established by the CGRGs, the Department of Finance’s grant opportunity guidelines template does not contemplate the inclusion of broad and non-specific criteria such as ‘[any] other factors’.33

2.23 In a departure from Finance’s template guidance (see paragraph 2.16), the guidelines for each round of the BBRF have included additional criteria, or ‘other factors’, in a separate section of the guidelines.34 When considering draft guidelines for the program, Finance raised the inclusion of the other factors as an issue and asked whether they involved ‘another assessment process’ and, if so, whether applicants should be given an opportunity to provide information on them in their applications. No changes to the draft guidelines or application form were made in response to Finance’s questions.

2.24 The published guidelines stated that the other factors were to be considered by the ministerial panel at the funding decision stage. These included, but were not limited to:

  • the spread of projects and funding across regions;
  • the regional impact of each project, including Indigenous employment and supplier-use outcomes;
  • other similar existing or planned projects in the region to ensure that there is genuine demand and no duplication of facilities or services;
  • other projects or planned projects in the region, and the extent to which the proposed project supports or builds on those projects and the services that they offer;
  • the level of funding allocated to an applicant in previous programs;
  • reputational risk to the Australian Government; and
  • the Australian Government’s priorities.

2.25 In this latter respect (the Australian Government’s priorities), some additional detail was provided against this dot-point in the fourth and fifth rounds. Specifically, the:

  • guidelines for both funding streams in round four included an additional sentence that ‘round four of the program will support drought-affected locations by targeting projects that will benefit communities affected by drought’; and
  • IP stream guidelines for round five included an additional sentence that ‘round five of the program includes $100 million of funding dedicated to supporting tourism-related infrastructure projects’.35

2.26 An eighth factor was introduced in both sets of program guidelines for the sixth round underway at the time of this ANAO audit.36 Specifically, the panel may also consider the:

community support for projects, which can include support from local MPs [Members of Parliament], councils and other organisations confirming the benefits that will flow to their region, provided through information included in applications and letters of support.

2.27 The ANAO’s analysis was that, although not previously recognised in the guidelines, this factor was a key consideration across all previous funding rounds. While its wording suggests that the confirmation of community support would be evidenced through the application process, the ANAO’s analysis is that such support, particularly from parliamentarians, has been largely coordinated outside the application process by the chairs through their offices, and some members of the panel (see paragraph 4.10) through their offices. Applicants were not asked to address this or any of the other factors when submitting their application.

2.28 The extent to which the funding outcomes have been influenced by the departmental assessments against the four criteria and the ministerial panel’s application of the other factors is examined in the section starting at paragraph 5.3. For no round was DISR asked to undertake an assessment of eligible applications against any of the other factors and Infrastructure also did not conduct an assessment of each eligible application against each of the other factors.37 In April 2022, Infrastructure advised the ANAO of its view as follows:

While a list of such ‘other factors’ is included in the Guidelines, it is non-exhaustive and is not intended to form an assessment criteria under the Guidelines.38 Rather, the reference to ‘other factors’ is no more than an explanation of the Ministerial Panel’s discretion to approve or reject, for whatever proper reason the Panel considered appropriate, grants that met the eligibility and assessment criteria.

2.29 This approach to the other factors is not consistent with the key principles for grants administration set out in the CGRGs, noting that the CGRGs require that practices and procedures be in place that ensure compliance with the key principles. In this respect, similar to the recent report into grants administration by the NSW Government (see footnote 29), the strategic review that informed the development and introduction of the Australian Government’s grants administration framework identified that:

Decision-making in grant programs has been a matter of strong public interest, widespread parliamentary and audit scrutiny, and significant political contention in recent times. The reasons for this lie largely in the ‘discretionary’ nature of many grant programs, the high levels of flexibility built into application assessment procedures, and the consequent lack of transparency in many decision-making processes.39

2.30 It is important that the underlying intent of the CGRGs framework be applied in the design of programs and award of funding under grant programs subject to that framework. The inclusion of the other factors clause in the BBRF guidelines was not consistent with the key principles for grants administration set out in the CGRGs, for reasons including that:

  • applicants were not afforded the opportunity to address those factors when applying for grant funding; and
  • the introduction of the other factors in the guidelines subsequent to and separate from the merit assessment process undermined the award of grant funding on the basis of competitive and comparative merit.

Recommendation no.1

2.31 The Australian Government amend the Commonwealth Grants Rules and Guidelines to require that, in circumstances where funding decisions may be made by reference to factors that are in addition to or instead of the published assessment criteria:

  • applicants be afforded the opportunity to address those other factors as part of their application for funding; and
  • records be made as part of the decision-making process as to how each competing applicant had been assessed to perform against each of those factors.

Department of Finance response: Noted.

2.32 Any amendment to the Commonwealth Grants Rules and Guidelines is a matter for consideration by Government. The Department of Finance will brief the Government on the ANAO’s findings and recommendations.

Did the guidelines clearly set out how applications would be scored, and how the scoring results would be used?

Up to and including round four, the grant opportunity guidelines for each round set out how applications would be scored against the merit criteria and stated that the results would provide the basis for determining the most meritorious applications for funding under the program. The guidelines have not accurately reflected the significance of ministerial discretion (represented in the guidelines by the ‘other factors’ clause) as a basis for the panel’s funding decisions.

2.33 For each round, the grant opportunity guidelines identified that eligible applications would proceed to be assessed and scored against each of the published merit criteria. The guidelines for each round set out the maximum score that could be awarded against each criteria and, implicitly, the weighting that was being applied to each criterion. The economic benefits criterion was either the highest weighted, or equal highest criterion (with the social benefits criterion), in each selection process.

2.34 The guidelines for each round outlined that the assessment score awarded to each eligible application would be used to identify which applications represented value with relevant money, and those that did not. As illustrated by Table 2.3, the guidelines also outlined that the assessment scores would determine which applications would be recommended for funding approval.

Table 2.3: Grant opportunity guidelines statements on meeting the merit criteria

Round

Statement from the BBRF grant opportunity guidelines

Scoring threshold adopted to identify applications considered to be value with relevant money

1

We will only recommend funding applications that score highly against each of the merit criteria. This ensures Commonwealth funding represents value with relevant money.

60 per cent of the maximum score that could be achieved against three of the four criteria, 20 per cent against the other criterionᵃ, and 54 per cent overall.

2

3

To recommend an application for funding it must score highly against each merit criterion.

60 per cent of the maximum score that could be achieved against each of the four criteria, and 60 per cent overall.

4

As this is an open competitive merit-based program, only the highest-ranking applications will be recommended for funding.

5

We will only consider funding applications that score at least 60 per cent against each assessment criterion, as these represent best value with relevant money.

6

     

Note a: Advice from Infrastructure to the minister was that this criterion ‘assesses the extent to which an application has leveraged partner contributions beyond the mandatory co-funding requirement. All eligible applications are assessed as representing value with relevant money against this criterion having already met the eligibility requirement for co-funding (either having met the mandatory co-funding requirement or having been granted an exceptional circumstances co-funding exemption). Therefore, any score against merit criterion 3, including a score of 1, is assessed as value with relevant money’.

Source: ANAO analysis of departmental records.

2.35 The guidelines indicated that a project remoteness loading may be applied to the total assessment scores of projects to assist applicants from small and remote areas to compete against those that are in larger regional areas and better resourced. Very remote projects were to receive the highest loading while inner regional projects were to receive the lowest. Although the program guidelines for the first four rounds flagged the use of a loading, it was only applied to applications under the first two rounds.40 The effect of the loading on scores is discussed from paragraph 3.13.

2.36 Any assessment of applications against any assessment criteria, including ‘other factors’, should be transparent and systematic.41 Across all rounds, the guidelines did not set out how applications would be assessed or considered against the ‘other factors’. This adversely impacted transparency and gave applicants little insight as to how their applications were affected by the panel’s consideration of those factors, and the discretionary nature of those considerations. The impact and significance of the panel’s consideration of the other factors on the approval of applications is examined from paragraph 5.6.

Were roles and responsibilities, including for the decision-making process, clearly identified?

While the decision-making role performed by the ministerial panel was clearly identified in the published guidelines for each round, the membership of the panel was not. In addition, the division of roles and responsibilities between the two administering departments has been refined over time and changes reflected in various services schedules to a December 2016 Memorandum of Understanding (MoU).

2.37 Accountable authorities are required to put in place practices and procedures to ensure that grants administration is conducted in a manner consistent with the CGRGs. This includes program documentation that clearly defines the roles and responsibilities of the various parties involved in undertaking grants administration on behalf of the Commonwealth. These roles and responsibilities are typically outlined in the grant opportunity guidelines, where appropriate, or within departmental governance and guidance documents.42

Division of departmental responsibilities

2.38 The relevant roles and responsibilities between Infrastructure (as the program policy entity) and DISR, for the provision of Business Grant Hub services, were negotiated and then set out in a Memorandum of Understanding (MoU) signed on 19 December 2016.43 While the roles of each department have been refined over time, the general division of responsibilities between the two are outlined below.

  • Infrastructure has had responsibility for:
    • co-development and approval of all BBRF grant opportunity guidelines;
    • administration of co-funding exemptions under the first three rounds;
    • design and application of the remoteness loading for the first two rounds (see paragraphs 2.35 and 3.13);
    • providing advice on the relative merits of the grant applications to the panel;
    • providing secretariat support to the panel during its decision-making process for the first round44; and
    • providing DISR with details of successful applicants.
  • DISR has undertaken the following tasks:
    • co-developing and publishing the guidelines and associated materials;
    • receiving and assessing applications against eligibility45 and merit criteria;
    • assessing requests for co-funding exemptions from the fourth round;
    • advising applicants of the funding decisions and providing feedback when requested; and
    • developing and executing grant agreements with successful applicants.

Decision-making responsibilities

2.39 The grant guidelines for each round identified that decisions on projects to be funded would be taken by a ministerial panel, with the membership of each panel to be determined at a later point in time (membership for each round is set out at Appendix 3). The panel’s membership has not been published in the program guidelines or otherwise announced.46

2.40 The guidelines stated that the panel would make its decisions in consultation with:

  • the National Infrastructure Committee of Cabinet (NICC) in the first and second rounds; and
  • Cabinet, in the subsequent four rounds.

2.41 The ANAO’s analysis confirmed that the guidelines accurately identified the panel as the decision-maker. There was only one round in which any changes to the list of successful applicants were made through the consultation process. Specifically, two projects selected by the panel to receive funding in the second round were removed from the list of successful applicants following the consultation process.

Have lessons learned from previous rounds been effectively used to inform the design and delivery of subsequent rounds?

Infrastructure has practices in place to use lessons learned from previous rounds to inform the design and delivery of subsequent BBRF funding rounds. Notwithstanding a recommendation it agreed to in an earlier audit of the award of regional grant funding, Infrastructure has not taken appropriate steps to analyse the reasons for the panel diverging from the results of the documented merit assessments so as to inform the development of the guidelines for later rounds and/or make adjustments to the conduct of the assessment process (for example, to undertake a departmental assessment of all factors the guidelines outline that can be taken into account when funding decisions are taken).

2.42 The ANAO has examined the award of funding under a number of regional grant programs administered by Infrastructure since the introduction of the grants administration framework in 2009 (Appendix 4 provides a summary of these audits).

2.43 In November 2021, Infrastructure advised the ANAO of the approach it has taken to implementing a continuous improvement approach to its management of the BBRF. The actions taken have included conducting post-implementation reviews of each round. Various changes have been made over time to the grant opportunity guidelines, assessment work practices and the approach to providing funding recommendations to the ministerial panel.

2.44 In its advice to the ANAO, Infrastructure outlined that there had been some shortcomings in the application of the eligibility criteria which had resulted in ineligible applications being erroneously assessed as eligible in multiple rounds of the BBRF.47 As requested by Infrastructure, DISR undertook an internal assurance process of the round five eligibility assessments prior to the assessment results being provided for consideration by the ministerial panel. Infrastructure advised the ANAO that ‘while the report did not identify any systemic or significant issues, it did identify potential issues with the eligibility of a number of applications.’48 This differed from DISR’s view, which was that the report had not identified any eligibility issues affecting the applications provided to the panel.

2.45 Infrastructure advised the ANAO that it has also undertaken some ‘targeted continuous improvement activities’ over the past 12 to 18 months. It drew particular attention to work it has undertaken to improve the reliance it is able to place on eligibility assessments undertaken by DISR (see paragraph 2.49). Infrastructure also outlined the following broader improvements:

  • November 2020: establishment of a Regional Initiatives Implementation Office, as the program management office function for the Regional Development, Local Government and Regional Recovery Division;
  • December 2020: endorsement of the program implementation framework for the division was endorsed, and approval of the project management office roles and functions;
  • February 2021: a number of foundational project management documents were endorsed by the Regional Initiatives Governance Board, the board’s terms of reference were refreshed, and program governance and management roles and responsibilities were clarified. In addition, revised program status report templates were developed, to enable the board and departmental executive to quickly understand the level of risk inherent in individual programs at any given time, and barriers to program delivery;
  • March 2021: the Regional Initiatives Implementation Framework was finalised, and a phased governance re-set was undertaken for divisional programs with the approach to the BBRF endorsed in April 2021; and
  • July 2021: the board considered the Regional Programs Branch risk and probity plans. A probity adviser was to be appointed to review the branch probity plan as well as to develop a divisional probity plan.49

2.46 Infrastructure also advised the ANAO in February 2022 that after each round of BBRF a review of guidelines was conducted and a number of changes have been made throughout the life of the program. This included some changes to the merit criteria and adjustments to the assessment criteria framework.

2.47 Infrastructure agreed to a recommendation in the ANAO audit of the Regional Jobs and Investment Packages (RJIP)50 that:

In circumstances where there is a high incidence of grant decision-makers indicating disagreement with the assessment and scoring of applications against published criteria, the Department of Infrastructure, Transport, Cities and Regional Development require, in consultation with any service provider, that assessment practices and procedures be reviewed to identify whether there are any shortcomings and, if appropriate, make adjustments.

2.48 Similar to what was observed in the audit of that program, for earlier rounds of the BBRF there was a high incidence of funding recommendations not being agreed to by the panel. In more recent rounds Infrastructure has become less precise in its funding recommendations, with practices similar to those observed in ANAO audit reports of the predecessor programs, the Regional and Local Community Infrastructure Program and the Regional Development Australian Fund, rather than continuing with the improvement evident in the ANAO’s audit of the National Stronger Regions Fund (see paragraph 2.42). Specifically, the ANAO identified a return to broadbanding of applications rather than individual merit rankings and that the department would pool applications for the decision maker to select from, rather than recommending which particular applications should be awarded grants from within the available program funding. The Joint Committee of Public Accounts and Audit, in its review of those earlier ANAO audit reports, supported the ANAO’s findings and recommendations (see Appendix 4).

2.49 In the more recent BBRF rounds, applications assessed to have performed best against the published assessment criteria have not been the most successful with the decision-making records indicating a greater reliance on the ‘other factors’ (see the section starting at paragraph 5.3). Even so, there has been no consequential adjustment to either the criteria in the grant guidelines or the assessment process. In April 2022, Infrastructure advised the ANAO that:

Given the issues raised by the BBRF Ministerial Panel about the Department’s assessment of certain project applications against the assessment criteria for rounds one and two, and in response to lessons being learned through the Yarrabee Consulting – Assurance Review of the Regional Jobs and Investment Packages Program, the Department worked with DISER to review and update the documents forming part of the BBRF Assessment Framework. The Department conducted a quality assurance check on a sample of assessments during the merit assessment process, and DISER used a separate team of experienced staff to undertake quality assurance checks in real time, with a view to achieving a better quality of assessments during the merit assessment process; thereby ensuring assessments complied with agreed procedures. More recently, the Department has worked with DISER to implement external assurance of assessment results, prior to them being submitted to the Ministerial Panel for consideration.

3. Funding recommendations

Areas examined

The ANAO examined the advice provided to the ministerial panel by the Department of Infrastructure, Transport, Regional Development, Communication and the Arts (Infrastructure) to inform the funding decisions under the first five rounds of the Building Better Regions Fund (BBRF).

Conclusion

Infrastructure provided appropriate funding recommendations based on merit assessment results for three of the five rounds that have been completed. This was not the case for the third and fifth funding rounds. In those two rounds, rather than clearly identifying which applications should be successful up to the limit of the available funding, Infrastructure recommended the panel select from a ‘pool’ of projects. The total value of those projects was significantly in excess of the funding available to be awarded (more than double the funding available in the third round, and more than triple the funding available in the fifth round). The significance of the changes made to the department’s approach was that, for the fifth round, there would be no circumstance in which any of the 528 applications seeking $761.4 million in grant funding that had been assessed as representing value with relevant money would need to be reported to the Finance Minister (in accordance with the Commonwealth Grants Rules and Guidelines) as an approval of an application that the department had recommended be rejected.

Areas for improvement

The ANAO has recommended that the Commonwealth Grants Rules and Guidelines (CGRGs) be amended to strengthen the requirements relating to written advice on funding recommendations.

3.1 Section 71 of the Public Governance, Performance and Accountability Act 2013 (PGPA Act) states that ‘A Minister must not approve a proposed expenditure of relevant money unless the Minister is satisfied, after making reasonable inquiries, that the expenditure would be a proper51 use of relevant money’.

3.2 Departmental advice, if agreed to by the minister, can constitute those reasonable inquiries.52 Within the context of the CGRGs, that advice must include, amongst other things, the merits of the proposed grant or grants relative to the grant opportunity guidelines and the principle of achieving value with relevant money.

Were decision-makers provided with funding recommendations in written briefings and submissions?

Decision-makers were provided with funding recommendations in written briefings and submissions in each round by the Department of Infrastructure. The guidelines outlined that Cabinet would be consulted on the funding decisions, and written submissions to Cabinet were prepared for each round.

3.3 In addition to the general requirement that practices and procedures be in place to ensure that the conduct of grants administration is consistent with the grants administration principles, when providing written advice on the merits of grants, paragraphs 4.6 and 4.7 of the CGRGs require officials to:

  • advise on the merits of the proposed grant or grants relative to the grant opportunity guidelines and the key principle of achieving value with relevant money. In relation to this obligation, the CGRGs explain that the basis for recommending or rejecting each proposed grant should be set out in the assessment material and should reflect the particular merits of each potential grant in terms of the grant opportunity guidelines, including assessment against the published criteria; and
  • at a minimum, indicate which grant applications: fully meet the selection criteria; partially meet the selection criteria; and do not meet any of the selection criteria. Specific recommendations regarding grant applications for approval can be in addition to this information (the department’s approach to ranking applications is discussed from paragraph 3.8).

3.4 Consistent with the CGRGs, Infrastructure prepared written briefing material to inform deliberations and decision-making by the ministerial panel. As set out in Table 3.1, while for some rounds a separate briefing and/or submission was provided for each stream, it was more common for a consolidated briefing and subsequent submission to be prepared that covered both funding streams.

3.5 Following the panel’s deliberations, Infrastructure prepared further written briefing material for the panel chair to consult with Cabinet on funding decisions, as required by the grant opportunity guidelines for each round. The consultation process was coordinated to comprise decisions for both funding streams for all but the first round. Changes to funding decisions as a result of this process occurred only in round two (see paragraphs 2.39 to 2.41).

Table 3.1: Written funding recommendation briefings and submissions

Round

Briefings to ministerial panel chair

Submissions to Cabinet

 

Infrastructure projects stream

Community investments stream

Infrastructure projects stream

Community investments stream

1

2 June 2017

14 July 2017

18 July 2017

22 August 2017

2

21 May 2018

12 June 2018

3

11 February 2019

15 February 2019

26 February 2019

4

15 April 2020

27 May 2020

5

27 August 2021

27 September 2021

         

Source: ANAO analysis of departmental records.

3.6 Each of the briefing packages were submitted and logged through the established electronic system to the respective chair of each panel.53 The packages each consisted of a covering brief with between eight and 11 separate attachments (including all merit assessment summaries for each of the eligible applications), meeting in full most of the content requirements set out in paragraphs 4.6 and 4.7 of the CGRGs. Specifically, the written briefings:

  • clearly identified that the spending proposals related to grants;
  • provided information on the requirements of the PGPA Act and the CGRGs, including the ministerial reporting obligations and the legal authority for the grants;
  • outlined the application and selection process followed, including the selection criteria that were used; and
  • identified the extent to which eligible applications had been assessed to meet the merit criteria, and applied the assessment results54 to rank eligible applications in bands (see paragraph 3.12).

3.7 Infrastructure also provided the chair, through his or her office, supplementary advice on projects before and after providing the briefing materials. This included early project-specific information before and during Department of Industry, Science and Resources’ (DISR’s) application assessment process (examined from paragraph 4.6)55; and materials at the decision-making stage to assist the ministerial panel’s assessment of projects against the ‘other factors’56 set out in the program guidelines. The latter was prepared for at least two of the five concluded funding rounds as follows:

  • in round one, departmental officials attended three of the four ministerial panel meetings held in June and August 2017.57 Ahead of the first meeting, Infrastructure prepared material separate to its written briefing, detailing which applications involved, amongst other things: saleyards; airports; roads or bridges projects; and
  • in round three, Infrastructure provided information on individual projects to the chair through his office in late January 2019. This material was hand-delivered on a USB device 12 days in advance of the department’s written briefing.58 Infrastructure did not keep a full record of the contents of the USB drive that was delivered on 30 January 2019.59

Were the results of the assessment process used to rank competing, eligible applications?

While the approach differed across rounds, the results of the assessment process were used to place competing, eligible applications that had met the published criteria to a minimum standard into orders of merit that:

  • for rounds one and two, individually ranked almost all applications by overall merit assessment score from highest scoring to lowest; and
  • from round three onwards, ranked applications in groups, or ranking bands, of between one and 61 applications by overall merit assessment score from highest scoring to lowest.

3.8 Under the CGRGs framework, selection (or assessment) criteria play an important role in relation to the key principle of achieving value with relevant money from granting activities. The CGRGs outline that selection criteria are used to assess the merits of proposals and, in the case of a competitive grant opportunity, to determine application rankings.60 Funding recommendations to decision-makers are required to address the merits of applications in terms of the criteria included in the grant opportunity guidelines.

3.9 For competitive, applications-based grant programs administered by Infrastructure that have been examined by the ANAO, on some occasions eligible applications have been individually ranked based on their assessment scores and for some other programs eligible applications have been collectively ranked in groups or bands. For example, for the Regional Development Australia Fund, the approach in later rounds under that program involved the use of ranking bands rather than individual rankings that had been used in the first round. In the November 2014 performance audit of the third and fourth rounds of that program61, the ANAO concluded that the use of ranking bands:

represented a marked decline in the degree of differentiation provided to the Minister compared to the first funding round and was not particularly helpful to inform the Minister’s decision-making.

3.10 Similarly, for the BBRF, the department’s approach changed over time from ranking eligible applications individually in the first two rounds to an increasing extent of bands (or ‘pooling’) being adopted, particularly in the third and fifth rounds.

3.11 As illustrated by Figure 1.1, applications assessed as eligible proceeded to the merit assessment stage.62 Each application was assessed against the four assessment criteria set out in the program guidelines and an overall merit score determined (see paragraph 2.18). The implementation of a loading, added to overall assessment scores in rounds one and two for outer regional and very remote areas, resulted in the generation of unique total assessment scores across almost all projects (see paragraph 2.35). From round three, the loading was no longer applied, and unique total scores were therefore not generated. For example, in round five up to 61 applications each shared the same total score.

3.12 Infrastructure’s approach to ranking projects in rounds one and two involved sorting applications that had been assessed as representing value with relevant money into three categories (for each funding stream) based on total project values.63 A separate order of merit was developed for each category by sorting projects in descending order by total score. Projects were then ranked by assigning a ranking place for each unique score (see Table 3.2).

Table 3.2: Quantity of projects recommended for funding per ranking point for the infrastructure project stream (across rounds up to the 20th place)

Ranking point

Round 1

Round 2

Round 3

Round 4

Round 5

 

Under $1m

$1m to $5m

Over $5m

Under $1m

$1m to $5m

Over $5m

 

 

 

1

1

1

1

1

1

1

1

2

2

2

1

1

1

1

1

1

2

1

5

3

1

1

1

1

1

1

3

8

13

4

1

1

1

1

1

1

2

13

15

5

1

1

1

1

1

1

3

20

27

6

1

1

1

1

1

1

8

32

38

7

1

1

1

1

1

1

1

36

8

1

1

1

1

1

1

10

43

9

1

1

1

1

2

1

9

59

10

1

1

1

1

1

1

17

58

11

1

1

1

1

1

1

12

61

12

1

1

1

1

1

1

19

46

13

1

1

1

1

1

1

24

24

14

1

1

1

1

1

1

31

15

1

1

1

1

1

1

16

1

1

1

1

1

1

17

1

1

1

1

1

1

18

1

1

1

2

2

1

20

1

1

1

1

1

1

                   

Note: Applications in rounds one and two were presented in three orders of merit according to total project value, which extended beyond the 20th ranking position (from a total of 24 ranking places for round two projects over $5 million and up to 54 for round two projects between $1 million to $5 million).

The illustration above relates only to the applications in the ‘Value with Relevant Money – Recommended for Funding’ (for the first four rounds) and the ‘Value with Relevant Money’ (for round five) pools of projects. Applications in other categories, including the ‘Not Value with Relevant Money’ pool were also ranked using a similar approach.

Source: ANAO analysis of departmental records.

3.13 From the third round, Infrastructure stopped applying the remoteness loading to the total scores of applications, and no longer split projects into three separate orders of merit based on their project values. Instead, one list of applications representing value with relevant money was generated for each funding stream. These applications were sorted in descending order by their aggregate assessment score, resulting in the generation of ranking bands with up to 61 applications sharing a ranking position.

3.14 Infrastructure’s use of these orders of merit and ranking bands in formulating its funding recommendations is discussed from paragraph 3.24.

Did the department transparently assess, and provide clear advice to decision-makers, on whether any exemption requests permitted under grant program guidelines should be granted?

For the third, fourth and fifth rounds, applicant requests to be exempted from the requirement to include partnership funding were assessed and clear recommendations were provided to the ministerial panel as to whether those requests should be granted. An assessment of these requests was not undertaken in the first two rounds. Without recommending which of the requests should be approved by the panel in those rounds, Infrastructure provided the panel with details of the applications seeking exemptions and recommended that the chair record which of those exemption requests the panel had approved. Across all five rounds, all applicants that sought an exemption were approved by the respective ministerial panels, irrespective of any departmental recommendations.

3.15 The ANAO audit of the Regional Jobs and Investment Packages (RJIP)64 found that co-funding exemption requests for that program were not appropriately considered. Under that program, Infrastructure did not assess each exemption request against the program guidelines, nor did it provide a recommendation on each request to the RJIP panel that was consistent with any assessment that was undertaken. The department agreed to an ANAO recommendation that it transparently assess any exemption requests permitted under grant program guidelines and provide clear advice to decision-makers on whether they should be granted, and why.65

3.16 The grant opportunity guidelines for each round and funding stream of the BBRF:

  • included an eligibility requirement setting out the amount of co-funding required for an application to be eligible (less co-funding was required for projects in a location classified as remote or very remote); and
  • identified that the extent to which an application included co-funding would impact on the assessment score achieved against one of the merit criteria.66

3.17 The guidelines also outlined that applicants were able to seek an exemption to these cash contribution requirements in ‘exceptional circumstances’. Applicants seeking an exemption were required to submit a supporting case including evidence demonstrating:

  • the exceptional circumstances being experienced;
  • how the exceptional circumstances were preventing the applicant from meeting the co-funding requirements; and
  • the applicant’s capability to maintain and fully utilise the project funding.

3.18 The guidelines stated that exemption requests would be approved in ‘very limited circumstances’ and that applications approved for an exemption to the co-funding requirement would still need to be assessed against the merit criteria (and that, where an exemption request was not approved, the application would not proceed to the merit assessment stage).

Assessment of co-funding exemption requests

3.19 The responsibilities for the administration of the co-funding exemption requests in each round have been shared between Infrastructure and DISR. Infrastructure has been consistently responsible for managing the process of seeking approvals for these from the ministerial panel, following receipt of the co-funding requests and associated supporting evidence in each round from DISR.

3.20 The responsibility for assessing the requests was not defined in the services schedule67 until the fourth round, when DISR assumed this responsibility. Prior to this, it was not clearly recorded in the schedule as to whether assessments of these requests were to be undertaken. Rather, the following relevant arrangements were in place.68

  • DISR was to provide Infrastructure with:
    • a list of applicants who have requested a co-funding exemption; and
    • each applicant’s application and supporting documentation.
  • Infrastructure was to provide DISR with the list of applications which have been granted a co-funding exemption by the ministerial panel.

3.21 Assessments of co-funding requests were not conducted by either department in the first two funding rounds. Concurrent with eligibility assessments, DISR collated and provided the co-funding requests and associated supporting evidence to Infrastructure to allow it to seek the required approvals from the ministerial panel. Reflecting that the department was not required by the ministerial panel to assess the requests, Infrastructure’s briefing to the chair of the ministerial panel for co-funding exemptions in the second round stated:

Given there are no assessment criteria, and following consultation with your Office, we have not assessed the exemption requests. However, we note some of the requests provide limited information on why the exemption should be granted. The more exemptions agreed by the ministerial panel will reinforce applicant expectations they will always be agreed.

3.22 As a percentage of the population of applications and consistent with Infrastructure’s advice, requests for co-funding exemptions have increased from six per cent in the first round to 14 per cent in the fifth round. Appendix 5 summarises the co-funding exemption requests and approvals for each round.

3.23 An ‘assessment framework’69 for the exemption requests was developed by Infrastructure during the third round. Infrastructure (in round three) and DISR (from round four) used a checklist template to record whether applicants had met the evidentiary requirements set out in the program guidelines for a co-funding exemption. While advice to the ministerial panel since the third round was informed by these processes and included recommendations as to which requests should be granted, all applicants that sought an exemption were approved by the respective ministerial panels irrespective of the department’s recommendation. The chairs of the panel for the fourth and fifth rounds recorded rationales for approving all requests. Specifically, all requests were approved:

  • on the basis of economic challenges in addition to existing challenges brought about by drought, fire and flood for round four; and
  • due to the continuing impact of COVID-19 in round five.

Were the recommended applications those identified as being the most meritorious eligible applications up to the limit of the available funding?

For the first, second and fourth rounds, Infrastructure clearly recommended that the ministerial panel approve specific applications up to the limit of the available funding. Consistent with the published guidelines, the department recommended the panel approve the 591 highest ranked applications across those three rounds up to the aggregate value of $661 million. In a departure from the competitive and merits-based process outlined in the program guidelines, Infrastructure’s recommendations for the third and fifth rounds were presented as ‘pools’ of candidates for the panel to choose from and, while sorted in descending order based on their total assessment scores, the department did not recommend that the highest scored and ranked applications be chosen as successful. Specifically, in:

  • round three, there were 306 ‘recommended’ projects seeking grants of $426 million put forward by Infrastructure for the panel to choose from, notwithstanding that there was only $205 million available; and
  • round five, there were 528 ‘recommended’ projects seeking grants of $761 million put forward by Infrastructure for the panel to choose from, compared with the $200 million the guidelines had said was available and the $300 million subsequently made available.

3.24 While the precise form of words varied across rounds, the published grant opportunity guidelines for the BBRF have consistently advised that because the program is an open and competitive merit-based program, only the highest-ranking applications would be recommended for funding (see Table 2.1). Eligible applications were required to achieve a score of at least 60 per cent70 of the maximum that was achievable against each merit criterion in order to be assessed as ‘value with relevant money’ (see paragraph 3.1).71

3.25 In each round, Infrastructure categorised and colour-coded applications based on their final assessment results. For the first four rounds, the assessment scores were used to separate the applications assessed to be ‘value with relevant money’ into ‘recommended’ applications (those with the highest scores and colour-coded green) and those that were not recommended for the award of funding (colour-coded orange or red). The approach adopted for the fifth round is discussed from paragraph 3.31.

3.26 While the categories, or ‘pools’, remained relatively consistent across funding rounds, the way they have been used to form the basis for Infrastructure’s advice on the merits of applications to the ministerial panel varied across rounds. These categories are summarised in Table 3.3.

Table 3.3: Application categories referred to in Infrastructure’s advice to the ministerial panel

Application category or ‘pool’

Description

Colour code

Value with relevant money — recommended for funding

(Category not used in round five)

Used in the first four rounds, this category represented the:

  • highest-ranking and most meritorious applications in the context of the published merit criteria; and
  • cohort of applications the department recommended to the ministerial panel for funding.

Green

 

Value with relevant money

 

Applications that had achieved or exceeded the minimum score required across all individual merit criteria. The department advised the panel that applications in this category were considered either:

  • value with relevant money, but not recommended for funding (in rounds one to four); or
  • ‘suitable for funding’ and therefore recommended for selection by the panel (in round five).

Orangeᵃ

 

Not value with relevant money

Applications that did not meet the minimum score threshold against one or more individual criteria and were not recommended for funding. Applications in this category were attached as a ranked list in the department’s advice to the ministerial panel in all five concluded rounds.ᵇ

Red/Pink

Ineligible

Applications not assessed against the merit criteria. Applications in this category were attached as a non-ranked list in the department’s advice to the ministerial panel.

N/A

Withdrawn

Applications withdrawn at the request of the applicant at any stage between close of applications and final funding decisions.ᶜ Applications in this category were quantified in the covering brief but were not detailed in attachments.

     

Note a: The orange colouring for this category was used in the second, third and fourth rounds. It was coloured green in the fifth round and only applications that were recommended for funding were coloured in the first round.

Note b: Applications in this list in the fifth round were not assigned rankings but were sorted by total assessment score.

Note c: Three projects that were selected for funding by the ministerial panel withdrew their applications approximately nine days after the panel’s deliberations and six days prior to the required consultation with Cabinet.

Source: ANAO analysis of departmental records.

3.27 Principles-based frameworks do not prescribe detailed steps that must be complied with.72 Rather, the aim is to provide an overarching framework that guides and assists entities to develop an appreciation of the core goals of the regulatory scheme.73 The CGRGs provide a largely principles-based framework, with some specific requirements.74 For example:

  • the CGRGs require accountable authorities and officials to have regard to the seven key principles for grants administration and put in place practices and procedures to ensure that the conduct of grants administration is consistent with those principles75; and
  • the specific requirement under paragraph 4.12 of the CGRGs for ministers, including senators, to report annually to the Finance Minister (in all instances where they have made a decision to approve a particular grant which the relevant official has recommended be rejected76) aims to enhance transparency and accountability around funding decisions. That is, this requirement is directly linked to the key principles of ‘governance and accountability’ and ‘probity and transparency’.

3.28 In this latter respect, and in the context of the BBRF as an open and competitive, merits-based program, the preconditions for achieving transparent and accountable funding decisions as envisaged by the CGRGs involve:

  • officials having appropriate regard to the principles across all aspects of the grant opportunity (rather than compliance solely with the letter of paragraphs 4.6, 4.777 and 4.12 of the CGRGs); and
  • consistent with this, ministers receiving clear and unambiguous departmental funding recommendations.

3.29 Open and competitive grants programs are often oversubscribed and are designed with a focus on officials using the results of the merit assessment process to specify and clearly recommend to approvers:

  • their assessment of the most meritorious applications for funding, up to the funding limit available; and
  • applications that should be rejected for reasons including: ineligibility, not being value with relevant money; or being assessed as meritorious but not able to be funded within the available funding limit or over other applications that were assessed as more meritorious under the published guidelines.

3.30 Up to and including the fourth round, Infrastructure consistently recommended that the ministerial panel approve specific projects listed in its green ‘recommended for funding’ category. With the exception of the infrastructure projects stream in the third round (discussed at paragraph 3.31), this category represented a clear funding recommendation that identified which applications were the most meritorious up to the limit of funding available. This is reflected in Table 3.4 and Table 3.5.

Table 3.4: Number of applications per assessment category for the infrastructure projects (IP) streams per funding round

Infrastructure Projects

Value with Relevant Money — Recommended for Funding

Value with Relevant Money

Not Value with Relevant Money

Ineligible

Approved by the ministerial panel

 

#

$m

#

$m

#

$m

#

$m

#

$m

Round 1

101

$234.5

159

$321.0

158

$257.4

119

$170.7

110

$219.5

Round 2

108

$209.0

372

$793.2

21

$33.0

37

$67.6

136

$208.2

Round 3

142

$419.1

301

$572.2

94

$244.3

71

$194.3

166

$197.4

Round 4

76

$203.5

123

$183.2

130

$537.8

41

$101.9

109

$205.0

Round 5

427

$755.3

260

$612.8

96

$213.5

197

$294.0

                     

Note: The funding available under the fifth round was increased from $200 million to $300 million in October 2021.

Source: ANAO analysis of departmental records.

Table 3.5: Number of applications per assessment category for the community investments (CI) streams per funding round

Community Investments

Value with Relevant Money — Recommended for Funding

Value with Relevant Money

Not Value with Relevant Money

Ineligible

Approved by the ministerial panel

 

#

$m

#

$m

#

$m

#

$m

#

$m

Round 1

131

$6.2

56

$4.9

201

$42.3

147

$6.9

Round 2

121

$5.8

48

$2.4

8

$0.3

92

$6.8

109

$4.4

Round 3

164

$6.9

52

$2.8

84

$19.4

164

$6.9

Round 4

54

$2.1

37

$4.2

74

$10.4

54

$2.1

Round 5

101

$6.1

34

$4.2

149

$24.6

101

$6.1

                     

Note: The funding available under the fifth round was increased from $200 million to $300 million in October 2021.

Source: ANAO analysis of departmental records.

3.31 Indicating inadequate regard for, and departures from, the key principles for grants administration under the principles-based CGRGs framework, Infrastructure made changes to the way it presented its funding recommendations to the ministerial panel in rounds three and five. Specifically, and as illustrated by Figure 3.1:

  • under the third round, Infrastructure recommended that the ministerial panel ‘consider and agree to select’ IP stream projects to the value of up to $198.5 million ‘from the pool of 142 recommended projects’, seeking a total of $419.1 million, or twice the amount of funding available to be awarded; and
  • in round five, Infrastructure stopped separating applications into ‘recommended’ and ‘not recommended’ pools (see Tables 3.4 and 3.5).78 Rather, it recommended that the panel approve for funding any of the 528 applications that were eligible and met the 60 per cent score threshold required to be assessed as ‘value with relevant money’. Those 528 applications had sought $761.4 million, substantially more than the $200 million the program guidelines had said was available to be awarded and the $300 million subsequently made available (see Figure 3.1).79

3.32 The significance of the changes made to Infrastructure’s approach in round five is that there would be no circumstance in which any of the 528 applications seeking $761.4 million in grant funding that had been assessed as representing value with relevant money would need to be reported to the Finance Minister as an approval of an application that the department had recommended be rejected. These reporting requirements are discussed from paragraph 5.12.

3.33 Infrastructure advised the ANAO in April 2022 that its approach ‘better enables the Department to meet Government objectives for this program (e.g. in enabling the exercise of Ministerial discretion including in relation to factors that Ministers are better placed to make judgements on)’.

Figure 3.1: Aggregate value of applications recommended for funding in each round

 

Note a: The total funding awarded in round five was increased in October 2021, after Infrastructure had provided its recommendations to the panel, by $100 million from the $200 million announced in the grant opportunity guidelines.

Source: ANAO analysis of departmental records.

3.34 As outlined at paragraph 1.6, a recent review of grants administration in the New South Wales Government made 19 recommendations, the first of which was for the issue of a grants administration guide based on the principles set out in the CGRGs. While the briefing requirements in the grants administration guide developed as part of that review were similar to those included in the CGRGs they were more explicit in certain respects by requiring that the recommended grantees, and the proposed funding amounts be identified.

Recommendation no.2.

3.35 The Australian Government amend the Commonwealth Grants Rules and Guidelines to strengthen the written advice prepared for approvers on the merits of a proposed grant or group of grants by requiring that advice to include a clear and unambiguous funding recommendation that:

  • identifies the recommended applications that have been assessed as eligible and the most meritorious against the published assessment criteria; and
  • does not recommend applications for an aggregate value of grant funding that exceeds the total amount available for the particular grant opportunity.

Department of Finance response: Noted.

3.36 Any amendment to the Commonwealth Grants Rules and Guidelines is a matter for consideration by Government. The Department of Finance will brief the Government on the ANAO’s findings and recommendations.

4. Funding decisions

Areas examined

The ANAO examined whether the funding decisions were informed by departmental advice and then appropriately documented, as required by the Public Governance, Performance and Accountability Act 2013 and the Commonwealth Grants Rules and Guidelines (CGRGs).

Conclusion

The decisions about the award of grant funding across each of the five funding rounds were not appropriately informed by departmental advice, particularly with respect to the third and fifth rounds. The basis for the funding decisions has not been appropriately documented, particularly in the three most recently completed rounds. It was common in the infrastructure projects (IP) stream (the larger of the two streams) for the approved applications to not be those assessed as best meeting the published criteria and for applications assessed as highly meritorious against the published criteria to not be awarded funding. There were inadequate records of the inputs to the decision-making process and the basis for decisions to approve projects assessed as less meritorious against the four published merit criteria and to reject applications assessed as having higher merit.

Areas for improvement

The ANAO made one recommendation to Infrastructure aimed at more transparent and accountable decision making when awarding grant funding. The ANAO has also recommended that the CGRGs be amended to address situations where stakeholders such as parliamentarians play a role in the assessment and award of grant funding.

4.1 Within the legislative framework governing decisions to spend public money, the CGRGs contain a number of specific decision-making requirements in addition to the overarching requirement that practices and procedures be in place to ensure grants administration is conducted consistent with the seven key grants administration principles. Consistent with those principles, the ANAO examined the conduct of the decision-making process including the records of that process.

Were candidates identified in the way outlined in the program guidelines?

All applications approved under the five concluded funding rounds were submitted through the open processes outlined in the program guidelines. With one exception, only those applications assessed as eligible were then assessed against the merit criteria and awarded grant funding.

4.2 Consistent with the program guidelines, all applications were submitted online and logged through the Department of Industry, Science and Resources’ (DISR) Business Grants Hub portal.80 Alongside settling any requests for extensions to the deadline, all applications were then registered and assessed for eligibility by DISR, with those that were eligible proceeding to be assessed against the merit criteria.

4.3 The program guidelines for each round have consistently outlined that ineligible applications would not proceed to merit assessment. There was one exception to this in the first round on 15 June 2017, when DISR was asked to assess the merits of an application it had previously assessed as ineligible. This request was made by Infrastructure on behalf of the ministerial panel, the day after the panel had met to make funding decisions and 17 days after DISR had delivered the final merit assessment results. This application was approved for funding of $6 million on 19 July 2017 and, as of November 2021, was one of seven projects yet to be completed from the first round.81 Following a letter and public request from the applicant in January 2022, the Deputy Prime Minister approved a variation to the completion date for the project from 30 April 2022 to 30 June 2023.

Was the funding approval process conducted in accordance with the published grant opportunity guidelines?

Approval processes have varied over time and have not been conducted fully in accordance with the published grant opportunity guidelines. Consistent with these guidelines, funding approvals for each round were provided by the chair of the panel after consultation with Cabinet. Including the ‘other factors’ in the guidelines was inconsistent with the intent of the CGRGs and eroded the role of the published merit assessment process by enabling and introducing discretion to decision-makers with little or no transparency. In addition, the guidelines did not provide for the panel’s consideration of applications against the other factors prior to it receiving Infrastructure’s recommendations or, as was the case in the third round, in parallel with DISR’s merit assessment process.

4.4 The program guidelines outlined that the ministerial panel, in consultation with Cabinet, would decide which grants to approve, after taking into account departmental recommendations and the availability of grant funds. It was also open to the panel in each round to apply a range of ‘other factors’, to its funding decisions (see paragraph 2.17).82 As set out at paragraphs 2.22 to 2.30, the approach to the ‘other factors’ is not consistent with the key principles for grants administration set out in the CGRGs.

4.5 Any assessment of applications, including against the other factors, should be well-documented, transparent and systematic. This enhances, among other things, confidence in the grant opportunity outcomes and grants administration processes, for both stakeholders and the public.83 It is also necessary for the administration of the BBRF to comply with the key principles for grants administration set out in the CGRGs (as required by paragraph 6.3 of the CGRGs).

Early application details

4.6 It was not outlined in the guidelines that Infrastructure would provide panel members with early project information before providing its funding recommendations (see paragraph 3.7). This resulted in some decision-making processes commencing before being informed by departmental advice and, for at least one round, in parallel with the departmental merit assessment process. This is an approach that poses risks in relation to the key principle of probity and transparency and the CGRGs requirement that funding decisions be taken after being informed by departmental advice on the merits of competing applications.

4.7 This early information was provided to the chairs through their offices in all but the fourth of the five concluded funding rounds (see Appendix 7). The early information did not include provision of the actual applications submitted. In almost all cases it included the key characteristics of the applications such as the applicant, amount sought, project name, location — and when later in the process — merit assessment scores. The exception was under round three, where Infrastructure also provided the chair through his office with early copies of its order of merit, all individual application summaries and a one-page summary of ‘sensitive projects’ 12 days in advance of its funding recommendations (see paragraph 3.7). In response to the ANAO’s request for a rationale for the provision of this early information, Infrastructure advised the ANAO in April 2022 that:

The appropriate role of the public service is to support the Government of the day, including by providing the Minister with factual information when requested.

4.8 Appendix 6 illustrates the time between the panel’s receipt of early information, provision of Infrastructure’s funding recommendations, and the panel’s deliberations for each round. The earliest provision of this information occurred in the first three rounds, with the most extensive use of the details evident in the third round (discussed at paragraph 4.12).

Assessment of applications against the ‘other factors’

4.9 While the CGRGs (at paragraph 2.8) identify the assessment of potential grantees as a key grants administration process, there was no documentation (by the departments) setting out how it was planned to assess the other factors including what sort of evidence would be sought and from whom/where. The ANAO’s analysis was that the panel members, through their offices, assessed applications against factors other than the four published merit criteria. With some variation across the rounds, these activities involved the following elements.

  • For all rounds, collating input for and analysis of applications against various criteria, including the:
    • spread of projects and funding across regions based on federal electorate;
    • funding allocated to applicants under other Commonwealth grant programs; and
    • confirming community support for projects, with an emphasis on support from local parliamentarians.
  • Based on the above analysis, preparing funding recommendations for distributing to the broader ministerial panel ahead of deliberations (except for the first and third rounds).
  • Providing secretariat support before, during and after panel deliberations from the second round onwards (discussed at paragraph 4.24).

4.10 The chair, through his or her office, had primary responsibility for undertaking these activities, with assistance provided by other panel members’ through their offices during the third and fourth rounds.84 Aided by the early application information provided by Infrastructure, ministers through their offices collated electorate details and parliamentarians’ preferences for those applications submitted from within their constituencies.

4.11 Except for round one, this information formed the basis for application shortlists, or ‘revised’ recommendations to inform the panel’s funding decisions.85 For the second, fourth and fifth rounds, these were provided to panel members through their offices up to three days before the panel meetings and were largely developed after receiving Infrastructure’s briefings. Specifically:

  • for round two, Infrastructure’s written funding recommendations were provided on 11 May 2018, with a 25 May 2018 shortlist considered by the panel on 28 May 2018;
  • for round four, the written departmental briefing was provided on 15 April 2020 and the shortlist, provided in two parts on 28 and 30 April 2020, was considered by the panel over two respective sessions on 29 April and 1 May 2020; and
  • for round five, the written departmental briefing was received on 27 August 2021 and the 15 September 2021 revised recommendations were provided that morning by the chair to other panel members through their respective offices before the meeting that afternoon.

4.12 Shortlist development was most prominent in the approval process for the IP funding stream in the third round. This was largely due to its development commencing earlier in the process and the cancellation of the panel’s planned meeting. Development of a shortlist commenced 12 days before the results of the merit assessment process were provided to the chair through his office, and over three weeks before Infrastructure provided its written funding recommendations.86

4.13 The ANAO examined the progressive development of the shortlist before and after the chair of the panel, through his office, had initially received Infrastructure’s recommendations on 11 February 2019. Figure 4.1 illustrates that IP applications identified on all iterations of the shortlist were more likely to be approved for funding than those that were recommended for funding by Infrastructure.

Figure 4.1: Round three funding outcomes for IP applications listed on each version of the ministerial panel shortlist compared with applications scored highly and recommended by Infrastructure for funding on 11 February 2019

Note: The first shortlist for 14 February 2019 represented the cohort of IP applications the ministerial panel was to consider that evening. The meeting was cancelled due to some members being unavailable and was not rescheduled for a later date.

Source: ANAO analysis of departmental records.

4.14 The chair’s approval on behalf of the ministerial panel for the final shortlist was obtained out-of-session on 20 February 2019. Aside from this, and the Cabinet consultation required by the program guidelines, there was no other involvement of panel members evident in the records examined by the ANAO for the third round.87 This contrasted with the other rounds, where panel members’ involvement was evidenced in the relevant records and by attending meetings (either in person or virtually) to agree the successful projects.

4.15 As outlined at paragraph 1.6, a recent review of grants administration in the New South Wales Government made 19 recommendations, the first of which was for the issue of a grants administration guide that is based on the principles set out in the CGRGs. The grants administration guide developed as part of that review directly addresses how stakeholders such as parliamentarians, the responsible minister, other ministers and ministerial staff are able to be involved in the award of grant funding. The guide requires that, where it is anticipated that a grant opportunity will involve input from those stakeholders and/or other levels of government or industry representatives:

  • the grant guidelines are to clearly outline the role of stakeholders;
  • there be processes in place to manage this interaction (including equitable opportunity for parliamentarians);
  • all stakeholder input be documented as part of the assessment process, and that where such input is received outside of the process set out in the grant guidelines, this must be documented; and
  • relevant input from key stakeholders (such as parliamentarians, the responsible minister, other ministers and ministerial staff) and the consideration given to that input in the assessment process be identified in the written advice prepared to inform funding decisions.

4.16 The guide also specifically outlines the obligations placed on ministerial staff where they are involved in grants administration. Those staff are required to:

  • be familiar and comply with the principles and grants administration processes set out in the guide, as well as applicable laws and policies that guide ethical behaviour;
  • put in place practices and procedures to ensure that ministerial involvement in grants administration is conducted in a manner that is consistent with the key principles and requirements in the guide; and
  • where a minister is the decision-maker, ensure that the decision is recorded in writing and the records are managed in accordance with the requirements of the state’s recordkeeping act.

Recommendation no.3

4.17 The Australian Government amend the Commonwealth Grants Rules and Guidelines to apply the principles for grants administration to situations where stakeholders, such as parliamentarians, play a role in the assessment and award of grant funding.

Department of Finance response: Noted.

4.18 Any amendment to the Commonwealth Grants Rules and Guidelines is a matter for consideration by Government. The Department of Finance will brief the Government on the ANAO’s findings and recommendations.

Community investment stream applications

4.19 Applications for the community investments (CI) stream were considered for approval separately under the first and third funding rounds (see Table 3.1). The ministerial panel met twice in August 2017 to select the CI applications for funding under the first round. In accordance with the published guidelines, this was after receiving the department’s funding recommendations and complete briefi ng package.

4.20 By the time Infrastructure had provided the chair with the equivalent CI briefing for round three on 15 February 2019, the panel members through their offices had already agreed to approve the department’s forthcoming recommendations for the CI stream in full. The ANAO’s analysis was that the rest of the panel members were not provided with this advice from Infrastructure, and therefore it could not have been considered by them before the associated approvals were provided.

4.21 In the remaining rounds, consolidated advice was provided for the two streams ahead of the panel’s decision-making processes. Similar to the approach adopted in round three, the panel approved the department’s CI stream recommendations with little to no changes.

Were appropriate records made of the decision-making process?

Appropriate records were not made of the decision-making process. While not good practice in the first round, the record-keeping practices for the decision-making process deteriorated significantly in later rounds. Infrastructure performed a secretariat function for the panel in the first round, and although secretariat services are readily able to be provided by DISR’s grants hub, it has not been engaged to provide administrative support for the panel.

4.22 In addition to enhancing accountability, which is one of the key principles of grants administration, adequate record-keeping practices are essential for demonstrating compliance with the grants administration framework and showing that due process has been followed. This is particularly important for an open and competitive merit-based program.

Receipt of departmental recommendations

4.23 A key feature of the decision-making process in each round was the meeting (or meetings) at which the panel planned to consider and select applications for funding. Consistent with the CGRGs, the department provided its funding recommendations to the ministerial panel in a written briefing prior to these scheduled meetings for each round. Table 4.1 summarises the role played by Infrastructure in the decision-making process for each round.

4.24 Infrastructure provided secretariat support to the panel during the first funding round. This included the department providing a complete briefing package for each panel member.88 From round two, the chairs through their offices were responsible for providing this support. In addition to creating records during meetings, this involved distributing the department’s funding recommendations and any other materials required by the panel to inform its decision-making.

Table 4.1: Departmental attendance at panel meetings for the first five funding rounds

Round

Was a meeting held?

Was secretariat support provided?

Were departmental officials in attendance?

1

2

3

4

✔ᵃ

✘ᵇ

5

✔ᵃ

✘ᵇ

       

Note a: Meetings were held virtually in line with public health directions associated with the COVID-19 pandemic.

Note b: Departmental attendance was limited to the first 15 minutes of the meeting to provide a general overview of the process and guidelines and to answer any panel member questions.

Source: ANAO analysis of departmental records.

4.25 Departmental advice has an important role in the context of the mandatory requirements of the CGRGs (see paragraph 3.3). The ANAO’s analysis was that after the first round, the broader ministerial panel membership was not provided with complete sets of the department’s recommendations. The CGRGs require that this advice must be received in writing by ministers before funding decisions are made.89 The materials distributed to panel members by the chair through his or her office for each round are specified in Appendix 7.

Records of the decision-making process

4.26 In October 2021, the Australian Government agreed to recommendations made by the Joint Committee of Public Accounts and Audit (JCPAA) aimed at improving the Commonwealth grants administration framework.90 This included a recommendation to the Department of Finance to review the official record-keeping requirements of the CGRGs with a view to:

  • addressing probity issues, including a requirement for all parties involved in grant administration to disclose and record any conflicts of interest91; and
  • ensuring records are kept of the reasoning for decisions of a relevant minister(s) to approve or reject grant applications and recommendations, including ministerial panels.92

4.27 For round one, a file note for each funding stream was created in July 2017 to document the outcomes of the decision-making processes for the first round. This was around a month after the panel approved the IP stream applications and four weeks prior to the CI stream decisions in August 2017. The outcomes from the CI stream were not subsequently recorded.93

4.28 These records, along with the notes taken during the deliberations by the Infrastructure officials in attendance, provided little insight into why the panel disagreed with the department’s recommendation to fund 18 projects (seeking a total of $79.7 million) and instead decided to fund 43 others94 (for $65.4 million) that had not been recommended.

4.29 From round two onwards, secretariat support was not provided by Infrastructure during panel meetings, and departmental officials did not attend those meetings.95 While departmental documents suggested that the chair’s office would provide secretariat support, there was no corresponding documentation to support this. The ANAO received one response to its requests for further information in this respect (see footnote 19). Therefore, the ANAO’s analysis of the decision-making processes is limited to email records from departmental IT systems (see paragraphs 1.16 and 1.17).

4.30 Indicating continuing merit in the JCPAA’s recommendations (see paragraph 4.26), the absence of records across all funding rounds has meant that the ANAO was unable to confirm that panel members had recused themselves from discussions when applications from their electorates were being considered (which was the process the panel had agreed to follow for the fifth round).

4.31 Secretariat support is one of the service offerings available to agencies when engaging grants hub services through DISR. In April 2022, DISR advised the ANAO that 166 of 405, or 41 per cent, of grant opportunities delivered by its Business Grant Hub have a committee with associated secretariat support.

4.32 In its funding advice for the fourth and fifth rounds in April 2020 and August 2021, Infrastructure advised the panel that:

the ANAO has previously raised concerns with the Joint Committee on Public Accounts and Audit about ministerial panels not keeping records of meetings or inadequate records; and not minuting how conflicts of interest were managed by each ministerial panel.

4.33 In response to ANAO questions in February 2022 as to whether the department has revisited with the chair of the panel the need for a secretariat function, Infrastructure advised the ANAO that ‘the chair of the ministerial panel has changed multiple times since the BBRF program was established. Outside of the first round of the program, the department has not proposed or been asked to provide secretariat support for the ministerial panel.’

4.34 Where there is misalignment between the results of merit assessment processes and the applications approved for funding, a line of sight between the divergent results can be achieved through adequate record-keeping. The inadequate records of decision-making for the award of BBRF funding has been detrimental for accountability and has eroded transparency.

Was the basis for the funding decisions recorded including by reference to the grant opportunity guidelines and the principle of achieving value with relevant money?

Out of the 1293 projects approved across the five concluded rounds, there were 179 instances where the bases for the panel’s funding decisions were not recorded. This comprised:

  • 164 instances where the panel decided not to approve applications that had been recommended by Infrastructure for funding (which had been assessed as having a high degree of merit against the four published criteria), without recording the reasons for not approving those applications. This contrasted with the approach taken for 40 other recommended applications where the panel’s reasons for not approving were recorded; and
  • 15 instances (10 missing from the 76 that should have been recorded in the second round, and 5 from 173 in the fifth round) where projects not recommended by the department were funded.

In addition, and particularly in more recent rounds, while there has been some record made of why applications assessed as less meritorious were approved for funding, the basis recorded did not adequately explain why applications assessed as more meritorious in terms of the published criteria have been overlooked in favour of those other applications with lower merit scores.

4.35 The CGRGs require decision-makers to record the basis for their approvals, relative to the grant opportunity guidelines and the principle of achieving value with relevant money. Where ministers agree with the department’s written recommendations, this provides both the basis for approvals and can constitute the reasonable inquiries required under the Public Governance, Performance and Accountability Act 2013 (PGPA Act).96

Recorded basis for decisions

4.36 Infrastructure has consistently recommended that its ministers, as chairs of the panel, populate and sign the recommendation review template provided in each of the five concluded rounds. The approach taken to recording the basis for decisions changed over time, in line with amendments implemented by Infrastructure (see paragraph 4.46). The following sets out the ANAO’s analysis of the recorded bases for decisions.

  • For rounds one and two, although there was a focus on identifying where applicants’ claims against the merit criteria had been under or overstated, the scores were not amended.97 Reasons were documented:
    • in round one, for 18 projects that were recommended by Infrastructure but not approved, and 43 projects98 that were approved but not recommended; and
    • in round two, for 66 projects that were approved by the panel but had not been recommended for funding by Infrastructure99, there were 60 projects without a recorded basis that were recommended but not approved.
  • For rounds three and four the reasons documented did not refer to panel disagreements with the merit assessment scores. Rather, they referred to the ‘other factors’ set out in the published guidelines as the basis for the funding decisions. Separate templates for individual projects were not populated. Blanket statements relating in whole or part to the other factors were recorded as the basis for decisions for groups of projects not recommended for funding by Infrastructure.100 Reasons were documented:
    • in round three, for the 112 projects not recommended by Infrastructure but approved for funding. There were 88 projects without a recorded basis that were recommended by Infrastructure but not approved; and
    • in round four, for the 49 projects not recommended by Infrastructure but approved for funding. There were 16 projects without a recorded basis that were recommended by Infrastructure but not approved.
  • For round five the recommendation review templates were not used. The basis for the approval of projects where the panel’s decisions differed from the merit assessment results were documented in a spreadsheet.101 Similar to the approach taken for rounds three and four, blanket statements referencing the ‘other factors’ were attributed to groups of projects.102 Reasons were documented for:
    • all 22 projects that were highly scored by the department but not approved; and
    • 168 out of 173 projects that were not highly scored by the department but approved.

4.37 The implications of Infrastructure’s changed approach to funding recommendations in the fifth round are discussed further at paragraphs 3.27 to 3.32, 5.4 and 5.8.

Adequacy of the written basis for funding decisions

4.38 The adequacy of reasons provided as the basis for funding decisions has progressively declined. The reasons have not represented robust justifications for elevating certain applications above others when compared with the ranking of applications based on their assessed merit against the four published criteria.

4.39 For round two, results of the ANAO’s analysis were at odds with the reasons recorded in recommendation review templates as to why the panel chose to fund particular projects over others. Specifically, the reasons recorded in the 66 completed recommendation review templates predominantly pointed to the panel considering those applications should have been scored more highly against assessment criteria. This did not accord with the primary reason evident from the ANAO’s review of the program records, which indicated that deviations between the funding decisions and departmental recommendations were aimed at managing the electoral distributions of projects, including whether the approval of projects could be defended under scrutiny.103

4.40 By way of example of the disconnect between the scores achieved against the published merit criteria and the award of funding, in round four, there were 24 eligible applications submitted from the Calare electorate under the IP stream. Of these, there were:

  • five recommended by Infrastructure, of which one was not approved and had the second highest score for applications in the electorate with a score of 30;
  • seven projects assessed as value with relevant money, but not recommended for funding by Infrastructure, with scores between 24 and 28. Of these, the two with the lowest scores of 24 were funded, whereas two other applications with scores of 25 and 26 were not approved; and
  • ten assessed as not value with relevant money (none were funded).

4.41 The reasons provided, which were in the category of either stating that the project was located in a drought-affected area, or that the regional impact of projects was a consideration, provided little insight as to why projects with lower scores were funded over those with higher scores that had been recommended by the department. The increasing disconnect between Infrastructure’s recommendations and the final funding outcomes over the life of the program is further examined from paragraph 5.6.

Were processes in place for decision-makers to re-score grant applications in circumstances where they disagreed with the scoring presented by the department?

Processes were not in place for decision-makers to re-score applications where they disagreed with the departmental merit assessments. In a departure from previously observed better practice, Infrastructure advised the minister in the later stages of the first round that it was unnecessary to rescore applications because it was not required by the CGRGs. Accordingly, where the panel indicated disagreement with departmental scoring in the first two rounds, it did not revise scores. By the third round, the panel’s written basis for funding decisions made no reference to the merit assessment results, instead focusing on the ‘other factors’ mentioned in the guidelines for the program.

4.42 The grants administration framework recognises that different conclusions can be drawn by assessors and decision-makers. In these instances, transparency and accountability is enhanced where decision-makers accurately document the reasons for not following departmental advice.

4.43 The ANAO audit of the predecessor program (the National Stronger Regions Fund, or NSRF) in December 2016104, noted improvements in Infrastructure’s approach towards recording the basis for funding decisions and highlighted it as an example of better practice.105 Departures from this approach were observed in the November 2019 audit of the Regional Jobs and Investment Package (RJIP), leading to an ANAO recommendation aimed at Infrastructure embedding the better practice observed in the NSRF for the re-scoring of applications where decision-makers conclude the assessment results are incorrect.

4.44 Based on the template used for the NSRF, the briefing packages for each of the five concluded rounds of the BBRF included a ‘departmental recommendation review template’ for the panel to readily document the basis for its funding decisions where they differed from Infrastructure’s recommendations.

4.45 The template comprised of two sections where the panel could document the details where it disagreed with Infrastructure’s recommendations:

  • ‘Section A’, was for recording where the panel disagreed with the departmental assessment against the merit criteria; and
  • ‘Section B’, was for recording where the panel’s decision varied from the department’s recommendation due to ‘factors other than the assessment criteria’.

4.46 Over time, Infrastructure moved away from the better practice approach exhibited under the NSRF. The corresponding impacts on the quality and accuracy of the decision documentation are outlined at paragraph 4.38, and summarised below.

  • Changes in early 2019 to the wording of the instructions for when to use the review template to document the basis for the panel’s funding decisions. Specifically:
    • for the first two rounds, the chair was asked to populate a template for each project by recording the basis for the panel’s decisions in respect to projects that were: recommended by Infrastructure, but not approved by the panel; and approved by the panel, but were not recommended for funding by the department; whereas
    • from round three onwards, the chair was asked to only record the basis for the panel’s decisions where the panel decided to approve projects that had not been recommended by Infrastructure.
  • Deleting sections from the recommendation review template during the first and third funding rounds. Specifically:
    • after advising the chair through his office in June 2017 that it was not necessary for applications to be rescored where the panel disagreed with the departmental merit assessment (because it is not a requirement of the CGRGs), Infrastructure deleted the part of Section A that was for documenting the rescoring of applications before it returned the 45 populated templates (for the IP applications) to the chair through his office for signing; and
    • in February 2019, Infrastructure deleted the rest of Section A (which was for recording where the panel disagreed with the departmental assessment against the merit criteria) before providing it as part of the round three briefing materials, removing the mechanism and reference point for documenting a basis for decisions where the panel disagreed with assessments against merit criteria.

4.47 Notably, in its response to a related recommendation in the ANAO audit of the Regional Jobs and Investment Packages, Infrastructure indicated in late 2019 that new assurance arrangements over DISR’s assessments of applications were likely to minimise the need to re-score applications in the future.

4.48 Noting the increasing disconnect between the Infrastructure’s recommendations and the final funding outcomes over the life of the BBRF, the ANAO sought advice from Infrastructure in February 2022 as to whether it had received any feedback more recently from the panel regarding issues with the merit assessment scoring. In its March 2022 response, the department advised the ANAO that it:

has not received feedback from the Ministerial Panel about there being problems with the Merit Assessments provided to them, such that the Ministerial Panel is selecting different projects in line with their discretionary powers outlined in the Published Grant Opportunity Guidelines.

Recommendation no.4

4.49 The Department of Infrastructure, Transport, Regional Development, Communications and the Arts:

  • put appropriate processes in place to ensure that where more than one minister, such as a ministerial panel, performs the role of grants decision-maker its written advice on the merits of proposed grants is provided to all panel members prior to funding decisions being taken; and
  • improve record-keeping practices so that the basis for decisions is clear, including in circumstances where the decision-maker has not agreed with the assessment of candidates undertaken by officials.

Department of Infrastructure, Transport, Regional Development and Communication response: Agreed.

4.50 The Department of Infrastructure, Transport, Regional Development and Communications (the Department):

  • provided appropriate, fulsome advice to Government for each round of the BBRF, through the Chair of the Ministerial Panel, including recommendations, applications in ranked order of highest merit score, detailed merit assessments, probity requirements and obligations on Ministers under the CGRGs. Where panels are used in future grant programs, the Department will recommend options to provide further practical support; for example, the provision of secretariat services and options for transmitting electronic or hard copies of the full briefing materials directly to all panel members.
  • is committed to quality record-keeping practices to support transparency and accountability in decision-making. The Department will continue to advise Ministers of their obligations under the CGRGs to record the basis for decisions and to offer secretariat support to facilitate this. The Department has also invested in the acquisition of an improved IT solution for records management and commenced a change management program to support staged implementation of the new record keeping arrangements across the Department.

5. Award of funding

Areas examined

The ANAO examined whether the award of funding under first five rounds of the Building Better Regions Fund (BBRF) was consistent with the published grant opportunity guidelines.

Conclusion

Overall, the award of funding was partly consistent with the grant opportunity guidelines. Consistent with the program design, the funding has been directed to projects in rural and regional areas. As the program has progressed through the first five rounds, there has been an increasing disconnect between the assessment results against the published merit criteria and the applications approved for funding under the infrastructure projects stream (which comprised the majority of approved projects and funding). This reflects the extent to which the ministerial panel has increasingly relied upon the ‘other factors’ outlined in the published program guidelines when making funding decisions.

Areas for improvement

The ANAO made one recommendation to amend the Commonwealth Grants Rules and Guidelines (CGRGs) to improve the requirements relating to advice about the award of grant funding and consequential reporting to the Finance Minister.

5.1 Under the CGRGs, achieving Australian Government policy outcomes through the award of grant funding is a key consideration. In the context of an open and competitive grants program, applications that are most likely to achieve those outcomes are, in a well-designed and administered program, those that are assessed as the most meritorious against the published merit criteria. Departmental recommendations that are consistent with these results promote the proper use and management of public resources in a transparent and equitable manner.106

5.2 When approving projects for funding, it is open to decision-makers to arrive at a different view to the departmental assessment against the published criteria. Where ministers performing the role of approver decide to approve projects that had been recommended for rejection, these must be included in an annual report to the Finance Minister by 31 March of each year.

What was the relationship between the population of applications funded and those that had been assessed as eligible and the most meritorious against the published assessment criteria?

Projects approved for funding under the community investments (CI) stream (grant funding totalling $26.3 million) have consistently been those assessed as being the most meritorious against the published assessment criteria. For projects approved under the infrastructure projects (IP) stream (involving grant funding of $1.1 billion, or 98 per cent of total program funding awarded), 65 per cent of IP stream applications approved for funding were not those assessed as being the most meritorious in the assessment process. Under the first round, 75 per cent of IP applications funded had been scored highly (and recommended by Infrastructure for funding). For subsequent rounds, highly scored IP applications were approved at a lesser rate of between 13 and 55 per cent.

5.3 The briefing packages provided by the Department of Infrastructure, Transport, Regional Development, Communication and the Arts (Infrastructure) for each round of the BBRF have included comprehensive details on the merits of eligible applications. While clear funding recommendations were not provided in rounds three and five (see paragraph 3.31), sufficient detail was provided (in the form of assessment scores, an order of merit and application assessment summaries) providing information to enable the ministerial panel to discern which applications had been assessed as the most meritorious against the published assessment criteria.

5.4 As part of its funding recommendations for each funding stream across the five concluded rounds, Infrastructure advised the panel of the benchmark score applications needed to have achieved to be considered ‘high scoring projects’. With the exception of round five107, the department’s funding recommendations involved recommending that the panel approve the applications that had met or exceeded these ‘high scoring’ benchmarks.108

5.5 Appendices 8 and 9 illustrate the benchmark scores that were required across each stream and round for applications to be considered highly scored, and in most cases, recommended for funding by Infrastructure.

Approval rates of meritorious applications

5.6 The program was designed to allow projects of similar size to compete against each other.109 Accordingly, the benchmark score determined for projects under the CI stream to be recommended for funding under the first four rounds was between three and 13 points lower than that required for the IP stream. The score required for projects to be recommended for funding under round five was the same for both streams (24 out of 40).

5.7 There was a strong alignment between merit assessment results and funding outcomes in the CI stream with 97 per cent of the CI applications funded over the five concluded rounds those that were highly scored and recommended for funding by the department. This was not the case for projects approved under the IP stream. The ANAO’s analysis of the approval rates for highly scored projects under the IP stream shows a progressive decline in the alignment between the merit assessment results and those applications approved for funding. Collectively across the funding rounds, highly scored IP applications were approved at a significantly lower rate (35 per cent) than the CI applications.

Table 5.1: Approval rates for applications that achieved the benchmark score required to be considered a ‘high scoring’ application under each funding stream

 

 

Applications meeting or exceeding the benchmark score

Applications scoring below the benchmark score

Funding stream

Total

IP

CI

Both

IP

CI

Both

Approved applications

1293

248

559

807

470

16

486

35%

97%

62%

65%

3%

38%

Not approved applications

3027

151

12

163

2029

835

2864

7%

1%

5%

93%

99%

95%

                 

Note: Where the aggregate value of Infrastructure’s funding recommendations exceeded the total funding available for the round, the ANAO’s analysis is based on the highest scoring applications with an aggregate value to the nearest proximity of the total funded awarded for that round. Further, the benchmark scores required by Infrastructure for each stream and round for applications to be considered highly scored are outlined in Appendices 8 and 9.

Source: ANAO analysis of departmental records.

5.8 Round one had the greatest alignment between the merit assessment results and funding approvals for the IP stream at 75 per cent, with the third round representing the least alignment with highly scored applications approved at a rate of 13 per cent. Under the fifth round, there were 139 projects with scores below the benchmark score for highly scored applications that were approved by the panel for a total of $128 million. This meant that there were 53 projects seeking a total of $167 million with higher scores that were not approved for funding. Figure 5.1, together with Table 5.2, illustrates the funding outcomes for applications under the IP stream across the five concluded rounds.

Figure 5.1: Applications approved per round under the infrastructure projects stream

 

Source: ANAO analysis of departmental records.

Table 5.2: Approval rates for applications under the infrastructure projects stream

Round

# Approved

Proportion of those approved that were recommended for funding

Proportion of those approved that were not recommended for funding

# Not approved

Proportion of those not approved that were recommended for funding

Proportion of those not approved that were not recommended for funding

1

110

75%

25%

435

4%

96%

2

136

44%

56%

431

11%

89%

3ᵃ

166

13%

87%

447

6%

94%

4

109

55%

45%

265

6%

94%

5ᵇ

197

25%

75%

602

7%

93%

             

Note a: Rather than the IP applications that were recommended by Infrastructure for funding, the ANAO’s analysis for round three was based on the 48 highest scoring applications (that were seeking grants of $201 million). This is because Infrastructure recommended that the panel choose projects for funding from a pool of 142 IP applications that were seeking an aggregate of $419 million, notwithstanding that there was only $199 million available to be awarded under the IP stream (see paragraphs 3.31 to 3.33).

Note b: Rather than the IP applications that were recommended by Infrastructure for funding, the ANAO’s analysis for round five was based on the 95 highest scoring applications (that were seeking grants of $302 million). This is because Infrastructure recommended 427 IP applications for funding that were seeking grants of $755 million, which was substantially more than the $200 million the program guidelines had said was available to be awarded and the $294 million subsequently awarded under the IP stream.

Source: ANAO analysis of departmental records.

5.9 All applications approved under the CI stream since the third round have been those assessed as eligible and the most meritorious against the published assessment criteria. Compared with the IP stream, apart from the first round, there have been no instances in the CI stream where the panel had either not approved a highly scored application or approved an application that was not highly scored. Under the first funding round, there were 16 projects approved that were assessed as not value with relevant money for a total value of $747,061; and 12 projects recommended for funding by the department seeking a total value of $1.4 million that were that were not approved by the panel.110 Figure 5.2 illustrates the funding outcomes for applications under the CI stream across the five concluded rounds.

Figure 5.2: Applications approved per round under the community investments stream

 

Source: ANAO analysis of departmental records.

5.10 As outlined at paragraph 4.48, Infrastructure advised the ANAO in March 2022 that the increasing disconnect between the results of the merit assessment process and the final funding outcomes for the IP stream was the result of the ministerial panel ‘selecting different projects in line with their discretionary powers outlined in the published grant opportunity guidelines.’ This advice did not address why there has been a strong level of alignment between the results of the merit assessments and funding outcomes for the CI stream.

5.11 The impact of the approach taken has been that the program is not being delivered as an open competitive grant opportunity where eligible applications are competing in terms of their assessed merit against the four published criteria. Rather, the ANAO’s analysis is outlined below.

  • There have been two levels of eligibility:
    • the first being the application of the published eligibility requirements to identify which applications would proceed to be assessed against the four published merit criteria; and
    • the second involved those applications achieving a score of 60 per cent or greater against the published merit criteria forming a pool of applications from which the ministerial panel selected from.
  • Decisions about which of those applications in the pool were to be awarded funding were made on the basis of the ‘other factors’ set out in the program guidelines.

Was the approval of any not recommended applications reported as required?

Across the five rounds, the approach for reporting the panel’s approval of applications that were not recommended for funding falls short of the intent of the CGRGs. This has been the result of Infrastructure changing its approach to making funding recommendations. Certain types of decisions under some rounds have been reported to the Finance Minister, with similar decisions in other rounds not reported. In addition, Infrastructure’s approach to categorising applications into ‘pools’ in conjunction with alternatively worded funding recommendations, has resulted in categories of applications where there is no clear recommendation and, as a result and based on Infrastructure’s interpretation of the CGRGs, no requirement for applications approved in those pools to be reported to the Finance Minister.

5.12 Where a minister approves a grant that the relevant agency had recommended be rejected, they must report each instance annually to the Finance Minister. The report must include a brief statement of reasons summarising the basis for the approval and be provided to the Finance Minister by 31 March of each year for the preceding calendar year. Together with clear and consistent departmental advice, these reporting requirements promote transparency and accountability in the grants administration process and were introduced due to the ‘sensitivity’ of overturn funding decisions.

Infrastructure’s funding recommendations

5.13 The wording and structure of the department’s funding recommendations have changed over the life of the program. A key feature in each round has been the categorising, or ‘pooling’, of applications based on the merit assessment results (see paragraph 3.25). Blanket advice and recommendations have then been attributed to all of the applications within each category in the written briefings for the panel.

5.14 With the exception of round five, applications colour-coded in green in the briefing attachments (‘value with relevant money – recommended’) have been those recommended to the panel for funding. The department’s advice relating to the other application categories has been less consistent across rounds. Appendix 10 provides an overview of the advice attributed to each of the following: ineligible applications; those assessed as not satisfactorily meeting one or more of the merit criteria (‘not value with relevant money’); and those that met the merit criteria but were not ranked high enough to be accommodated within the available quantum of grant funding (‘value with relevant money’).

5.15 Key observations about the department’s approach are outlined below.

  • Since round one, it has not recommended that any category of applications be ‘rejected’ and has not since provided any advice in relation to ineligible applications.
  • The approach to providing recommendations for the ‘value with relevant money’ but lower scored (below the high scoring threshold) category of applications has been the most inconsistent across the funding rounds, with:
    • no recommendations provided for these projects under the first two rounds (98 related approvals were not reported to the Finance Minister);
    • advice that these projects were ‘not recommended’ for funding under the third111 and fourth rounds (161 related approvals were reported to the Finance Minister); and
    • under the fifth round, Infrastructure advised the panel that all 528 applications (seeking $761 million) in this category were ‘suitable for selection’ and recommended that the panel approve projects from that list up to the funding available (none of the 298 approvals have been reported to the Finance Minister).

5.16 The approach to making funding recommendations in the third and fifth BBRF rounds was similar to the approach employed by Infrastructure in a predecessor program audited by the ANAO in 2010–11 (see paragraph 2.42). In November 2021, the ANAO sought advice from Infrastructure as to why it had adopted an approach to making funding recommendations for the IP stream in rounds three and five such that a pool of recommended applications was presented to the panel that significantly exceeded the amount of funding available under the rounds. Infrastructure advised the ANAO in November 2021 that, rather than providing the panel with applications in separate colour-coded lists to indicate where the funding allocation would be exhausted112, it provided one list of all applications that had met or exceeded the minimum score requirements to be classified as ‘value with relevant money’ and inserted an orange line on the list to delineate the point at which available funding would be exhausted. The department further advised that:

This approach ensured the Panel was aware of which applications fell within the available funding envelope on a strictly merit score-based approach, but also recognised that the published program guidelines explicitly allow the Ministerial Panel to consider other factors when deciding which projects to fund, and these factors are set out in the guidelines.

In the Department’s view, the presentation of projects in this way better enabled decision-makers to undertake their role as the Guidelines envisage, i.e. to consider the Department’s and the Hub’s initial rating and apply additional discretionary factors in relation to applications assessed as eligible and value for money.

5.17 The wording of the round five briefing meant that there would be no circumstance in which any of the 427 IP applications seeking $755.3 million in grant funding that had been assessed as representing value with relevant money would need to be reported to the Finance Minister as an approval of an application the department had recommended be rejected.

5.18 Illustrating the extent to which this has reduced transparency, if the approach adopted for the fifth round had been modelled on the approach from the previous round (for example, by not recommending more applications for funding than could be approved within the available funding113), there would have been 53 IP applications (seeking $169.0 million) and 42 IP tourism applications (seeking $132.5 million) that would have been ‘recommended for funding’. In the context of the 197 IP applications (totalling $294 million) approved by the panel under the fifth round:

  • 30 of those 53 IP applications were awarded grant funding totalling $95.0 million.114 In addition to those, there were 90 lower scored and ranked applications approved for funding (totalling $73.8 million). Using the approach adopted under rounds three and four, those 90 applications would have been reported to the Finance Minister; and
  • 20 of those 42 IP tourism applications were awarded funding totalling $53.5 million.115 In addition to those, 57 lower scored and ranked applications were awarded funding (totalling $71.6 million). Had the department not changed its approach from the previous round, those 57 applications would have been reported to the Finance Minister.

5.19 Similar to the principles outlined in the CGRGs, the recent review of grants administration in the New South Wales Government mentioned at paragraph 1.6 concluded that, ‘to ensure accountability and transparency in grants administration, there must be a clear delineation of what officials recommended and what decisions the minister made, and the reasons for those decisions.’116 The grants administration guide developed as part of that review, while based in part on the CGRGs, includes stronger requirements than the CGRGs do on the recording of the basis for funding decisions. These include that the basis for any decisions that depart from the recommendations of the assessment team be recorded whereas the CGRGs do not require a record be made of the basis for decisions not to award funding to recommended applications, or that the written advice to decision-makers recommend the rejection of those applications that were not recommended.

Recommendation no.5

5.20 The Australian Government amend the Commonwealth Grants Rules and Guidelines to require that:

  • when advising on the award of grant funding, officials recommend that the decision maker reject all applications not supported for the award of a grant within the available funding envelope; and
  • the basis for any decisions to not approve applications that were recommended for funding be recorded.

Department of Finance response: Noted.

5.21 Any amendment to the Commonwealth Grants Rules and Guidelines is a matter for consideration by Government. The Department of Finance will brief the Government on the ANAO’s findings and recommendations.

What was the distributional outcome of the applications received, assessed, recommended and approved for funding?

Consistent with the objectives for the BBRF, almost $1.08 billion (94 per cent) of the funding awarded under the first five rounds has been directed towards projects in rural (74 per cent or $850 million) and provincial (20 per cent or $227 million) regions. Compared with the results of the merit assessment process, the award of funding favoured electorates held by The Nationals as those electorates were awarded $104 million (or 29 per cent) more funding in total across the five rounds than would have been the case had the results of the merit assessment process been relied upon. Electorates held by all other parties were awarded less grant funding than would have been the case had the merit assessment results been relied upon.

5.22 A recurring theme in ANAO audits of grants administration has been the importance of granting activities being implemented in a manner that accords with published program guidelines. Similarly, the grants administration framework was developed based, in part, on recognition that potential applicants and other stakeholders expect that grant funding decisions will be made in a manner, and on a basis, consistent with the published program guidelines.117 Further in this respect, reflecting their importance, the guidelines for each program represent the Australian Government policy outcomes that the proposed grants are intended to achieve.

Targeting regional projects

5.23 A key change in the policy objectives for the BBRF in comparison to the predecessor program, the National Stronger Regions Funding (NSRF), was that funding was to be directed to projects outside of major capital cities. Consistent with this, almost $1.08 billion (94 per cent) of the $1.15 billion approved under the first five rounds has been awarded to projects located in regional areas, with $48.5 million (4.2 per cent) going to outer metropolitan and $25.3 million (2.2 per cent) awarded to inner metropolitan projects. A breakdown of the proportion of funding awarded for each round by demographic profile is provided by Table 5.3.

Table 5.3: Proportion of funding directed to regional areas by funding round

Demographic profile

Round 1

Round 2

Round 3

Round 4

Round 5

Total

Rural

66.0%

74.7%

71.7%

84.4%

73.5%

73.9%

Provincial

26.8%

14.1%

22.7%

14.8%

19.6%

19.7%

Outer Metropolitan

6.0%

7.2%

5.2%

0.7%

2.5%

4.2%

Inner Metropolitan

1.3%

4.0%

0.3%

0.0%

4.4%

2.2%

Total funding ($m)

$226.4

$212.5

$204.3

$207.1

$300.1

$1150.5

             

Source: ANAO analysis of departmental records and Australian Electoral Commission (AEC) data.

5.24 ANAO analysis of the geographic distribution of funding was that the aggregate value of projects funded for each demographic profile were generally consistent with both the populations of eligible applications and those recommended for funding (or ‘highly scored’) by the department. Key observations for individual funding rounds were that:

  • with the exception of the first round, the proportion of funding awarded to rural applications was between three and 16 per cent lower than what would have been awarded based on the department’s recommendations; and
  • the proportion of funding awarded to provincial areas (generally larger regional centres), was between four and 11 per cent higher than the department’s recommendations under the first found rounds.

Electoral distribution of funding

5.25 The ANAO examined the amount of funding awarded to each electorate under each round. The award of funding in a way that did not rely upon the results of the merit assessment process saw a redistribution of grant funding towards electorates held by The Nationals (see Table 5.4). Specifically, applications located in electorates held by The Nationals were awarded $104 million (or 29 per cent) more grant funding than would have been the case had funding been awarded to those applications assessed as the most meritorious in each round. Applications located in electorates held by all other political parties were awarded less grant funding than would have been the case had funding been awarded based on the results of the merit assessment process. The most significant reductions were to electorates held by the Liberal Party ($73.5 million less grant funding awarded) and the Australian Labor Party (ALP) ($26.1 million less grant funding awarded). In terms of the five rounds:

  • increased funding compared with the results of the merit assessment process was awarded to electorates held by The Nationals in four of the five rounds (an average increase of $26.2 million for each of those four rounds) with round three being the only round where electorates held by The Nationals were awarded less grant funding than would have been the case had the merit assessment results been relied upon (by a total of $0.9 million in that round);
  • conversely, round three was the only round where electorates held by the Liberal Party were awarded more grant funding than would have been the case had the merit assessment results been relied upon (by a total of $8.6 million in that round). The average reduction across the other four rounds for the Liberal Party compared with the merit assessment results was $20.5 million; and
  • round two was the only round where electorates held by the ALP were awarded more grant funding than would have been the case had the merit assessment results been relied upon (by a total of $3.6 million in that round). The average reduction across the other four rounds for the ALP compared with the merit assessment results was $7.4 million.

Table 5.4: Distribution of BBRF funding by demographic profile and political party

 

Demographic Profile

The Nationals

Liberal

Australian Labor Party

Katter’s Australian Party

Independent

Centre Alliance

The

Greens

Total

Number of seats held by party

 

Inner Metro

0

0.0%

17

11.3%

25

17%

0

0.0%

2

1.3%

0

0.0%

1

0.7%

45

30%

Outer Metro

0

0.0%

21

13.9%

24

16%

0

0.0%

0

0.0%

0

0.0%

0

0.0%

45

30%

Provincial

3

2.0%

8

5.3%

12

8%

0

0.0%

0

0.0%

0

0.0%

0

0.0%

23

15%

Rural

13

8.6%

15

9.9%

7

5%

1

0.7%

1

0.7%

1

0.7%

0

0.0%

38

25%

Total

16

10.6%

61

40.4%

68

45%

1

0.7%

3

2.0%

1

0.7%

1

0.7%

151

100%

Funding if awarded by merit assessment results ($m)

 

Inner Metro

0.0 M

0.0%

0.2 M

0.0%

21.4 M

2%

0.0 M

0.0%

3.5 M

0.3%

0.0 M

0.0%

0.0 M

0.0%

25.1m

2%

Outer Metro

0.0 M

0.0%

45.3 M

3.9%

10.6 M

1%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

55.9m

5%

Provincial

48.7 M

4.2%

39.2 M

3.4%

82.6 M

7%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

170.6m

15%

Rural

306.3 M

26.3%

419.4 M

36.1%

95.3 M

8%

49.8 M

4.3%

24.2 M

2.1%

16.3 M

1.4%

0.0 M

0.0%

911.3m

78%

Total

355.0m

30.5%

504.2m

43.4%

209.9m

18%

49.8m

4.3%

27.7m

2.4%

16.3m

1.4%

0.0m

0.0%

1,162.8m

100%

Total funding awarded ($m)

Inner Metro

0.0 M

0.0%

0.8 M

0.1%

20.8 M

2%

0.0 M

0.0%

3.7 M

0.3%

0.0 M

0.0%

0.0 M

0.0%

25.3m

2%

Outer Metro

0.0 M

0.0%

41.5 M

3.6%

7.1 M

1%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

48.5m

4%

Provincial

83.5 M

7.3%

74.4 M

6.5%

68.8 M

6%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

0.0 M

0.0%

226.7m

20%

Rural

375.5 M

32.6%

314.0 M

27.3%

87.1 M

8%

38.6 M

3.4%

21.0 M

1.8%

13.8 M

1.2%

0.0 M

0.0%

850.0m

74%

Total

459.0m

39.9%

430.6m

37.4%

183.8m

16%

38.6m

3.4%

24.7m

2.1%

13.8m

1.2%

0.0m

0.0%

1,150.5m

100%

                                   

Note: Analysis of the distribution of seats was based on the 2019 Federal Election results. Where the aggregate value of Infrastructure’s funding recommendations exceeded the total funding available for the round, the ANAO’s analysis is based on the highest scoring applications with an aggregate value to the nearest proximity of the total funding awarded for that round.

Source: ANAO analysis of departmental records and AEC data.

5.26 Accounting for the large proportion of electorates unable to receive funding, the average number of grants and funding awarded was examined for electorates where there was at least one eligible application submitted.118 The results of this analysis are set out in Appendix 11. In summary:

  • the Liberal Party and ALP held the greatest number of electorates associated with eligible applications (on average, each had 24 electorates, or 35 per cent of applicable electorates). Funding outcomes were more favourable for applications from Liberal held electorates, which received twice as many grants per electorate, with average funding per electorate being between $1.6 million (round one) and $3.1 million (round four) more than the amount awarded to ALP electorates;
  • while The Nationals119 accounted for fewer electorates (16 in each round or 24 per cent of applicable electorates), applications funded from these electorates outperformed those in Liberal Party and ALP electorates in terms of both average number of grants and value of grants; and
  • performing favourably in both respects, Katter’s Australian Party (KAP) had up to 16 grants awarded in round two and the largest average value of funding per electorate ($17.5 million in round five).

Has the announcement, commencement and completion of projects been timely?

While there were delays between final approvals and announcements under the first two rounds, public announcements of the funding outcomes have been timely for recent rounds. There have been some delays with the completion of projects, and this has been reflected in seven projects from round one and 33 from round two still being underway.

5.27 The CGRGs outline that effective disclosure and reporting arrangements for grants in a timely manner is essential for transparency and accountability. The timeframe between final application approvals and the public announcement of successful projects ranged from three days in round five up to 66 days for the CI stream applications under round one. Table 5.5 provides an overview of these timeframes for each round.

Table 5.5: Timing of final approvals and public announcements of projects

Round

Final approval

Public announcement

Days between approval and announcement

1

18 July 2017

4 August 2017 (IP)

17

22 September 2017 (CI)

66

2

12 June 2018

18 July 2018

36

3

25 February 2019

11 March 2019

14

4

27 May 2020

5 June 2020

9

5

5 October 2021

8 October 2021

3

       

Source: ANAO’s analysis of departmental records

Completion of projects

5.28 The establishment of ongoing monitoring and management arrangements throughout the grants lifecycle should assure entities that grant opportunities are proceeding as planned and that relevant money is being appropriately managed. DISR is responsible for the project management of the BBRF including contract management, risks, variations and managing payments and acquittals. DISR provides regular monthly reporting on the delivery of projects to Infrastructure, including a consolidated project management report. This report covers all projects across the five rounds and updates the status of projects.

5.29 There have been some delays to contract executions, with 87 contracts executed over 60 days after the respective planned project commencement dates. The majority of these projects were from the fourth round, a situation at odds with the guidelines stating that the round would ‘invest in shovel-ready projects delivering jobs, driving economic growth and building up drought-affected regional communities.’ Table 5.6 provides an overview of the 38 contracts that have been either terminated or withdrawn over the first four rounds, with aggregate values of $28 million.

Table 5.6: Contracts terminated or withdrawn for the first five rounds

Round

Projects in progress

 

Completed projects

Number of contracts terminated or withdrawn

Value of contracts terminated or withdrawn

Payments made to terminated projects

1

7

237

13

$524,757

$9,600

2

33

203

9

$2,426,606

nil

3

124

196

10

$11,894,606

nil

4

133

21

6

$13,154,109

nil

5

298

           

Source: ANAO analysis of departmental records.

5.30 Only one terminated contract in round one received a payment, with all others terminated before payments were made. While no terminations have been recorded for rounds four or five, there have been six funding offers either declined by the applicant or withdrawn by DISR for the fourth round. One of the contracts withdrawn from round three related to an application that had been awarded only 25 per cent ($2.5 million) of the $10 million requested by the applicant.

Appendices

Appendix 1 Entity responses

ANAO comments on the Department of Infrastructure, Transport, Regional Development, Communications and the Arts’ response
  1. The ANAO concluded that the award of funding under the first five rounds of the Building Better Regions Fund was partly effective and partly consistent with the CGRGs. See paragraphs 7 to 11 in the report summary.
  2. The recommendations to improve aspects of the CGRGs reflect findings that a stronger grants administration framework would guard against the practices employed in the award of BBRF funding that were not consistent with the underlying intent of the framework.
  3. Refer to paragraph 8 of the report summary. The audit found that while the BBRF was well designed in a number of respects, there were also shortcomings in a number of important areas.
  4. The audit finding in this respect is outlined fully at paragraphs 12 to 17 of the report summary.
  5. While Infrastructure identified the assessed merit of eligible applications against the published criteria, it did not recommend that funding be awarded to those applications assessed as most meritorious. Rather, it recommended that the ministerial panel advise the department which applications — from those that had been assessed to meet the minimum assessment score required to be considered as ‘value with relevant money’ — should be awarded funding, with the aggregate value of that ‘pool’ of applications being significantly in excess of the funding available to be awarded in both rounds three and five. See further at paragraph 22 of the report summary.

Appendix 2 Improvements observed by the ANAO

1. The existence of independent external audit, and the accompanying potential for scrutiny improves performance. Improvements in administrative and management practises usually occur: in anticipation of ANAO audit activity; during an audit engagement; as interim findings are made and/or after the audit has been completed and formal findings are communicated.

2. The Joint Committee of Public Accounts and Audit (JCPAA) has encouraged the ANAO to consider ways in which the ANAO could capture and describe some of these impacts. The ANAO’s 2021–22 Corporate Plan states that the ANAO’ s annual performance statements will provide a narrative that will consider, amongst other matters, analysis of key improvements made by entities during a performance audit process based on information included in tabled performance audit reports.

3. Performance audits involve close engagement between the ANAO and the audited entity as well as other stakeholders involved in the program or activity being audited. Throughout the audit engagement, the ANAO outlines to the entity the preliminary audit findings, conclusions and potential audit recommendations. This ensures that final recommendations are appropriately targeted and encourages entities to take early remedial action on any identified matters during the course of an audit. Remedial actions entities may take during the audit include:

  • strengthening governance arrangements;
  • introducing or revising policies, strategies, guidelines or administrative processes; and
  • initiating reviews or investigations.

4. In this context, the below actions were observed by the ANAO during the course of the audit. It is not clear whether these actions and/or the timing of these actions were planned in response to proposed or actual audit activity. The ANAO has not sought to obtain assurance over the source of all of these actions or whether they have been appropriately implemented.

  • Improvements to the Department of Infrastructure, Transport, Regional Development, Communications and the Arts’ (Infrastructure’s) accuracy in and process for identifying relevant records are identified for Freedom of Information Act 1982 (FOI Act) purposes.
  • The services schedules to the Memorandum of Understanding (MOU) between Infrastructure and the Department of Industry, Science and Resources (DISR) for the provision of Business Grant Hub services for the fifth and sixth rounds of the Building Better Regions Fund (BBRF) were executed on 9 August 2021 (for round five) and 19 April 2022 (for round six) after protracted negotiations between the departments had delayed the finalisation.
  • Infrastructure appointed an external probity adviser in November 2021 and held a probity information session for staff in April 2022 with a focus on presenting the new draft divisional probity plan.

Appendix 3 Membership of the ministerial panel for each round

Round

Chair

Members

1

Senator the Hon Fiona Nash

The Hon Darren Chester MP

Senator the Hon Arthur Sinodinos AO

Senator the Hon James McGrath

2

The Hon Dr John McVeigh MP

The Hon Michael McCormack MP

Senator the Hon Bridget McKenzie

Senator the Hon Michaelia Cash

Senator the Hon James McGrath

The Hon Keith Pitt MP

3

The Hon Michael McCormack MP

Senator the Hon Bridget McKenzie

Senator the Hon Matthew Canavan

Senator the Hon Simon Birmingham

The Hon Sussan Ley MP

The Hon Steve Irons MP

4

Senator the Hon Simon Birmingham

The Hon Dan Tehan MP

The Hon David Littleproud MP

The Hon Mark Coulton MP

The Hon Nola Marino MP

The Hon Ben Morton MP

The Hon Kevin Hogan MP

Senator the Hon Jonathon Duniam

5

The Hon Barnaby Joyce MP

The Hon David Littleproud MP

The Hon Dan Tehan MP

Senator the Hon Anne Ruston

Senator the Hon Bridget McKenzie

Senator the Hon Richard Colbeck

The Hon David Gillespie MP

The Hon Ben Morton MP

The Hon Kevin Hogan MP

The Hon Nola Marino MP

     

Source: ANAO analysis of departmental records.

Appendix 4 ANAO audits of regional grants programs since the introduction of the Commonwealth grants administration framework in 2009

1. Auditor-General Report No.3 2010–11 The Establishment, Implementation and Administration of the Strategic Projects Component of the Regional and Local Community Infrastructure Program found that no version of the program guidelines outlined the assessment criteria that would be used to select the successful applications and that the department did not provide recommendations to the minister about which projects should be approved within the available funding of $550 million which ‘was a significant failing on the part of the department given that, since December 2007, the enhanced grants administration framework has required departments to provide advice to ministers on the merits of each grant application relative to the guidelines for the program’.120 In its review of this audit report, the Joint Committee of Public Accounts and Audit (JCPAA) concluded that:

  • the department had not taken its previous assurances to implement and adhere to improved grants administration seriously;
  • the processes put in place to rectify the lack of published assessment criteria did not appear to have provided the minister with clear funding recommendations and the Committee was concerned that the department did not provide the minister with clear, documented advice on which to base their decisions; and
  • overall, it was the lack of documentation surrounding the final selection of successful applications that was of greatest concern to the JCPAA as it signalled a lack of accountability and transparency (which are included in the Commonwealth Grant Rules and Guidelines’ (CGRG’s) seven key principles of grants administration).121

2. Auditor-General Report No.3 2012–13 The Design and Conduct of the First Application Round for the Regional Development Australia Fund identified that while selection criteria had been published in the program guidelines and applied, the department’s advice to the decision-maker continued to require improvement as did its approach to recording the basis for funding decisions.122

3. Auditor-General Report No.9 2014–15 The Design and Conduct of the Third and Fourth Funding Rounds of the Regional Development Australia Fund made similar findings to the audit of the first round and concluded that considerable work remained to be done to design and conduct regional grant programs in a way where funding is awarded, and can be seen to have been awarded, to those applications that demonstrate the greatest merit in terms of the published program guidelines.123 In its review of this audit report, the JCPAA concluded that there would be benefits from the department providing a clear statement in its funding recommendations as to whether each grant application should be approved or rejected.124

4. Auditor-General Report No.30 2016–17 The Design and Implementation of Round Two of the National Stronger Regions Fund concluded that program design and implementation was largely effective and, in addition, earlier audit recommendations aimed at improving the department’s assessment of applications, funding advice to decision-makers and evaluation of program outcomes had been implemented.125

5. Auditor-General Report No.12 2019–20 Award of Funding Under the Regional Jobs and Investment Packages concluded that, while advice provided by the department was largely appropriate, the assessment processes were not to the standard required by the grants administration framework and the records of funding decisions did not provide sufficient transparency.126 The JCPAA review of this audit report concluded that recording more information about the ministerial panel decision-making, including probity issues such as conflicts of interest and the decision-making process used, could have avoided questions about the bias or otherwise of the panel as well as improving public confidence.127

Appendix 5 Departmental advice on the consideration of exemption requests

Round

Were the requests assessed?

Briefs provided to Panel

Departmental recommendation

Panel decision

1

Yes

No recommendation provided.

Total 93 exemption requests received, with 58 of those otherwise meeting all other eligibility criteria.

All 58 applications otherwise meeting all other eligibility criteria were approved, and 13 of these projects were subsequently approved to receive a grant.

2

Yes

No recommendation provided.

71 requests received, with Infrastructure noting that some of the requests provided limited information on why the exemption should be granted.

All 71 requests were approved, and 15 of these projects were subsequently approved to receive a grant.

3

Yes

Department recommended that 96 of the 115 requests be approved.

All 115 requests were approved, and 27 of these projects were subsequently approved to receive a grant.

All 27 of those projects had been recommended by the department to receive a co-funding exemption.

4

Yes

Department recommended that 60 of the 99 requests be approved.

All 99 requests were approved, and 22 of these projects were subsequently approved to receive a grant.

Of those 22, 17 projects had been recommended by the department to receive a co-funding exemption.

5

Yes

Department recommended that 111 of the 148 requests be approved.

All 148 requests were approved, and 34 of these projects were subsequently approved to receive a grant.

Out of those 34, 30 applications had been recommended by the department to receive a co-funding exemption.

         

Source: ANAO analysis of departmental records.

Appendix 6 Timeline of the decision-making processes for each funding round

Timeline of the decision-making processes for each funding round

Note a: After at least two cancellations, the ministerial panel did not meet in round three to decide which grants to approve.

Source: ANAO analysis of departmental records.

Appendix 7 Information distributed by the chair through his or her office to the ministerial panel in each funding round

Round and timing

Material prepared within and distributed among panel members through their offices

Components of departmental advice provided to panel members through their offices

1

6/06/2017

 

 

Full infrastructure project (IP) stream briefing pack (cover brief and all 11 attachments).

13/06/2017

List of 418 eligible IP stream applications sorted by electorate. A comments column included parliamentarians’ preferences and level of community support for projects.

 

14/07/2017ᵃ

 

Full community investments (CI) stream briefing pack (cover brief and all 11 attachments).

11/08/2017

Electorate details and comments for the 187 eligible CI stream applications.

 

2

25/05/2018

 

An application shortlist requiring further refinement by the panel.ᵇ

No cover brief for either stream.

One of 10 attachments (list funding recommendations covering both IP and CI streamsᶜ).

3

14/02/2019

An application ‘shortlist’ for the IP stream only.

 

No cover brief for the IP stream.

Three of 11 attachments for the IP stream (the list of funding recommendations; PGPA Act and CGRGs obligations; and list of ineligible applications)

No parts of the CI briefing were provided beyond the chair to the broader panel membership.

4

21/04/2020

 

Cover brief and two of eight attachments (list of funding recommendations covering both IP and CI streamsᶜ; and an overview of the eligibility and assessment process).

28/04/2020

A meeting agenda and a list of 59 IP applications to be considered the next day by the panel (57 were funded).

 

30/04/2020

A meeting agenda and list of 53 IP applications to be considered the next day for funding (52 were funded).

 

5

3/09/2021

 

No cover brief.

Three of eight attachments (list of funding recommendations covering both IP and CI streams; PGPA Act and CGRGs obligations; and an overview of the eligibility and assessment process)

15/09/2021

An agenda and list of 270 projects to be considered later that day for funding (260 were funded).

 

       

Note a: This is the date of lodgement through the briefing system.

Note b: The shortlist consisted of applications requesting an aggregate value that exceeded available funding by at least $45 million. It was described as a ‘summary document’ providing ‘a state by state, and seat by seat analysis of projects, member priorities and other feedback.’

Note c: Attachment A had been amended by the chair through his office to include the electorate details for each application.

Source: ANAO analysis of departmental records

Appendix 8 Distribution of application total scores for the infrastructure project stream

Distribution of application total scores for the infrastructure project stream

Application score or outcome

Round 1

Round 2

Round 3

Round 4

Round 5

 

Total

Approved

Not Approved

Total

Approved

Not Approved

Total

Approved

Not Approved

Total

Approved

Not Approved

Total

Approved

Not Approved

40

0a

0a

0a

0a

0a

0a

0a

0a

0a

39

0a

0a

0a

0a

0a

0a

0a

0a

0a

38

3

1b

2

0a

0a

0a

0a

0a

0a

37

3

1b

2

0a

0a

0a

0a

0a

0a

36

5

0b

5

0a

0a

0a

2

0a

2

35

0a

0 a

0 a

0a

0a

0a

9

3b

6

0a

0a

0a

1

0a

1

34

2

2b

0

0a

0a

0a

19

11b

8

2

2b

0

4

4b

0

33

1

1b

0

0a

0a

0a

30

11b

19

1

1b

0

6

3b

3

32

4

2b

2

0a

0a

0a

43

16b

27

8

6b

2

21

11b

10

31

18

15b

3

4

2b

2

55cd

18cd

37cd

13

11b

2

34d

18bd

16d

30

16

13b

3

10

6b

4

69

24b

45

20

15b

5

29

14b

15

29

16

11b

5

14

9b

5

32

14b

18

32cd

25cd

7cd

41

19b

22

28

21

16b

5

30

14b

16

62

21b

41

35

8b

27

56

23b

33

27

30

17b

13

55

24b

31

46

15b

31

27

11b

16

63

22b

41

26

25

11b

14

63cd

19cd

44cd

46

15b

31

26

9b

17

77

34b

43

25

45cd

10cd

35cd

85

16b

69

38

13b

25

34

10b

24

84

26b

58

24

34

4b

30

77

20b

57

16

3b

13

34

11b

23

66c

23c

43c

23

37

1b

36

79

10b

69

12a

0a

12a

14a

0a

14a

19a

0a

19a

22

32

4b

28

59

15b

44

7a

0a

7a

12a

0a

12a

22a

0a

22a

21

40

1b

39

8

1b

7

5a

0a

5a

12a

0a

12a

26a

0a

26a

20

27

0b

27

4a

0a

4a

5a

0a

5a

9a

0a

9a

17a

0a

17a

19

25

0b

25

1a

0a

1a

5a

0a

5a

16a

0a

16a

27a

0a

27a

18

24

1b

23

1a

0a

1a

3a

0a

3a

8a

0a

8a

16a

0a

16a

17

9

0b

9

2a

0a

2a

11a

0a

11a

15a

0a

15a

38a

0a

38a

16

3

0b

3

4a

0a

4a

4a

0a

4a

4a

0a

4a

7a

0a

7a

15

2

0b

2

0a

0a

0a

3a

0a

3a

3a

0a

3a

7a

0a

7a

14

5

0b

5

1a

0a

1a

4a

0a

4a

1a

0a

1a

7a

0a

7a

13

2

0b

2

0a

0a

0a

1a

0a

1a

0a

0a

0a

5a

0a

5a

12

0

0b

0

2a

0a

2a

0a

0a

0a

1a

0a

1a

4a

0a

4a

11

0

0b

0

2a

0a

2a

1a

0a

1a

1a

0a

1a

6a

0a

6a

10

0

0b

0

0a

0a

0a

0a

0a

0a

1a

0a

1a

2a

0a

2a

9

0

0b

0

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

8

0

0b

0

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

7

0

0b

0

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

Ineligible

119

1b

118

37a

0a

37a

71a

0a

71a

41a

0a

41a

96a

0a

96a

Withdrawn

8a

0a

8a

29a

0a

29a

5a

0a

5a

4a

0a

4a

16a

0a

16a

Total #

545

110

435

567

136

431

613

166

447

374

109

265

799

197

602

                               

Key:

 

Indicates that total application scores of this magnitude were not available this round.

a

No applications at this score point were approved for funding.

b

Illustrates the spread of total application scores for those approved under this round.

c

Indicates the minimum total score required for the application to be recommended for funding by the department for this round.

d

Indicates the total score required for the application to be considered ‘highly scored’ by the department for this round.

   

Source: ANAO analysis of departmental records.

Appendix 9 Distribution of application total scores for the community investment stream

Distribution of application total scores for the community investment stream

Application score or outcome

Round 1

Round 2

Round 3

Round 4

Round 5

 

Total

Approved

Not Approved

Total

Approved

Not Approved

Total

Approved

Not Approved

Total

Approved

Not Approved

Total

Approved

Not Approved

40

0a

0a

0a

0a

0a

0a

39

0a

0a

0a

0a

0a

0a

38

0a

0a

0a

0a

0a

0a

37

0a

0a

0a

0a

0a

0a

36

0a

0a

0a

0a

0a

0a

35

0a

0a

0a

0a

0a

0a

1

1b

0

0a

0a

0a

34

0a

0a

0a

0a

0a

0a

1

1b

0

1

1b

0

33

0a

0a

0a

0a

0a

0a

0

0b

0

2

2b

0

32

1

1b

0

0a

0a

0a

2

2b

0

3

3b

0

31

0

0b

0

0a

0a

0a

2

2b

0

8

8b

0

30

4

4b

0

0a

0a

0a

0a

0a

0a

3

3b

0

10

10b

0

29

5

5b

0

8

6b

2

0a

0a

0a

10

8b

2

11

11b

0

28

3

3b

0

7

7b

0

0a

0a

0a

12

10b

2

17

16b

1

27

10

10b

0

14

13b

1

1

1b

0

3

3b

0

12

12b

0

26

9

8b

1

23

20b

3

5

5b

0

10

9b

1

10

10b

0

25

25

25b

0

31

27b

4

7

7b

0

9

8b

1

15

12b

3

24

20

20b

0

30

25b

5

15

15b

0

9cd

7cd

2cd

16cd

16cd

0cd

23

27

27b

0

26cd

11cd

15cd

16

16b

0

2a

0a

2a

3a

0a

3a

22

29

28b

1

29a

0a

29a

33

32b

1

1a

0a

1a

2a

0a

2a

21

10cd

7cd

3cd

4a

0a

4a

21

20b

1

2a

0a

2a

4a

0a

4a

20

6

4b

2

0a

0a

0a

43

43b

0

1a

0a

1a

1a

0a

1a

19

3

1b

2

0a

0a

0a

24

22b

2

3a

0a

3a

0a

0a

0a

18

3

1b

2

1a

0a

1a

8cd

3cd

5cd

3a

0a

3a

3a

0a

3a

17

2

1b

1

1a

0a

1a

2a

0a

2a

8a

0a

8a

9a

0a

9a

16

1

1b

0

1a

0a

1a

3a

0a

3a

3a

0a

3a

1a

0a

1a

15

4

1b

3

1a

0a

1a

9a

0a

9a

0a

0a

0a

0a

0a

0a

14

4a

0a

4a

1a

0a

1a

6a

0a

6a

1a

0a

1a

1a

0a

1a

13

3a

0a

3a

0a

0a

0a

2a

0a

2a

3a

0a

3a

1a

0a

1a

12

8a

0a

8a

0a

0a

0a

6a

0a

6a

0a

0a

0a

1a

0a

1a

11

4a

0a

4a

0a

0a

0a

6a

0a

6a

1a

0a

1a

0a

0a

0a

10

1a

0a

1a

0a

0a

0a

3a

0a

3a

0a

0a

0a

4a

0a

4a

9

5a

0a

5a

0a

0a

0a

2a

0a

2a

0a

0a

0a

0a

0a

0a

8

0a

0a

0a

0a

0a

0a

4a

0a

4a

1a

0a

1a

0a

0a

0a

7

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

0a

Ineligible

201a

0a

201a

92a

0a

92a

84a

0a

84a

74a

0a

74a

149a

0a

149a

Withdrawn

4a

0a

4a

3a

0a

3a

2a

0a

2a

5a

0a

5a

2a

0a

2a

Total #

392

147

245

272

109

163

302

164

138

170

54

116

286

101

185

                               

Key:

 

Indicates that total application scores of this magnitude were not available this round.

a

No applications at this score point were approved for funding.

b

Illustrates the spread of total application scores for those approved under this round.

c

Indicates the minimum total score required for the application to be recommended for funding by the department for this round.

d

Indicates the total score required for the application to be considered ‘highly scored’ by the department for this round.

   

Source: ANAO analysis of departmental records.

Appendix 10 Infrastructure advice for application categories and approach taken for reporting to the Finance Minister in each round

Infrastructure advice for application categories and approach taken for reporting to the Finance Minister in each round

 

Ineligible applications

Applications assessed as not value with relevant money

Applications assessed as value with relevant money but scored and ranked lower

 

Advice

Approvals

Advice

Approvals

Advice

Approvals

1

Recommended that these be rejected.

 

One ineligible application approved.

This application was reported to the Finance Minister.

Recommended that these applications be rejected.

 

Twenty such applications were approved. These 20 applications were reported to the Finance Minister.

Did not recommend that these applications be ‘rejected’.

 

Twenty-two such applications were approved.

These 22 applications were not reported to the Finance Minister.

2

Did not recommendᵃ that these applications be rejected.

None of these applications were approved.

Did not recommend ᵃ that these applications be rejected.

None of these applications were approved.

Did not recommend that these applications be ‘rejected’.

 

Seventy-six of these applications were approved.

These 76 applications were not reported to the Finance Minister.

3

As for round two.

None of these applications were approved.

Recommended that these applications not be funded.

None of these applications were approved.

Infrastructure advised that lower scored and ranked applications were ‘not recommended’ but did not advise that these applications be ‘rejected’.

One hundred and twelve of these applications were approved.

These 112 applications were reported to the Finance Minister.

4

As for rounds two and three.

None of these applications were approved.

‘Strongly recommended’ these applications not be funded.

None of these applications were approved.

Infrastructure advised that lower scored and ranked applications were ‘not recommended’ but did not advise that these applications be ‘rejected’.

Forty-nine such applications were approved.

These 49 applications were reported to the Finance Minister.

5

As for rounds two, three and four.

None of these applications were approved.

Recommended that these applications not be funded.

None of these applications were approved.

Infrastructure changed its approach to making recommendations and, as a result, all applications assessed as value with relevant money, irrespective of their scoring and rank, were included in the pool the panel was advised to select from. Infrastructure advised the panel that all applications assessed as value with relevant money were ‘suitable for selection’.

             

Key:

 

Denotes a positive correlation between the approach adopted and the principles and/or requirements of the Commonwealth Grant Rules and Guidelines (CGRGs).

 

Denotes that the approach adopted represents some degree of departure from the principles and/or requirements of the CGRGs.

 

Denotes an inconsistency between the approach adopted and the principles and/or requirements of the CGRGs.

   

Note a: Infrastructure advised the ANAO in November 2021 that it is of the view that ineligible applications cannot be considered for funding under the program guidelines, and for this reason it has moved away from recommending the panel reject these applications.

Source: ANAO analysis of departmental records.

Appendix 11 Average funding and number of grants awarded per electorate

1. The table below summarises the number of electorates per political party that were associated with at least one eligible application across each funding round. Also outlined are the corresponding average numbers of grants and funding awarded per electorate.

Table A.1: Number of electorates per round with at least one eligible application

 

Round 1

Round 2

Round 3

Round 4

Round 5

LIB

22

25

29

15

28

NAT

16

16

16

16

16

ALP

27

25

31

11

24

KAP

1

1

1

1

1

CA

1

1

1

1

1

IND

2

2

2

0

2

GRN

0

0

1

0

0

Total

69

70

81

44

72

           

Source: ANAO analysis of departmental records.

Table A.2: Average number of grants awarded per electorate with eligible applications

 

Round 1

Round 2

Round 3

Round 4

Round 5

LIB

4.4

3.7

4.7

3.1

4.4

NAT

5.0

4.9

6.2

6.1

6.1

ALP

2.2

2.0

1.9

1.3

2.3

KAP

11.0

16.0

12.0

3.0

5.0

CA

5.0

4.0

9.0

2.0

11.0

IND

2.0

2.5

7.5

0.0

3.5

GRN

0.0

0.0

0.0

0.0

0.0

           

Source: ANAO analysis of departmental records.

Table A.3: Average amount of funding per electorate with eligible applications if awarded based on merit scores

 

Round 1

Round 2

Round 3

Round 4

Round 5

LIB

$5,139,095

$4,149,706

$2,473,762

$5,099,598

$3,346,803

NAT

$2,537,849

$4,367,889

$5,818,586

$5,996,369

$2,995,969

ALP

$2,895,344

$1,123,320

$897,814

$1,795,386

$1,515,359

KAP

$626,874

$1,893,984

$13,549,275

$7,605,091

$23,845,740

CA

$417,588

$5,215,000

$351,441

$5,794,224

$82,200

IND

$3,921,500

$2,980,889

$771,285

$0

$3,669,402

GRN

$0

$0

$0

$0

$0

           

Source: ANAO analysis of departmental records.

Table A.4: Average amount of funding awarded per electorate with eligible applications

 

Round 1

Round 2

Round 3

Round 4

Round 5

LIB

$3,900,266

$2,916,338

$2,769,519

$4,782,506

$4,281,439

NAT

$4,351,966

$5,704,137

$5,764,047

$6,609,071

$6,259,958

ALP

$2,334,581

$1,268,768

$754,755

$1,656,408

$1,942,932

KAP

$4,735,949

$6,260,631

$2,491,912

$7,605,091

$17,522,542

CA

$673,521

$5,215,000

$914,455

$3,770,000

$3,180,262

IND

$1,281,500

$2,589,889

$2,101,285

$0

$6,358,452

GRN

$0

$0

$0

$0

$0

           

Source: ANAO analysis of departmental records.

Footnotes

1 During the course of the audit and prior to 1 July 2022, the administering entity was the Department of Infrastructure, Transport, Regional Development and Communications.

2 DISR was responsible for, among other things, conducting the application eligibility and merit assessment processes for each round of the BBRF. The results from those processes were provided to Infrastructure, as the department responsible for advising the ministerial panel, and used to form the basis of Infrastructure’s funding recommendations for the ministerial panel. These responsibilities are discussed at paragraph 2.38.

3 During the course of the audit and prior to 1 July 2022, the department’s name was the Department of Industry, Science, Energy and Resources (or DISER).

4 In accordance with the Streamlining Government Grants Administration (SGGA) program, participating entities are required to consume grants administration services from one of the Community or Business Grants Hubs. See: Department of Finance, Grants: Getting Started, available from https://www.finance.gov.au/government/commonwealth-grants/grants-gettin… [accessed 23 March 2022].

5 Auditor-General Report No.12 2019–20 Award of Funding Under the Regional Jobs and Investment Packages.

6 See, for example, Auditor-General Report No.30 2016–17 Design and Implementation of Round Two of the National Stronger Regions Fund and Auditor-General Report No.1 2013–14 Design and Implementation of the Liveable Cities Program.

7 The Department of Infrastructure, Transport, Regional Development, Communications and the Arts (Infrastructure) advised the ANAO in February and March 2022 that the merit assessments for round six applications were expected to be finalised by 27 April 2022, with the funding decisions and announcements expected between mid-June and early August 2022. The ANAO has not audited the assessment of applications and award of grant funding under the sixth round.

8 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, paragraph 2.1.

9 Department of Finance, Submission to the Joint Committee of Public Accounts and Audit Inquiry into Auditor-General’s Reports 5, 12 and 23 (2019–20), Submission 10, March 2020, p. 3.

10 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, paragraph 6.3. The seven key principles are: robust planning and design; collaboration and partnership; proportionality; and outcomes orientation; achieving value with relevant money; governance and accountability; and probity and transparency. In October 2021, the Australian Government agreed to a recommendation from the Joint Committee of Public Accounts and Audit (JCPAA) in its Report 484 The Administration of Government Grants that the CGRGs be amended to include an eighth key principle for grants administration of ‘adherence to published guidelines’.

11 Mostly recently, the April 2022 report of the New South Wales Government’s review of grants administration made 19 recommendations, which included that a grants administration guide based on the principles set out in the CGRGs (containing mandatory requirements for officials, ministers and ministerial staff) be issued by legislative instrument. See: NSW Department of Premier and Cabinet and the NSW Productivity Commissioner, Review of grants Administration in NSW, April 2022, available from https://www.dpc.nsw.gov.au/publications/ [accessed 9 May 2022].

12 During the course of the audit and prior to 1 July 2022, the administering entity was the Department of Infrastructure, Transport, Regional Development and Communications.

13 During the course of the audit and prior to 1 July 2022, the department’s name was the Department of Industry, Science, Energy and Resources (or DISER).

14 Approval for the purposes of section 71 of the PGPA Act was provided by the chair of each ministerial panel.

15 Secretariat support for the panel was provided by Infrastructure under the first round. In response to its offer in the week leading up to the panel’s deliberations for round two to do so again, Infrastructure was advised by the chair (through his office) that it did not need to attend. The chair (through his office) then assumed responsibility for providing these services for all subsequent funding rounds.

16 Application electorate details were collated by ministerial staff and applications from within panel members’ electorates were flagged in those records. Any subsequent declarations of conflicts of interest by panel members, and any associated processes for managing them, were not recorded.

17 Auditor-General Report No.12 2019–20 Award of Funding Under the Regional Jobs and Investment Packages.

18 See, for example, Auditor-General Report No.30 2016–17 Design and Implementation of Round Two of the National Stronger Regions Fund and Auditor-General Report No.1 2013–14 Design and Implementation of the Liveable Cities Program.

19 A number of staff from ministerial offices involved in the award of grant funding under one or more rounds were invited to meet with the ANAO to provide information on how applications were assessed against the program guidelines and funding decisions taken. As of May 2022, one of the 13 invited advisers had met with the ANAO.

20 The Department of Finance is responsible for developing policy and advising on the resource management framework for public sector agencies. The Public Governance, Performance and Accountability Act 2013 is the cornerstone of the resource management framework and the CGRGs are issued under section 105C of that Act.

21 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, paragraph 11.5, p. 31.

22 Auditor-General Report No.7 2021–22 Australian Government Grants Reporting, page 23.

23 Eligibility related to project location was changed from round two onwards. In round one, to be eligible, a project had to be located outside an excluded area. Excluded areas were defined in the program guidelines to be comprised of the significant urban areas of Sydney, Melbourne, Brisbane, Perth, Adelaide and Canberra (Hobart and Darwin have been eligible locations in all funding rounds). From round two onwards, the location restrictions were amended to extend the geographical eligibility criterion to include peri-urban areas, and to allow projects in excluded areas to be funded if it could be demonstrated that the benefits of the project would flow to an eligible area.

24 The guidelines set out that this evidence could include: being located in a local government area that is eligible for the Australian Government’s Drought Communities Programme – Extension; being located in a locality drought-declared by the relevant state or territory government; official Bureau of Meteorology rainfall data indicating an extended period without or significant decline in rainfall; or demonstrated impact of economic and/or employment decline as a result of drought.

25 The ANAO requested advice from Infrastructure in November 2021 as to how lessons learned in subsequent funding rounds had been used to inform the design and delivery of subsequent rounds of the BBRF.

26 Infrastructure advised the ANAO that legal advice obtained by DISR provided assurance that the eligibility criteria under the program guidelines does not affect the validity of the grant agreement.

27 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, Glossary, p.40, available from https://www.finance.gov.au/government/commonwealth-grants/commonwealth-grants-rules-and-guidelines [accessed 4 May 2022].

28 Department of Finance, Grant Opportunity Guidelines Templates, Competitive/Non-Competitive Template, p.14. available from https://www.finance.gov.au/sites/default/files/2021-01/Competitive%20-%20Non-competitive%20grants%20Jan%202021.pdf [accessed 4 March 2022].

29 This section has been titled ‘The assessment criteria’ since the third round and was titled ‘The merit criteria you will need to address’ under the first two rounds.

30 During the 14 February 2022 hearing of the Senate Rural and Regional Affairs and Transport Legislation Committee, Infrastructure was asked questions as to why applications that had been highly scored against the merit criteria had not been approved for funding, given the program was an open and competitive program. Infrastructure advised the committee that its funding recommendations are an ‘initial ranking based on [the] four assessment criteria’, with the ministerial panel’s consideration of applications taking ‘into account both those sets of criteria’ and that the scoring of applications against the four published merit criteria represented, according to the department’s evidence, ‘just half the equation’.

31 These criteria have been listed in the sub-section titled ‘Who will approve grants?’ since the third round and under the sub-heading ‘Final decision’ in the first two rounds.

32 This was set out as the extent to which the project leveraged additional funding and partnerships, and the likelihood of the project going ahead without the grant funding. From round three, this criterion was revised as the ‘impact of funding on the project’.

33 In an April 2022 review of its grants administration, the NSW Government acknowledged that broad and non-specific criteria such as ‘any other relevant factors’ are not good practice and should not be included in grant guidelines. It also noted that the broad discretion provided to assessors and decision-makers by such criteria is ‘not suited to objective, merit-based grants administration’ and ‘creates the risk that decisions may not be—and may not be seen to be—fair, accountable, and transparent’. See: NSW Department of Premier and Cabinet and the NSW Productivity Commissioner, Review of grants Administration in NSW, April 2022, p.34, available from https://www.dpc.nsw.gov.au/publications/ [accessed 9 May 2022].

34 These have been listed in the sub-section titled ‘Who will approve grants?’ since the third round and under the sub-heading ‘Final decision’ in the first two rounds of the BBRF.

35 Tourism-related infrastructure projects were also a focus of the third round. While the list of discretionary factors for that round did not draw attention to this, the introductory sections of those guidelines outlined that up to $45 million of that round had been earmarked for such projects.

36 Round six applications closed at 5pm Australian Eastern Daylight Time on 10 February 2022.

37 In round three, Infrastructure recorded an intention to include in its briefing to the ministerial panel any relevant information against these factors that may prove useful to the ministerial panel in its considerations, and for this purpose the division with responsibility for the BBRF consulted with the Cities Division within the department on applications for projects relevant to city/regional deals that had been concluded or largely concluded.

38 See also paragraph 2.16.

39 Mr Peter Grant PSM, Strategic Review of the Administration of Australian Government Grant Programs, 31 July 2008, p. 7.

40 The round one program guidelines included that the loading ‘would’ be applied to applications’ total scores. From rounds two to four, the program guidelines indicated that the loading ‘may’ be applied.

41 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, paragraph 13.9, p. 36.

42 In addition to the respective higher-level Accountable Authority’s Instructions, these can include internal grants management guides or handbooks, standard operating procedures (SOPs) and, where applicable, memoranda of understanding.

43 Individual services schedules to this MoU have been signed on 3 February 2017 (for rounds one and two), 26 October 2018 (for round three), 14 January 2020 (for round four), 9 August 2021 (for round five) and 19 April 2022 (for round six).

44 Infrastructure advised the ANAO in February 2022 that ‘[o]utside of the first round of the program, the Department has not been asked to provide Secretariat support for the ministerial Panel’.

45 The published guidelines identified that the ‘program delegate’ (that is, DISR), would make the final decisions on what constituted eligible activities or expenditure for both streams across all rounds.

46 Panel membership for three of the rounds has been provided in response to questions at Senate Estimates. For the first two rounds, and the current sixth round, the membership has not been published in the guidelines, otherwise announced or provided to the Parliament. In commenting on draft guidelines, the Department of Finance had suggested that the panel membership should be included.

47 In round one, an ineligible application was awarded funding, in round two, another ineligible application was not removed from consideration (although it was not awarded grant funding), and in round three, a ‘potentially ineligible’ entity received grant funding. Infrastructure advised the ANAO that legal advice obtained by DISR was that ‘the eligibility criteria under the program guidelines does not affect the validity of the grant agreement.’

48 Eligibility was not able to be confirmed for 11 (13 per cent) of the sample of 83 applications assessed as eligible; and ineligibility was not able to be confirmed for one application (eight per cent) from a sample of 13 applications assessed as ineligible.

49 Infrastructure advised the ANAO in April 2022 that the external probity adviser had been appointed in November 2021 and a probity information session was held in April 2022 to inform and advise Infrastructure staff on the new draft divisional probity plan.

50 Auditor-General Report No.12 2019–20 Award of Funding Under the Regional Jobs and Investment Packages.

51 The PGPA Act defines ‘proper’ as efficient, effective, economical and ethical.

52 Consequently, where that advice is not agreed to, it no longer provides the written basis required by the CGRGs and PGPA Act for funding approvals in its original form. Decision-makers are therefore required to modify or attach to that advice, in writing, where and how they have formed a different view.

53 In accordance with the whole-of-government Parliamentary Workflow Solution Program (PWS), each briefing package was lodged by Infrastructure through the Parliamentary Document Management System (PDMS). See: Department of Finance, Parliamentary Document Management System (PDMS), available from https://www.finance.gov.au/government/whole-government-information-and-communications-technology-services/parliamentary-document-management-system-pdms [accessed 20 March 2022].

54 Merit assessments were conducted by the Business Grants Hub within DISR with the final results provided to Infrastructure.

55 This information was provided in spreadsheet format (provided first by DISR to Infrastructure) and included comprehensive project information and applicant contact details, and later as the round progressed, applications’ merit assessment scores.

56 The other factors are outlined from paragraph 2.22.

57 Specifically, two officials from Infrastructure and two officials from DISR attended the 14 June 2017 session. The same two Infrastructure officials attended the subsequent 21 June and 17 August 2017 meetings. The DISR representatives did not attend the latter sessions.

58 The USB device contained — as a minimum and potentially amongst other things — an advanced copy of Infrastructure’s order of merit (which later formed ‘Attachment A’ to the department’s briefing) and a one-page summary of 17 ‘sensitive projects’ flagging varying levels of support or concern for those projects from within Infrastructure. These were provided to the chair’s office on 30 January 2019. The departmental briefing was provided on 11 February 2019.

59 The ANAO’s analysis of Infrastructure’s email records identified the delivery of the USB and some, but not all, of the documents provided. Infrastructure was unable to advise the exhaustive contents of the USB.

60 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, Glossary, p.40.

61 Auditor-General Report No.9 2014–15 The Design and Conduct of the Third and Fourth Funding Rounds of the Regional Development Australia Fund.

62 Merit assessments were conducted by the Business Grants Hub within DISR with the results provided to Infrastructure.

63 The three categories for the Infrastructure Project (IP) stream were projects valued at: under $1 million; between $1 million and $5 million; and over $5 million. Categories for the Community Projects (CI) stream were projects: under $20,000; between $20,000 and $100,000; and over $100,000.

64 Auditor-General Report No.12 2019-20 Award of Funding Under the Regional Jobs and Investment Packages.

65 ibid., pp 25-28.

66 In the first two rounds, this criterion was titled ‘value for money’ and in the more recent rounds it was called ‘impact of grant funding on your project’.

67 The departments entered into a Memorandum of Understanding (MOU) for the provision of Business Grant Hub services on 19 December 2016. Schedules to this MOU were signed on 3 February 2017 (for rounds one and two), 26 October 2018 (for round three), 14 January 2020 (for round four), and 9 August 2021 (for round five).

68 These details were recorded in writing in Annex 2 of the services schedule for the third round. While arrangements for the administration of co-funding exemptions were not addressed in the previous services schedule, the processes adopted in the first two rounds were consistent with the 26 October 2018 schedule.

69 Also referred to as ‘guiding principles’, the two-page framework was provided as an attachment to the exemption approvals brief provided to the ministerial panels in the third, fourth and fifth rounds.

70 The only exception to this was during the first two funding rounds for the third merit criterion (the ‘value with relevant money’ criterion), which involved assessing the extent to which projects would leverage funding from sources in addition to the requested grant, and the likelihood of the project going ahead without the grant funding. For applications to be assessed as being value with relevant money under those two funding rounds, the minimum score accepted by the departments against the third criterion was one out of the five available (or 20 per cent), resulting in minimum overall scores of 19 out of 35 (or 54 per cent).

71 Applications that did not meet the 60 per cent threshold against one or more individual criteria were identified as ‘not value with relevant money’.

72 Principles-based frameworks are adopted in favour of prescriptive or rules-based approaches where the framework requires flexibility to be fit-for-purpose across a range of circumstances.

73 Australian Law Reform Commission, For Your Information: Australian Privacy Law and Practice (ALRC Report 108), August 2010, paragraph 4.7, p. 238.

74 Department of Finance, Submission to the Joint Committee of Public Accounts and Audit Inquiry into Auditor-General’s Reports 5, 12 and 23 (2019–20), Submission 10, March 2020, p. 3.

75 Australian Government Solicitor, Commonwealth grants: an overview of legal issues, 12 February 2019, p. 8; Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, paragraph 6.3, p. 16.

76 ibid., paragraph 4.12, p. 11. This report must include a brief statement of reasons outlining the basis for the approval of each grant. This report must be provided to the Finance Minister by 31 March each year for the preceding calendar year.

77 Paragraph 4.7 sets out the minimum information that officials must provide to ministers in relation to the assessment of applications against the selection criteria, and notes that ‘[a]ny specific recommendations regarding grant applications for approval can be in addition to this information’.

78 For the first four rounds, the ‘recommended’ pool of applications were a subset of those assessed as value with relevant money and formed the basis for Infrastructure’s recommendations to the ministerial panel.

79 The approach adopted in the third and fifth rounds of the BBRF were similar to those about which the ANAO made findings in its audit of a predecessor program, the Strategic Projects Component of the Regional and Local Community Infrastructure Program (Auditor-General Report No.3 2010–11) where the department put forward as ‘recommended’ applications with an aggregate value significantly above the amount of grant funding that was available to be awarded with funding decisions then taken on the basis of criteria for which there was no documented assessment for each of the competing applications. Until this audit of the award of BBRF funding, subsequent audits of the award of grant funding under programs administered by Infrastructure had found the department had improved its approach to providing funding recommendations.

80 Secure online portal for business.gov.au, available at https://portal.business.gov.au/ [accessed 28 February 2022].

81 Round one projects were expected to be completed by no later than 30 June 2020.

82 This was reflected in Infrastructure’s advice to the ANAO in December 2021, in conjunction with its responses to questions during Senate Estimates in February 2022.

83 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, paragraph 13.9, p. 36, available from www.finance.gov.au [accessed 6 March 2022].

84 The offices of the respective Assistant Ministers for Regional Development and Territories assisted with coordinating the parliamentarian input and monitoring the spread of funding as application shortlists were developed.

85 Although the same information was compiled for the first round, it was not used to develop a shortlist. Instead, it supplemented the project details provided by the department and, for the IP stream, was used to sort applications by electorate ahead of the panel’s consideration. It was provided electronically to panel members’ offices on 13 June 2017 (the day before the first of two panel meetings for the IP stream).

86 Ministers through their offices commenced collating the electorate details for projects upon receipt of the application details from Infrastructure on 21 December 2018. The shortlist was refined over a four-week period and was based largely on input and preferences gathered from parliamentarians. The published guidelines did not outline that this would occur to inform the award of grant funding.

87 All round three email communication examined by the ANAO involved communication between panel members’ offices. This included provision of written approval for the final shortlist by panel members’ offices, two of which were provided after the final list had been provided to Infrastructure on 20 February 2019.

88 Packs were provided in hard copy for the first round, with the exception of the application summaries for the CI stream (Attachment I), which were provided on a USB device.

89 Department of Finance, Commonwealth Grant Rules and Guidelines, Department of Finance, Canberra, 2017, paragraphs 4.6 and 4.10, pp. 11-12, available from https://www.finance.gov.au/government/commonwealth-grants/commonwealth-grants-rules-and-guidelines [accessed 6 March 2022].

90 Joint Committee of Public Accounts and Audit, Inquiry into Auditor-General’s Reports 5, 12 and 23 (2019-20), Canberra, December 2020, paragraphs 2.60 to 2.66, pp.35-36, available from https://www.aph.gov.au/Parliamentary_Business/Committees/Joint/Public_Accounts_and_Audit/AdminGovGrants/Report [accessed 27 March 2022].

91 For BBRF, there are no processes in place to manage conflicts of interest concerning advisers that are involved in the assessment and decision-making process. During the course of the audit, the ANAO raised with Infrastructure the lack of any process to address conflicts of interest resulting from a former executive level Infrastructure staff member (with responsibility for BBRF in rounds three and four) and a former adviser from the Deputy Prime Minister’s office (with responsibility for BBRF rounds three and four) forming a consultancy firm whose purposes include assisting entities to apply for grant funding. Another example identified by the ANAO involved a different former adviser (with responsibility for the BBBF in round two) contacting an adviser involved with the fifth round during the period in which funding decisions were being taken, with his new employer being subsequently awarded a grant (the application had not scored sufficiently highly on the merit criteria to have received a grant on the basis of that assessment).

92 The JCPAA noted that this is particularly important where a minister approves a grant that a relevant official/entity has recommended be rejected or assessed as ineligible.

93 The file notes were created by a departmental official who was not present at the meetings. Departmental records suggest that the official based the record on the notes taken during the deliberations by two other attending officials.

94 Including 20 that were assessed as not being value with relevant money and one assessed as ineligible.

95 Noting that the panel was to convene for round two on an ACT public holiday, Infrastructure contacted the chair through his office four days prior to the meeting to see if officials were required to attend, as was the case for the prior round. The chair through his office advised the department that ‘no support [would] be required on the day’.

96 Consistent with this, Infrastructure advised the panel in each round that: ‘By following the recommendations of the Ministerial Briefing Package, the Approver will be declaring that they have made reasonable inquiries and are satisfied that approving the proposed expenditure represents a proper use of relevant money, in accordance with section 71’ of the Act.

97 This was similar to a finding made in Auditor-General Report No.12 2019–20 Award of Funding Under the Regional Jobs and Investment Packages, leading to an ANAO recommendation (agreed to by Infrastructure) that the department implement processes for decision-makers to re-score grant applications in circumstances where they disagree with the scoring presented by the department.

98 Including one application assessed as ineligible by DISR but assessed against the merit criteria at the request of the panel following its deliberations in June 2017 (see paragraph 4.3).

99 Ten completed recommendation review templates were missing from round two, as there were 76 projects that were approved by the panel but had not been recommended for funding.

100 For example, under the third round, the same basis for the approval of 112 projects not recommended by Infrastructure was recorded as:

In relation to the projects selected, the panel took into consideration factors as set out in the BBRF Round Three Guidelines at Section 8.1. Based on the pool of applications submitted to the panel members for consideration, the regional spread of projects and funding across the regions relative to current and previous Australian Government investments, was a key factor in the selection process. Selected projects address key infrastructure priorities in each region, taking into account Australian Government priorities that align with the Government’s intent of supporting regional Australia, focusing on economic growth and job creation. Each selected project impacts on their local community in a unique way and builds on their local, state and federal funding to contribute to the objectives of the BBRF.

101 Due to Infrastructure recommending that any application with a merit score of 60 per cent or more against each merit criterion could be funded, the basis for the panel’s decisions were recorded in respect to whether projects were considered ‘highly scored’ by Infrastructure. Highly scored applications were defined in advice from Infrastructure to the panel as those with overall scores of: 32 for the IP stream; 31 for tourism projects within the IP stream; and 24 for the CI stream. Highly scored projects able to be funded within the available funding envelope were presented chronologically at the top of the order of merit above an orange line.

102 For example, the recorded basis for the approval of 133 projects not ‘highly scored’ was: ‘The Australian Government has injected an additional $100 Million into Round five of the BBRF. This gave the panel the flexibility to support these additional projects taking into account the Australian Governments Priorities, the spread of projects and funding across the regions and the impact of each project and its benefit for communities.’

103 For example, the analysis undertaken by panel members through their offices recorded that:

Of concern only 8% of funding is forecast to be delivered in Labor electorates. Labor seats submitted 26% of applications by value. The Coalition vs Labor/Independent split will become apparent as soon as announcements are made. I have reconsidered our discussion points from last night, and made some recommendations which incorporate Members feedback and boost this Labor allocation to a more defensible percentage. These are reflected in yellow in the updated attachment. The updated project list brings the Labor funding up to 15%. Independents will receive 7% of funding.

104 Auditor-General Report No.30 2016–17 Design and Implementation of Round Two of the National Stronger Regions Fund, paragraphs 4.28 to 4.31.

105 ‘Ministerial Panel Review Templates’ were used to provide a written record of how the decision-makers for that program: arrived at a different view than that of the department as to the merits of applications; and were satisfied that their approval represented value with relevant money.

106 Conversely, applications that are assessed as ineligible or insufficiently meritorious in relation to the assessment criteria in the guidelines are unlikely to provide value with public money and promote the intended program objectives.

107 Although Infrastructure defined highly scored applications in its funding advice for round five, it recommended that the panel select any projects for funding that had achieved a score of at least 60 per cent against each individual merit criterion (that is, an overall score of 24 out of 40). Highly scored applications for round five were those with merit assessment scores of: 32 for the IP stream; 31 for tourism projects within the IP stream; and 24 for the CI stream.

108 Under round three, the aggregate value of grants being sought across the high scoring applications exceeded the total available funding for the round. This was not the case for rounds one, two and four, where the aggregate values aligned with the total available funding for the rounds.

109 Building Better Regions Fund Infrastructure Projects Stream, Program Guidelines, November 2016, p.2.

110 Two of these projects were removed on 18 June 2018 at the request of ministers after consultation with the panel.

111 In an approach that was similar to the one adopted for the fifth round, the department recommended the panel select from the ‘value with relevant money – recommended’ category of 306 projects seeking an aggregate value of $426 million. The available funding under the round was $205 million. Notwithstanding this, there were 88 projects from the recommended pool that were not approved for funding in favour of others lower on the order of merit from the ‘value with relevant money’ category.

112 In rounds one, two and four, the aggregate value of funding sought from applications in the ‘value with relevant money – recommended’ category did not exceed the total available funding for those rounds.

113 The availability of the additional $100 million awarded under round five was not confirmed until after Infrastructure had provided its recommendations to the panel. If Infrastructure’s funding recommendation had not exceeded the aggregate funding available for the round, it would have been based on the $200 million originally available and according to the published guidelines. Given this was increased and $300 million was awarded, the ANAO’s analysis was based on the 95 highest scoring IP applications with an aggregate value of $301.5 million (across eight ranking bands or groups of applications that most closely aligned with the total funding awarded).

114 The two highest scored and ranked applications were not awarded funding.

115 Four out of the five highest scored and ranked applications were not awarded funding.

116 NSW Department of Premier and Cabinet and the NSW Productivity Commissioner, Review of grants Administration in NSW, April 2022, available from https://www.dpc.nsw.gov.au/publications/ [accessed 9 May 2022].

117 In October 2021, the Australian Government agreed to a recommendation from the Joint Committee of Public Accounts and Audit in its Report 484 The Administration of Government Grants that the CGRGs be amended to include an eighth key principle for grants administration of “Adherence to published guidelines”.

118 A large proportion of federal electorates are not relevant to — and if included would skew — the analysis of the distribution of BBRF funding. This is due to the BBRF program objective to direct funding to projects outside of major capital cities. Unless applicants from major cities can demonstrate that the benefits from projects will flow into regional areas, they are not eligible for funding under the program.

119 Analysis undertaken to inform the panel’s decision-making took into consideration whether the seat was held by The Nationals or the Liberal party.

120 Auditor-General Report No.3 2010–11 The Establishment, Implementation and Administration of the Strategic Projects Component of the Regional and Local Community Infrastructure Program.

121 Joint Committee of Public Accounts and Audit, Report 423 - Review of Auditor-General’s Reports Nos 39 2009-10 to 15 2010-11, Canberra, July 2011, p. 48, available from: https://www.aph.gov.au/Parliamentary_Business/Committees/House_of_representatives_Committees?url=jcpaa/auditgen3_10/report.htm [accessed 1 June 2022].

122 Auditor-General Report No.3 2012–13 The Design and Conduct of the First Application Round for the Regional Development Australia Fund.

123 Auditor-General Report No.9 2014–15 The Design and Conduct of the Third and Fourth Funding Rounds of the Regional Development Australia Fund.

124 Joint Committee of Public Accounts and Audit, Report 449: Regional Development Australia Fund, Military Equipment Disposal and Tariff Concessions: Review of Auditor-General Reports Nos 1-23 (2014-15), Canberra, August 2015, paragraph 2.67, p. 28, available from: https://www.aph.gov.au/Parliamentary_Business/Committees/Joint/Public_Accounts_and_Audit/Report_No_9/Report_449 [accessed 1 June 2022].

125 Auditor-General Report No.30 2016–17 Design and Implementation of Round Two of the National Stronger Regions Fund.

126 Auditor-General Report No.12 2019–20 Award of Funding Under the Regional Jobs and Investment Packages.

127 Joint Committee of Public Accounts and Audit, Report 484: The Administration of Government Grants: Inquiry into Auditor-General’s Reports 5, 12 and 23 (2019-20), Canberra, December 2020, paragraph 4.91, p. 66.