Browse our range of reports and publications including performance and financial statement audit reports, assurance review reports, information reports and annual reports.
Operation of Grants Hubs
Please direct enquiries through our contact page.
Audit snapshot
Why did we do this audit?
- The Streamlining Government Grants Administration (SGGA) Program was introduced in 2015–16 to deliver simpler, more consistent and efficient grants administration across government.
- Recent audits of grants programs have identified issues with the management of grants through the hubs.
Key facts
- Funding of $106.8 million was provided to support implementation of the SGGA Program, anticipating total annual financial benefits of $400 million.
- The Departments of Industry, Science, Energy and Resources (Industry) and Social Services (DSS) built and operated two grants hubs from 1 July 2016, with oversight from the Department of Finance (Finance) and assistance from the Digital Transformation Agency (DTA).
- Between 1 July 2016 and 9 February 2021 around $45 billion in grant funding was awarded through 79,651 grants.
- Each grants hub had its own grants management information and communications technology system.
What did we find?
- There is insufficient evidence to demonstrate that the SGGA Program improved the effective and efficient delivery of grants administration.
- The interim Digital Transformation Office and DSS’s design, and Finance’s governance, of the SGGA Program was not effective.
- Industry and DSS’s build and operation of the grants hubs was partly effective.
What did we recommend?
- Recommendations were aimed at: agreeing a methodology to demonstrate improvements in efficiency and effectiveness of grants administration through the hubs; developing and agreeing a future plan for the operation of grants hubs; and establishing a whole-of-government grants administration and payments dataset.
- Finance agreed to three recommendations directed to Finance.
- Industry and DSS agreed to two recommendations made to the grants hubs.
$8.4bn
in grants payments were administered by the grants service providers in 2019–20.
36,279
grants awarded were administered by the grants service providers in 2019–20.
768
grant programs were administered by grants service providers in 2019–20.
Summary and recommendations
Background
1. In the 2015–16 Budget, the Streamlining Government Grants Administration (SGGA) Program was introduced as part of the Digital Transformation Agenda—Stage One measure. To support SGGA Program implementation, $106.8 million was provided over four years. The objective of the SGGA Program was to deliver simpler, more consistent and efficient grants administration across government. This approach would enable government to deliver grants more effectively at lower cost and risk.
2. Effective implementation of this initiative was intended to lead to improved policy outcomes, an improved experience for grant applicants and recipients, a reduction in red tape, efficiencies for government entities administering grant programs, and improved transparency of, and capability to analyse, whole-of-government grants administration and payment data. Total annual financial benefits were estimated to be around $400 million.
3. The Department of Industry, Science, Energy and Resources (Industry) and the Department of Social Services (DSS) were responsible for building and operating the Business Grants Hub (BGH) and Community Grants Hub (CGH) respectively. The hubs were meant to provide standardised business processes.
4. The Department of Finance (Finance) was responsible for SGGA Program governance, with the assistance and support of the Digital Transformation Agency (DTA). The interim Digital Transformation Office (DTO) was responsible for delivery of a data warehouse; this responsibility was transferred to Finance in late 2015.
5. Initially, the 12 Commonwealth entities awarding the most grants funding were designated as participating entities in the SGGA Program. In 2018–19 and 2019–20 BGH and CGH each provided grants administration services to around 10 entities, including several that had not been designated.
6. From October 2016 the government issued amended Budget Process Operational Rules and Estimates Memoranda that set out requirements for participating entities. Participating entities were required to use the hubs unless an exemption or deferral had been granted by the Minister for Finance. This applied to all New Policy Proposals for grant programs commencing from and including Mid-Year Economic and Fiscal Outlook 2016–17. Existing grant programs were to be transitioned to the hubs by 30 June 2019.1
Rationale for undertaking the audit
7. Between 1 July 2016 and 9 February 20212, the 15 entities participating3 in the SGGA Program published on GrantConnect more than $44.9 billion in grant funding across 79,651 grants, representing around 85 per cent of total reported funding and awards.4 The government has invested $157.8 million since 2013–14 to support the SGGA Program and related information and communications technology (ICT) initiatives.
8. The ANAO has not previously audited the effectiveness of the SGGA Program, BGH or CGH. Recent audits of grants programs5 have identified issues with the management of grants through the hubs. The audit will provide assurance to the Parliament about the extent to which the hubs have achieved SGGA Program objectives.
Audit objective and criteria
9. The audit assessed whether the Streamlining Government Grants Administration Program improved the effective and efficient delivery of grants administration.
10. To form a conclusion against the audit objective, the audit adopted two high-level criteria:
- Was there robust design and governance of the SGGA Program to support achievement of intended outcomes?
- Have the hubs been built and operated to deliver SGGA Program objectives?
Conclusion
11. There is insufficient evidence to demonstrate the SGGA Program improved the effective and efficient delivery of grants administration. Core deliverables were not achieved, and shortcomings in the design and operation of the hubs impacted on the realisation of the intended SGGA Program benefits (better outcomes for grant applicants and recipients, reduction in red tape, and efficiencies for government).
12. The design and governance of the SGGA Program was not effective. The design of the SGGA Program was not supported by a sound evidence base. Governance arrangements were established, but did not support achievement of program outcomes, benefits and deliverables. Planning was not seen through to completion, impacting the achievement of deliverables. There was a benefits realisation framework, but it was not applied. The SGGA Program could not demonstrate the achievement of intended outcomes due to a lack of measurable indicators, baselines and targets. In relation to core deliverables, DSS and Industry built two grants hubs, but did not deliver a single whole-of-government grants administration process with six different workflows, a data warehouse or market testing.
13. The build and operation of the grants hubs were partly effective. While consistency and effectiveness in grants administration is somewhat improved, there are deficiencies in relation to usage of the hubs for the full grants lifecycle, collaboration between the hubs and client entities, and data management. The hubs have not developed an appropriate performance framework to measure the benefits. There is limited evidence that the forecast benefits of the SGGA Program have been achieved.
Supporting findings
Program design and governance
14. Evidence-based advice was not provided to government when establishing the SGGA Program. In the absence of reliable baseline information, benefits were presented to government without a reasonable basis for establishing how they would be realised. Cost studies completed in 2017 and 2018 highlighted shortcomings in the reliability of the advice to government. (See paragraphs 2.2 to 2.17)
15. Governance arrangements which were established did not effectively support the achievement of SGGA Program objectives. Governance arrangements did not always operate as intended, and changes to arrangements were not always clearly articulated. The SGGA Program implementation plan was last updated in July 2017. There was no authority for a change made to some deliverables. (See paragraphs 2.18 to 2.30)
16. A benefits realisation strategy was developed. Inconsistent with the intention of the SGGA Program implementation plan, performance indicators were not measurable and there were no baselines or targets that would clearly demonstrate the achievement of benefits and outcomes. (See paragraphs 2.31 to 2.50)
17. The achievement of SGGA Program deliverables has not been clearly established. The SGGA Program largely delivered one of four core deliverables. Two hubs, each with an ICT grants management system, were delivered. However, there were delays in the transition of grant programs to the hubs and standardised business processes were not adopted. The three other core deliverables— a single grants administration process, a data warehouse and market testing — have not been achieved. (See paragraphs 2.51 to 2.78)
Build and operation of the Grants Hubs
18. Although there was a benefits framework and performance indicators designed to measure effectiveness and efficiency, the hubs did not adopt a common set of measures to support SGGA Program benefit measurement. Many performance indicators were not measurable or lacked targets. There were no baselines established for performance indicators at the outset to allow judgements to be made about the achievement of benefits. The hubs cannot clearly demonstrate that their establishment has led to more effective or efficient grants administration. (See paragraphs 3.4 to 3.19)
19. The hubs have partly supported more effective grants administration. Where data was available, BGH and CGH generally demonstrated compliance with selected legislative requirements. Business processes and workflows were established but poor data quality impedes an assessment of whether processes and workflows were followed. There are significant data management and quality issues. Client usage of services across the grants lifecycle is uneven. Evaluation services are rarely used. Service standards have been developed but are only reported by CGH. Collaboration with and support of client entities in the grant design and hub usage phases is uneven. (See paragraphs 3.20 to 3.67)
20. There is insufficient evidence to demonstrate that the hubs have supported more efficient grants administration. A key objective of the SGGA Program — to establish a standardised whole-of-government approach to streamline grants administration — has not been achieved. Although the hubs undertook regular reviews of costing models to better reflect the cost of services in prices, the hubs cannot demonstrate cost recovery. (See paragraphs 3.68 to 3.79)
Recommendation no. 1
Paragraph 2.43
Department of Finance and the hubs agree a methodology to capture and report performance information that demonstrates the efficiency and effectiveness of grants administration through the hubs.
Department of Finance response: Agreed.
Department of Industry, Science, Energy and Resources response: Agreed.
Department of Social Services response: Agreed.
Recommendation no. 2
Paragraph 2.77
Department of Finance develop and agree a future plan for the operation of grants hubs, and where this plan differs from the SGGA Program funding proposal, seek authority for changes from government.
Department of Finance response: Agreed.
Recommendation no. 3
Paragraph 3.37
To assist in the achievement of the intended benefits of the SGGA Program, Department of Finance and the hubs establish a whole-of-government grants administration and payments dataset and implement arrangements to assure the quality of the data.
Department of Finance response: Agreed.
Department of Industry, Science, Energy and Resources response: Agreed.
Department of Social Services response: Agreed.
Summary of entity responses
21. Summary responses from the audited entities, where provided, are below. Full responses from all audited entities are included in Appendix 1.
Department of Finance
The Department of Finance (the Department) agrees to the three recommendations of the report of the Australian National Audit Office, Operation of the Grants Hubs, and has been progressing work in consultation with the Department of Social Services, the Department of Industry, Science, Energy and Resources and other policy entities to refine the whole of government approach to Commonwealth grants administration. This work builds on the achievements delivered under the Streamlining Government Grants Administration (SGGA) program and addresses the issues outlined in the three recommendations.
The Department notes the positive impact of the SGGA program and the benefits it delivered, which have been highlighted in a number of independent reviews and more recently in an independent user experience survey in which grant applicants and recipients noted improvements in their experiences with Commonwealth grants administration.
To improve Commonwealth grants administration going forward, the Department, in collaboration with the grants hubs, is addressing the following areas identified in the ANAO report: better use of whole of government data and improved data quality; improved benefits measurement; further enhancing user experiences in the design and delivery of grants; refreshing governance arrangements; and improving the administrative cost and efficiency of grants administration.
Department of Industry, Science, Energy and Resources
The Department of Industry, Science, Energy and Resources acknowledges the Australian National Audit Office’s report on the operation of grants hubs.
The department accepts the recommendations, as they relate to DISER in its capacity of managing the Business Grants Hub under the whole of government Streamlining Government Grants Administration Program.
The department notes the ANAO’ s conclusion that the build and operation of the grants hubs were partly effective, recognising improvements in the consistency and effectiveness of grants administration. The department welcomes the ANAO’s survey finding that more than 80 per cent of Business Grants Hub (BGH) client entity staff were satisfied with the BGH. We will continue to work with client agencies to improve the user experience for both policy entities and businesses alike.
The original intent of the SGGA Program included, inter alia, to deliver an improved service experience for businesses and community organisations that apply for Commonwealth grants. This has been delivered through the streamlining of the number of different interactions, with different agencies and differing complexity that businesses and community groups are required to engage experience. The Program was also intended to create transparency around the advertising and awarding of grant opportunities. The department considers that these foundational intentions have been delivered.
The department thanks the ANAO for its report, and commits to working with the Department of Finance and the Community Grants Hub to explore options for addressing the recommendations made in the report.
Department of Social Services
The Department of Social Services (the department) acknowledges the insights of the Australian National Audit Office’s report on the Operation of the Grants Hubs.
Since its inception on 1 July 2016, the Community Grants Hub (CGH) has delivered grants on behalf of the department and on-boarded eight other in-scope agencies. In 2020–21, the CGH managed 562 grant programs and $11 billion in funding, and is the Commonwealth’s largest shared service provider for grants administration. The CGH has an established Service Offer to support the delivery of consistent grant processes across the portfolios it supports.
The department supports the recommendations made in the report, including ongoing efforts to improve whole-of-government grants administration arrangements to better serve client agencies and grant recipients. The recommendations align with work already underway to strengthen effective and efficient grants administration at the whole-of-government level.
The department is committed to working with the Department of Finance, as the lead policy agency for grants administration, to strengthen the way it reports against efficiency and effectiveness measures, and towards establishing a whole-of-government dataset for grants administration.
Digital Transformation Agency
The Digital Transformation Agency (DTA) acknowledges the findings contained in the extract of the audit report provided for comment.
While no recommendations have been directed to the DTA through this audit, we nonetheless recognise the importance of carrying forward the insights it contains to inform the design and delivery of future digital investments by the Australian Government.
Since the Streamlining Government Grants Administration program concluded, the DTA has led the implementation of a range of reforms under the stewardship of the Secretaries Digital Committee to strengthen coordination and oversight of digital investments. This has most recently included implementing the Australian Government’s Digital and ICT Investment Oversight Framework.
The DTA will carefully consider the findings and recommendations of this audit once it has been tabled in the Parliament and take steps to identify all learnings and insights and, to the extent they have not already been addressed, feed them into administration of the Oversight Framework and its associated policies and standards.
Key messages from this audit for all Australian Government entities
Below is a summary of key messages, including instances of good practice, which have been identified in this audit and may be relevant for the operations of other Australian Government entities.
Governance and risk management
Program design
Records management
1. Background
Introduction
Streamlining Government Grants Administration Program
1.1 In the 2015–16 Budget, the Streamlining Government Grants Administration (SGGA) Program was introduced as part of the Digital Transformation Agenda—Stage One measure. The measure provided $106.8 million over four years to support implementation.6 The objective of the SGGA Program was to deliver simpler, more consistent and efficient grants administration across government. This approach would enable government to deliver grants more effectively at lower cost and risk.
1.2 To achieve this objective, four core deliverables were established (Table 1.1).
Deliverables |
|
Each hub would maintain its own grants management information and communications technology (ICT) system and adopt standardised business processes for grants administration. Migration of grant programs to the hubs would commence in July 2016 and be completed by September 2017. |
|
|
Note a: Participating entities provided more than 90 per cent of the total value of Australian Government grant funding and comprised: Department of Agriculture; Attorney-General’s Department; Department of Defence (including Defence Materiel Organisation); Department of Education and Training; Department of Employment; Department of the Environment; Department of Health; Department of Industry and Science; Department of Infrastructure and Regional Development; Department of the Prime Minister and Cabinet; Department of Social Services; and Department of Veterans’ Affairs. The Department of Communications and the Arts was added in December 2015. Subsequent machinery of government changes and the inclusion of entities have led to modifications in the number of participating entities over time.
Note b: A market test involves considering alternative delivery models including approaching the market to determine if another organisation could provide a function more efficiently (having regard to cost, benefits and risks).
Source: Summary of April 2015 SGGA Program funding proposal, and SGGA Program implementation plan.
1.3 Industry and DSS were responsible for building and operating the Business Grants Hub (BGH) and Community Grants Hub (CGH), respectively.7 The hubs would implement standardised business processes, with oversight from the Department of Finance (Finance). Finance was responsible for SGGA Program governance. The interim Digital Transformation Office (DTO) was responsible for delivery of a data warehouse.8 The Digital Transformation Agency (DTA) was responsible for providing support to Finance and overseeing user experience deliverables.9
1.4 Effective implementation of this initiative was intended to lead to:
- improved policy outcomes by better targeting grants;
- an improved experience for grant applicants and recipients;
- a reduction in red tape;
- efficiencies for government entities administering grant programs; and
- improved transparency of, and capability to analyse, whole-of-government grants administration and payment data.
1.5 Total annual financial benefits were estimated to be around $400 million.
Program operation and funding
1.6 Funding allocated by government for the implementation of the program and related ICT systems initiatives included $106.8 million allocated over four years in the 2015–16 Budget for the SGGA Program and $35.2 million over two years in the 2017–18 Budget from the Public Service Modernisation Fund (see Appendix 3).10 In total, $157.8 million was allocated over six years from 2013–14. In April 2015 government agreed that hub operational costs would be recovered through a user pay cost recovery model based on a unit priced service catalogue.11
1.7 The hubs operate as a shared service arrangement for grants administration.12 Under an arrangement with a hub13, the client entity14 retains policy control for its grants programs. The accountable authority of the client entity must establish an appropriate internal control framework and manage risks and is responsible for grants administration compliance with the Public Governance, Performance and Accountability Act 2013 (PGPA Act) including the Commonwealth Grant Rules and Guidelines (CGRGs).15
1.8 The hub provides grants administration services to the client entity, the nature and extent of which are selected by the client entity from the service catalogue. The service catalogue was to offer end-to-end grants administration services of design, selection, establishment, management and evaluation (also referred to as the grants lifecycle).
1.9 Both Industry and DSS commenced building their hubs in 2015, and the hubs commenced operation from 1 July 2016. Each hub used a single separate grants management ICT system (see Appendix 4).
Participation in the SGGA Program
1.10 Australian Government Grants – Briefing, Reporting, Evaluating and Election Commitments (Resource Management Guide 412) encourages all non-corporate Commonwealth entities to leverage whole-of-government initiatives when developing grant policy proposals, and requires participating entities to consume grants administration services from either the BGH or CGH.
1.11 From October 2016 the government issued amended Budget Process Operational Rules (BPORs) and Estimates Memorandums16 that set out requirements for participating entities. All New Policy Proposals (NPPs) for grant programs commencing from and including Mid-Year Economic and Fiscal Outlook 2016–17 were to be implemented via the hubs.17 From November 2017 existing grant programs that did not have an exemption or deferment were required to transition to the hubs by 31 March 2019.18 The hubs were required to implement all in-scope grant programs by 30 June 2019.
1.12 GrantConnect19 was launched in February 2017, with mandatory reporting of grant opportunities from April 201720 and grants awarded from 31 December 2017.21 In 2018–19 and 2019–20, 28 entities reported grant opportunities and grants awarded on GrantConnect.
1.13 In November and December 2020, the ANAO surveyed these entities to determine the number that had entered into agreements with a grants administration service provider (Table 1.2). Grant administration service providers include BGH, CGH, National Health and Medical Research Council (NHMRC)22, Australian Research Council (ARC)23 and SmartyGrants.24
Service provider |
Number of entities citing an agreement |
Surveyed entities |
BGH |
9 |
Department of Agriculture, Water and the Environment (DAWE); Austrade; Department of Defence (Defence); Industrya; Department of Education, Skills and Employment (DESE); Department of Health (Health); Department of Home Affairs (Home Affairs); Department of Infrastructure, Transport, Regional Development and Communications; and the Department of the Treasury |
CGH |
10 |
Attorney-General’s Department; DAWE; DSSa; Department of Veterans’ Affairs; DESE; Health; Home Affairs; National Disability Insurance Agencyb; National Indigenous Australians Agency; and the Department of the Prime Minister and Cabinet |
NHMRC |
3 |
Cancer Australia, Health and NHMRCa |
Other |
2 |
Australian Communications and Media Authority (ACMA)c and Defenced |
Entities with no service provider |
||
– |
10 |
ARCe; Australian Securities and Investments Commission; Australian Taxation Office; Department of Foreign Affairs and Trade; Finance; Great Barrier Reef Marine Park Authority; National Blood Authority; NHMRCe; Organ and Tissue Authority; Safe Work Australia; Wine Australiaf |
Note a: Industry, DSS and NHMRC provide grants administration services to grant programs for which their entity has policy responsibility. These entities do not enter into head agreements or schedules with policy areas.
Note b: DSS advised the ANAO in February 2022 that the National Disability Insurance Agency did not have an agreement with CGH in 2021–22.
Note c: ACMA advised the ANAO that it received an exemption in January 2018 from the moratorium that prohibited acquisition of grants management ICT systems. ACMA used the SmartyGrants ICT system to deliver grant programs in 2018–19 and 2019–20.
Note d: ARC advised the ANAO that it has been delivering the Per-and Poly-Fluoroalkyl Substances (PFAS) program as a special research initiative under an agreement with the Department of Defence since 2018.
Note e: ARC and NHMRC administer their own grants.
Note f: Wine Australia is a Corporate Commonwealth Entity (CCE). The CGRGs apply to CCEs where they undertake grants administration on behalf of the Commonwealth.
Source: ANAO survey.
1.14 Table 1.3 shows an increase in the number of grant programs and awards, and the value of grants payments, between 2018–19 and 2019–20, as reported to the ANAO by ARC, BGH, CGH and NHMRC. The service providers advised the ANAO that in 2019–20 they administered 768 grant programs and managed 36,279 grants awarded, and made grants payments of $8.4 billion.
|
ARC |
BGH |
CGHb |
NHMRC |
||||
|
2018–19 |
2019–20 |
2018–19 |
2019–20 |
2018–19 |
2019–20 |
2018–19 |
2019–20 |
Active programsc d |
1 |
0 |
43 |
116 |
578 |
650 |
2 |
2 |
Active grants awardedc d |
4 |
4 |
13,845 |
12,932 |
19,832 |
22,987 |
149 |
356 |
Total value of grants payments (millions)c |
$5 |
$4 |
$769 |
$1277 |
$6735 |
$6919 |
$54 |
$216 |
Note a: Data in this table has not been validated by the ANAO and the ANAO has not determined if entities have interpreted the data request differently.
Note b: For CGH the data represents grant programs where CGH was responsible for administering the manage phase of the grants lifecycle.
Note c: Between November and December 2020, the ANAO requested benchmark advice from ARC, BGH, CGH and NHMRC: For 2018–19 and 2019–20, where your entity receives payment for any grant administration services and makes payments to recipients, please provide the number of active grant programs, active grant awards funded by your entity, and total value of grant payments to grant recipients. The ANAO specified this should include all grants administered by the hub, including DSS grants administered by the hub and grants administered by the hub on behalf of client entities.
Note d: Active means that the grant program or grants awarded commenced prior to the end of the financial year and have not been closed or completed. For example, if a planned report, final payment, acquittal or evaluation has not yet occurred the program or award is considered to be active.
Source: ARC, Industry, DSS and NHMRC advice to ANAO in December 2020 based on a standard request.
Rationale for undertaking the audit
1.15 Between 1 July 2016 and 9 February 202125, the 15 entities participating26 in the SGGA Program published on GrantConnect more than $44.9 billion in grant funding across 79,651 grants, representing around 85 per cent of total reported funding and awards.27 The government has invested $157.8 million since 2013–14 to support the SGGA Program and related ICT initiatives.
1.16 The ANAO has not previously audited the effectiveness of the SGGA Program, BGH or CGH. Recent audits of grants programs have identified issues with the management of grants through the hubs. The audit will provide assurance to the Parliament about the extent to which the hubs have achieved SGGA Program objectives.
Audit approach
Audit objective, criteria and scope
1.17 The audit assessed whether the Streamlining Government Grants Administration (SGGA) Program improved the effective and efficient delivery of grants administration.
1.18 To form a conclusion against the audit objective, the audit adopted two high-level criteria:
- Was there robust design and governance of the SGGA Program to support achievement of intended outcomes?
- Have the hubs been built and operated to deliver SGGA Program objectives?
1.19 The audit focused on Finance’s, the interim DTO’s, and DTA’s role in establishing design and governance of the SGGA program, and Industry’s and DSS’s build and operation of the hubs as shared service arrangements for administration of grant programs. The audit did not examine in detail administration of individual grant programs or their outcomes, and grant recipients’ experience.
Audit methodology
1.20 In undertaking the audit, the ANAO:
- examined documentation collected from Finance and DTA relating to the design and governance of the SGGA Program;
- examined documentation collected from Industry and DSS relating to the build and operation of the hubs;
- examined relevant sections of the PGPA Act and Rules, the CGRGs and Finance guidance material;
- analysed data extracted by Finance from GrantConnect, and by Industry and DSS from the BGH and CGH grants management ICT systems;
- surveyed 28 Commonwealth entities that reported grant opportunities or grants awarded in 2018–19 and 2019–20 on GrantConnect (see Appendix 5 for an overview of the purpose and responses to the ANAO Survey); and
- met with Industry, DSS and Finance staff.
1.21 The audit was conducted in accordance with ANAO auditing standards at a cost to the ANAO of $1,090,000.
1.22 The team members for this audit were Tracey Martin, Natalie Maras, Chay Kulatunge, Supriya Benjamin, Stephenson Li, Dung Chu, Alicia Vaughan, Runal Velso, Christine Chalmers and Peta Martyn.
2. Program design and governance
Areas examined
This chapter examines whether the design and governance of the Streamlining Government Grants Administration (SGGA) Program supported the achievement of intended deliverables and benefits.
Conclusion
The design and governance of the SGGA Program was not effective. The design of the SGGA Program was not supported by a sound evidence base. Governance arrangements were established, but did not support achievement of program outcomes, benefits and deliverables. Planning was not seen through to completion, impacting the achievement of deliverables. There was a benefits realisation framework, but it was not applied. The SGGA Program could not demonstrate the achievement of intended outcomes due to a lack of measurable indicators, baselines and targets. In relation to core deliverables, DSS and Industry built two grants hubs, but did not deliver a single whole-of-government grants administration process with six different workflows, a data warehouse or market testing.
Areas for improvement
The ANAO made two recommendations aimed at agreeing a performance measurement methodology that demonstrates the achievement of intended outcomes; and developing and agreeing a future plan for the operation of grants hubs.
2.1 To determine whether the design and governance of the SGGA Program supported the achievement of intended outcomes the audit considered whether:
- evidence-based advice was provided to government when establishing the SGGA Program;
- governance and planning supported the achievement of objectives;
- program benefits were measured; and
- key program deliverables were achieved.
Was evidence-based advice provided to government?
Evidence-based advice was not provided to government when establishing the SGGA Program. In the absence of reliable baseline information, benefits were presented to government without a reasonable basis for establishing how they would be realised. Cost studies completed in 2017 and 2018 highlighted shortcomings in the reliability of the advice to government.
Development of the SGGA Program funding proposal
2.2 Consideration of options to improve grants administration commenced in 2012. This included progressing various grants management streams of work such as funding for GrantConnect and the Department of Social Services’s (DSS’s) grants management information and communications technology (ICT) system, and transitioning Department of Health and Aging funding programs to DSS’s grants management ICT system. In October 2013, the Secretaries Board endorsed the Transforming and Modernising Government Programme, which included ten initiatives relating to improving grants administration.
2.3 In May 2014 the Secretaries ICT Governance Board28 decided that the Department of Finance (Finance) would lead a feasibility study to reduce the number of grants management ICT systems and draw together other improvements to grants administration.
2.4 Concurrently, the Efficiency Working Group29 considered options for grants administration improvements between August and September 2014. Options included reducing the number of entities administering grants, having a single entity managing grants, having a single grants management system, and outsourcing grants administration. Several entities noted that they had achieved some efficiencies already by centralising grants management within their entity.30
2.5 In September 2014 the Efficiency Working Group recommended to the Secretaries Committee on Transformation31: establishing a whole-of-government grants management system involving a standardised grants lifecycle; pre-qualification; a single ICT system; and consolidating administrative functions for grants in a single entity or small number of hubs.
2.6 In late 2014 an eGovernment Ministerial Taskforce on Digital Transformation was established to develop a whole-of-government plan to enhance digital delivery of government services. This Taskforce oversaw the development of the Digital Transformation Agenda and was supported by an eGovernment Steering Committee including representatives from Department of the Prime Minister and Cabinet (PM&C), Department of Communications, Finance, Industry and DSS.
2.7 In March 2015 the government decided the work of the Efficiency Working Group relating to improving grants administration would be included in the Digital Transformation Agenda.
2.8 Figure 2.1 provides an overview of meetings and decisions between October 2013 and April 2015 of the Secretaries Board32, the eGovernment Ministerial Taskforce on Digital Transformation, and various taskforces, boards, committees and working groups, that led to a SGGA Program funding proposal.
Source: ANAO summary of Digital Transformation Agency and Department of Finance documentation, funding proposals and Government decisions relating to improvements in grants administration and the Digital Transformation Agenda.
2.9 In 2013 and 2014 Finance undertook surveys of Commonwealth entities and their grant activity, and of grant recipients and their experiences when searching and applying for grants, to inform a pre-feasibility report and scoping study for GrantConnect. In 2014 and 2015 Finance conducted further surveys to capture information about grants expenditure and ICT systems for a feasibility study.33 Finance was unable to provide the ANAO with any documentation for these surveys.
2.10 The eGovernment Steering Committee met in March and April 2015 to draw together the grants administration and ICT components into a single funding proposal.
2.11 In the final meeting of the eGovernment Steering Committee on 2 April 2015, the interim Digital Transformation Office (DTO) presented information on grant cost types and drivers to provide an indication of baseline costs related to grants administration (see Figure 2.2). The information shown in this presentation was not reliable.
- The ANAO was unable to reproduce the average grant costs depicted, based on the information in the presentation.
- Although the interim DTO’s presentation purported that it relied on information from Finance surveys, the data shown was not consistent with findings of the 2013 and 2014 pre-feasibility study or GrantConnect scoping study, which did not capture information about costs and related to a different financial year.
- The presentation states that an incomplete data set was used for indicative purposes only to provide a proportional cost comparison.
- Other graphs in the same presentation indicated the data related to 10 granting entities, which included entities that were not affected, and excluded some entities that were affected, by the SGGA Program.
Note: A note to the diagram contained in the original material provided to ANAO reads: ‘raw data from Finance ICT surveys. This does not represent a complete data set and is used for indicative purposes only for a proportional comparison of costs.’
Source: Presentation to eGovernment Steering Committee, 2 April 2015.
2.12 By centralising grants information, standardising administrative processes and consolidating ICT, the interim DTO identified six areas of estimated financial benefit totalling $370 million a year (Figure 2.3).34 This comprised $300 million a year from administered funding, $50 million a year from program administration, and $20 million a year from ICT systems.
Note: A note to the diagram contained in the original material provided to ANAO reads: ‘The $200M ICT platform consolidation cost is based on early decommissioning of legacy systems. The $20M cost reflects a replacement as part of natural ICT refresh.’
Source: Presentation to eGovernment Steering Committee, 2 April 2015.
2.13 On 24 April 2015 the Minister for Communications and the Minster for Social Services finalised a funding proposal for the SGGA Program component of the Digital Transformation Agenda.
2.14 Table 2.1 sets out the SGGA Program benefits identified in the funding proposal. Financial benefits were estimated to exceed $400 million annually, with 85 per cent to be achieved by December 2017. The funding proposal indicated that benefits would be quantified as part of detailed transition planning with granting entities. However, the funding proposal did not establish baselines, a methodology for measuring benefits or performance measures to demonstrate achievement of outcomes.
Benefit area |
Benefits to be realised |
Financial benefit |
1. Better outcomes for usersa
|
A. Better targeting of grants (achieved through making grants easier to find and better information) will improve policy outcomes. |
A one per cent improvement in targeting would lead to benefits of around $300 million a year |
B. An improved client experience will be delivered by making it easier to find and apply for grants. |
Contributes to around $32.7 million reduced annual regulatory burden |
|
2. Reduction in red tape |
C. A consistent grants process across government including a common authentication system and the ability to re-use applicant registration and pre-qualification information (for example, demonstrating financial viability) will deliver red tape reduction for individuals and businesses. |
|
3. Efficiencies for Government
|
D. Standardised and streamlined administrative processes and improved scheduling and productivity of specialised staff in the hub shared services centres will lead to potential efficiencies. |
Potential efficiencies of approximately $50 million annually (based on a 20 per cent reduction in administrative costs) |
E. Capability to analyse whole-of-government grants administration and payments data. |
None specified |
|
F. A reduction in capital investment in grants systems over 10 years (depending on the outcome of market test) through further consolidation of ICT systems. |
Not quantified but subsequently estimated to be either or both $20 million a yearb and $100 million over 10 yearsc |
|
Total |
|
$400 million a year |
Note a: Better outcomes for users were to be achieved through the implementation of the SGGA Program and grants.gov.au (GrantConnect).
Note b: Based on the draft August 2015 SGGA Program Implementation Plan. On 2 April 2015, the interim DTO estimated financial benefits of $20 million a year arising from whole-of-government integration and consolidation of ICT when presenting to the eGovernment Steering Committee.
Note c: Based on SGGA Program implementation plans between August 2015 and July 2017, which included a long-term outcome of avoidance of capital costs through the streamlining of 20 ICT systems to two or potentially one system. The implementation plans indicate the capital avoided estimate was based on benchmarks from a similar program to consolidate parliamentary workflow systems to a single government platform.
Source: ANAO summary of SGGA Program funding proposal.
2.15 In March 2017, two years after the decision to commence the SGGA Program, Finance completed a baseline study of grants administration costs for 2014–15. The 2017 cost baseline study presented (for the entities participating in the SGGA Program35), total expenditure on grant programs, administration, ICT, and FTE; and showed costs by grants lifecycle phase (Table 2.2). Finance repeated elements of the baseline study for 2015–16 and 2016–17 and reported the results to the SGGA Reference Group in May 2018.36 Finance observed that entities experienced difficulty completing the exercise, as cost information for grants administration was not readily available.
|
Funding proposal baseline annual estimates ($m) |
Finance cost studies baseline annual estimates ($m) |
||
|
April 2015 |
2014–15 |
2015–16 |
2016–17 |
Total grants funding |
30,000 |
8000 |
9000 |
11,000 |
Administration costs |
300 |
350 |
296 |
306 |
Administration costs, excluding ICT |
250 |
326 |
266 |
275 |
ICT costs |
50 |
15 |
30 |
31 |
Note: The information in the SGGA Program funding proposal in April 2015 was based on information from 10 entities, whereas the baseline cost studies were based on information from 13 participating entities.
Source: SGGA Program funding proposal, and Finance cost baseline reports and documentation.
2.16 The March 2017 study indicated that the initial policy advice had overestimated the ICT proportion of total grants administration costs by $35 million. The overestimation impacted on the assumption that the program would deliver $100 million in savings through the ICT moratorium.37 Similarly the baseline estimate incorporated total grant funding of $30 billion to derive a saving of $300 million, but as indicated in Table 2.2, actual grants funding was significantly less than estimated in the funding proposal. The ANAO has previously observed that a single grant program can significantly increase the total grant funding awarded in a financial year, leading to significant variation from year to year.38 This highlights the risk of basing estimates on a single year of data.
2.17 Government was not advised of the changes in expectations regarding benefits to be derived from the SGGA Program.
Did governance and planning support the achievement of program objectives?
Governance arrangements which were established did not effectively support the achievement of SGGA Program objectives. Governance arrangements did not always operate as intended, and changes to arrangements were not always clearly articulated. The SGGA Program implementation plan was last updated in July 2017. There was no authority for a change made to some deliverables.
SGGA Program governance
2.18 Finance adopted typical governance structures and roles for the SGGA Program, which included:39
- a senior responsible officer (SRO) from Finance (at Senior Executive Service — SES— Band 2 level), supported by a program manager from Finance and a Finance Program Management Office;40
- seven sponsoring committees (at the Secretary and Deputy Secretary level) that were reported to on an as required basis;41
- a Governance Board, established in June 2015, chaired by the SRO and with SES Band 2 members from Finance, the interim DTO, the newly established grants hubs (the Business Grants Hub [BGH] and Community Grants Hub [CGH]), and participating entities;
- Program Management Group42 and Grants Hubs Catch-ups;43
- an SGGA Reference Group chaired by Finance, with members at SES Band 1 level; and
- at times, other committees and groups to foster collaboration between Finance, the interim DTO and Digital Transformation Agency (DTA), the National Health and Medical Research Council (NHMRC), the Australian Research Council (ARC), the new hubs and client entities.
2.19 The Governance Board’s role was to provide broad oversight of SGGA Program implementation44, including approving Program artefacts and deliverables.45
2.20 The governance arrangements changed over time. These changes were reflected in versions of the SGGA Program implementation plan until July 2017. After July 2017 the SGGA Program implementation plan was not kept up to date and changes to governance arrangements were conveyed through an April 2018 governance review, briefings to sponsoring committees and the 2019 terms of reference for the SGGA Reference Group. Changes to the key decision-making and implementation oversight committee supporting the SRO (from the Governance Board to the Program Management Group and Grants Hub Catch-ups in 2018 and to the SGGA Reference Group in 2019) were not clearly articulated at the time to the relevant governance bodies.
2.21 Figure 2.4 shows selected SGGA Program governance arrangements in place in August 2015, July 2017 and July 2020. Figure 2.4 does not reflect all changes that occurred between 2015 and 2020.
Source: draft August 2015 SGGA Program Implementation Plan, July 2017 SGGA Program Implementation Plan, and ANAO summary of 2020 meeting records and terms of reference documents.
2.22 Some of the changes to governance arrangements were made in response to the recommendations of reviews, including gateway reviews46 (see Figure 2.6), completed between 2016 and 2020. A summary of reviews and governance-related recommendations is presented in Table 2.3.
Review |
Summary findings and recommendations |
February 2016 First-stage Gateway Review |
The review report found there was a governance framework in place but that it needed to be working more effectively. Participating entities sought greater rigour, clarity and accountability in governance arrangements, including the changed roles of DTO and Finance, and Finance taking a stronger leadership role in driving agendas in meetings. The review made two recommendations relating to the Governance Board focus on benefits and transition and SGGA Reference Group responsibilities and participation. |
August 2017 First mid-stage Gateway Review |
The review report noted that good progress had been made since the previous review, with entities complimenting the range of governance documents and frameworks developed by Finance, and generally positive feedback about the operation of the Governance Board and SGGA Reference Group. No recommendations were made to change the governance framework. |
April 2018 Finance Governance Review |
The review was initiated at a December 2017 meeting of the Governance Board in response to the maturation of the SGGA Program. The review observed that early governance arrangements reflected the need for intense engagement and collaboration with the hubs and client entities at the start-up phase. Given the maturity of the Program the review proposed that:
|
March 2019 Second mid-stage Gateway Review |
The review report noted that governance arrangements needed to be developed post transition of grant programs to the hubs, recommending Finance establish a small group of stakeholders to determine governance arrangements. |
October 2020 PricewaterhouseCoopers (PwC) Post Implementation Review |
The review found that that there was no uniform governance model in place between grants hubs and policy partners. The report recommended Finance lead a review to re-examine and confirm the governance arrangements including the respective roles and responsibilities of the grants hubs, policy partners and Finance. |
June 2021 End-stage gateway review |
The review concluded that the second mid-stage gateway review governance recommendation had been partially addressed. It noted there was evidence discussions had occurred regarding governance, however, appropriate program governance arrangements were not in place. Concerns were expressed about, among other things, risk management and authorities for decisions and changes. |
Source: Review reports.
2.23 In addition, the ANAO identified further improvements that could have been made to governance arrangements.
- The governance arrangements would have benefited from further senior management involvement to drive the achievement of SGGA Program objectives, deliverables and outcomes.
- Responsibilities were established for most key positions, committees and entities. Terms of reference could have been developed for the Program Management Group and Grants Hubs Catch-ups as this creates an understanding about the purpose and role of these bodies.
- Meetings should occur as scheduled and records should have been kept in the case of the Governance Board and SGGA Reference Group.
SGGA Program planning
2.24 The SGGA Program funding proposal included details of deliverables, milestones and responsibility for establishing governance arrangements.
2.25 It was the responsibility of the Finance Program Management Office to develop and maintain an implementation plan, via the Governance Board. The SGGA Reference Group also had responsibility for reviewing updates to the implementation plan.47
2.26 The implementation plan was developed for use by Finance, DTO, DTA, the hubs, participating entities and members of the Governance Board. The plan was intended to provide an overview of the program and describe the mechanisms to track and control progress and was to be supported by separate planning documents for specific projects, such as a data warehouse.
2.27 The plan and supporting documentation covered: the objectives and policy context; a statement of benefits and program outcomes; deliverables; governance arrangements; risk and issues management; and monitoring, review and evaluation.
2.28 The implementation plan was considered to be a ‘living document’ to accommodate an iterative approach to the program of work. Over time, there were changes to many aspects of the plan, including reassignment of responsibilities; changes to deliverables and timeframes; reclassification of benefits; and addition of long-term outcomes. Changes to benefits and deliverables were generally agreed as part of ongoing review by the SRO, Governance Board or SGGA Reference Group. The plan was generally consistent with advice to government although some long-term outcomes (such as ICT consolidation, avoidance of capital costs and business process standardisation) were added and these were not specified in advice to government.
2.29 The July 2017 SGGA Program Implementation Plan outlined the status of deliverables and made some changes to key milestones and deliverables, some of which were not authorised by government (Table 2.4). The Governance Board and SGGA Reference Group did not have the appropriate authority to change deliverables. Deliverables were not monitored after July 2017. Finance advised the ANAO in February 2022 that while the implementation plan was not updated from July 2017, deliverables were monitored through governance arrangements and the Public Service Modernisation Fund reporting.
Key deliverable |
Key milestones in April 2015 funding proposal |
Milestone due dates |
Delivery status in July 2017 SGGA Program Implementation Plan |
Revised milestone due dates |
Develop a whole-of-government grants administration business process and workflows |
Endorsed by Secretaries Committee on Transformation |
March 2016 |
Governance Board endorsed December 2016 |
– |
Implemented by participating entities |
December 2016 |
December 2016 |
– |
|
Agency transition to grants hubs |
Secretaries Committee on Transformation agrees entity transition schedule to hubs |
March 2016 |
Transition intentions report provided to Secretaries Board September 2016 |
– |
Participating entities commence transition to hubs |
July 2016 |
– |
July 2017b |
|
Participating entities’ new grant programs transitioned to the hubsa |
September 2017 |
– |
June 2019b |
|
Establish grants hubs |
Transition plans approved |
September 2015 |
– |
November 2015d |
Develop service offering and catalogue |
November 2015 |
November 2015 |
– |
|
Prequalification capability operational |
March 2016 |
July 2016 |
– |
|
Interface between grants hubs and GrantConnect operational |
June 2016 |
Removed as a milestonec |
– |
|
Grant hubs established and operational |
June 2016 |
July 2016 |
– |
|
Establish a data warehouse |
Standardise data elements |
March 2016 |
– |
October 2017 |
Single data warehouse operational and integrated with administration hub systems and GrantConnect |
June 2016 |
Removed as a milestonee |
– |
|
Single data warehouse integrated with non-hub systems |
June 2017 |
Removed as a milestonee |
– |
|
Undertake market testingf |
Planning and analysis of market test |
September 2017 |
– |
September 2017 |
Complete market testing for feasibility of single ICT system or outsourced administrative services |
December 2017 |
– |
December 2018 |
|
Governance and stakeholder managementg |
Whole-of-government governance arrangements for grants administration agreed and established |
June 2015 |
June 2015 |
– |
Collaborative Head Agreement drafted and agreed |
September 2015 |
December 2015 |
– |
|
Stakeholder management plan agreed |
June 2015 |
September 2015 |
– |
|
Note a: This milestone assumes client entities’ business process transformation is complete.
Note b: Transition schedule and completion revised to reflect changes agreed in the 2017–18 budget measure Public Service Modernisation Fund.
Note c: Phase one of Data Warehouse Scoping Study (October 2016) found the interface between the Hubs and GrantConnect for enhanced data analytics and reporting is not required.
Note d: The July 2017 SGGA Program Implementation Plan noted grants hubs were formally launched on 1 July 2016 and business processes standardised by December 2016.
Note e: Deliverable milestones for the data warehouse changed in the July 2017 SGGA Program Implementation Plan to include a Phase One Data Warehouse Scoping Study and a Phase one Prototype to be completed in July 2017, and reporting to government on the outcomes of market testing, and SGGA Program outcomes and benefits in the 2019–20 Budget Process.
Note f: Dependent on the outcome of the market test further milestones included commencing transition to a single ICT system or administrative services by September 2018 and completing this transition by December 2019.
Note g: Additional governance and stakeholder deliverable milestones were included in the July 2017 SGGA Program Implementation Plan including agreeing a high-level design in October 2016, a benefits realisation strategy in March 2017, and planning to agree a communications strategy in June 2017 and complete program reviews in October 2017, July 2018 and August 2019.
Source: SGGA Program funding proposal and July 2017 SGGA Program Implementation Plan.
2.30 Gateway reviews (see Figure 2.6) made findings in relation to implementation planning.
- The 2016 first-stage gateway review found that many critical controls and frameworks described in the implementation plan had not been initiated.
- The 2016 review found that there were inconsistent approaches to risk management by the overarching SGGA Program and participating entities.
- The 2017 first mid-stage gateway review observed that areas requiring attention included the clarity of program scope and outcomes, onboarding of client entities and emerging timeframe pressures.
- The 2019 second mid-stage gateway review noted that significant risks were untreated and had the potential to affect commitment to, and timing and sustainability of, the SGGA Program.
- The 2021 end-stage gateway review concluded a program management methodology and standard practices had not been consistently applied to drive the delivery of the program.
Were program benefits measured and reported?
A benefits realisation strategy was developed. Inconsistent with the intention of the SGGA Program implementation plan, performance indicators were not measurable and there were no baselines or targets that would clearly demonstrate the achievement of benefits and outcomes.
2.31 The SGGA Program funding proposal identified three major benefits from the SGGA Program: better outcomes for users, reduction in red tape and efficiencies for government (see Table 2.1). To demonstrate these benefits, the SGGA Program required a framework for performance measurement, baseline information to compare performance against, and data and information about client entity experiences.
SGGA Program benefits realisation framework
2.32 No benefit measures were included in the SGGA Program funding proposal.
2.33 There was an intention to develop and maintain, in consultation with stakeholders, a ‘benefits realisation framework’ to identify the quantitative and qualitative benefits of the SGGA Program. The framework was to describe relevant measures, baselines, targets and schedules for realisation. Benefits reporting was to be provided to the Governance Board and Secretaries Committee on Transformation.
2.34 In February 2016 the first-stage Gateway Review recommended the SGGA Program commence as a priority the outcomes and benefits activities indicated in the implementation plan.
2.35 Three benefit realisation frameworks have been endorsed or approved since the commencement of the SGGA Program: in December 2016, March 2017 and October 2018. These were endorsed by the Governance Board or SRO. A logic map was used to link benefits to outcomes and measures. Figure 2.5 provides an overview of the development of the frameworks and the measurement of benefits.
Source: ANAO summary of Finance documentation.
2.36 The frameworks covered four core benefits: improved policy outcomes, efficiencies for government, improved user experience, and enhanced capability to use whole-of-government grants data. This represented a change in benefit areas from those identified in the funding proposal.48
2.37 Ten benefit measures were identified in the 2016 framework. This was increased to 25 in the 2017 framework and reduced to 13 in the 2018 framework.49 The 2017 and 2018 frameworks identified, for each measure, benefit owners who had responsibility for measuring the benefit.50 However, the indicators were often not measurable51 as methodologies had not been finalised and agreed with entities, and they were not complete. The frameworks did not specify baselines, included targets only in 2017, and did not include a monitoring and reporting plan.52
2.38 The end-stage gateway review completed in June 2021 concluded:
A benefits realisation framework was developed but benefits do not appear to have been actively managed or used to drive the Program, and the current benefits are not clearly aligned to the original [funding proposal].
2.39 The review made a recommendation to improve benefits management in future stages of the SGGA Program.53 The review also recommended that the program prepare a closure report that covered the realisation of benefits.
Establishing baseline costs and monitoring costs
2.40 In successive years Finance sought to establish baselines for the SGGA Program, including the costing exercises in 2017 and 2018 (see paragraph 2.15) and benefits information collection from client entities in 2017. The outputs of these activities were not used to establish baselines for reporting against the benefits framework. The 2017 benefits information collection asked entities about grants administration and ICT savings realised. In total across 13 entities, savings of less than $1 million were reported and some entities reported no savings.
2.41 Finance did not continue the costing exercise in subsequent years. Planned annual cost efficiencies and savings were not measured and monitored. A contractor (ASG) was engaged to undertake surveys of client entities in 2019 and 2020. In the 2019 survey of 12 entities, all 12 indicated no impact or a negative impact from the hubs on administration costs. In the 2020 survey of 12 entities, 10 indicated no impact or a negative impact.
2.42 In 2020, to address concerns about the cost of hub services, Finance commissioned PwC to assess whether the hubs were more expensive than entities providing their own grants administration services. Client entities had difficulty providing robust cost information, indicating uncertainty around where to source the costing data and what costs to include, and often excluding overheads and surge staffing. Based on scenarios and estimates provided by the hubs and client entities, the PwC theoretical assessment found that, for the sample of opportunities analysed, the cost of grants administration on average was 29 per cent less for the hubs than for client entities providing their own grants administration. BGH had higher costs in some scenarios than CGH.
Recommendation no.1
2.43 Department of Finance and the hubs agree a methodology to capture and report performance information that demonstrates the efficiency and effectiveness of grants administration through the hubs.
Department of Finance response: Agreed.
Department of Industry, Science, Energy and Resources response: Agreed.
2.44 The Department’s current methodology was developed in conjunction with the Department of Finance and designed to feed up into broader SGGA metrics. In light of the ANAO’s findings, the department will work with Finance and the Community Grants Hub to review the current approach.
Department of Social Services response: Agreed.
2.45 The Department of Finance, as the responsible policy agency for SGGA, has commenced work to deliver against this recommendation. The Department of Social Services is actively contributing to these efforts, and is supportive of a consistent methodology to report performance information in grants administration.
Measuring client experiences
2.46 Separately, DTA, Industry and Finance undertook user experience studies.
- DTA developed a diagnostic tool, however this work did not progress.
- In 2018 Industry engaged Portable to interview 20 users to establish current experience and opportunities for improvement. A range of potential improvements were suggested including measuring user effort and cost.
- Finance engaged PwC to undertake user experience research. A report was provided to Finance in November 2021. Based on survey responses from 92 grant applicants and recipients: 57 per cent were satisfied with the accuracy of pre-populated grants data; 51 per cent were satisfied with the consistency of experience when applying for grants; and 41 per cent were satisfied with any advice they were given regarding additional grant opportunities for which they were eligible.54 A range of recommendations were made, including a single platform for grant opportunities, streamlining application processes and greater standardisation of application processes.
2.47 The ANAO surveyed client entity staff managing grant programs about the impact of using the hubs in relation to key intended benefits and outcomes of the SGGA Program (see Appendix 5). The survey measured the perception of benefits and outcomes in terms of cost, efficiency and effectiveness of grants administration; better grants experience for applicants and recipients; streamlining grants business processes; consistency of administration; better policy outcomes; availability of consistent and quality whole-of-government data; compliance with whole-of-government participation requirements; and evidence-based policy making. Surveyed entities did not indicate that the SGGA Program had led to the achievement of these positive outcomes and benefits.
Reporting benefits to government
2.48 The SGGA Program April 2015 funding proposal indicated that the interim DTO would report to government on implementation progress. Finance last reported to government in 2017; at this time program outcomes and benefits had not been achieved.55
2.49 In May 2017, as part of the 2017–18 budget measure Public Service Modernisation Fund, it was agreed that the Minister for Finance would report to government by the 2019–20 Budget on the next stage of the SGGA Program, including a proposal for savings to be taken from entities participating in the program.56 The report was to include the findings of the market test, together with the outcomes and benefits report and recommendations for stage two of the SGGA Program. This report did not occur because Finance considered the comeback to be extinguished (see paragraph 2.73).
2.50 To report back on and support further action in relation to the Digital Transformation Agenda, DTA was to provide a final report to government on an assessment of government investment in digital and ICT over the last five years, in the 2021–22 Mid-Year Economic and Fiscal Outlook context. In November 2021, DTA advised government that the DSS ICT investment project to deliver a single grants process across departments had delivered the outcomes57 but there was no benefits realisation plan in place.58 DTA observed that shared and common services can be successful but rely on effective governance.
Were key program deliverables achieved?
The achievement of SGGA Program deliverables has not been clearly established. The SGGA Program largely delivered one of four core deliverables. Two hubs, each with an ICT grants management system, were delivered. However, there were delays in the transition of grant programs to the hubs and standardised business processes were not adopted. The three other core deliverables— a single grants administration process, a data warehouse and market testing — have not been achieved.
2.51 SGGA Program core deliverables are set out in Table 1.1.
2.52 The SGGA Program did not deliver:
- Deliverable 1 — a single whole-of-government grants administration process with six different workflows (see paragraphs 3.68 to 3.72);
- Deliverable 3 — a data warehouse capability to improve cross agency reporting (see paragraphs 2.61 to 2.66); or
- Deliverable 4 — a market test (see paragraphs 2.67 to 2.75).59
2.53 Deliverable 2 was largely achieved. BGH and CGH provide grants administration services to participating entities. Each hub maintains its own grants management ICT system. However, the migration of grant programs to the hubs was delayed beyond the required completion date of September 2017 (see Table 2.4) and the hubs have not adopted a single standardised business process.
2.54 Several deliverables, including the data warehouse, were removed as a deliverable, without appropriate authority, from the July 2017 SGGA Program Implementation Plan.
2.55 Gateway reviews assessed progress towards completion of SGGA Program deliverables. Figure 2.6 shows actual and planned timing for the two mid-stage and an end-stage gateway reviews, as well as other commissioned reviews. The results from these reviews were variously presented to the SRO, the Governance Board, the SGGA Reference Group and, on occasion, sponsoring committees.
2.56 The first three gateway reviews indicated that from the outset there were major issues requiring urgent action to ensure successful delivery of the program. Where major issues and risk were identified, recommendations were made to address these issues. The committees monitored implementation of recommendations. While a number of issues were addressed, some matters such as benefits realisation, the market test and data warehouse arose as major issues or risks in most of the gateway reviews, indicating at times action taken was insufficient to manage the risk to the delivery of the program. The end-stage gateway review noted that a number of additional reviews were outsourced to a variety of consultants, but it was not clear how they had been utilised.
Notes: Gateway reviews apply a five-tiered rating system set out in RMG 106, paragraph 97 (table) where:
Green indicates successful delivery of the program/project to time, cost, quality standards and benefits realisation appears highly likely and there are no major outstanding issues that at this stage appear to threaten delivery significantly.
Green-Amber indicates successful delivery of the program/project to time, cost, quality standards and benefits realisation appears probable however constant attention will be needed to ensure risks do not become major issues threatening delivery.
Amber indicates successful delivery of the program/project to time, cost, quality standards and benefits realisation appears feasible but significant issues already exist requiring management attention. These need to be addressed promptly.
Amber-Red indicates successful delivery of the program/project to time, cost, quality standards and benefits realisation is in doubt with major issues apparent in a number of key areas. Urgent action is needed to address these.
Red indicates successful delivery of the program/project appears to be unachievable. There are major issues on program/project definition, schedule, budget, quality or benefits delivery. The program/project may need to be re-baselined and/or overall viability re-assessed.
Source: ANAO summary of SGGA Program documentation.
2.57 A PwC post implementation review commissioned by Finance and reported in October 2020 concluded that the SGGA Program had mostly achieved two program objectives60 and partially achieved two others.61 However, the program objectives and deliverables described in the PwC post implementation review were not the same as the objectives or deliverables in the SGGA Program funding proposal.
2.58 The June 2021 end-stage gateway review noted that the outcomes and benefits have been through several iterations since the original funding proposal and observed that many of the iterations had not been measured or monitored. The review concluded that:
- in relation to deliverables — the data warehouse was not achieved;
- in relation to benefits — better outcomes for users, reduction in red tape, and efficiencies for government were partially achieved; and
- in relation to objectives — streamlined processes were achieved.
2.59 Observations from the end-stage gateway review report for each benefit area and a selection of deliverables are presented in Table 2.5. Often the evidence set out in the review report was anecdotal or limited in scope, and did not sufficiently demonstrate the deliverables, benefits or objectives realised, including how streamlined processes were achieved.
2.60 Two core deliverables from the SGGA Program funding proposal to be delivered by Finance — a data warehouse and a market test — are examined below.
Benefit area |
Ratinga |
Benefits, objectives and deliverables to be realised through the SGGA Program |
Review observations |
1. Better outcomes for users |
Partial ▲ |
A. Better targeting of grants (achieved through making grants easier to find and better information) will improve policy outcomes. |
Consolidated data to support this is not fully available and there is little evidence of departmental use of the data that is available for this purpose. |
B. An improved client experience will be delivered by making it easier to find and apply for grants. |
The review team accepts the client experience should have improved given the improved application system functionality, however this is yet to be measured and reported on. |
||
2. Reduction in red tape |
Partial ▲
|
C. A consistent grants process across government including a common authentication system and the ability to re-use applicant registration and pre-qualification information (for example, demonstrating financial viability) will deliver red tape reduction for individuals and businesses. |
CGH was developing an authentication system using AUSkey, however AUSkey was discontinued. CGH has plans to use MyGov when level 3 authentication becomes available. BGH mostly uses ABNs for authentication. CGH are able to do some prefilling, BGH are able to prefill from ABN and from the applicant profile. Standard templates are in place for grant applications and grant agreements. CGH use 90 per cent standard templates. |
3. Efficiencies for Government |
Partial ▲ |
D. Standardised and streamlined administrative processes and improved scheduling and productivity of specialised staff in the hub shared services centres will lead to potential efficiencies of approximately $50 million annually (based on a 20 per cent reduction in administrative costs). |
There is anecdotal evidence of efficiency gains but these are neither quantified or documented. $6–7 million in staff costs were avoided following the transfer of Health grants to CGH. Capability increase, such as increased workload without increasing staff numbers, was noted by other agencies. A cost benefit benchmarking exercise was done in 2016 (by Synergy) but the measure was not repeated. Recent PwC reports argue that grants are now generally administered for a reduced cost. |
E. Capability to analyse whole-of-government grants administration and payments data. |
Consolidated whole-of-government data is not readily available to facilitate this. |
||
F. A reduction in capital investment in grants systems over 10 years (depending on the outcome of market test) through further consolidation of ICT systems. |
This has been achieved given no funding for non-hub systems since the SGGA Program began.b |
||
Streamlined processes |
Achieved ◆ |
Paragraph 1.1 shows this as a Program objective. |
A two-stage approach was adopted. Stage one – standardise business processes and consolidation of systems through development and migration to two administrative hubs. This has been achieved. A single grants platform would offer significant benefits, it has been agreed at ministerial level that the next phase would be addressed as a separate program. Having two hubs with different service models and prices has had a positive effect on the client entity engagement, to the point where 90–95 per cent of grants are processed through the hubs. |
Data warehouse |
Not achieved ■ |
Table 1.1 shows this as deliverable 3. |
A prototype was developed but the feature was descoped in favour of the use of GrantConnect. |
Note a: Benefits were assessed and rated for the benefit areas and for two deliverables, rather than for the benefits to be realised as depicted in column three of the table.
Note b: The ANAO notes that this does not take account of the funding provided to the hubs (see Appendix 3).
Source: ANAO summary of end-stage gateway review.
Data warehouse
2.61 Establishing a data warehouse to support analysis of grants data was one of the four core deliverables of the SGGA program. In April 2015, the government was advised that the introduction of a data warehouse62 would deliver efficiencies for government by providing capability to analyse whole-of-government grants administration and payment data to better manage and target grants. Better targeting grants was linked to better outcomes for users and better policy outcomes.
2.62 In April 2015 the government was advised that the:
- government currently does not have the ability of either providing a whole-of-government view of grants administration and payments or analysing grants data;
- implementation of a data warehouse was to build upon existing capability and provide a whole-of-government analytic capability; and
- the interim DTO was to deliver the data warehouse at a cost of $2.4 million63 by June 2016 for hub data and by June 2017 for whole-of-government (and possibly other jurisdiction) data.
2.63 Government transferred responsibility for the data warehouse to Finance in December 2015. Finance engaged Callida Consulting to conduct a scoping study, which recommended in October 2016 the use of existing systems and whole-of-government data capabilities64 to provide the data warehouse capability. In response to the scoping study recommendation, the Governance Board noted that a separate data warehouse was not considered necessary and the consolidation of GrantConnect and the hubs’ grants management systems should provide all the information that is required for achieving SGGA Program objectives, although central control of the data was necessary. The Governance Board agreed that a virtual data warehouse would be established; and the Governance Board would assume additional responsibilities associated with ensuring the effective use of whole-of-government grants data.
2.64 Between February 2017 and April 2018, a project approach and roadmap was agreed, a ‘Discovery Phase’ report and the prototype stage were completed, and memoranda of understanding were signed and a working group was established involving Finance and the hubs. CGH was appointed technical lead65, and a draft project management plan was developed. Delivery would occur in four stages, with all work to be completed at a cost of $3.7 million66 by 30 June 2019. This was three years later than the original advice to government about delivery of an operational data warehouse combining hub and GrantConnect data.
2.65 In July 2020 the SGGA Reference Group was advised that core functionality for the data warehouse had not been delivered. The SGGA Reference Group was advised that there was a lack of interface between different systems and that incomplete and inadequate data affected accuracy and currency of reporting. Implications were that recipients could be double funded due to gaps in common source data, and data was not being used to better manage and target grants funding. In October 2020 the PwC post-implementation review found that the hubs used different grants management systems involving variations in data and formats. These arrangements did not support simple grants management and reporting through a single system.
2.66 In June 2021, the end-stage gateway review concluded that the data warehouse had not been achieved.
Market test
2.67 In April 2015, as part of the 2015–16 Budget measure Digital Transformation Agenda — Stage One, the government agreed to undertake a market test as part of the SGGA Program. Government was advised that the market test could assess grants management ICT systems and grants administration services; whether to expand the program beyond the 12 main granting entities; and/or innovative arrangements for grants administration. The outcome of the market test would inform transition to a single grants management ICT system or grants administration service provider from July 2018.
2.68 Prior to undertaking the market test, it was necessary for standard business processes to be adopted and grants administration to be consolidated to deliver a mature contestable service by September 2017. All versions of the implementation plan in place between April 2015 and June 2016 planned to complete the market test by December 2017.
2.69 In May 2017, as part of the 2017–18 Budget measure Public Service Modernisation Fund, the government agreed to provide further funding to the hubs to support a more rapid consolidation of government grants administration for six participating entities. The Minister for Finance was to report back to government by no later than the 2019–20 Budget with the market test results.
2.70 In August 2017, the first mid-stage gateway review concluded that market testing was unlikely to meet the schedule due to a lack of clarity around scope, the timing of transition of programs to the hubs and the form of the approach to market. The review recommended the SGGA Program clarify the scope, method, pricing model and proposed timing for market testing. In September 2017 Finance advised the Minister that there was insufficient time to undertake a market test according to the agreed schedule. Finance recommended and the Minister agreed to delay market testing until 2019–20.
2.71 In November 2017, Finance proposed two contestability initiatives to the Minister. The first was to include the NHMRC as an interim hub to introduce competitive tension. The second was to conduct a contestability scoping study and report back to government in the 2018–19 Mid-Year Economic and Fiscal Outlook. The Minister for Finance agreed to these initiatives in November 2017. In December 2017 the Assistant Prime Minister agreed to these arrangements, including deferring market testing to beyond the 2019–20 Budget, on the understanding that the initial benefits of the program will not be realised until 30 June 2019.
2.72 Between May 2018 and December 2019, two additional reviews commissioned by Finance of the SGGA Program’s readiness for contestability and options for market testing were undertaken.
2.73 Finance received advice from PM&C in July 2019 that the SGGA Program Update comeback to government had been extinguished. Finance advised the ANAO that this eliminated the requirement to market test. It is not clear to the ANAO why the decision to extinguish the report back was interpreted by Finance to mean the program did not need to be implemented.
2.74 Finance’s subsequent reporting on SGGA Program progress to governance bodies did not include an update on market testing and the Minister for Finance also was not informed about progress towards market testing.
2.75 In June 2021, the end-stage gateway review noted that:
- government had agreed not to progress market testing due to a lack of contestable options in the market67;
- a recommendation from the March 2019 second mid-stage gateway review to develop options and scope for the market testing activities had been fully addressed; and
- the work completed to date provided a sound base for the second phase of the market testing envisaged in the original government consideration of the SGGA Program.
2.76 In summary, of the four core deliverables outlined in Table 1.1, only one has been largely delivered. Plans for their delivery have not been agreed with government. Finance advised in November 2021 that no Ministerial agreement or authority has been sought regarding the future of the SGGA Program.
Recommendation no.2
2.77 Department of Finance develop and agree a future plan for the operation of grants hubs, and where this plan differs from the SGGA Program funding proposal, seek authority for changes from government.
Department of Finance response: Agreed.
Department of Industry, Science, Energy and Resources response: Noted.
Department of Social Services response: Noted.
2.78 The Department of Finance, as the responsible policy agency for SGGA, is leading this work. The Department of Social Services is actively contributing to these efforts, and is supportive of a whole-of-government dataset for grants administration.
3. Build and operation of Grants Hubs
Areas examined
This chapter examines whether the hubs have been built and operated to deliver the Streamlining Government Grants Administration (SGGA) Program objectives by delivering more effective and efficient grants administration.
Conclusion
The build and operation of the grants hubs were partly effective. While consistency and effectiveness in grants administration is somewhat improved, there are deficiencies in relation to usage of the hubs for the full grants lifecycle, collaboration between the hubs and client entities, and data management. The hubs have not developed an appropriate performance framework to measure the benefits. There is limited evidence that the forecast benefits of the SGGA Program have been achieved.
Areas for improvement
The ANAO made one recommendation aimed at the Department of Finance (Finance) and the hubs regarding establishing a whole-of-government grants administration dataset and assuring data quality. The ANAO also identified a range of opportunities for the hubs to support more effective grants administration at paragraphs 3.28 and 3.44.
3.1 The hubs were expected to be operational by 1 July 2016. The Department of Industry, Science, Energy and Resources (Industry) and the Department of Social Services (DSS) commenced building the Business Grants Hub (BGH) and Community Grants Hub (CGH), respectively, in 2015. The hubs commenced operation from 1 July 2016.
3.2 In November 2021, BGH advised there were about 300 staff delivering BGH grants administration services. The staffing profile for CGH was 921 as at 21 October 2021.68
3.3 The objective of the SGGA Program was to deliver simpler, more consistent and efficient grants administration across government. This approach would enable government to deliver grants more effectively at lower cost and risk. The audit considered whether:
- the hubs have assessed the effectiveness and efficiency of grants administration;
- there is more effective grants administration; and
- there is more efficient grants administration.
Have the hubs assessed the effectiveness and efficiency of grants administration?
Although there was a benefits framework and performance indicators designed to measure effectiveness and efficiency, the hubs did not adopt a common set of measures to support SGGA Program benefit measurement. Many performance indicators were not measurable or lacked targets. There were no baselines established for performance indicators at the outset to allow judgements to be made about the achievement of benefits. The hubs cannot clearly demonstrate that their establishment has led to more effective or efficient grants administration.
3.4 The ANAO examined the measures established and reported by the hubs to assess effectiveness and efficiency.
Hubs effectiveness measurement
3.5 The July 2017 SGGA Program Implementation Plan sets out that it is the responsibility of the hubs to assist with the measurement of SGGA Program benefit outcomes by collecting and reporting data in relation to agreed performance measures for the hubs.
3.6 The first step in establishing an effective performance framework is to determine the intended outcomes, or benefits, of the program. The hubs did not have benefit frameworks, measures or baselines in place when they commenced operation that would allow them to demonstrate the hubs’ impact over time. Between June 2018 and June 2020 BGH developed three benefits realisation frameworks. CGH planned, developed and refined key components of its benefits realisation framework between September 2015 and May 2018.
3.7 Figure 3.1 provides an overview of the development and implementation of benefits realisation frameworks and performance measures.
Source: ANAO summary of SGGA Program, CGH and BGH documentation.
3.8 Table 3.1 compares the final BGH and CGH benefits frameworks to the SGGA Program Benefits Realisation Framework from October 2018. This shows the frameworks aligned with one exception—BGH benefit outcomes do not address the SGGA Program benefit area of improved policy outcomes.69
SGGA Program benefit area |
SGGA Program benefit outcome, October 2018 |
BGH benefit outcome, June 2020 |
CGH benefit outcome, April-May 2018 |
Improved policy outcomes |
A. Improved evidence-based policymaking through access to high quality dataa |
Benefit area not included in the BGH Benefit Realisation Framework |
A.1 and B.1 More reliable, quality and accessible grants data
|
B. Improved alignment with and responsiveness to applicable whole-of-government policies (such as privacy, cyber security, evaluation and risk management) |
|||
Improved user experience |
C. Reduced administrative burden on grant applicants and recipients through better systems and process design |
C.1 and D.1 Improved customer satisfaction C.1 and D.1 Improved employee satisfaction |
C.1 and D.1 Reduced red tape, improved access to CGH services and ease of use
|
D. Grants applicants and recipients experience an improved user experience |
|||
Efficiencies for government |
E. Reduced administrative costs to Government through better use of resources |
E.1 Improved systems |
E.1 and F.1 Reduced grant processing costs E.2 and F.2 More agile, scalable, and capable workforce E.3 and F.3 Increased productivity through common processes & automation E.4 and F.4 More responsive, consistent, predictable and manageable grant service E.5 and F.5 Reduced red tape, improved access to CGH services and ease of use |
F. Productivity improvements in grants administration through better systems and processes |
F.1 Streamlined service delivery F.2 Reduced manual processing |
||
Enhanced capability to use whole-of-government grants data |
G. Improved availability and quality of whole-of-government grants datab |
G.1 Increased data access G.2 Increased data quality |
G.1 More reliable, quality and accessible grants hub data G.2 Finance is the benefit owner for records within whole-of-government data sets meeting quality standards. |
Note a: In the SGGA Program Benefits Framework 2018 Finance is the benefit owner for this benefit outcome in relation to whole-of-government grants data informing government decision-making. Consuming entities and hubs are benefits owners to the extent that administration by the hubs leads to better designed grants programs.
Note b: In the SGGA Program Benefits Framework 2018 Finance is the benefit owner for this benefit outcome. Consuming entities and hubs are benefits owners to the extent that they contribute to this measure through meeting their reporting responsibilities for in-scope grants in the whole-of-government grants data set.
Source: ANAO summary of SGGA Program, BGH and CGH benefit realisation documentation.
3.9 The hubs have not adopted the common set of measures developed for the SGGA Program. Over time, the BGH and CGH frameworks contained 33 and 62 performance measures, respectively, which included both effectiveness and efficiency measures.
- BGH developed 13 effectiveness measures through its benefits framework, with its most recent benefit framework comprising 12 effectiveness measures.70 BGH established a baseline for 11 of the measures and, between 2017–18 and 2020–21, reported on between seven and 10, depending on the year. Not all measures had targets. For the four to eight measures that had targets, targets were met for between zero and one measure, depending on the year.
- CGH initially developed 24 effectiveness measures, with its most recent benefit framework containing 16 effectiveness measures.71 CGH established a baseline for eight of the 16 measures. In 2017–18 and 2018–19 CGH reported on four measures; each had targets and targets were met for three in 2017–18 and four in 2018–19.
3.10 Between 2016 and 2018 CGH measures changed each year, limiting ability to demonstrate improvements over time. Similarly, BGH measures changed (to a lesser extent) between June 2018 and June 2020.
3.11 From 2018–19 CGH developed hub business plans that included a purpose statement, priorities, key activities, key performance indicators and targets. There were two measures of effectiveness in 2019–20 and three in 2020–21.72 CGH did not report against these measures. The AusIndustry Division’s business plans in 2017–18 and 2018–19 included BGH priorities and key activities but did not include performance measures.
3.12 Neither hub consistently monitored or reported on benefit framework measures between 2016 and 2021.
- Prior to the introduction of the June 2020 BGH Benefits Realisation Framework, BGH did not establish reporting requirements for benefits measures. From June 2020 BGH introduced a requirement to undertake benefit measurement biannually. BGH reported on 74 per cent of benefit measures in July 2020, and 69 per cent in April 2021.73
- CGH planned to monitor and report against benefit measures on a six monthly basis. In practice, CGH reported or established baselines for fewer than half of its measures and did not report on any benefit measures in any year except 2018–19. CGH ceased use of the benefits realisation framework to demonstrate performance in January 2019.
3.13 The ANAO assessed the appropriateness of CGH and BGH effectiveness indicators and found that almost half were not measurable.74
- Seven of the 13 BGH effectiveness indicators were measurable. The methodology for measuring four effectiveness measures changed, impacting the hub’s ability to accurately convey performance over time.
- For both BGH and CGH, the methodologies do not always appropriately capture the intended measure.75
- Five of 24 CGH effectiveness measures and the business plan effectiveness measures did not have specified methodologies.
3.14 Due to the lack of measurability, changes in measures over time and a lack of monitoring, the hubs could not consistently demonstrate the achievement of intended benefits.
Hub efficiency measurement
3.15 Resource Management Guide (RMG) 131 indicates that efficiency is generally measured as the price of producing a unit of output and is generally expressed as a ratio of inputs to outputs. A process is efficient where the production cost is minimised for a certain quality of output, or outputs are maximised for a given volume of input.76
3.16 Requirements for BGH and CGH efficiency measures were established as part of the benefits realisation frameworks of the hubs.
- Between 2017–18 and 2019–20 BGH established 19 efficiency measures (including measures of timeliness).77 The most recent benefit framework contained 15 efficiency measures. BGH established a baseline for 10 of the 15 measures. A varying subset of measures were reported between 2017–18 and 2020–21. For the 12 measures that had targets, targets were met for between zero and two measures, depending on the year.78
- Between 2016–17 and 2020–21 CGH established 33 efficiency measures, with its most recent benefit profile containing 10 efficiency measures (including timeliness measures).79 CGH established a baseline for seven of the 10 measures. Reporting on six efficiency measures was presented in 2018–19; all reported measures had targets and four measures met the target. CGH did not report on efficiency measures in other financial years that the hub has operated.
- CGH business plans also included one measure of efficiency in 2019–20 and 2020–21: ‘grants administration services are delivered in a cost-efficient manner on a full cost recovery basis’. This is the only efficiency measure at November 2021. CGH did not report against this measure.
3.17 The ANAO assessed the appropriateness of BGH and CGH efficiency measures and found that the indicators were often not measurable.
- Seven of the 19 BGH indicators were measurable. Twelve did not have a measurement methodology, and the methodology for one measure changed limiting the ability to assess efficiency over time.80
- Twenty-one of the 33 CGH measures were measurable. Twelve did not have a methodology.81 The 2019–20 and 2020–21 business plan efficiency measure methodology did not address the intent of the measure.82
3.18 The hubs and client entities participated in Finance benchmarking activities including a 2014–15 cost baseline study and a cost review completed by PricewaterhouseCoopers (PwC) in June 2020. Both of these studies highlighted difficulties in entities sourcing and defining cost data. The reports identified other factors impacting cost benchmarking such as significant variation in grants administration approaches within and across entities, potentially impacting the robustness of any comparisons.
3.19 In summary, as a consequence of deficiencies in efficiency measurement and benchmarking, the hubs were unable to clearly demonstrate efficiency outcomes.
Have the hubs supported more effective grants administration?
The hubs have partly supported more effective grants administration. Where data was available, BGH and CGH generally demonstrated compliance with selected legislative requirements. Business processes and workflows were established but poor data quality impedes an assessment of whether processes and workflows were followed. There are significant data management and quality issues. Client usage of services across the grants lifecycle is uneven. Evaluation services are rarely used. Service standards have been developed but are only reported by CGH. Collaboration with and support of client entities in the grant design and hub usage phases is uneven.
3.20 To determine whether the hubs have supported more effective grants administration, the ANAO examined the effectiveness of the hubs in relation to: compliance with legislative requirements; establishing business workflows; managing grants data; entity usage of the hubs; meeting service level standards; and supporting client entities.
Compliance with legislative requirements
3.21 The ANAO tested levels of compliance for select requirements under the Public Governance, Performance and Accountability Act 2013 (PGPA Act), Commonwealth Grant Rules and Guidelines (CGRGs), and resource management guides.83 This analysis was done using data extracted from the hubs’ grants management information and communications technology (ICT) systems and from GrantConnect; and, where relevant, relied on the work of ANAO financial statement audits in 2019–20 and 2020–21 including systems control testing.
3.22 Table 3.2 shows for BGH services, there were high levels of compliance with selected requirements.84
Requirement tested |
BGH |
A. Make a relevant commitment, enter into an agreement and administer an agreement Only an appropriate delegate can make the final decision whether to approve an application for funding and authorise a payment to be made (section 23 and section 110(1)(a) of the PGPA Act, and paragraph 36 of RMG 400). |
ANAO financial statements audit testing concluded for the 2019–20 and 2020–21 financial years relevant controls were operating satisfactorily and no exceptions were identified.a |
B. Commitments of relevant money A grant payment cannot be made under an arrangement until an arrangement has been entered into by the accountable authority (or an appropriate delegate) on behalf of the entity (subsection 23(1) of the PGPA Act). |
BGH complied with this requirement for 99.7 per cent of unique applications (195 of 61,145 appear in the payments and reports dataset, but do not appear in the agreement status dataset). |
C. Transparency of grants awarded from a grant opportunity As of 1 July 2020, the grant opportunity identification number (GO ID) is required to be published as part of the grant award report (paragraph 24 of RMG 421). Exceptions to reporting the GO ID include where grants are provided on a one-off or ad hoc basis, an exemption is provided by the Minister for Finance, or the Minister makes a decision not to publish grant guidelines. |
BGH complied with this requirement for 99.9 per cent of executed grant agreements (2146 of 2149 with a start date on or after 1 July 2020 reported by BGH on GrantConnect) that could be linked to GrantConnect. |
D. Reporting grants awarded From 31 December 2017, an entity must report, on GrantConnect, information on individual grants no later than 21 calendar days after the grant agreement for the grant takes effect (paragraph 5.3 of the CGRGs). The date of effect will depend on the particular arrangement. It can be the date on which a grant agreement is signed or a specified starting date. RMG 421 paragraph 23 states that where there is no grant agreement, officials may decide and document why they have chosen a specific date of effect or start date and may relate to the date of the first invoice or payment. |
Of the grant agreements reported on GrantConnect, BGH complied with this requirement for 94 per cent of the grant agreements that could be linked (15,390 of 16,338 unique applications reported by BGH on GrantConnect within 21 days). The 948 (six per cent) non-compliant records were reported on GrantConnect, on average, 51 days after the start date. There were 237 BGH grant agreements reported on GrantConnect, that had not recorded the status of ‘executed’ in the hub grants management ICT system. |
E. Confidentiality provisions An entity must identify whether a grant agreement contains any confidentiality provisions and provide the reasons for those provisions when reporting a grant award on GrantConnect (paragraph 5.5 of the CGRGs, and paragraph 28 of RMG 421). In grant agreements there are two broad types of confidentiality related clauses used: general clauses referencing confidentiality in legislation, and special confidentiality provisions which protect the confidentiality of: all or part of the grant agreement; or information obtained or generated in relation to the grant project (paragraph 27 of RMG 421). |
All (16,368) of the BGH executed grant agreements that could be linked to grants awards reported on GrantConnect included confidentiality provisions. However, this reporting was not sufficiently supported by the hub grants management ICT system, including providing support for the one grant award that reported it had a provision for confidentiality of the grant output. For all unique executed grant agreements in the agreement status dataset, grant confidentiality input and confidentiality output options were set to ‘not applicable’, where ‘not applicable’ is the default option. As the default option was retained in all cases, it is not clear whether the question has been considered and the field had been deliberately completed in the system. There was no record in the system of the agreement that was reported on GrantConnect as having a confidential output. |
F. Power to administer an arrangement Total grant amount paid must not exceed total approved grant amount in the grant agreement (section 23 of the PGPA Act and paragraph 36 of RMG 400). |
ANAO financial statements audit testing concluded for the 2019–20 and 2020–21 financial years relevant controls were operating satisfactorily, it had gained reasonable assurance in relation to grants rights and obligations, occurrence, accuracy, classification and expenses balance and did not identify exceptions.a |
J. Power to administer an arrangement Payments are in accordance with executed agreements. The right amount is going to the right person at the right time (section 23 of the PGPA Act and paragraph 36 of RMG 400). |
ANAO financial statements audit testing concluded for the 2019–20 and 2020–21 financial years relevant controls were operating satisfactorily, it had gained reasonable assurance in relation to rights and grants obligations, occurrence, accuracy, classification, and expenses balance and did not identify exceptions or issues.a |
Note a: This result was based on an examination of hub documentation, such as authority to proceed with grant opportunity/funding round, spending minute approval, grant agreement, invoices and acquittal letters.
Source: ANAO analysis based on Industry grants administration system data.
3.23 For legislative requirements A, B, F and J, ANAO financial statement audits concluded that for 2019–20 and 2020–21, relevant CGH controls were operating satisfactorily, with one exception.85 For reasons outlined in paragraph 3.36, the ANAO 2021–22 Interim Report on Key Financial Controls of Major Entities will further examine CGH compliance with legislative requirements.
3.24 Consistent with Finance’s Shared Service Arrangements Guide: 04/2017 Governance (Assurance) Framework, the hubs provide assurance over the operation of their systems through formal representations and independent assurance statements on the operation of their control frameworks.86 These assurances include identification of breaches of legislative requirements and business processes. Hub assurances are required because when hubs administer grants, the client entities’ accountable authorities retain ultimate responsibility for having appropriate systems relating to risk and internal controls in place to support the proper use and management of the relevant grant funds.
3.25 From 2017–18 both hubs provide annual management assurance letters (MALs) to their client entities.87 While the type of assurance provided in the respective hubs’ 2019–20 management assurance letters is broadly the same, there are some differences as shown in Table 3.3.
Assurance area |
BGH assurances |
CGH assurances |
Legislative obligations, including CGRGs |
Whether have been complied with |
Indirect assurancea |
Finance lawb |
Whether there have been any significant breaches |
Whether there have been any significant breachesc |
Hub assurance framework |
Grants have been delivered consistent with framework |
None |
Partnership agreements |
Grants have been delivered consistent with partnership agreement obligations |
None |
Records and reporting |
Records and reporting accurately reflect grants activity and transactions |
Records and reporting accurately reflect grants activity and transactions and no financial material has been withheld from client entity |
Fraud |
Any fraud or suspect fraud has been disclosed to client entity |
None |
Control framework |
Any significant deficiencies in control framework have been disclosed to client entity |
None |
Oversight |
None |
Governance and advisory bodies have provided oversight of assurance arrangements |
Effectiveness of controls framework |
None |
Effectiveness of CGH Controls Testing Framework has been assessed |
Note a: The MAL indicates the financial information provided by the hub to the client entity is true and fair in accordance with the requirements of the PGPA Act and PGPA Financial Reporting Rule 2015, and that financial accounts and records have been maintained and can be generated in accordance with the PGPA Act and Financial Reporting Rule and other legislation. DSS advised the ANAO that the CGH MALs ‘provides assurance’ that the hub ‘adheres to the requirements of the PGPA Act …[and that] … the CGRGs, being a legislative instrument under section 105C of the [PGPA Act] are therefore also being adhered to.’
Note b: Reportable breaches include ‘high volume, high value and/or systemic instances of non-compliance’ of the CGRGs. Department of Finance, Notification of significant non-compliance with the finance law (RMG 214) [Internet], Finance, updated September 2020, paragraph 10, available from https://www.finance.gov.au/government/managing-commonwealth-resources/notification-significant-non-compliance-finance-law-rmg-214 [accessed March 2022].
Note c: CGH reporting of breaches includes reporting instances of breaches of the CGRGs. MALs did not consistently include a report on Finance Law breaches: none of the MALs in 2017–18 (seven) and 2018–19 (nine) included Finance Law breaches; two of nine MALs did not include these in 2019–20; and one of nine MALs did not include these in 2020–21.
Source: ANAO analysis of BGH and CGH 2019–20 management assurance letters.
3.26 In 2020–21 BGH changed the nature of assurance provided. Rather than providing assurance, the 2020–21 BGH management assurance letters reported the results of testing against a sample of key obligations. The intent was to provide confidence that the client entity’s granting programs were being managed in compliance with frameworks. In 2020–21 the nature of CGH management assurance letters was generally the same as in 2019–20.
3.27 Between 2017–18 and 2019–20, one significant breach was identified for a BGH-managed grant program.88 Other common BGH deficiencies related to risk management and late publication of grants (a breach of the CGRGs). In 2020–21 Industry reported 25 instances of publication of grants outside the 21-day reporting period, and 113 instances of non-compliance arising from independent testing. For CGH, 302 breaches were identified in 2019–20. The late publication of grants outside the 21-day reporting period accounted for 292 out of 302 deficiencies.89 A further 68 breaches were identified in 2020–21. Two related to financial delegation deficiencies, and 66 related to late publication of grants.90
3.28 There were opportunities for BGH and CGH to improve reporting in management assurance letters.
- BGH could advise client entities of remedial action in response to breaches and deficiencies.
- CGH could report control weaknesses identified through independent testing to client entities.91
- CGH could provide explicit assurance of compliance with the CGRGs.
Establishment of business processes and workflows
3.29 The SGGA Program implementation plan assigned responsibility for development of standard business processes, including six workflow processes, to the hubs. To underpin these business processes and workflows, the hubs needed to develop policy, procedures and templates. These policy and procedures form part of the hub’s internal control framework which includes measures to ensure compliance with finance law.92
3.30 The hubs developed standard business processes and workflows for grants management and these were supported by internal policy and procedures documents.
- BGH policy and procedures were available to staff on DocHub (the Industry core recordkeeping system). In March 2021 there were 258 resources comprising internal policy, procedures, templates and process maps.
- CGH policy and procedures were available on HubNet (a web-based portal). Links were provided to relevant procedures, guidance material and process maps. At March 2021, there were 307 links to procedures, guidance material and process maps, of which 232 were accessible and 75 were not. Sometimes, multiple records had the same or similar names in DSS’s recordkeeping system. Where links to documents did not work, this made it difficult to identify the right reference. Some CGH staff indicated that relevant or current documents and records were not accessible on HubNet. In early 2021 CGH undertook a project to refresh and update HubNet and CGH policies and procedures were migrated to a new team site in September 2021.
3.31 The ANAO examined the BGH grants management data extract for evidence that 17 key business processes and workflows established in policies and procedures had been met (see Appendix 6). The ANAO was unable to conduct a similar analysis of CGH data extract for the reasons identified at paragraph 3.36.
3.32 For six of 17 BGH processes examined, poor data quality impacted the assessment of whether business processes and workflows were followed.93 Where the ANAO was able to undertake the analysis, there were inconsistencies with processes and workflows. These inconsistencies were widespread (between 40 and 75 per cent of relevant records) for five of the 17 business processes examined.94 More isolated inconsistencies (between one instance and 25 per cent of relevant records) were observed across 11 of the 17 business processes examined.95 The ICT system did not have system controls to identify outlier responses or prevent errors or practices that were not consistent with business processes from occurring.96
Management of grants data
3.33 Under the SGGA Program, it was intended that each hub would maintain its own grants management ICT system but adopt common business processes and workflows across the two hubs. These arrangements were to support a whole-of-government data warehouse capability. The data warehouse was intended to improve cross entity reporting and enable data analytics of grants administration and payments.
3.34 On 1 December 2020, the ANAO requested a single dataset of grants transactions between 1 July 2016 and 30 November 2020. BGH and CGH provided the ANAO with a data extract in December 2020 (for applications received by BGH between 1 July 2016 and 30 November 2020) and February 2021 (for applications created by CGH between 1 July 2016 and 19 February 2021).
3.35 The ANAO found data quality issues with the extracts.
- The number of BGH applications recorded in the data for each lifecycle phase was not consistent with expectations from the standard business workflows. Additional anomalies in BGH data are outlined in Appendix 6.
- CGH constructed queries to generate the datasets from its grants management ICT system. The manner in which it did this introduced errors.97
- The hubs were unable to provide sufficient information to give the ANAO assurance over the completeness and accuracy of the data, and whether there was information loss during the extraction and delivery process. Although there is some automated population of data fields, the hubs do not have quality assurance arrangements for testing the completeness and accuracy of data in the grants management ICT systems across programs and entities.
- Lack of or insufficient data dictionaries and the use of non-mandatory fields for client entity data entry, also impact on the useability of the data.
3.36 Subsequent to the provision of the datasets, CGH advised the ANAO that its extract included non-hub related grant records. The ANAO determined that the CGH data extract was not suitable for analysis. The ANAO 2021–22 Interim Report on Key Financial Controls of Major Entities will further examine CGH compliance with business processes and legislative requirements.
Recommendation no.3
3.37 To assist in the achievement of the intended benefits of the SGGA Program, Department of Finance and the hubs establish a whole-of-government grants administration and payments dataset and implement arrangements to assure the quality of the data.
Department of Finance response: Agreed.
Department of Industry, Science, Energy and Resources response: Agreed.
3.38 Establishing a common grants administration and payments dataset is likely to have significant implications on how grants are currently managed and delivered through the Business Grants Hub. The department agrees to work with the Department of Finance and Community Grants Hub to investigate options to further streamline whole-of-government arrangements.
Department of Social Services response: Agreed.
3.39 The Department of Finance, as the responsible policy agency for SGGA, is leading this work. The Department of Social Services is actively contributing to these efforts, and is supportive of a whole-of-government dataset for grants administration.
Hub usage
3.40 The SGGA Program implementation plan and Estimates Memorandum 2017/40 Whole-of-Government Grant Administration Arrangements set out that participating entities98 would adopt end-to-end grants administration services offered by the hubs across the grants lifecycle for all existing grant programs and NPPs.99
3.41 Schedules to head agreements with client entities form the basis for specifying services to be delivered by a hub. To ensure there is sufficient clarity of service uptake, schedules should clearly specify services to be provided to client entities under the agreement.
3.42 Hub agreements with client entities form the basis for specifying services to be delivered by a hub.
- BGH — Of 130 schedules to head agreements, 15 contained no details of services to be provided. In addition, 73 did not include the final quote.
- CGH — Of 53 schedules to head agreements, nine contained no details of services to be offered. In addition, 30 did not include the final quote. In December 2020, CGH advised that the schedules to head agreements provided generic services offered and final services to be provided are outlined in the accepted quote document, which is attached to a schedule.
- In summary, out of 183 CGH and BGH schedules, 54 contained no information that identified the services provided.
3.43 The ANAO examined the BGH and CGH agreements to determine if the agreements specified a fee for service and compared this to actual cost recovery through invoicing. Neither hub consistently included fees for service in agreements, with BGH100 more likely than CGH101 to include this information. The absence of information about fees for service in agreements impacts the hubs’ and client entities’ ability to determine whether invoices reflected agreed prices. Where fees for service were included in agreements, invoices could generally be reconciled to agreements for BGH.
3.44 To enable the hub to understand the effectiveness of its grants administration services, the hub needs to understand the uptake of those services. CGH agreements should clearly specify services to be provided to client entities under the agreement, where this information is contained in an accepted quote document, the final agreement should include or adequately reference all associated accepted quote documents.
3.45 Table 3.4 shows how often services relating to a particular grants lifecycle phase were specified in client entity agreement documents.
Percentage of agreements with services specified or offered |
Design |
Select |
Establish |
Manage |
Evaluate |
BGH |
100% |
98% |
100% |
100% |
8% |
CGH |
75% |
73% |
70% |
70% |
57% |
Source: ANAO analysis based on DSS and Industry client entity agreement documentation between 1 July 2016 and 30 July 2020.
3.46 The hubs advised the ANAO that reasons for non-usage of service may include legislation preventing delegation of elements of the select phase by the client entity (for example, selection decisions for the Medical Research Future Fund); or lack of appropriate expertise in the hubs.102 Neither hub had sought feedback from client entities about why the evaluate function was not used. The hubs advised that possible reasons might include:
- evaluation services are considered to be expensive;
- evaluation services require expertise not currently available in the hubs;
- evaluation services are provided by hub entities’ evaluation functions or outsourced; and
- evaluation is most relevant to new programs, and most of the CGH’s work is with established grant programs.
3.47 Both hubs decided to remove evaluation services from their standard service offer in 2021–22. This is inconsistent with the original SGGA Program intention. BGH considered evaluation activities were best placed with the policy owner; limited take up of evaluation by client entities; cost of evaluation ready services exceed quoted costs by 56 per cent; and noted to the ANAO that its Evaluation Unit had not received additional staff to provide evaluation services.103
3.48 The SGGA Program funding proposal envisaged that by attaining sufficient scale through all in-scope programs using hubs service offerings, the hubs would reduce the cost of grants administration. Given the limited take up of core CGH services, and of evaluation services for both hubs, there is a risk that expected cost efficiencies intended to be achieved through the hubs will not be realised.
Meeting service level standards
3.49 CGH agreements with client entities establish service level standards for the hub’s performance. The service level standards define targets for eight system availability service level areas and five grant round administration service level areas.104 CGH reports on these standards in Monthly Partnership Reports.
3.50 ANAO examined 103 CGH monthly partnership reports for eight client entities from September 2016 to September 2020.
- When reported, the targets for four of the eight system availability service level areas were met in 100 per cent of reports105, and the remaining four targets were met in 85 to 95 per cent of reports.106
- When reported, the targets for one of the five grant round administration service level areas was met in 100 per cent of the reports107, and the remaining four targets were met in 86 to 94 per cent of reports.108
3.51 BGH service level standards are established in the BGH service catalogue for each phase of the grants lifecycle, and include targets.109 BGH does not report against its service level standards.
3.52 The ANAO surveyed client entity staff managing grant programs about satisfaction with hub services (see Appendix 5).
- Overall, respondents were generally more satisfied with BGH than CGH services.
- BGH respondents were more satisfied with design, manage and establish services, than with select or evaluate services.
- CGH respondents were more satisfied with design and select services, than with establish, manage and evaluate services.
3.53 Reasons for dissatisfaction with hub services included cost110; quality of services111; lack of information or delays in providing information; perceived siloed services in CGH with lack of interaction between teams from various phases and within phases; lack of clarity on the respective responsibilities of the hubs and the client entity; duplication of effort between the hubs and the client entities112; perceived increased administrative burden associated with using the hubs; and services offered not aligning with grant program requirements.
3.54 With regard to compliance with the CGRGs, meeting finance law requirements and effectiveness of risk management, fewer than 20 per cent of grant program managers were dissatisfied with each hub’s performance. With regard to meeting grant program timelines, managing grant program risks, clarity of roles and responsibilities, and governance arrangements fewer than 20 per cent of program managers were dissatisfied with BGH’s performance, and between 30 and 40 per cent were dissatisfied with CGH’s performance.
Supporting client entities
3.55 BGH and CGH have published vision or purpose statements on their websites which emphasise the importance of working with policy partners, Australian Government agencies, grant applicants and grant recipients.
Design phase
3.56 Budget Process Operational Rules (BPORs) and Estimates Memoranda established requirements for consultation between hubs and participating entities when developing grant policy proposals.
- All implementation (including administration and ICT) costs across the five stages of the grants lifecycle must be prepared in consultation with the grants hubs.
- Hubs and participating entities must reach agreement on implementation costs and present the outcome of the agreement in the New Policy Proposal (NPP).113
- Participating entities must provide to the hub the final NPP submitted to government by the client entity.114
- Where an NPP is agreed to by government, hubs and client entities must execute formal agreements for the provision of grants administration services.115
3.57 From the outset, both hubs included consultation in their range of engagement and design services to client entities — through a Design and Liaison Team and Program Managers in the BGH and Client Managers in the CGH. This included establishing quoting processes for determining NPP costs.
3.58 Finance identified 105 NPPs relating to grant programs since 2016. These NPPs were examined by the ANAO to determine whether hubs were identified as affected entities and specified hub administration costs. NPPs did not consistently refer to the hubs or specify the administration costs attributable to the hubs.
3.59 The ANAO examined 145 costing agreements provided by Finance with a view to following them through to hub agreements. The ANAO was unable to definitively match any costing agreement with any program specified in hub agreements. Matches could not be made because Finance, BGH and CGH use different naming conventions for programs and quote registers contained insufficient information to allow for matching. Matching costing agreements with hub agreements was also impeded by some costing agreements not identifying a hub; or including hub costs; and most agreements not including costs from a hub quote.
3.60 In the survey of client entity staff managing grant programs, the ANAO found that one of 13 BGH client entity staff surveyed was dissatisfied with the level of consultation during the initial design phase of the grants lifecycle, compared to nine of 30 CGH client entity staff (see Appendix 5). Two of 21 BGH client entity staff were dissatisfied with the consultation when establishing agreements, compared with 12 of 31 CGH client entity staff.
Use of hub services
3.61 The SGGA Program implementation plan sets out responsibilities of the hubs including the development of a service offering and catalogue; transition plans and readiness checklist for establishing individual grants in a hub environment; and a common set of costing and pricing models across the grants lifecycle.
3.62 In late 2015 the Governance Board endorsed a Service Catalogue Framework116, Costing Methodology117 and Transition Principles. These framework documents were intended to guide the development of hub documentation. Table 3.5 shows the month and year in which SGGA Program framework documents and hub supporting tools were developed, if at all.
Supporting tool |
SGGA Program framework document |
BGH |
CGH |
Transition plan template |
December 2015 March 2017 |
Not applicablea |
Not applicable |
Common service offering and catalogue |
October 2015 |
Not developed |
Not developed |
Transition readiness checklist |
Not applicable |
December 2018 |
Not developed |
Common costing and pricing modelb |
October 2015 |
Not developed |
Not developed |
Price lists |
Not applicable |
July 2020c |
February 2017 |
Client entity portal/platform |
Not applicable |
By April 2018 |
February 2017 |
Note a: Finance provided client entities with a Transition Plan Template for preparing and communicating an entity’s Grants Administration Transition Plan. Completion of the Transition Plan was primarily the responsibility of the client entities; collaboration with the hubs was required to provide quotes and determine the feasibility, timeframe and requirements for transition of each program. The hubs used the transition plans to develop transition schedules.
Note b: Each hub developed its own costing and pricing model commencing in the 2016–17 financial year. The BGH costing model is not made available to client entities and is used to generate quotes. The CGH costing model was the basis for a services rate card (price list) that is available to client entities.
Note c: The price list was included in a presentation about the BGH cost model to Finance in July 2020.
Source: ANAO based on Finance, DSS and Industry documents.
3.63 While service offerings and catalogues were developed by the hubs, they were not based on whole-of-government business processes. The format of other tools did not support price comparisons by entities. BGH did not publish a price list on its public facing website or its portal. A common costing and pricing model was not developed. The approach to costing and pricing models did not support the SGGA Program to demonstrate the realisation of financial benefits or support effective market testing.
3.64 The hubs advised the ANAO in November 2021 that a common costing model is not appropriate as there is a fundamental difference in the types of grant programs being delivered by each hub, and because each hub is underpinned by differing ICT systems, processes and departmental structures. The hubs advised that their different operating models have evolved from what was originally envisioned by the SGGA Program, based on client requirements and lessons learnt. The hubs’ position is inconsistent with advice included in the SGGA Program funding proposal, which said grant programs could be implemented using one of six standard patterns (workflows).
3.65 Self-service portals were to be developed by the hubs to facilitate client entities’ access to information. Despite commencing development of the BGH portal in 2018, it was incomplete in March 2021, with the majority of pages on the site containing no material for client entities, or content that had not been periodically updated. In November 2021 Industry advised ANAO it has not updated the portal as the primary mechanism for engagement with external parties is through a dedicated account manager.
3.66 CGH’s portal, HubNet, hosts extensive guidance and explanatory material. In 2020 CGH commenced a review of HubNet. At September 2021 CGH planned to refresh the HubNet site.
3.67 The ANAO surveyed client entity staff managing grant programs about the adequacy of hub support and guidance. More than 80 per cent of BGH client entity staff were satisfied, compared with less than 40 per cent of CGH client entity staff. In relation to the CGH, the most common concerns were the clarity, accuracy and reliability of guidance.
Have the hubs supported more efficient grants administration?
There is insufficient evidence to demonstrate that the hubs have supported more efficient grants administration. A key objective of the SGGA Program — to establish a standardised whole-of-government approach to streamline grants administration — has not been achieved. Although the hubs undertook regular reviews of costing models to better reflect the cost of services in prices, the hubs cannot demonstrate cost recovery.
3.68 The SGGA Program implementation plan states that the objective of the SGGA Program was to establish a whole-of-government approach to streamlining grants administration, taking into consideration user needs. The implementation plan acknowledges that the great majority of the benefits for government are to be achieved by streamlining administrative processes118 rather than rationalising the grants ICT environment. Standardised business processes in combination with improved scheduling and productivity of specialised staff in hub shared services centres were intended to lead to potential efficiencies of approximately $50 million annually (representing a 20 per cent reduction in administrative costs). Finance completed or commissioned benefits measurement exploring cost efficiencies but did not establish efficiencies were achieved (see paragraphs 2.40 to 2.42).
Standardisation of streamlined grants administration processes
3.69 The implementation plan assigned responsibility to the hubs for the development of standardised whole-of-government business process and workflow patterns.119 The July 2017 SGGA Program Implementation Plan indicates that in December 2016 the SGGA Governance Board endorsed a whole-of-government grant program delivery model that set out the standardised business processes, although SGGA Governance Board meeting records do not document this.
3.70 To support streamlining the hubs designed business processes around the grant lifecycle phases, engaged in processes to increase automation, assigned responsibility to internal governance committees to oversight streamlining, and reviewed and improved the description of grant phases, business processes, and reduced workflow steps. At times these activities to support streamlining have not resulted in the intended improvement120, including agreed common business processes.
3.71 It was the intention that each hub maintain its own grants management ICT system (Table 1.1). The primary grants management systems are GPS (CGH) and BGM (BGH). However, both hubs rely on attachments and records stored in multiple systems (Table 3.6).
Purpose |
CGH |
BGH |
Grants administration |
GPS |
eSGMS replaced by BGM 1 March 2017 |
Document storage for entity interactions with hub |
Arc |
BGM, email |
Grant payments |
SAP |
Techone |
Hub staff information (such as standard operating procedures) |
HubNet |
DocHub |
Monitor programs |
Localised trackers (Sharepoint) |
BGM |
Program reporting |
Data exchange |
BGM Power BI |
General reporting |
Qlik |
Lighthouse, Enterprise Beyond |
Grant publication |
GrantConnect CGH website |
GrantConnect Business.gov.au website |
Client relationship management |
ARC, email, MSM |
CRM, BGM, email |
Client portal access (platform managed by Industry) |
Communitygrants.gov.au GrantConnect |
business.gov.au GrantConnect |
Source: BGH and CGH advice and documentation.
3.72 The multiplicity of systems used by both BGH and CGH is inconsistent with the SGGA Program funding proposal, which specified that the use of a single grants management ICT system by each hub was critical in driving business process change and achieving cost efficiencies.
Pricing of grants administrations services
3.73 In April 2015 government agreed that the operational costs for hub grants administration services would be recovered using a user pay cost recovery model based on a unit priced service catalogue. To support cost recovery, the SGGA Program implementation plan requires the hubs to develop a common service catalogue and pricing model, maintain cost information and report costs to Finance as required.
3.74 Both hubs sought to understand grants management costs before developing a pricing structure. PwC was engaged by Industry and DSS to analyse the grants management process of the two entities and assist with the development of a high-level costing approach. The objective was to ensure that implementation of two grants hubs was supported by a consistent understanding of cost drivers and an agreed approach to pricing. The results of the PwC analysis were presented to the hubs in August 2015. The report noted that neither department was capturing the end-to-end costs of the grants administration process, therefore it is unlikely that pricing would fully recover costs.
3.75 BGH initially developed a costing model that was based on estimates of effort involved in delivering each phase of the grants lifecycle.121 Following a February 2020 management initiated review, this was changed to a ‘self-serve’ model122 based on activity volumes rather than estimates of effort. This was viewed as potentially improving certainty about agreed costs. BGH developed a new pricing schedule to commence from 1 July 2020.
3.76 As at February 2021 BGH manages programs under both the old and new approaches based on the costing model that was in place when an agreement was entered into.
3.77 CGH developed and maintained an activity-based costing model. DSS believes there have been improvements over time in the confidence that can be placed on the underlying data.
3.78 Over time there have been adjustments to achieve greater cost recovery.
- In April 2017 CGH identified 25 hub-related activities that had not been included in the original 147 activities listed in the service catalogue. Subsequently, CGH adjusted its service catalogue, price list and costing model.
- In 2020 BGH determined that it was undercharging for evaluation services. It also found that costing did not include the cost of the ICT systems. The costing model was adjusted to include the cost of ICT systems.
3.79 Due to differences in the two grants hubs’ pricing approaches, including different units to measure volume of hub services, it is not possible to produce a reliable overall price comparison.
Appendices
Appendix 1 Entity responses
Appendix 2 Improvements observed by the ANAO
1. The existence of independent external audit, and the accompanying potential for scrutiny improves performance. Improvements in administrative and management practices usually occur: in anticipation of ANAO audit activity; during an audit engagement; as interim findings are made; and/or after the audit has been completed and formal findings are communicated.
2. The Joint Committee of Public Accounts and Audit (JCPAA) has encouraged the ANAO to consider ways in which the ANAO could capture and describe some of these impacts. The ANAO’s 2021–22 Corporate Plan states that the ANAO’s annual performance statements will provide a narrative that will consider, amongst other matters, analysis of key improvements made by entities during a performance audit process based on information included in tabled performance audit reports.
3. Performance audits involve close engagement between the ANAO and the audited entity as well as other stakeholders involved in the program or activity being audited. Throughout the audit engagement, the ANAO outlines to the entity the preliminary audit findings, conclusions and potential audit recommendations. This ensures that final recommendations are appropriately targeted and encourages entities to take early remedial action on any identified matters during the course of an audit. Remedial actions entities may take during the audit include:
- strengthening governance arrangements;
- introducing or revising policies, strategies, guidelines or administrative processes; and
- initiating reviews or investigations.
4. In this context, the below actions were observed by the ANAO during the course of the audit. It is not clear whether these actions and/or the timing of these actions were planned in response to proposed or actual audit activity. The ANAO has not sought to obtain assurance over the source of these actions or whether they have been appropriately implemented.
5. Department of Finance actions included completing an End Stage Gateway Review in June 2021 and engaging PricewaterhouseCoopers to undertake user experience research to support benefits realisation measurement in 2021. In February 2022, Finance advised the ANAO that it had engaged services to further develop business process mapping.
6. Department of Industry, Science, Energy and Resources actions included updated hub grants administration workflows in early 2021 and in October 2020, the hub reviewed its service offerings recommending changes be made to the service catalogue for external entities.
7. Department of Social Services (DSS) actions included:
- commencing a review of its portal, HubNet, in 2020 and in September 2021 migrating the hub policies and procedures to a new SharePoint team site which included an online manual for grants administration;
- bringing together assurance activities in an overarching document to consolidate its framework in December 2020; and
- reviewing the hub’s service offering in 2021 and making subsequent changes to its service catalogue and price lists.
8. Digital Transformation Agency actions included reporting to government on the outcomes of the DSS information and communications technology investment project to deliver a single grants process across departments in November 2021.
Appendix 3 Funding for the SGGA Program and related initiatives
1. The funding amounts in Table A.1 includes direct funding to the SGGA Program as well as funding to information and communications technology (ICT) initiatives (including gateway reviews) and other initiatives integral to delivery of the SGGA Program.
Entity |
Total funding ($m) |
Funding details |
Interim Digital Transformation Office (DTO) |
– |
SGGA Program In 2015–16 over four years, $4.7 million to develop high-level design and data warehouse. The full appropriated amount was subsequently transferred to Department of Finance (Finance) through the Mid-Year Economic and Fiscal Outlook and Portfolio Additional Estimates Statements (MYEFO and PAES) 2015–16 process. |
Finance |
17.7 |
Grant systems initiatives In 2013–14 over four years, $9.1 million (including operating funding of $3.6 million over four years) to implement GrantConnect a web-based whole-of-government electronic grants advertising, application and reporting system for use by entities and grant applicants as a single point of reference.a In 2014–15 over four years, $0.5 million to complete a gateway review process on Department of Social Services’s (DSS’s) second pass business case for a grants management ICT system. SGGA Program In 2015–16 over four years, $2.8 million to provide overall program governance and undertake a market test, $0.6 million for gateway review, and in the MYEFO and PAES 2015–16 process DTO transferred to Finance $4.7 million to deliver a common ICT platform, to improve reporting and to support the delivery of program benefits through SGGA Program and GrantConnectb such as better outcomes for users and improved user experience. |
Department of Industry, Science, Energy and Resources |
15.6 |
Grant systems initiative In the 2013–14 Budget the development of a grants management platform was funded via the Single Business Service initiative. The value of funding was not specified but estimated as $3.5 million in the April 2015 SGGA Program funding proposal. Public Service Modernisation Fund — transition to hubs In 2017–18 over two years, $12.1 million to develop capacity and functionality of the Business Grants Hub grants management system and accelerate transition of grants programs to the hub. |
DSS |
124.5 |
Grant systems initiative In 2014–15, $2.1 million to scope a second pass for a grants management ICT system. SGGA Program In 2015–16 over four years, $99.3 million to establish a grants hub and stabilise and modernise its grants management ICT system. Public Service Modernisation Fund — transition to hubs In 2017–18 over two years, $23.1 million to accelerate transition of 41 grant programs from six entities to the hubs (representing 74 per cent of grant programs to be administered by the hubs). |
All agencies |
157.8 |
Funding announced between 2013–14 and 2017–18, to be provided over a six-year period. |
Note a: Initially referred to as the Australian Government Grants System, grants.gov.au was rebranded as GrantConnect in 2017. GrantConnect is the Australian Government’s grants information system. It provides centralised publication of forecast and current Australian Government grant opportunities and grants awarded. GrantConnect [Internet], Finance, available from https://www.grants.gov.au/ [accessed March 2022]. GrantConnect is discussed at paragraph 1.12.
Note b: Between 1 July 2013 and 25 September 2020, Finance estimates the development and maintenance costs for GrantConnect to be $10.9 million.
Source: Summary of related budget measures between 2013–14 and 2017–18, and program planning documentation.
Appendix 4 Hub grants management information and communications technology systems
|
Business Grants Hub |
Community Grants Hub |
Grant management information and communications technology (ICT) system |
eSGMS — implemented in 2009, planned to decommission the system by 31 December 2020. All grants that remained active post 31 December 2020 were to be migrated to BGM where required functionality existed. BGM — implementation commenced in 2017. eSGMS and BGM are supported by a Lighthouse reporting tool. |
FaHCSIA Online Funding Management System (FOFMS)a — system enhancements commenced in 2015–16 and were planned to continue through to 30 June 2018. FOFMS was rebranded to GPS on 4 February 2019. GPS is supported by a data analytic reporting tool. |
Grant management ICT system users as at 29 October 2021 |
BGM — 880 active licences for internal Department of Industry, Science, Energy and Resources (Industry) users. eSGMS — 1054b active users. |
GPS — 855 Community Grants Hub (CGH) users. GPS — 2718 client entity usersc. |
Grant portal users as at 29 October 2021 |
79,473 — active portal accounts for customers and applicantsb |
1,413d—grant recipient portal userse |
Note a: FaHCSIA was the former Australian Department of Families, Housing, Community Services and Indigenous Affairs.
Note b: Note users of eSGMS and the Business Grants Hub portal may have multiple access or account types resulting in inflated user numbers.
Note c: Client entities include Department of Agriculture, Water and the Environment, Attorney-General’s Department, Department of Education, Skills and Employment, Department of Health, Department of Home Affairs, Department of the Prime Minister and Cabinet, Department of Social Services (DSS), Department of Veterans’ Affairs and National Disability Insurance Agency.
Note d: 442 organisations were on-boarded in 2018. CGH offered 701 grant recipient organisations access to the grant recipient portal between December 2020 and March 2021, and 96 of these organisations accepted the offer.
Note e: CGH has different access points (systems/portals) depending if you are an applicant or an existing Grant Recipient. The total number of applications received from 2018–19 through to 2021–2022 is 64,667. These applications were not received through the Grant Recipient Portal.
Source: Summary of Industry and DSS advice and documentation.
Appendix 5 ANAO survey
1. Between November and December 2020, the ANAO surveyed all 28 entities that published grant opportunities and/or awards on GrantConnect in 2018–19 and/or 2019–20, including entities that use and do not use hub services.
2. The survey sought entities’ views to inform audit findings directed towards the Department of Finance and the hubs. The survey had two parts:
- Part 1 — Was to be completed by both users and non-users of hub services at the entity level. A single response was sought for this part of the survey.
- Part 2 — Was directed to users of either or both the Business Grants Hub (BGH) and Community Grants Hub (CGH) services. Part 2 was to be completed by at least one grant program area from each entity. In some cases, the ANAO received responses from more than one grant program area in an entity.
|
Total |
BGH |
CGH |
Total |
BGH |
CGH |
Total |
BGH |
CGH |
Grant Program Area(s): If your entity uses one or both of the hubs, what does your entity view as the main impacts of using the hubs? Please select all that apply (increase, decrease, no change, no view). See paragraph 2.47 of this report. |
|||||||||
|
Number respondinga |
Number increase |
Number decrease |
||||||
Cost of grants administrationb |
52 |
17 |
35 |
34 |
6 |
28 |
5 |
4 |
1 |
Efficiency of grants administration |
53 |
17 |
36 |
10 |
5 |
5 |
30 |
8 |
22 |
Effectiveness of grants administration |
51 |
17 |
34 |
11 |
5 |
6 |
20 |
4 |
16 |
Better grants experience for clients (grant applicants and recipients |
52 |
16 |
36 |
6 |
3 |
3 |
21 |
5 |
16 |
Streamlining of grants business processes, patterns and workflows |
49 |
16 |
33 |
9 |
4 |
5 |
24 |
7 |
17 |
Consistency of grants administration activities |
51 |
16 |
35 |
17 |
5 |
12 |
11 |
1 |
10 |
Better grants policy outcomes |
51 |
16 |
35 |
6 |
2 |
4 |
14 |
2 |
12 |
Availability of consistent grants data across entities |
46 |
15 |
31 |
12 |
3 |
9 |
17 |
4 |
13 |
Quality of whole-of-government grants data |
46 |
16 |
30 |
7 |
3 |
4 |
5 |
3 |
2 |
Compliance with whole-of-government participation requirements |
52 |
17 |
35 |
16 |
4 |
12 |
1 |
0 |
1 |
Evidence based policymaking, evaluation and risk management |
44 |
13 |
31 |
6 |
1 |
5 |
5 |
1 |
4 |
Grant Program Area(s): For different stages of the grants administration lifecycle, to what extent is your entity satisfied or dissatisfied with hub services? Please select all that apply (very satisfied, satisfied, neither satisfied or dissatisfied, dissatisfied, very dissatisfied). See paragraph 3.52 of this report. |
|||||||||
|
Number respondingc |
Number satisfied or very satisfied |
Number dissatisfied or very dissatisfied |
||||||
Design of grant opportunities and activities |
45 |
14 |
31 |
24 |
10 |
14 |
16 |
3 |
13 |
Assessment and selection of grantees |
35 |
15 |
20 |
15 |
8 |
7 |
14 |
5 |
9 |
Establishment of grants |
50 |
17 |
33 |
18 |
11 |
7 |
22 |
2 |
20 |
Ongoing management of grantees and grant activities |
49 |
15 |
34 |
20 |
11 |
9 |
19 |
2 |
17 |
Evaluation of grant opportunities and activities |
16 |
7 |
9 |
6 |
4 |
2 |
4 |
2 |
2 |
Other |
9 |
2 |
7 |
1 |
0 |
1 |
7 |
2 |
5 |
Grant Program Area(s): Please indicate the extent of your entity’s agreement or disagreement with the following statements about hub services. For each statement, please select a response for each hub, where relevant (strongly agree, agree, neither agree or disagree, unsure, disagree, strongly disagree). See paragraph 3.54 of this report. |
|||||||||
|
Number respondingd |
Number agree or strongly agree |
Number disagree or strongly disagree |
||||||
The hub supports my entity to meet CGRG requirements |
77 |
33 |
44 |
39 |
16 |
23 |
6 |
0 |
6 |
The hub supports my entity to meet finance law requirements |
76 |
33 |
43 |
33 |
12 |
21 |
6 |
0 |
6 |
The hub services support my entity to meet program timelines |
76 |
33 |
43 |
20 |
9 |
11 |
23 |
6 |
17 |
The hub services help my entity to manage grant program risks. |
73 |
33 |
40 |
19 |
9 |
10 |
20 |
4 |
16 |
The hub has effective risk management with respect to my entity’s grant program |
76 |
32 |
44 |
19 |
9 |
10 |
12 |
3 |
9 |
The hub costings for grants administration services reflect the prices advertised |
49 |
6 |
43 |
10 |
2 |
8 |
15 |
2 |
13 |
The hub roles and responsibilities are clear |
75 |
31 |
44 |
21 |
11 |
10 |
20 |
2 |
18 |
The hub governance arrangements ensure that my entity’s needs are met |
77 |
33 |
44 |
18 |
9 |
9 |
19 |
4 |
15 |
Grant Program Area(s): To what extent is your entity satisfied or dissatisfied that it was appropriately consulted by the Hub in the initial stages of designing the grant program(s)? Please select a response for each hub (very satisfied, satisfied, neither satisfied or dissatisfied, dissatisfied, very dissatisfied). See paragraph 3.60 of this report. |
|||||||||
|
Number respondingc |
Number satisfied or very satisfied |
Number dissatisfied or very dissatisfied |
||||||
Appropriately consulted during initial design phase |
43 |
13 |
30 |
24 |
10 |
14 |
10 |
1 |
9 |
Grant Program Area(s): To what extent is your entity satisfied or dissatisfied that it was appropriately consulted by the hub when establishing the agreement for the provision of services for a grant program(s)? Please select a response for each hub (very satisfied, satisfied, neither satisfied or dissatisfied, dissatisfied, very dissatisfied). See paragraph 3.60 of this report. |
|||||||||
|
Number respondingc |
Number satisfied or very satisfied |
Number dissatisfied or very dissatisfied |
||||||
Appropriately consulted when establishing the agreement for provision of services |
52 |
21 |
31 |
24 |
15 |
9 |
14 |
2 |
12 |
Grant Program Area(s): What is your entity’s view about the adequacy of hub support and guidance (telephone, email, web chat, published guidance and templates)? Please select all that apply (more than adequate, adequate, no view, partly adequate, not adequate). See paragraph 3.67 of this report. |
|||||||||
|
Number respondinge |
Number adequate or more than adequate |
Number partly adequate or not adequate |
||||||
Adequacy of hub support and guidance |
53 |
17 |
36 |
26 |
14 |
12 |
25 |
3 |
22 |
Grant Program Area(s): If your entity’s view about the adequacy of hub support and guidance was ‘Not Adequate’ or ‘Partly Adequate’, what could be improved? Please select all that apply. See paragraph 3.67 of this report. |
|||||||||
|
Number stating that area could be improved |
|
|
||||||
Accessibility of support and/or guidance |
14 |
3 |
11 |
|
|
|
|
|
|
Responsiveness of hub officers |
9 |
0 |
9 |
|
|
|
|
|
|
Clarity of guidance given |
25 |
2 |
23 |
|
|
|
|
|
|
Relevance of support and/or guidance |
15 |
2 |
13 |
|
|
|
|
|
|
Accuracy and reliability of guidance |
19 |
1 |
18 |
|
|
|
|
|
|
Completeness of guidance |
18 |
3 |
15 |
|
|
|
|
|
|
Other (please specify) |
12 |
2 |
10 |
|
|
|
|
|
|
Note a: The ‘Number responding’ is the total of all responses to the question and includes responses where ‘no change’ or ‘no view’ were selected.
Note b: Cost of grants administration includes hub service costs, administrative costs of dealing with the hubs and total costs of overseeing the grant program.
Note c: The ‘Number responding’ is the total of all responses to the question and includes responses where ‘neither satisfied nor dissatisfied’ was selected.
Note d: The ‘Number responding’ is the total of all responses to the question and includes responses where ‘neither agree nor disagree’ was selected.
Note e: The ‘Number responding’ is the total of all responses to the question and includes responses where ‘no view’ was selected.
Source: ANAO survey of Australian Government granting entities and grant program areas.
Appendix 6 Consistency of Business Grants Hub practices with a selection of business processes and workflows
Lifecycle Phase |
Business process |
Practice consistent with processes |
Consistency rating |
A. Select |
Application received prior to grant assessment decision. (Business Grants Hub [BGH] did not establish a timeframe within which to complete this process) |
Instances observed of assessment decisions made prior to receipt of applications (up to 306 days), and applications assessed up to four years after application received. On average assessment decisions were made 10 weeks after applications were received. |
|
B. Select |
Applicant assessed as eligible before a merit assessment, if required, occurs. |
Records for the timing of eligibility and merit assessments were often incomplete, limiting the usefulness of the BGH data for demonstrating that business processes and workflows have been met. Where records were available, limited instances were observed of business processes and workflows not being followed. There were no instances of an application being assessed as ineligible and being found suitable. Although some ineligible applications progressed to the merit assessment stage (1142), and some eligibility assessments occurred after merit assessments (261). |
Insufficient records to test
Eligibility assessed before merit (where records available)
|
C. Select |
No single officer should recommend an application for funding and approve the application for funding. |
In 2020–21, effective controls were in place to prevent a recommender from approving an application for funding in the grants management information and communications technology (ICT) system. User access controls allowed delegates who were not the delegate for a grant program to approve funding. |
Separation of duties
Appropriate delegate
|
D. Select |
A recommendation must be made prior to a decision to fund an application. Where the delegate rejects the recommendation, this may indicate there are concerns about the quality of eligibility and merit assessments. |
Recommendations were generally made before decisions were made. The delegate largely accepted the recommendation made. Some records indicate that the audit log may be compromised including where changes are made prior to an initial decision being made, or up to three years after an initial recommendation is made. |
Recommendation accepted
Records compromised
|
E. Select |
A recommendation must be made prior to a decision on the amount to fund an application. A change in the recommended amount may indicate there are concerns about the quality of eligibility and merit assessments. |
The delegate rarely changed the recommended funding amount. In the rare instances where it was changed (28), there were a few outliers where there was a large increase or decrease in the value of the funding. For example, the approved amount ranged in value from $856,000 less than recommended to $1,543,482 more than recommended. |
Recommendation accepted
|
F. Select |
A record is made of whether an applicant fully, partially or does not meet the selection criteria to support ministerial briefings. |
BGH systems generally recorded an eligibility and merit outcome to support reporting to the Minister, where required. BGH captures the eligibility and merit assessments in two datasets. There were large differences between the outcomes recorded in each dataset. Compared to the eligibility dataset, there was 7965 applications where a different outcome was recorded in the merit dataset. |
Record of eligibility and merit outcome
Consistency between datasets
|
G. Select and establish |
A grant agreement cannot be entered into prior to the delegate approval of which applications will receive funding. A grant agreement cannot start prior to delegate approval and an agreement is executed. The funding decision and start (execution) date is reported on GrantConnect. (BGH did not establish a timeframe within which to complete this process) |
BGH reporting on GrantConnect indicates that grant agreements do not commence prior to delegate approval, but can commence at the same time as delegate approval. 6566 of 22,934 (29 per cent) executed agreements from BGH’s grants management ICT could not be matched to GrantConnect. More than half of the grant agreements reported on GrantConnect start 18 days after approval. In some cases, the delay between approval and agreement start dates was up to three years.
|
Delegate approval is not after agreement start date
Consistency of grants awarded records between BGH and GrantConnect
|
H. Establish |
A grant agreement cannot be accepted (executed) before it is issued. Applicants generally have 30 days to accept issued grant agreements, unless an alternative due date is specified. Once the due date has passed a grant agreement offer may be withdrawn. |
Records for the timing of dates for issuing and acceptance of grant agreement were occasionally incomplete. There were also instances where executed, issued and due date records were inconsistent with agreement status. Where records were available, most applicants accept grant funding agreements prior to the due date, or within 30 days of the issue date. Instances were observed where grant agreements were issued after the due date, or accepted before they were issued, or accepted months before the due date (where an issue date could not be established). More than 46 per cent of grant agreements were accepted on the same day they were issued. For 720 of 22,934 applications (three per cent), the applicant accepted the grant agreement 12 weeks to three years after the due date. |
|
I. Establish |
Grant agreements cannot end before the start date. |
In a small number of instances grant agreements finished before they commenced. |
|
J. Select, establish and manage |
An agreement execution date cannot occur before an application decision date, and grants payment transaction dates cannot occur before the agreement execution date. |
Of the 40,891 unique applications contained in the eligibility assessment dataset, 18,113 contained blanks in one or more date or purchase order number fields in the eligibility, agreement status and purchase order datasets. Where records were available, decision, execution and transaction dates for around 50 per cent of applications that recorded these dates (11,309 of 22,778) did not occur in a sequence that was consistent with BGH internal business processes and workflows, and 11,469 demonstrated the date sequence. |
Completeness of records
Grant payments are not before agreement execution date which is not before application decision date
|
K. Establish and manage |
Purchase orders are only issued for successful applicants (grantees), who have accepted an agreement, and this is accurately recorded in grants management ICT system. A grant agreement cannot progress to the manage phase or have a payment made without a purchase order. |
Effective controls were in place and operating satisfactorily in the financial system for 2019–20 and 2020–21. See lifecycle phase C for appropriate delegate assessment. A comparison of records from BGH’s financial management system and grants management ICT system indicates records are incomplete and do not sufficiently correspond with each other. For example, the purchase order records from the financial management system contains purchase orders for at most 52 per cent of executed grant agreements, and 80 per cent of purchase orders from the financial management system could be matched to executed agreements in the grants management ICT system. For less than a quarter of the applications in the payment dataset, the purchase order was created after the assessment decision date and executed agreement date. |
Controls in the financial system
Consistency of purchase order records between financial system and grants management system
Purchase order created after agreement executed
|
L. Establish and manage |
A risk rating assessment (of the grantee and their project) is required to occur when establishing an agreement, and a risk rating review and update is to occur when reviewing progress reports. An audit is most likely to occur where grantees pose a high risk or are part of moderate and complex programs. An audit cannot occur until an agreement is executed. |
For more than half of the executed grant agreements, BGH recorded an appropriate risk rating. The remainder of the grant agreements retained the default data field setting of ‘no risk rating’. Data did not indicate that risk ratings were reviewed in accordance with business processes. Few grant agreements (less than 1.5 per cent) were rated as medium or high risk. High risk grant agreements were likely to have grants management activities applied to manage the risk (including audits), whereas medium risk grants agreements were most likely to manage risk through progress reports and associated payments. The majority of audits appear to be required as part of a routine sample of grant recipients, rather than in response to a grant agreement risk rating. |
Risk rating
Review of risk rating
Audit and reporting risk based
|
M. Manage |
A grant applicant must make a request for a variation to a grant agreement before it is assessed by BGH. Following an assessment BGH will decide to approve or not approve the variation. (BGH did not establish a timeframe within which to complete this process) |
A variation was sought for 46 per cent (10,511 of 22,934) of unique executed grant agreements. Some variations decisions were made 12 weeks before a request was made; and a significant proportion of variation requests (37 per cent) were not for an executed grant agreement and did not have an application number. Variations decisions were made, on average, within 12 weeks of being requested, although 58 per cent of pending decisions were more than 12 months old, and a further 26 per cent were between three months and 12 months old. |
|
N. Manage |
When accepting a grant agreement the grantee agrees to reporting obligations including to provide reports in accordance with due dates established in agreements. Where reports are not received by the due date, BGH request the grantee to provide the overdue report within a specified time (21 days), this may lead to termination of the agreement or other late report action. |
The majority of milestone reports are provided by the grantees before the due date. 29 per cent were provided after the due date, 17 per cent were provided more than 21 days after the due date, and three per cent were provided 12 weeks after the due date. At least one report was provided 45 years early. |
|
O. Manage |
A review of a progress report cannot occur before a report is received. Grant payments may be contingent on progress reports. |
On average BGH completes reviews of progress reports within two weeks of receiving the report. System data provides review dates for some reports that precedes the date that some reports were received (in some cases more than six months before they were received), while other progress reports were recorded as being reviewed 10 years after they had been received. |
|
P. Manage |
A final assessment cannot be completed before a grant commences and a grantee submits a final report. A project is not complete if a final assessment has not been completed, therefore the final assessment should occur before the project end date. (BGH did not establish a timeframe within which to complete this process) |
Almost all of the applications included in the project outcomes dataset had a completed final assessment. Some final assessments were completed before the project end date (in some cases five years before the project end date). There were also instances of a final assessment being completed before a grant commenced, or where there was not an executed agreement. On average it takes BGH between ten and 14 weeks to complete an assessment after the end date for the project. Some final reports were not completed until three years after the project end date. |
|
Q. Manage |
A project outcome rating is required to be documented as part of the BGH final project report assessment. Payments should not exceed the amount documented in the grant agreement. |
BGH recorded project outcomes for 62 per cent of all applications in the project outcomes dataset. 98.6 per cent of applications that did not record an outcome in the project outcomes dataset had recorded a final assessment date, indicating an outcome rating should have been recorded. 74 per cent of the applications that had no outcome recorded received payment of the full agreed grant amount, this included 21 instances where more than the full grant amount was paid. There were 5 instances where there was not an agreed grant amount and a final assessment date was recorded. There was inconsistency between the established dataset and the project outcome dataset. For example, 65 per cent of executed agreements appeared in the project outcomes dataset, and 18 per cent of the applications in the project outcome dataset did not have an executed agreement. Where a rating was recorded for an executed grant agreement, 99 per cent of grant outcomes were successful or mainly successful. Generally, successful and mainly successful outcomes were more likely to lead to the grant being fully paid, while partially successful and failed outcomes were more likely to lead to a payment less than the agreed amount. |
Project outcome rating is documented
Consistency of records between datasets
Payments do not exceed full agreed grant amount
|
Source: ANAO analysis of BGH data.
Footnotes
1 Australian Government Grants – Briefing, Reporting, Evaluating and Election Commitments (Resource Management Guide 412) encourages all non-corporate Commonwealth entities to leverage whole-of-government initiatives when developing grant policy proposals.
2 As outlined in paragraph 1.12 entities were required to report grants awarded on GrantConnect from 31 December 2017. This means that the figures presented do not represent all grant funding and awards for the period 1 July 2016 to 9 February 2021.
3 The number of participating entities changed over time and reflect the addition of entities such as the Department of Home Affairs and National Indigenous Insurance Agency, and machinery of government modifications. Participating entities at the commencement of the SGGA Program are listed at Table 1.1, note a.
4 The hubs do not administer all grants awarded published by participating entities on GrantConnect.
5 The recent audits include Auditor-General Report No.2 2018–19 Administration of the Data Retention Industry Grants Program, Auditor-General Report No.12 2019–20 Award of Funding Under the Regional Jobs and Investment Packages and Auditor-General Report No.45 2019–20 Management of Agreements for Disability Employment Services.
6 Budget papers Australian Government, Budget Measures, Budget Paper No.2: 2015–16, Commonwealth of Australia, Canberra, 2015, pp. 68 and 69.
7 At the time of the budget measure the then Department of Industry and Science was responsible for building and operating the BGH. Since this time the Department has been subject to two machinery of government changes, where each new Department became responsible for the build and operation of the hub. For the purposes of the report the Department will be referred to as the Department of Industry, Science, Energy and Resources (Industry).
8 Initial responsibilities included high level design, a data warehouse and oversight of user experience. In December 2015 responsibility for high level design and the data warehouse transferred from DTO to Finance, and in 2016 the associated funding was also transferred. Finance was responsible for improved whole-of-government reporting, searching and registration of applicants.
9 At the time the budget measure was introduced, DTA was the interim Digital Transformation Office (DTO). The DTO was formally established as an executive agency reporting to the Minister for Communications in July 2015. DTO became the DTA on 27 October 2016.
10 The Commonwealth of Australia, Budget 2017–18, Budget Measures, Budget Paper No.2 2017–18 [Internet], Department of the Treasury, 9 May 2017, Public Service Modernisation Fund—agency sustainability budget measure, p. 76, available from https://archive.budget.gov.au/2017-18/bp2/bp2.pdf [accessed March 2022].
11 No additional funding was proposed for the participating entities to meet the cost of migrating grant programs to the hubs and hub grants management information and communications technology (ICT) systems. Subsequently, funding was provided to six entities to transition to the hubs through the 2017–18 Public Service Modernisation Fund—agency sustainability budget measure.
12 Shared service arrangements support the Australian Public Service (APS) by providing common services across entities through centres of excellence (hubs) to drive efficiencies through increased scale and adopting best practice approaches. Department of Finance, Shared Services [Internet], available from https://www.finance.gov.au/government/setting-commonwealth-entity/shared-services [accessed October 2021].
13 An entity’s accountable authority or delegate may enter into arrangements with the hubs to provide grants administration services and enter into funding agreements with grant recipients in accordance with Public Governance, Performance and Accountability Act 2013 (PGPA Act) section 23.
14 A client entity includes participating entities (see Table 1.1, note a) and other entities that use hub services.
15 The CGRGs are issued by the Finance Minister under section 105C of the PGPA Act. The CGRGs establish the Australian Government grants policy framework (which applies to all non-corporate Commonwealth entities). They set out mandatory requirements and better practice and are supported by resource management guides that establish requirements and expectations for aspects of grants administration.
16 BPORs are endorsed annually by government and establish mandatory requirements, unless otherwise agreed by government. BPORs also establish that adherence to Estimates Memoranda (that give guidance, advice and instruction on Budget matters) is mandatory.
17 Estimates Memorandum 2016/38 — Whole-of-Government Grant Administration Arrangements paragraph 4. BPORs Rule 1.9 (October 2016). BPORs Rule 1.10 (August 2017) superseded BPORs Rule 1.9 (October 2016).
18 Estimates Memorandum 2017/40, paragraph 5.
19 GrantConnect is the Australian Government’s grants information system. It provides centralised publication of forecast and current Australian Government grant opportunities and grants awarded. GrantConnect [Internet], Finance, available from https://www.grants.gov.au/ [accessed March 2022].
20 The requirement to publish grant opportunities is set out in the CGRGs, paragraph 5.2. This requirement was introduced in the 2017 version of the CGRGs which came into effect on 30 August 2017.
21 The requirement to publish grants awarded on GrantConnect is set out in the CGRGs, paragraph 5.3. Finance, CGRGs [Internet], Finance, 2017, available from https://www.legislation.gov.au/ [accessed March 2022].
22 In December 2017 the NHMRC was included as a third interim hub. NHMRC was included for the purposes of providing grants administration services for the Medical Research Future Fund and other health research grants that were the responsibility of the Department of Health. The inclusion of NHMRC on a temporary basis was considered necessary as technical specialist capabilities were required to deliver clinical trial research grants. For the purposes of this report references to the hub entities generally excludes NHMRC.
23 ARC commenced providing grant administration services to the Office of National Intelligence (ONI) in 2020–21. ONI, Research funding to address intelligence and national security threats [Internet], ONI, available from https://www.oni.gov.au/research-funding-address-intelligence-and-national-security-threats [accessed March 2022]. For the purposes of this report references to the hub entities excludes ARC.
24 SmartyGrants offers cloud-based self-service grants administration software and other services including consulting, administration, legal and evaluation. SmartyGrants [Internet], available from https://smartygrants.com.au/ [accessed March 2022].
25 As outlined in paragraph 1.12 entities were required to report grants awarded on GrantConnect from 31 December 2017. This means that the figures presented do not represent all grant funding and awards for the period 1 July 2016 to 9 February 2021.
26 The number of participating entities changed over time and reflect the addition of entities such as the Department of Home Affairs and National Indigenous Insurance Agency, and machinery of government modifications.
27 The hubs do not administer all grants awarded published by participating entities on GrantConnect.
28 The role of the Secretaries ICT Governance Board is to set whole-of-government strategies on the use of ICT across the Australian Public Service (APS).
29 The Efficiency Working Group’s role was to investigate opportunities for longer-term sustainable efficiency initiatives, taking into account current whole-of-government initiatives underway. The Efficiency Working Group was to report to the Secretaries Board and government as appropriate, on specific measures including efficiency proposals that focus on cost reduction from the whole-of-government perspective. It was co-chaired by Finance and PM&C. Membership included deputy secretaries from all departments and the Australian Taxation Office.
30 Entities that had centralised or streamlined aspects of grants administration included DSS, Health, Industry and Employment.
31 The Secretaries Committee on Transformation provided key strategic oversight of public sector reform, including back office structural activities overseen by the deputy secretaries Efficiency Working Group and the service delivery reforms being undertaken by the Digital Transformation Agency. Membership consisted of secretaries of all departments of state, the APS Commissioner and the Commissioner for Taxation.
32 The Secretaries Board was responsible for delivery of the Australian Public Service (APS) reform program. The Secretaries Board membership consists of secretaries of all departments of state, the Australian Public Service Commissioner and the Commissioner for Taxation.
33 In May 2014, the Secretaries ICT Governance Board decided that Finance would lead a feasibility study to reduce the number of grants management ICT systems and draw together other improvements to grants administration. Finance was to report back to the Secretaries ICT Governance Board with an interim feasibility study report in September 2014 and a final report in June 2015. The feasibility study was not available.
34 These benefits would come at a cost of $80 million to $260 million depending on the timing of ICT system consolidation.
35 With the exception of the PM&C and the Department of Health that did not respond to the baseline study. Regression analysis was used to estimate costs for these entities.
36 The number of programs, applications and grants awarded were not reported.
37 As part of the Digital Transformation Agenda, the government agreed to a moratorium on investment in grants management ICT systems in April 2015.
38 Auditor-General Report No.7 2021–22, Australian Government Grants Reporting, paragraph 2.11.
39 The description of typical governance structures and roles are outlined in Australian Government Assurance Reviews (Resource Management Guide 106) and the PM&C Guide to implementation planning.
40 The interim DTO also maintained a Program Management Office for the Digital Transformation Agenda that was also referred to as the Portfolio Management Office.
41 The sponsoring committees include committees that were in place at the commencement of the SGGA Program (the Secretaries Board, Secretaries Committee on Transformation, the Efficiency Working Group, and the Digital Transformation Agenda Steering Committee) and committees established after the commencement of the SGGA Program (Deputy Secretaries Provider Forum, Deputy Secretaries Grants Advisory Forum, and Sponsorship Committee).
42 The purpose of the Program Management Group was to discuss implementation and operational issues of the SGGA Program as it moved through the transition phase. Program Management Group membership included representation from Finance, the BGH and CGH and a DTA observer.
43 Following the 2018 Governance Review the Program Management Group was renamed the Grants Hub Catch-ups, with meeting frequency changing from bi-monthly to an ‘as needs’ basis. From July 2018 the Grants Hub Catch-ups were to focus on strategic items that were impacting the streamlining of government grants. The Grants Hub Catch-ups were made up of SES Band 2 officers from Finance, BGH and CGH, NHMRC and ARC.
44 From August 2016 the Governance Board was to provide strategic leadership of the program consistent with government objectives and to an appropriate level of performance.
45 Core Program deliverables include whole-of-government grants administration business processes and workflows, grants administration hubs, a data warehouse and a market test of grants administration and ICT. Governance and stakeholder management deliverables include a program implementation plan, Collaborative Head Agreement, stakeholder management plan, high-level design for as is and future states, benefits realisation strategy, gateway reviews, and a risk and issue management plan.
46 Since 2011 gateway reviews have been applied to programs due to the complexity and implementation challenges associated with program delivery, particularly cross-portfolio programs. The purpose of the reviews is to strengthen existing governance and assurance practices, and increase program management capability across the government. The gateway review process is outlined in RMG 106. Department of Finance, Australian Government Assurance Reviews (RMG 106), July 2017.
47 The Governance Board and SGGA Reference Group records did not clearly record the endorsement or approval of versions of the implementation plan.
48 Benefit areas changed from better outcomes for users and reduction in red tape to improved policy outcomes and improved user experience. A new benefit area of enhanced capability to use whole-of-government grants data was added (which was previously a benefit to be realised in relation to efficiencies for government).
49 In September 2017, Finance advised the Governance Board that the reduction in measures from the 2017 strategy to 2018 framework was to focus on key success factors and expected improvements in user experience.
50 The hubs were identified as benefit owners, or co-owners, for the majority of measures, with client entities and Finance having individual or joint responsibility for a smaller number of measures.
51 In applying the ‘measurable’ criterion, the ANAO assessed whether the measures: use sources of information and methodologies that are reliable and verifiable; and provide an unbiased basis for the measurement and assessment of the program’s performance.
52 BGH and CGH developed hub-specific benefit realisation frameworks; the implementation of these frameworks is discussed in Chapter 3 in relation to hub effectiveness and efficiency. These frameworks were not used to support measurement of overall benefits from the SGGA Program.
53 The recommendation included making improvements by: ensuring that outcomes/benefits are used to drive the Program; defining clear and consistent measures that can be easily tracked; assigning a benefits owner; determining an appropriate reporting frequency; ensuring that any changes are documented and signed-off by the relevant authority; and ensuring consistency and continuity in the benefits monitoring work.
54 Survey responses also indicated that: in relation to the use of eligibility information to alert grant applicants to other grant opportunities, 60 per cent had a good or great experience when finding a grant and 69 per cent had a good or great experience when understanding eligibility; in relation to different aspects of the application system and related processes, user satisfaction ranged from 41 to 59 per cent; and in relation to the management system and related processes, user satisfaction ranged from 53 to 57 per cent.
55 Finance advised the ANAO in February 2022 that ministerial level briefings had been provided following this period, however, supporting documentation was not provided and the ANAO did not confirm this.
56 The SGGA Program was expected to deliver a range of savings such as a reduction in administration costs of $50 million a year and a reduction in ICT investment costs of $20 million a year, in total financial benefits of approximately $400 million a year were anticipated, see paragraph 1.5.
57 The project had achieved outcomes in terms of efficiencies, improvements and providing an adequate return on investment. However, the project did not meet original expectations; and it did not avoid duplication, overlap and lack of coordination.
58 The SGGA Program funding proposal only provided funding for ICT systems to DSS; that is, not to Industry.
59 See Table 1.1, note b and the discussion of market testing commencing at paragraph 2.67.
60 The objectives that were assessed as mostly achieved were to: standardise and scale grants management processes across government; and streamlining grants administration and reduce work for grant applications.
61 The objectives that were assessed as partially achieved were to: improve efficiency of grants delivery of applicants, recipients and Government; and replace multiple existing systems to produce a scalable common grants management process across Government.
62 Advice included that this capability had the potential to be extended to states, territories and local government.
63 Capital investment in the data warehouse was $2.4 million, and it did not meet the requirements for an ICT two pass review.
64 These capabilities included: Data.gov.au — the central source of Australian open government data which supports data modelling, analytics and visualisation; NationalMap — an online map-based tool to allow easy access to spatial data from Australian government agencies; and Data61’s computing platform for secure access and analysis of whole-of-government linked datasets.
65 Finance remained responsible for the data warehouse and led the project. CGH advised the ANAO in November 2021 that it was technical lead for stage one only (the prototype) and, at Finance’s direction, this project ceased at the end of stage one due to governance, scope and data privacy issues.
66 In March 2019, the SGGA Reference Group was advised that $1.5 million of the virtual data warehouse budget remained.
67 Finance could not provide to the ANAO evidence of government agreement to this.
68 Health transferred 285 staff to the DSS in September 2018 as part of a machinery of government change.
69 Industry advised the ANAO in November 2021 that in designing the SGGA Benefits Management Framework, the definition and measure of ‘improved policy outcomes’ was broader than just BGH. This would be coordinated through the SGGA Benefits Management Framework and measured by Finance.
70 Effectiveness measures for BGH include: an increase in ability to access and report on whole-of-government program data; an increase in customer satisfaction with usability of the portal; a reduction in perceived application processing time from customer perspective as a result of automated application processing tasks; and an improved customer experience with issue resolution.
71 Effectiveness measures for CGH include: percentage compliance with data quality targets defined in the Data Quality Framework; percentage of grant programs that follow standard service offering; percentage of client agency satisfaction with hub services; increase in the number of in-scope grants programs managed through the hub; and standard reports used as-is (as opposed to report customisation).
72 In 2018–19 the business plan contained 18 performance indicators, these were reduced to three measures in 2019–20 including two effectiveness measures: grants are paid to the correct organisation, at the correct rate, and in a timely manner that enables the grant recipient to deliver the agreed outcomes; and the performance of grant recipients is actively managed against the grant agreement, each organisation is assigned a risk rating against which their performance is monitored and issues are escalated to policy owners for decision in a timely manner. In 2020–21 a third effectiveness measure was added: engage with grant recipients and other stakeholders for the purpose of assessing performance and gathering insights to assure service delivery is meeting policy intent.
73 Reasons for not reporting measures in April 2021 included three measures were to be measured annually, delay in customer surveys due to COVID-19, functionality was not operational and therefore could not be measured, and reporting capability had not been developed.
74 In applying the ‘measurable’ criterion, the ANAO assessed whether the measures were: reliable and verifiable — use sources of information and methodologies that are reliable and verifiable; and free from bias — provide an unbiased basis for the measurement and assessment of the program’s performance.
75 For example, the methodology used to measure ‘an increase in capability of staff to deliver services’ is the ‘number of staff working hubbed programs as a proportion of all AusIndustry staff’. For a measure relating to engagement with grant recipients and other stakeholders for the purpose of assessing performance against outcomes, CGH measured whether: all locational intelligence in GPS Interactions were recorded using the appropriate hashtag; and all incidents that have the potential to adversely impact the department are escalated within 24 hours of notification to the relevant Group Manager, Branch Manager and policy area.
76 Department of Finance, Developing good performance information (RMG 131) [Internet], Finance, May 2020, paragraph 63, available from https://www.finance.gov.au/government/managing-commonwealth-resources/developing-good-performance-information-rmg-131 [accessed March 2022].
77 Efficiency measures for BGH include: a reduction in cost of configuring new programs; a reduction in time and cost to design a new grant opportunity; a reduction in time taken to complete selection process; a reduction in time taken to process applications due to automation; an increase in the proportion of programs that fit the standard service offer; and a reduction in time for the customer to complete progress report.
78 The measures that were reported in 2019–20 had targets such as: 10, 15 or 75 per cent reduction compared to a baseline. The baselines were intended to be used in these instances to assess if the target was met. For two of the measures that had targets, performance against target could not be determined because one did not have a baseline and the other had a baseline but it was no longer relevant as the methodology had changed.
79 Efficiency measures for CGH include: percentage reduction in the cost of a standard or typical quote; reduce average completion times by service category: complete an assessment; number of full-time equivalent staff to manage (Program Delivery Model [PDM]) per 1000 grant agreements; number of grants rounds per year per number of grants administration full-time equivalent staff; and percentage of grant applications that are submitted using pre-populated applicant details.
80 The methodology for measuring ‘a reduction in time and cost to design a new grant opportunity’ changed from reporting the percentage reduction (or increase) in both time and cost to design a new grant opportunity to reporting the number of hours to design a new grant opportunity.
81 Methodologies were not established for a number of performance measures relating to reductions in cost or time.
82 To measure whether grants administration services are delivered in a cost efficient manner on a full cost recovery basis CGH measured the variance between the cost estimate and delivery expense, and the reduction in cost of administering grants as a proportion of administered grant expenditure.
83 See Department of Finance, Commitment of Relevant Money (RMG 400), updated October 2020 and Publishing and reporting Grants and GrantConnect (RMG 421), updated December 2020.
84 In most cases, some unique applications or executed agreements were excluded from testing as there was insufficient information to undertake the test.
85 The exception involved the identification of an inappropriate delegate signing a grant agreement, as part of the 2019–20 interim financial statements control testing. This exception did not reoccur in subsequent testing.
86 The ANAO did not assess the completeness and accuracy of reporting in the hubs management assurance letters (MALs) or associated independent testing, or compliance of individual grant programs with CGRG mandatory requirements. Footnote 5 notes other ANAO audits that have identified issues with the management of grants through the hubs.
87 DSS provides client entities with management representation letters, for the purposes of the report these letters will be referred to as MALs. BGH issued 10 MALs in 2017–18, seven in 2018–19 and nine in 2019–20. CGH issued six MALs in 2017–18, seven in 2018–19 and nine in 2019–20.
88 This deficiency involved a grant agreement for the Small Business Bushfire Support Line (delivered on behalf of the Treasury) being executed five days prior to the delegation instrument being provided (although payments were not made until after delegations were provided).
89 The 10 other deficiencies – notified to the Department of Health – involved a failure to meet the relevant delegation: five instances in which the delegation was not met as the authority to make or vary an arrangement for more than $100 million rests at EL2 and above, and five instances in which payments were not released as required by the grant agreement.
90 The financial delegation breaches involved payments made prior to the relevant agreements being executed for Department of Health grants. Various causes for late publication of grants were identified including instances where mandatory reporting data (such as grant opportunity identification number or grant selection type) was not recorded.
91 CGH advised the ANAO in November 2021 that broader deficiencies are reported via other mechanisms (not the MALs) such as Portfolio Board meetings and the CGH Community of Practise. The Portfolio Boards are bilateral forums intended to provide strategic oversight and guidance, including by resolving issues and potential risks, to all grant rounds administered through the CGH. Board meetings were chaired by an SES Band 2 or 1, and membership was generally at the SES Band 1 level and EL level. Boards are in place with seven client entities and generally meet six to eight times a year.
92 Subsection 16(b) of the Public Governance, Performance and Accountability Act 2013 establishes a general duty for accountable authorities to establish and maintain an appropriate system of internal control including implementing measures directed at ensuring officials comply with finance law. Paragraph 13.5 of the Commonwealth Grants Rules and Guidelines (CGRGs) states that accountable authorities should establish appropriate internal control mechanisms for grants administration. Paragraph 6.3 of the CGRGs state that accountable authorities and officials must put in place practices and procedures that address the seven key principles of grants administration. Finance law is considered to include accountable authority instructions, delegations, fraud control frameworks and the CGRGs. Department of Finance, Duties of accountable authorities, General Duties (RMG 200) [Internet], Finance, updated April 2021 available from https://www.finance.gov.au/government/managing-commonwealth-resources/managing-risk-internal-accountability/duties/duties/duties-accountable-authorities-rmg-200 [accessed March 2022]. Requirements for grants administration are also established in resource management guides issued by Finance.
93 For example: BGH records of the timing of eligibility and merit assessments were often incomplete; there were 6566 executed grant agreement records in BGH data that could not be matched to GrantConnect; 20 per cent of purchase order records from the financial management information system could not be matched to executed grant agreements, and 48 per cent of executed agreements could not be matched to purchase order records; and eligibility and merit outcomes are recorded in two datasets, these datasets recorded different eligibility and merit outcomes for 7965 applications.
94 Examples of widespread inconsistencies included: the creation of purchase orders and payment of grant recipients occurring out of sequence with decisions on applications and execution of agreements; and for executed agreements a failure to record risk ratings, reviews of risk ratings, and grant outcomes.
95 Examples included instances where: grant assessment decisions were made up to 306 days before an application was received; grant agreements were issued after the due date, accepted before they were issued or finished before they commenced; progress reports were reviewed before they were received or up to 10 years after they were received; and final grant outcomes were assessed before the grant commenced or where there was not an executed agreement.
96 For example, user access controls allowed delegates who were not the delegate for a grant program to approve funding.
97 CGH used a two-step process to create each data extract report involving: separately extracting the applications that have an Activity Id and those that have no data entered in the Activity Id field; and then these datasets have been concatenated to form the data extract report for the ANAO. Given that there is a degree of data manipulation during this process to append the data, there is a potential risk of systematic error.
98 Participating entities were initially the top 12 Commonwealth granting entities (see Table 1.1, note a). Subsequent machinery of government changes and the inclusion of entities have led to modifications in the number of participating entities over time.
99 Except for new policy proposals where an exemption or deferment from delivery by a hub has been provided by the Minister for Finance.
100 For the BGH agreements, 43 per cent did not contain fees for service, 56 per cent did not include the costing summary (which is the basis on which agreed services are charged).
101 For the CGH agreements, 67 per cent did not contain fees for service, and 92 per cent did not include price lists.
102 For example, the National Health and Medical Research Council (NHMRC) interim hub for medical research grants was established in November 2017.
103 Industry advised the ANAO in November 2021 that limited staffing did not impact the selection of evaluation services by client entities.
104 System availability service levels relate to DSS’s systems availability, incident management and disaster recovery. Grant round administration service levels relate to quoting and grant round planning, design, selection, establishment and management.
105 These service level standard areas related to: availability of DSS systems and applications to enable client entities to receive services; information on the creation and changing of organisational records; reporting on whether the client entity has been affected by priority incidents (client entity provides an indication of priority of incident), as well as reporting on initial response time, and timeline for workaround or permanent fix for an incident; and in the event of a disaster occurring, CGH will make appropriately stored and backed up client entity grants administration data available in a form that allows the client entity to resume grant business within specified timeframes.
106 These service level standard areas related to: arrangement of a standard application form, updates to program records and relevant publishing literatures (the service level standard was met in 95 per cent of reports); providing the client entities a list of active system users and their system roles (85 per cent); a summary of the types and totals of service support requests received through the helpdesk, relating to digital platform system enquiries and service requests (89 per cent); and information on client entity requests for system alerts and resolution of outages (94 per cent).
107 This service level standard area related to the grant round opening on the date specified in grant round management plan.
108 These service level standard areas relate to: provision of a quote for hub services and the re-issue of an expired quote (the service level standard was met in 87 per cent of reports); following first design meeting, CGH provides the client entity with a grant round management plan (86 per cent); reporting on vendor checks for funding round, provision of an assessment report, grant agreements issued to successful applicants, and report individual executed grants on GrantConnect (90 per cent); and information on performance reports, deliverables assessed, financial assurance over payments, and insights on unexpended funds (94 per cent).
109 During the engage phase, service levels relate to providing costings, service proposals and service schedules within specified timeframes. During the establish phase service levels relate to timely execution of grant agreements with no errors, and publishing grants awarded on GrantConnect within required timeframes.
110 Higher and additional cost was raised by a number of client entities in relation to fixing errors in application forms, varying grant agreements, and the high cost of administration relative to low value grant programs (for example, a grant program with an annual administered appropriation of $145,000, was quoted $107,382 for the 2021–22 grant round, where the client entity proposed to undertake assessment and selection processes).
111 Survey responses noted CGH errors in grant agreements, application forms and payments made, or delays in making payments. One client entity noted that where it administers grants it would acquit 100 per cent of grants, but this was not possible when using the hubs due to cost imposed by the hubs on the entity.
112 Client entities noted that there was excessive oversight needed of hub decisions during select, establish and manage phases due to a lack of understanding of grant program requirements or technical expertise.
113 BPORs Rule 1.9, 4.1 and 4.2 (October 2016), BPORs Rule 3.8 (September 2019) and Estimates Memorandum 2016/38 paragraph 4.
114 The Estimates Memorandum consultation requirement is: where an NPP includes a grant proposal, it must be prepared in consultation with the grants hubs to ensure implementation costs and staff impacts are identified consistent with the BPORs. BGH business process reflect this requirement, but CGH does not require the client entity to provide the hub with the final NPP that was submitted to government.
115 Commonwealth Grants Rules and Guidelines paragraph 4.8 sets out that when grants are to be administered by a third party the relevant accountable authority must ensure the arrangement is in writing.
116 The Service Catalogue Framework listed grants administration activities by lifecycle phase that were to be used by client entities across the two hubs, reflecting standardised whole-of-government grants administration processes that were to be adopted by client entities. The hubs could build on the framework to include value added services, bundled services, unit prices and service levels. The framework formed the basis for costing services.
117 The Costing Methodology was to be used by hubs to develop service offerings and prices. A common costing and pricing model supports SGGA Program to quantify financial benefits and client entities to compare prices from grants hubs against common services.
118 In documentation of the SGGA Program and the grants hubs the terms administrative processes and business processes are used interchangeably.
119 The SGGA Program defines processes as a linked sequence of activities that delivers an expected outcome and workflow patterns as a decision tree approach towards selecting one of six grant approaches.
120 In April-June 2019 BGH developed grants administration process maps. An internal BGH report in December 2019, recognised that the maps had not been adopted as planned due to a lack of change management activity. BGH created new process maps in December 2020. BGH advised the ANAO in November 2021 that these process maps were approved in May 2021.
121 BGH advised ANAO that it updates the costing model on a six-monthly basis to accommodate Finance changes to prices relating to salary, overhead rates, and indexation.
122 Previously, in 2017, BGH undertook a project to develop a self-serve costing model. The model was not released. The intention of the self-serve costing model for client entities was to develop costs and include in short form NPPs prior to engaging with the BGH. The intention was also to provide more transparency to the client entity on the costing assumptions.