The objective of this audit was to assess the effectiveness of the Australian Research Council’s (ARC’s) administration of the National Competitive Grants Program (NCGP).

Summary and recommendations

Background

1. The Australian Research Council (ARC) is a Commonwealth entity established under the Australian Research Council Act 2001 that advises the Australian Government on research matters, administers the National Competitive Grants Program (NCGP) and has responsibility for Excellence in Research for Australia and Engagement and Impact assessments. The ARC’s purpose is to grow knowledge and innovation for the benefit of the Australian community through funding the highest quality research, assessing the quality, engagement and impact of research, and providing advice to Government on research matters.

2. The NCGP funds fundamental and applied research and research training across all disciplines, except clinical medical research, to eligible organisations. The NCGP comprises 13 schemes under two programs: the Discovery program, which primarily focuses on supporting individuals and small teams to undertake fundamental research; and the Linkage program, which focuses on creating links between university researchers, industry partners and other community organisation to undertake applied research.

3. From 2016 to 2018 the ARC received 17,556 applications across all NCGP schemes. Over this period, the ARC commenced funding for 3,493 grants with a total value of $1.99 billion.

Rationale for undertaking the audit

4. Government-funded research contributes to improving Australia’s social and cultural environment, international competitiveness and ensuring that the economy can embrace changing economic and technological conditions, deliver future prosperity and support the wellbeing of all Australians. The NCGP is one of the key sources of research funding, with an allocation of $791.3 million in 2019–20. The audit provides assurance to Parliament about whether this significant source of research and development funding is being administered by the ARC in an effective manner and is meeting the objectives set for the NCGP by the Government.

Audit objective and criteria

5. The objective of the audit was to assess the effectiveness of the ARC’s administration of the NCGP. To form a conclusion against the audit objective, the ANAO adopted the following criteria:

  • Do the NCGP guidelines align with the Commonwealth Grants Rules and Guidelines (CGRGs) and government objectives in relation to research and innovation?
  • Do the ARC’s grant assessment processes comply with the NCGP guidelines?
  • Has the ARC implemented effective monitoring, assurance, evaluation and reporting arrangements for the NCGP?

Conclusion

6. The ARC’s administration of the NCGP is effective, except that its performance indicators and monitoring and assurance arrangements should be strengthened.

7. The NCGP guidelines align with the CGRGs and the government’s research and innovation objectives, and have recently increased the focus on government priorities. The guidelines clearly outline the elements of the NCGP and are effectively communicated to stakeholders.

8. The ARC has mature and effective processes in place to assess grants, manage conflicts of interest and provide funding recommendations that comply with the NCGP guidelines.

9. The ARC’s arrangements to measure the performance of the NCGP, monitor and evaluate the program and provide assurance that administering organisations comply with funding requirements are largely effective. Most of the ARC’s key performance indicators (KPIs) are relevant, but not all are reliable. Its assurance arrangements could be more risk-based to provide greater assurance that administering organisations comply with grant agreement requirements and the program is achieving its objectives.

Supporting findings

Guidelines and communications

10. NCGP guidelines are consistent with all mandatory elements of the CGRGs and also incorporate better practice elements. Consistent with the CGRGs, the guidelines include eligibility and assessment criteria, approval requirements and performance reporting.

11. The focus of the NCGP is to support the highest quality research and research training through a competitive grant process. The guidelines broadly align with the Government’s objectives, including the Science and Research Priorities, Australia’s National Science Statement and National Innovation and Science Agenda.

12. The NCGP guidelines clearly outline the program governance arrangements, including detailed program and scheme-specific eligibility and assessment criteria, assessment processes, decision-making, and reporting and performance requirements.

13. The ARC has implemented strategies to effectively communicate with key stakeholders. The ARC uses a variety of communications activities, targeting different audiences and key stakeholders, to provide information on NCGP guidelines and its elements, as well as NCGP-funded research outcomes.

Assessment and funding decisions

14. The ARC has appropriate assessment, eligibility, scrutiny and appeal processes to support well informed and transparent grant assessment and funding decisions.

15. The ARC has established appropriate arrangements to manage actual and perceived conflicts of interest in the NCGP process.

16. The ARC provided the Minister for Education clear advice and funding recommendations consistent with requirements of the ARC Act, the CGRGs and the NCGP guidelines.

Performance and reporting

17. The performance management framework established by the ARC for the NCGP is largely appropriate. Most of the KPIs are relevant but they require further refinement to be reliable and do not include any efficiency measures.

18. Internal reporting about the NCGP focuses on point-in-time data and current status, but does not include historical or contextual information. Including this type of information in internal management reporting would improve the ARC’s capacity to effectively monitor and administer the NCGP.

19. The ARC has appropriate monitoring and assurance arrangements for the NCGP. However implementation could by improved by ensuring that the arrangements are more risk-based and effectively contribute to the ARC’s assurance that administering organisations comply with grant agreement requirements and NCGP risks are being managed appropriately.

20. The ARC assesses the effectiveness of the NCGP through various activities, including evaluations, and has used the results of these activities to inform program improvements. In 2018 the ARC established a new evaluation strategy and plan, which resulted in two evaluations in 2018–19.

Recommendations

Recommendation no.1

Paragraph 2.8

The Australian Research Council review the practice of issuing NCGP guidelines annually.

Australian Research Council response: Agreed.

Recommendation no.2

Paragraph 4.12

The Australian Research Council ensure that its KPIs for the NCGP are reliable and include efficiency.

Australian Research Council response: Agreed.

Recommendation no.3

Paragraph 4.55

The Australian Research Council ensures that its monitoring and assurance activities, in particular institutional reviews, are risk-based and contribute to the Australian Research Council’s assurance that NCGP objectives are being achieved.

Australian Research Council response: Agreed.

Summary of entity response

21. The proposed report was provided to the Australian Research Council. The ARC did not provide a summary response. Its full response is reproduced at Appendix 1.

Key messages from this audit for all Australian Government entities

Below is a summary of key messages, including instances of good practice, which have been identified in this audit that may be relevant for the operations of other Australian Government entities.

Group title

Governance and risk management

Key learning reference
  • Grant application, assessment, assurance, monitoring and reporting requirements in grant programs should be proportional to identified risks, in accordance with the CGRGs.
Group title

Grants

Key learning reference
  • By engaging subject matter experts throughout the NCGP assessment process, the ARC has been able to harness expert advice on complex and highly specialised research topics.
Group title

Records management

Key learning reference
  • The ARC administers the NCGP using a bespoke end-to-end Research Management System. Where cost effective, IT infrastructure for grants assessment and grants management should be fit-for-purpose. Systems that embed and validate high volume and transparent processes, such as those used by the ARC for managing potential conflicts of interest within the research sector, can deliver enhanced efficiency and assurance.

1. Background

1.1 The Australian Government supports university research through a range of funding mechanisms, primarily competitive grants and research block grants. Funding is awarded by a number of government entities to successful applicants to undertake specific research projects. Research block grants are allocated to universities by the Department of Education to support the training of students undertaking higher degrees by research and the systemic costs of universities in conducting research.

National Competitive Grants Program

1.2 The Australian Research Council (ARC) is a Commonwealth entity established under the Australian Research Council Act 2001 that advises the Australian Government on research matters, administers the National Competitive Grants Program (NCGP) and has responsibility for Excellence in Research for Australia and Engagement and Impact assessments. The ARC’s purpose is to grow knowledge and innovation for the benefit of the Australian community through funding the highest quality research, assessing the quality, engagement and impact of research, and providing advice to Government on research matters.

1.3 The NCGP funds fundamental and applied research and research training across all disciplines, except clinical medical research, to eligible organisations. There were 43 NCGP eligible organisations as at 31 May 2019. The NCGP comprises two programs:

  • Discovery — primarily focuses on supporting individuals and small teams to undertake fundamental research; and
  • Linkage — focuses on creating links between university researchers, industry partners and other community organisations to undertake applied research.

1.4 Table 1.1 outlines the schemes within the Discovery and Linkage programs. Further details about the NCGP and its component schemes can be found on the ARC’s website (www.arc.gov.au).

Table 1.1: NCGP schemes

Discovery program

Linkage program

  • Discovery Projects
  • Australian Laureate Fellowships
  • Future Fellowships
  • Discovery Indigenous
  • Discovery Early Career Researcher Award
  • ARC Centres of Excellence
  • Linkage Projects
  • Industrial Transformation Research Program — comprising Industrial Transformation Training Centres and Industrial Transformation Research Hubs
  • Linkage Infrastructure, Equipment and Facilities
  • Linkage Learned Academies Special Projects
  • Special Research Initiatives
  • Supporting Responses to Commonwealth Science Council Priorities
   

Source: www.arc.gov.au.

1.5 From 2016 to 2018 the ARC received 17,556 applications from administering organisations across all of the NCGP schemes. Over this period, the ARC commenced funding for 3,493 grants with a total value of $1.99 billion. For funding commencing in 2018, the ARC received 5,686 NCGP applications and awarded 1,127 grants. For funding commencing in 2019, as at 26 April 2019 the ARC had received 5,497 applications and had awarded 971 grants. Success rates for NCGP grants vary each year depending on the number of applications received, the quality of those applications, total funding available and size of the grants awarded. From 2016 to 2018 the average success rate across all NCGP schemes was around 20 per cent. Figure 1.1 shows the value of NCGP grants with funding commencing in 2016, 2017 and 2018.

Figure 1.1: Value of NCGP grants by commencing year, 2016 to 2018

 

Note: The increase in Linkage Program grants in 2017 and subsequent decrease is largely due to the impact of awarding ARC Centres of Excellence grants in 2017 ($283.5 million).

Source: ANAO analysis of ARC data.

1.6 Figure 1.2 and Figure 1.3 show the value and number of grants disaggregated by the two NCGP programs, Discovery and Linkage, and by their component schemes.

Figure 1.2: Total value and number of NCGP grants, 2016 to 2018 — Discovery program

 

Source: ANAO analysis of ARC data.

Figure 1.3: Total value and number of NCGP grants, 2016 to 2018 — Linkage program

 

CoE

ARC Centre of Excellence

ITTC

Industrial Transformation Training Centres

LP

Linkage Projects

SRI

Special Research Initiatives

LIEF

Linkage Infrastructure, Equipment and Facilities

LLASP

Linkage Learned Academies Special Projects

ITRH

Industrial Transformation Research Hubs

CS

Commonwealth Science Council Priorities

       

Source: ANAO analysis of ARC data.

1.7 Appendix 2 outlines NCGP grants by institution from 2016 to 2018. The number of grants awarded to individual institutions ranged from one to 386 and the value of grant funding ranged from $170,000 to $264.3 million. The top three recipients of NCGP grants were: the University of New South Wales (386 grants valued at $264.3 million); the University of Queensland (378 grants valued at $230.5 million); and the University of Melbourne (381 grants valued at $222.8 million).

Grants administration

1.8 The Commonwealth Grants Rules and Guidelines (CGRGs), issued under the Public Governance, Performance and Accountability Act 2013, establish the overarching grants policy framework and articulate the expectations for all non-corporate Commonwealth entities in relation to grants administration.1 Under this framework, non-corporate Commonwealth entities undertake grants administration based on the mandatory requirements and key principles of grants administration in the CGRGs. The ARC is subject to the CGRGs.

1.9 Figure 1.4 outlines the key elements of the ARC’s process for establishing, assessing and approving grants and monitoring compliance with grant requirements.

Figure 1.4 Overview of the ARC’s NCGP process

 

This graphic provides a flowchart overview of the Australian Research Council’s NCGP process, commencing with establishment of the grant opportunity (including the issuing of grant guidelines, calling for applications, and establishment of a Selection Adv

 

Source: ANAO analysis of ARC documentation.

Recent developments

1.10 In November 2018 the House of Representatives Standing Committee on Employment, Education and Training published its report into Australian Government Funding Arrangements for non-NHMRC Research. The Committee made 15 recommendations, with eight of those directly relevant to the NCGP. The Government has yet to respond to this report.2

1.11 On 31 October 2018 the Minister for Education announced a review of the 2015 National Science and Research priorities as they applied to the NCGP. On 19 February 2019 the Minister announced the terms of reference for a panel of experts, led by the ARC’s Chief Executive Officer, to undertake this review. The panel is expected to report to the Government in July 2019.

Rationale for undertaking the audit

1.12 Government-funded research contributes to improving Australia’s social and cultural environment, international competitiveness and ensuring that the economy can embrace changing economic and technological conditions, deliver future prosperity and support the wellbeing of all Australians. The NCGP is one of the key sources of research funding, with an allocation of $791.3 million in 2019–20. The audit provides assurance to Parliament about whether this significant source of research and development funding is being administered by the ARC in an effective manner and is meeting the objectives set for the NCGP by the Government.

Audit approach

Audit objective, criteria and scope

1.13 The objective of the audit was to assess the effectiveness of the ARC’s administration of the NCGP. To form a conclusion against the audit objective, the ANAO adopted the following criteria:

  • Do the NCGP guidelines align with the CGRGs and government objectives in relation to research and innovation? (Chapter 2)
  • Do the ARC’s grant assessment processes comply with the NCGP guidelines? (Chapter 3)
  • Has the ARC implemented effective monitoring, assurance, evaluation and reporting arrangements for the NCGP? (Chapter 4)

Audit methodology

1.14 In undertaking the audit, the ANAO:

  • reviewed and analysed ARC records, including grant guidelines, standard operating procedures, work instructions, strategic documents, policies, reviews and reports;
  • conducted an analysis of a random sample of 94 grants funded in 2016, 2017 and 2018, a random sample of 59 grants that did not receive funding, and the three highest value ARC Centres of Excellence grants of the nine funded in 2017. This analysis included a review of ministerial grant approval documents;
  • analysed program data captured in the ARC’s Research Management System and data warehouse;
  • interviewed relevant ARC personnel and a sample of universities and peak bodies; and
  • sought contributions from all NCGP eligible organisations and the public. Ten submissions were received, all from eligible organisations.

1.15 The audit was conducted in accordance with the ANAO Auditing Standards at a cost to the ANAO of approximately $444,000. The team members for this audit were Brendan Gaudry, Michael Commens, Alana Tolman, Lynette Tyrrell, Mary Huang and Deborah Jackson.

2. Guidelines and communications

Areas examined

The Commonwealth Grants Rules and Guidelines (CGRGs) establish the overarching grants policy framework and articulate the expectations for all non-corporate Commonwealth entities in relation to grants administration. The objective of grants administration is to promote proper use and management of public resources through collaboration with the non-government sector, such as industry, small business and the not-for-profit sector, to achieve government policy outcomes. The ANAO examined whether the Australian Research Council’s (ARC’s) National Competitive Grants Program (NCGP) guidelines align with the CGRGs and government objectives in relation to research and innovation.

Conclusion

The NCGP guidelines align with the CGRGs and the government’s research and innovation objectives, and have recently increased the focus on government priorities. The guidelines clearly outline the elements of the NCGP and are effectively communicated to stakeholders.

Area for improvement

The ANAO has made one recommendation addressing the annual issue of NCGP guidelines.

2.1 This chapter examines the extent to which NCGP guidelines align with the CGRGs and the processes used by the ARC in developing and communicating the guidelines.3 Chapters three and four examine the ARC’s application of the guidelines, including grant assessment, approval, reporting and assurance processes and procedures.

2.2 The ANAO reviewed NCGP guidelines released by the ARC in 2015 to 2018.

Table 2.1: Guidelines reviewed by the ANAO

Program/Scheme

Released

Funding commences

Discovery

2015

2016 and 2017

Discovery

2016

2016 and 2017

Future Fellowships

2016

2016

Discovery

2017

2018 and 2019

Linkage

2015

2015, 2016 and 2017

ARC Centres of Excellence

2015

2017

Linkage

2016

2017 and 2018

Linkage

2017

2018 and 2019

ARC Centres of Excellence

2018

2020

     

Notes: The ANAO did not examine: Learned Academies Special Projects; Linkages Special Research Initiatives; and Supporting Responses to Commonwealth Science Council Priorities.
Discovery guidelines include: Discovery Projects, Discovery Indigenous, Future Fellowships (except in 2016), Discovery Early Career Researcher Award and Australian Laureate Fellowships.
Linkage guidelines include: ARC Centres of Excellence, Linkage Projects, Linkage Infrastructure, Equipment and Facilities, Industrial Transformation Research Hubs and Industrial Transformation Training Centres.

Source: ANAO analysis.

Are the NCGP guidelines consistent with the Commonwealth Grant Rules and Guidelines?

NCGP guidelines are consistent with all mandatory elements of the CGRGs and also incorporate better practice elements. Consistent with the CGRGs, the guidelines include eligibility and assessment criteria, approval requirements and performance reporting.

2.3 The CGRGs require officials to develop guidelines for all new grant opportunities and issue revised guidelines where significant changes have been made to a grant opportunity. The development of clear, consistent and well documented guidelines helps to ensure consistent and efficient grants administration.

2.4 Guidelines and supporting documentation should include grant objectives, eligibility criteria, proposal assessment criteria, weighting of assessment criteria, the approval process, indicative reporting and acquittal requirements, and a description of complaint handling, review and freedom of information mechanisms. The Department of Finance has issued a better practice checklist to assist entities to develop guidelines.4

2.5 The ARC employs a multi-stage process for the development of the NCGP guidelines and associated supporting materials that is governed by a standard operating procedure (SOP) and related work instructions. The ANAO found that, in developing guidelines for the Discovery and Linkage Programs, the ARC complied with the processes outlined in the SOP and work instructions. The ANAO also observed that:

  • The ARC scheduled feedback sessions with Selection Advisory Committee (SAC) members following SAC meetings. Feedback relating to proposal5 forms, funding allocations, project limits, budget requirements, committee meeting processes and NCGP grant guideline content is recorded following committee meetings.
  • The ARC developed draft guidelines for grant opportunities in consultation with internal and external stakeholders. This involved sending emails, holding workshops and conducting surveys designed to garner feedback for proposed guideline changes during drafting.
  • The ARC completed risk assessments for all guidelines developed for grants commencing in 2016, 2017, 2018, 2019 and 2020, assessing each set of guidelines as ‘low risk’. The ARC also sought, and received, confirmation each round from the Department of Finance and the Department of the Prime Minister and Cabinet that the assessment of ‘low risk’ was appropriate.
  • The ARC undertook an internal clearance process prior to submitting the guidelines to the Minister for Education for approval. The ARC provided appropriate guidance to the Minister when seeking approval of the guidelines.
  • Prior to the release of the 2017 CGRGs, NCGP guidelines were published on the ARC’s website. NCGP guidelines issued in 2018 were published on GrantConnect (www.grants.gov.au).

2.6 The ARC developed guidelines, obtained Ministerial approval and made the guidelines publically available in accordance with the mandatory requirements of the CGRGs. The guidelines also aligned with better practice guidance and the key principles for grants administration outlined in the CGRGs, including in relation to eligibility and assessment criteria, the approval process and reporting and performance requirements. The guidelines also met all applicable elements of the Department of Finance’s Grant guidelines — Better practice checklist.

2.7 ANAO analysis identified only minimal changes in assessment, eligibility and reporting requirements between the guidelines issued for programs commencing 2016 to 2020. The main changes made between funding rounds related to changes to definitions and the clarification of terminology used in the guidelines. Given the changes are only minimal, the ARC should consider publishing new NCGP guidelines only when necessary; for example, when significant changes are made to the grant eligibility or assessment criteria. The ARC advised the ANAO that, as part of the streamlining review (see paragraphs 4.63 to 4.65 for details), it is revising its approach to reviewing and publishing the guidelines.

Recommendation no.1

2.8 The Australian Research Council review the practice of issuing NCGP guidelines annually.

Australian Research Council response: Agreed.

2.9 The ARC will review the practice of issuing NCGP guidelines annually with a view to having multi-year guidelines.

Do the objectives outlined in the NCGP guidelines align with the government’s objectives in relation to research and innovation?

The focus of the NCGP is to support the highest quality research and research training through a competitive grant process. The guidelines broadly align with the Government’s objectives, including the Science and Research Priorities, Australia’s National Science Statement and National Innovation and Science Agenda.

Government objectives

2.10 The government’s objectives, which are relevant to the NCGP, are outlined in a number of documents and policy papers, including in the Science and Research Priorities, the National Innovation and Science Agenda (NISA) and Australia’s National Science Statement.

2.11 In May 2015 the government established nine Science and Research Priorities, and corresponding Practical Research Challenges. The priorities are: food; soil and water; transport; cybersecurity; energy; resources; advanced manufacturing; environmental change; and health. The government expects that, over time, the implementation of these priorities will result in an increased proportion of Australian Government research investment being allocated to areas of critical need and national importance.

2.12 In December 2015 the Minister for Industry, Innovation and Science announced the NISA — a policy statement on innovation and science.6 The NISA measures include grant programs, tax incentives, education initiatives, and data and digital changes and are framed around four main focus areas, referred to as pillars: culture and capital; collaboration; talent and skills; and government as an exemplar.

2.13 In 2017 the government released Australia’s National Science Statement. The statement sets out the government’s vision for an Australian society engaged in and enriched by science through the achievement of four objectives: engaging all Australians with science; building our scientific capability and skills; producing new research knowledge and technologies; and improving and enriching Australians’ lives through science and research.

NCGP objectives

2.14 The focus of the NCGP is to support ‘the highest-quality fundamental and applied research and research training through national competition’.7 The program’s objective is to support:

  • high quality research leading to the discovery of new ideas and the advancement of knowledge;
  • facilities and equipment that researchers need to be internationally competitive;
  • researchers at different stages of their careers, including training and skills development of the next generation of researchers; and
  • incentives for Australia’s most talented researchers to work in partnership with leading researchers throughout the national innovation system and internationally, and to form alliances with Australian industry.

2.15 At the individual program level, the Discovery Program aims to deliver outcomes that benefit Australia and build Australia’s research capacity through support for:

  • excellent, internationally competitive research by individuals and teams;
  • research training and career opportunities for the best Australian and international researchers;
  • international collaboration; and
  • research in priority areas.8

2.16 The objectives of the Linkage Program are to deliver outcomes of benefit to Australia and build Australia’s research and innovation capacity through support for:

  • collaborative research between university-based researchers and researchers in other sectors;
  • research training and career opportunities that enable Australian and international researchers and research students to work with industry and other end-users; and
  • research in priority areas.9

2.17 Scheme specific objectives are also set out in the guidelines. The objectives of Discovery Projects, the largest of the NCGP schemes, are to:

  • support excellent basic and applied research by individuals and teams;
  • encourage high-quality research and research training;
  • enhance international collaboration in research;
  • expand Australia’s knowledge base and research capability; and
  • enhance the scale and focus of research in the Science and Research Priorities.

2.18 These program and scheme objectives are broadly consistent with the government’s objectives and are also aligned with the objective of the NCGP. For example, collaboration, one of the four NISA pillars, is reflected in the program objectives, and is an eligibility requirement and part of the selection sub-criteria for most Linkage Program schemes (discussed in more detail later in this chapter).10 The nine Science and Research Priorities are clearly referenced in the guidelines for the Discovery and Linkage Programs and the scheme-specific assessment criteria for the Discovery Program. The Linkage Program, Linkage Projects and ARC Centres of Excellence guidelines directly reference the Science and Research Priorities in the selection sub-criteria.

2.19 When applying for a NCGP grant, applicants select whether the project aligns with one of the government’s nine Science and Research Priorities. The ARC advised the ANAO that the government’s Science and Research Priorities are taken into account during the assessment process, although nomination of a Science and Research Priority is not mandatory. While the priorities are reflected in the guidelines, the guidance provided to assessors does not include the requirement to assess proposals against the Science and Research Priorities and weightings have not been assigned to the priority areas. The ARC noted that, while referencing these priorities in the guidelines indicates that the priorities are valued, they are but one factor taken into account during the assessment process.

2.20 Figure 2.1 shows the amount of funding allocated to the Science and Research Priorities in 2016, 2017 and 2018. Guidelines for most schemes commencing in 2016 were released prior to the announcement of the Science and Research Priorities. This accounts for the comparatively low rate of funding allocated to Science and Research Priorities in 2016.

Figure 2.1: Percentage of NCGP funding allocated to Science and Research Priorities, 2016 to 2018

 

Source: ARC data.

Do the NCGP guidelines clearly outline the program governance arrangements?

The NCGP guidelines clearly outline the program governance arrangements, including detailed program and scheme-specific eligibility and assessment criteria, assessment processes, decision-making, and reporting and performance requirements.

2.21 Approved and published NCGP guidelines provide the basis for program governance, including eligibility requirements, assessment criteria and process, decision-making, and reporting and performance requirements, which are discussed below.

Eligibility requirements

2.22 Proposals for Discovery and Linkage Programs can only be accepted from eligible organisations and must be submitted through the research office of an eligible organisation, as defined in the guidelines. The eligible organisation submitting the proposal becomes the administering organisation. As at 31 May 2019, 42 Australian universities and the Australian Institute of Aboriginal and Torres Strait Islander Studies were listed as eligible organisations.11

2.23 The NCGP guidelines clearly outline general and scheme-specific eligibility criteria and define program-specific terminology used in the eligibility criteria, such as Chief Investigator (CI), Partner Investigator, Candidate and Centre Director.12

2.24 The Discovery Program general eligibility criteria that apply to all five Discovery schemes are clear and detailed. For example, general eligibility criteria for Discovery Program proposals include:

  • at the time of submission all researchers must have met all obligations relating to previously funded projects including the satisfactory completion of progress and final reports;
  • all researchers must take responsibility for the authorship and intellectual content of the research proposal;
  • a researcher cannot concurrently hold more than one ARC fellowship (Future Fellowships and Australian Laureate Fellowships) or Discovery Early Career Researcher Award, and a holder of an ARC Fellowship or ARC Award cannot concurrently hold a fellowship from another Commonwealth funding agency; and
  • within the Discovery Program, a researcher can be funded for a maximum of two projects as a Chief Investigator (CI) or one ARC fellowship or ARC award and one project as a CI.

2.25 The Linkage Program guidelines include clear and comprehensive general program eligibility requirements for the four schemes covered by the guidelines, as well as detailed scheme-specific eligibility criteria.13 General eligibility requirements for Linkage schemes include:

  • the eligible organisation that submits the proposal will be the administering organisation and all other eligible organisations listed on the proposal will be other eligible organisations; and
  • the administering organisation and each other eligible organisation on the proposal must demonstrate a significant contribution of cash and/or in-kind or other material resources to the project, having regard to the total cost of the Project and the relative contribution of any CI(s) at the organisation.

2.26 Linkage Program guidelines state that partnerships, including partnerships with industry, business and community organisations, are an eligibility requirement for all Linkage Program schemes. The guidelines also detail general eligibility requirements for the different named participants in Linkage proposals. These include eligibility requirements for ARC Centres of Excellence directors, research hub directors, training centre directors, CIs and partner investigators. Linkage Program eligibility requirements also include limits on the number of funded projects individuals can participate in.

2.27 Discovery and Linkage Program guidelines note that the ARC can determine whether a research proposal meets eligibility requirements at any stage during the assessment of the proposal. If the ARC assesses a proposal as being ineligible, the proposal may not progress to the assessment stage. The guidelines also set out that the ARC cannot recommend for funding a proposal that has been through the initial assessment process but is subsequently deemed to be ineligible by the NCGP Eligibility Committee.14

Assessment criteria

2.28 Discovery Program guidelines include clear and comprehensive assessment criteria and criteria weightings for all five Discovery schemes. Assessment criteria and weightings differ depending on the scheme. Comprehensive assessment sub-criteria are also included, which provide more detailed guidance to assist candidates in addressing the assessment criteria. For example, the Discovery Future Fellowship scheme has four assessment criteria. These are: candidate (40 per cent); project quality (35 per cent); feasibility and strategic alignment (10 per cent); and benefit and collaboration (15 per cent). Each of the four criteria has detailed sub-criteria. Examples of the sub-criteria questions for the ‘project quality’ criteria include:

  • Does the research address a significant problem?
  • Will the proposed research maximise economic, environmental, social and/or cultural benefit to Australia?
  • Does the proposed project address Science and Research Priorities?
  • Is the conceptual/theoretical framework innovative and original?

2.29 Linkage Program guidelines set out clear and comprehensive assessment criteria, including weightings for the four Linkage schemes covered by the guidelines. For example, Linkage Projects has four assessment criteria: project quality and innovation (25 per cent); feasibility (20 per cent); investigator(s) (25 per cent); and benefit (30 per cent). Each of the four criteria include detailed sub-criteria. Examples of the sub-criteria questions for the ‘benefit’ criteria include:

  • Will the proposed research encourage and develop strategic research alliances between the higher education organisation(s) and other organisation(s)?
  • Will the proposed research maximise economic, commercial, environmental and/or social benefit to Australia?
  • Does the project represent value for money?

2.30 On 27 November 2018 the Minister for Education announced details of a new National Interest Test to be implemented by the ARC.15 The test applies to all NCGP selection rounds that have opened for applications since 31 October 2018. In January 2019 the ARC developed a procedure that describes the assessment processes associated with the National Interest Test. The Linkage Projects funding round, which closed in February 2019 and is under assessment, is the first to apply the test.

Assessment process

2.31 The guidelines outline in some detail the general assessment process to be followed when conducting assessments of all NCGP research proposals, including that:

  • all proposals for all schemes must be submitted through the ARC’s Research Management System (RMS), unless otherwise directed by the ARC, and must meet format and content requirements of the RMS16;
  • the ARC oversees the assessment of research proposals and makes funding recommendations to the Minister;
  • all proposals will be considered against the eligibility criteria for the relevant schemes;
  • proposals may be assigned to external assessors to assess and report on the proposal against the assessment criteria;
  • proposals are ranked relative to other proposals and a budget is recommended by the SAC on the basis of the proposal, assessors’ reports and any rejoinder submitted by the applicant (see paragraphs 3.14 to 3.19 for further details on the SAC);
  • an administering organisation is given the opportunity to respond to assessor’s written comments (the rejoinder process). Administering organisations may also nominate persons whom they do not wish to assess a proposal by submitting a ‘request not to assess’ form; and
  • if a conflict of interest exists or arises subsequently, the administering organisation must have documented processes in place for managing the conflict of interest for the duration of the project. The guidelines further state that these processes must comply with the Australian Code for the Responsible Conduct of Research 2007, the ARC’s Conflict of Interest and Confidentiality Policy and any relevant successor documents.

Decision-making

2.32 Decision-making requirements and arrangements are also clearly articulated in the NCGP guidelines, including that:

  • the ARC CEO, following recommendations from the SAC and in accordance with the ARC Act, recommends to the Minister which proposals should receive funding, the amount of funding and the duration of the recommended project, as well as which proposals should not be approved for funding;
  • the Minister will determine which proposals will be approved, the amount of funding and the timing of funding to be paid to administering organisations for approved proposals;
  • in accordance with section 60 of the ARC Act, the Minister must not approve for funding any proposal that fails to meet the eligibility criteria set out in the guidelines; and
  • appeals will only be considered against administrative processes and not against SAC decisions, assessor ratings, comments or the assessment outcome.

Reporting and performance requirements

2.33 NCGP guidelines clearly set out reporting and performance requirements for each scheme (reporting requirements differ between NCGP programs and schemes).

2.34 The Discovery Program guidelines clearly outline the general reporting requirements that apply to all Discovery Program schemes. The requirements include end-of-year reports, progress reports and end-of-project final reports. Administering organisations are required to submit end-of-year reports for all funded proposals by 31 March in the year following each calendar year that funding was awarded. Progress reports by exception are to be completed if significant issues are affecting the progress of the project. Final reports are to be submitted within 12 months of final payment or within 12 months of the final ARC-approved project end date. In addition to the general reporting requirements, administering organisations for Australian Laureate Fellowships are required to submit mid-term case studies for each project.

2.35 Similarly, Linkage Program guidelines set out general reporting requirements for funded proposals. End-of-year reports, progress reports by exception and final reports are required from administering organisations for all funded projects in accordance with timeframes set out in the guidelines. Linkage Program guidelines also detail scheme-specific reporting requirements. For example:

  • administering organisations must submit key performance indicators and annual targets for the Industrial Transformation Research Hubs and Industrial Transformation Training Centres and report against these targets in annual progress reports;
  • administering organisations must submit progress reports in year three for a four or five year Linkage Infrastructure, Equipment and Facilities project; and
  • annual reports for ARC Centres of Excellence must cover financial operations and research performance. Administering organisations must also report on a range of key performance indicators common to all ARC Centres of Excellence. Targets for key performance indicators must be approved by the ARC and reported against in annual progress reports.

2.36 Report forms for end-of-year reporting, progress reports by exception and final reports are available in the RMS, with further instructions on the RMS reporting processes provided on the ARC’s website and in instructions to staff.

Have the NCGP guidelines and related program elements been effectively communicated to key stakeholders?

The ARC has implemented strategies to effectively communicate with key stakeholders. The ARC uses a variety of communications activities, targeting different audiences and key stakeholders, to provide information on NCGP guidelines and its elements, as well as NCGP-funded research outcomes.

2.37 Prior to the release of the 2017 CGRGs, guidelines were required to be published on an entity or other whole-of government website. The 2017 CGRGs specified that guidelines are to be published on GrantConnect. The CGRGs also state that entities should communicate effectively with potential grantees and key stakeholders.

2.38 Guidelines released prior to the 2017 CGRGs were published on the ARC’s website. The ARC’s website also includes:

  • a grants calendar that outlines timelines for applications for all NCGP schemes; and
  • a ‘key changes’ section, alongside new guidelines, which includes clarifications and changes to definitions, selection criteria weightings, project eligibility and grant administration processes.

2.39 ARC Centres of Excellence guidelines released in 2018, which are the only guidelines issued since the introduction of the 2017 CGRGs, were published on GrantConnect.

Communications strategies and guidance

2.40 The ARC has developed a variety of strategies and plans to support its communications and engagement with stakeholders and the public, including:

  • External communications strategy, May 2016 — designed to provide information and guidance about communications channels, opportunities and values. Its objectives are to ensure: there is clear guidance about the communication channels that will be used to keep the higher education and research sector informed and up-to-date regarding ARC core business; communication channels are clearly articulated; and communications are conducted in a professional, consistent, coherent and logical manner.
  • Industry communications strategy 2016–17 — provides a framework for the ARC’s engagement with industry and other stakeholders about industry-related research matters. It identifies key industry groups and aims to inform industry on various NCGP schemes, as well as the potential outcomes that could be enhanced through industry participation.
  • Consultation framework, developed in February 2017 and updated in February 2019 — the objectives of the framework include to strengthen partnerships between the ARC and key stakeholders and improve ARC’s capacity to identify, collect and analyse non-financial performance measures.
  • ARC social media policy, October 2017 — provides guidance and establishes parameters for effective and appropriate use of social media including key social media messaging relating to NCGP funding outcomes, program information, ARC publications and media releases.
  • Outreach plan for program partnerships, 2018 — outlines key outreach activities, including the ARC Major Investments Forum, induction meetings for Industrial Transformation Research Program directors and new laureates, training workshops for new research office staff, and a series of meetings with directors and business managers of ARC Centres of Excellence.
  • Outreach and engagement strategy, October 2018 — provides overarching guidance for achieving an efficient, strategic and balanced program of outreach and engagement incorporating operational, promotional and strategic messaging.

2.41 To guide direct communications with stakeholders, the ARC has established a client service charter that outlines: the services that the ARC provides; service standards for engaging with clients; and how the ARC will manage its website and any information provided by clients. The ARC also has a Standard Operating Procedure (SOP) to inform and guide ARC staff on the management of email and telephone enquiries. The ARC uses an IT platform for tracking, prioritising and reporting on email and telephone queries. The procedure sets out the processes for logging email and telephone queries, prioritising queries, developing and clearing responses to queries and extracting reports.

Communications activities

2.42 Reports on outreach activities are prepared at the conclusion of each calendar year and highlight outreach activities undertaken by the ARC.17 The ARC’s 2018 outreach report is a high level statistical report focusing on the number of engagements with universities by state and the engagement type, including operational engagement, strategic and promotion engagement.

2.43 The ARC has developed and disseminated a range of materials and conducted stakeholder engagement activities as outlined in its strategies, targeting groups identified and using communications channels and messaging that are consistent with its overarching communications strategies. This includes:

  • stakeholder presentations, which provide an overview of the NCGP lifecycle (from the development of guidelines to final reports) as well as a more detailed summary of all NCGP schemes, including historical data on success rates and diversity among applicants;
  • the ARC website, which includes a wide variety of comprehensive information on the NCGP, including current and previous NCGP guidelines, stories on funded research proposals, reminders and announcements relating to grant opportunities;
  • a range of NCGP brochures, which present an overview of NCGP schemes, including scheme objectives, funding periods and level of funding available. The brochures also include an outline of the ARC, its legislative authority, purpose and role in assessing the quality, engagement and impact of Australian research;
  • the Making a Difference annual publication, which highlights the outcomes from specific NCGP-funded research projects and provides an overview of each NCGP scheme and its high level objectives;
  • research highlights posters published most months, which highlight NCGP-funded research projects. Quick Response (QR) codes to drive traffic to the ARC’s website and its Twitter account feature prominently on all posters reviewed by the ANAO18; and
  • ARChway newsletters, which are released three times each year. Each newsletter includes a brief message from the CEO, news and events including conferences attended or hosted by the ARC, articles from grant recipients, information on ARC-funded research as well as links to relevant media releases, and links to the ARC’s website for important dates on NCGP schemes.

2.44 The messaging in the communications reviewed by the ANAO was consistent with the overarching communications and outreach strategies. The ANAO also observed ARC staff using the IT platform to track and record direct communications with stakeholders. Their use conformed to the processes set out in the SOP.

2.45 Traditional and online communications have been integrated through the use of QR codes and the promotion of the ARC’s Twitter handle. As at 29 May 2019 the ARC’s Twitter account had 17,200 followers and had posted 1,912 tweets. The ARC uses Twitter, along with other media such as its website, to inform followers of the release of NCGP guidelines, the opening and closing times for grant rounds, opening of rejoinders, and to promote ARC-funded research. For example, in March 2019, the ARC posted: ‘closing soon’ reminders for a number of NCGP schemes; and ARC-funded research highlights.

2.46 From 26 March 2018 to 15 April 2019 the ARC recorded 1473 email and telephone enquiries.19 Matters raised by stakeholders included: grants application forms; eligibility queries; conflict of interest; and questions relating to the NCGP guidelines. ANAO analysis of a selection of email correspondence showed that the ARC provided clear and concise responses to email queries including links to relevant reference material.

2.47 The ANAO sought feedback from universities and relevant peak bodies on a number of issues relating to the ARC’s administration of the NCGP, including its communications. Overall university stakeholders were strongly supportive of the ARC and indicated that the ARC’s administration of the NCGP was effective. Stakeholders also advised the ANAO that improved engagement with the universities around the timing of grant opportunities would help avoid grant proposal submission dates falling during peak academic periods, February to March.

3. Assessment and funding decisions

Areas examined

To effectively administer a grants program and ensure that grants are awarded to those activities that best satisfy the objectives of the grant opportunity, an entity needs to have in place robust processes to assess grant applications and inform funding recommendations. This chapter examines whether the Australian Research Council (ARC) has established processes and systems to assess grants and make funding recommendations that comply with the National Competitive Grants Program (NCGP) guidelines.

Conclusion

The ARC has mature and effective processes in place to assess grants, manage conflicts of interest and provide funding recommendations that comply with the NCGP guidelines.

Have appropriate arrangements been implemented to ensure grant assessments and funding decisions are well informed and transparent?

The ARC has appropriate assessment, eligibility, scrutiny and appeal processes to support well informed and transparent grant assessment and funding decisions.

3.1 Assessment criteria are defined in the Commonwealth Grants Rules Guidelines (CGRGs) as the specified principles or standards against which applications will be judged. These criteria are also used to assess the merits of proposals and, in the case of a competitive grant opportunity, to determine application rankings.

3.2 The ANAO tested: a representative, random sample of 94 funded and 59 unfunded grants; three high value ARC Centres of Excellence grants; and the eleven grants recommended to and not funded by the Minister for Education in 2018. The testing included compliance with eligibility, application, assessment, funding agreement, reporting and ministerial approval brief requirements outlined in the NCGP guidelines.

3.3 The ARC provides guidance on conducting assessments in its standard operating procedures (SOPs), assessor handbooks and on the ARC’s webpage.20 Assessors are encouraged to provide high quality assessments that are objective, detailed, sufficient, fair, meaningful and balanced. The SOP that covers assessments states that NCGP proposals are assessed against the scheme-specific assessment criteria through a peer review process to ensure that only the highest quality research proposals are recommended for funding. The NCGP assessment and approvals processes are summarised in Figure 3.1.

Figure 3.1: NCGP assessment and approvals processes

 

This flowchart graphic provides an overview of the Australians Research Council’s NCGP assessment and approvals processes. It is a slightly more detailed version of Figure 1.4, focusing on the grant assessment process following the closure of applications

 

Note a: Individuals applying for ARC funding can nominate persons whom they do not wish to assess their proposal by submitting a ‘request not to assess’ form.

Source: ANAO analysis of ARC documentation.

Grant proposals

3.4 In terms of the application process, the ANAO tested whether proposals (referred to in the Research Management System [RMS] as the application) were submitted to and certified by the proposed administering organisation’s research office and, where required, whether the administering organisation’s supporting statement had been submitted. These statements were required to address specific criteria relating to: the organisation’s research history and strengths; resources available to the applicant; and the opportunity for the applicant to demonstrate their suitability for further research work. They were also to be signed by the organisation’s Deputy Vice-Chancellor of Research (or equivalent). All grants reviewed by the ANAO had been submitted and certified by their respective Deputy Vice-Chancellor of Research or research office delegate and had a suitable supporting statement.

3.5 The ANAO also tested whether proposals addressed the relevant selection criteria for each scheme. ANAO analysis found that all proposals: addressed the scheme-specific selection criteria; and identified expected outcomes and/or objectives, as required. In addition, all applications were submitted on time.

Assessments

3.6 Peer review plays a critical role in the assessment of NCGP proposals. Assessors are responsible for assessing NCGP proposals against the relevant scheme selection criteria and contributing to the process of scoring and ranking research proposals. The College of Experts assists the ARC in recruiting and assigning assessors.21 For each NCGP scheme round, in most cases General Assessors are selected from the College of Experts to form a Selection Advisory Committee (SAC) to oversee the peer review process. The General Assessors are chosen to provide expertise based on the requirements of the scheme.

3.7 Following the close of submissions for grant funding, the ARC assigns each proposal within a NCGP scheme to a lead General Assessor, usually an expert in the proposal’s academic field, with additional General Assessors allocated depending on the specific requirements of the NCGP scheme. For example, the ARC Assessor Handbook states that two General Assessors are to be allocated to Discovery Projects and Discovery Early Career Researcher Award proposals, while three General Assessors are generally allocated to Discovery Indigenous and Linkage Projects proposals.

3.8 The lead General Assessor is often responsible for selecting Detailed Assessors from the ARC’s assessor database who are assigned on the basis of their expertise and field of research. The number of Detailed Assessors varies between NCGP schemes, with four Detailed Assessors and four reserves usually allocated to Discovery Projects, Discovery Indigenous and Linkage Projects proposals.

3.9 Detailed Assessors review allocated proposals against the scheme selection criteria and provide a written assessment and ratings on a scale from A to E for each of the selection criteria for each proposal. Following the completion of detailed assessments, General Assessors review and assess applications against the relevant criteria, providing a single draft preliminary overall score using the rating scale set out in Table 3.1. Detailed Assessors’ comments are then provided anonymously to the applicant, with the applicant given two weeks to respond to any concerns raised by assessors via a rejoinder submission.

Table 3.1: NCGP rating scale

Rating scale

Criteria

Recommendation

A

Outstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of applications should receive ratings in this band

Recommended unconditionally

B

Excellent: Of high quality and strongly competitive. Approximately 15% of applications should receive ratings in this band

Strongly support recommendation of funding

C

Very Good: Interesting, sound and compelling. Approximately 20% of applications should receive ratings in this band

Support recommendation of funding with reservation

D

Good: Sound, but lacks a compelling element. Approximately 35% of applications are likely to fall into this band

Unsupportive of recommendation for funding

E

Uncompetitive: Uncompetitive and has significant weaknesses. Approximately 20% of applications are likely to fall into this band

Not recommended for funding

     

Source: ARC, Assessor Handbook, 7 March 2018, p. 7.

3.10 After the rejoinder process has closed, based on a review of the Detailed Assessor comments and scores and the applicant’s rejoinder submission, General Assessors can revise their preliminary ratings before submitting final ratings for each proposal in the RMS. Following the submission of final ratings, the RMS will produce a ranked list of proposals for review by the SAC.

3.11 ANAO analysis of funded and unfunded grants found that in all cases assessors (both general and detailed) were assigned in accordance with requirements and written assessments were completed with ratings recorded against proposals using the scale set out in Table 3.1.

3.12 The ARC has also developed a SOP that details the process for identifying and managing inappropriate assessments. The SOP defines inappropriate assessments, which includes assessments that are very brief or use generic comments across multiple assessments, contain comments that do not support the scores or compare applications, or include comments that can be perceived as discriminatory, defamatory or irrelevant. In 2017 the ARC introduced a process of spot checking proposal assessments. The ARC has developed work instructions that set out that all assessment text is screened using search strings that seek to identify potential concerns with eligibility, plagiarism, conflict of interest, scores and ratings. Reports are generated following the text search and assessments flagged as potentially inappropriate during the spot checks are reviewed by the ARC. The ARC advised that where inappropriate assessments are confirmed by the ARC, they may request an assessor to amend the assessment or, in rare circumstances, remove the assessment from the peer review process.

3.13 The ANAO reviewed spot checks from Discovery Projects 2018, where 20 assessments were identified and reviewed by the ARC for inappropriate content. The most common issues giving rise to inappropriate assessments was commentary by the assessors on eligibility and plagiarism, which is not part of the assessor’s role.

Selection Advisory Committee

3.14 Selection Advisory Committee meetings are held following the completion of detailed and general assessments of proposals. The ARC uses SACs to review and consider applications for funding and to make recommendations to the ARC’s Chief Executive Officer (CEO). SACs are normally made up of members drawn from the College of Experts, with different configurations based on the NCGP scheme being assessed. For example, the SACs for Discovery Projects and Discovery Early Career Researcher Award schemes are configured into five panels covering five broad disciplinary areas: Biological Sciences and Biotechnology; Engineering, Information and Computing Sciences; Mathematics, Physics, Chemistry and Earth Sciences; Social, Behavioural and Economic Sciences; and Humanities and Creative Arts. ARC Centres of Excellence, Industrial Transformation Research Hubs, Industrial Transformation Training Centres and Australian Laureate Fellowships SACs are configured into a single interdisciplinary panel.

3.15 The roles and obligations of SAC committee members, as well as the sequence of events and required documentation for SAC meetings, is outlined in a Selection Advisory Committee Meeting SOP and work instructions.

3.16 The SAC Committee Chair is responsible for leading the committee through the process to recommend proposals for funding and calling the committee to a vote on proposals within the uncertainty band or where there is disagreement between committee members.22 The SAC provides recommendations to the CEO who, in accordance with Section 52 of the ARC Act, submits recommendations, including recommended budgets, to the Minister for consideration.

3.17 The ANAO reviewed a selection of SAC documents, including schedules, instructions to committee members, committee run sheets and voting records. The documents were comprehensive and consistent with requirements set out in the work instructions and SOPs.

3.18 The ANAO also observed a SAC meeting. At the meeting, the ANAO observed discussions on the shortlisting of proposals and whether to recommend proposals for funding. These discussions were led by the lead General Assessor, as set out in the SOP. The ANAO also observed committee votes on shortlisting proposals and recommended proposals for funding. Voting was conducted using the SAC meeting application and the outcome of votes was recorded by ARC staff in attendance at the meeting. The meeting was conducted in accordance with the process outlined in the work instructions and SOP.

3.19 ANAO analysis of funded grants found that in all cases the SAC ranked and recommended each grant for funding. All funded grants reviewed by the ANAO were also recommended for funding by the ARC CEO.

Eligibility

3.20 The ARC is responsible for assessing whether a proposal meets the eligibility requirements listed in the guidelines. The assessment of a proposal’s eligibility may be undertaken at any time following the close of submissions.

3.21 The ANAO tested whether applicants were eligible for the grant opportunities for which they were applying. These eligibility requirements vary between grant opportunities but often comprise requirements relating to: number of existing NCGP funded projects; PhD conferral date; employment status at an eligible organisation; submission of final reports for previously funded projects; and the percentage of time the researcher intended to spend on the proposed project.

3.22 The ANAO’s testing confirmed that all applications were submitted by eligible organisations and that all applicants awarded funds were eligible.

Eligibility and scrutiny committees

3.23 The role of the NCGP Eligibility Committee is to assess potential eligibility issues that have been identified and have not been resolved by the ARC; make a recommendation on eligibility; and provide recommendations to the CEO as to whether an application is either eligible or ineligible for funding.

3.24 The ANAO’s analysis of Eligibility Committee documents and recommendations to the CEO arising from committee meetings for Discovery Projects commencing in the past four years is shown in Table 3.2. All ineligibility recommendations were approved by the CEO.

Table 3.2: Eligibility Committee outcomes

Year funding commenced

Proposals reviewed

Proposals recommended as ineligible

Reason for ineligibility

2016

181

10

10:

medical nature of the research

2017

261

10

10:

medical nature of the research

2018

781

11

5:

5:

1:

Chief Investigator exceeded project limits

medical nature of the research

outstanding final report

2019

192

(196 eligibility issues)

58

38:

16:

3:



1:

outstanding final reports

medical nature of the research

Chief Investigator did not meet eligibility requirements

incomplete information in the proposal

         

Source: ANAO analysis of ARC documents.

3.25 The role of the Scrutiny Committee is to oversee and examine the integrity of the assessment process in relation to grant applications involving SAC members and ARC employees. The Scrutiny Committee should examine the integrity of the assessment process after every SAC meeting where a ‘member application’ was submitted. A member application is defined in the Scrutiny Committee SOP as an application from a SAC member who is responsible for assessing the grant opportunity or an application submitted by or involving an ARC staff member.

3.26 The ANAO reviewed a selection of Scrutiny Committee documents, including: emails to committee members; committee minutes and approvals; committee data sheets; and protocol documents, which set out the role and obligations of the scrutiny committee. The analysis found that Scrutiny Committee meetings were conducted and documented in accordance with the Scrutiny Committee SOP. Minutes from the committee were provided to the ARC CEO with all minutes reviewed by the ANAO stating that the Scrutiny Committee was satisfied that the ARC assessment process was fair for all member proposals, with no evidence of favourable or unfavourable bias in the assessment. Details on the number of committee member proposals reviewed by the committee and comparative success rates are set out in Table 3.3.

Table 3.3: Scrutiny committee outcomes

NCGP scheme

No. of member proposals

Member success rate (%)

Overall scheme success rate (%)

Discovery Projects 2016

59

40.7

17.7

Discovery Indigenous 2016

0

N/A

32.3

Linkage Infrastructure, Equipment and Facilities 2016

2

100.0

31.8

Discovery Projects 2017

59

54.2

17.8

Discovery Indigenous 2017

1

100.0

35.5

Linkage Infrastructure, Equipment and Facilities 2017

8

50.0

30.2

Discovery Projects 2018

52

42.3

19.2

Discovery Indigenous 2018

1

100.0

34.2

Linkage Infrastructure, Equipment and Facilities 2018

5

40.0

29.2

       

Source: ARC data.

Appeals

3.27 The ARC has developed processes, outlined in the SOPs, for managing appeals from unsuccessful NCGP applicants. Appeals must be lodged through the administering organisation’s research office and be authorised by their Deputy Vice-Chancellor (Research) or equivalent within 28 days of the NCGP funding announcement.

3.28 Appeals, once received by the ARC, are managed by the NCGP Appeals Committee. The committee can only consider appeals against administrative process issues, consistent with the appeals process outlined in the NCGP guidelines. If an error is identified by the committee, a decision is made as to whether the error led to a defect in decision-making that adversely affected the appellants’ application. For each appeal reviewed by the committee there are three potential outcomes:

  • appeal dismissed;
  • appeal allowed but not in the fundable range; or
  • appeal allowed and application is in the fundable range.

3.29 Very few appeals (less than one per cent of proposals) are received by the ARC. Recent outcomes from NCGP appeals are outlined in Table 3.4.

Table 3.4: NCGP appeals outcomes, 2015 to 2018

Year

Total proposals

Appeals received

Appeals dismissed

Appeals allowed (not in fundable range)

Appeals allowed (in fundable range)

2015

6,414

12

5

7

0

2016

6,200

5

5

0

0

2017

5,771

7

6

1

0

2018

5,228

12

8

4

0

           

Source: ANAO analysis of ARC documentation.

Have appropriate arrangements been implemented to manage actual or perceived conflicts of interest by NCGP assessors?

The ARC has established appropriate arrangements to manage actual and perceived conflicts of interest in the NCGP process.

3.30 The CGRGs state that officials should establish transparent processes that help manage misconceptions and the potential for personal or related party gain. Accountable authorities should ensure that entity policy and management processes for conflict of interest are published to support probity and transparency. Accountable authorities should also put in place appropriate mechanisms for identifying and managing potential conflicts of interest for grant opportunities.

3.31 NCGP guidelines state that each participant or organisation named in an NCGP application must declare to the administering organisation at the date of submission any conflict of interest that exists or is likely to arise in relation to any aspect of the proposal. Further, if a conflict of interest exists or arises, the administering organisation must have documented processes in place for managing the conflict of interest for the duration of the project. Such processes must comply with the Australian Code for the Responsible Conduct of Research 2007, the ARC conflict of interest and confidentiality policy and any relevant successor documents.

3.32 The ARC has a conflict of interest and confidentiality policy. The policy states that it is designed to ensure that all material personal interests are disclosed and that conflicts of interest are identified and managed in a rigorous and transparent way to ensure the integrity, legitimacy, impartiality and fairness of ARC processes. It applies to ARC staff, committee members, assessors, consultants and any other parties engaged by the ARC for the provision of service, as well as administering organisations, funding applicants and funded researchers. The policy is published on the ARC’s website, consistent with the CGRG requirements.

3.33 The ARC requires all employees, contractors and committee members to disclose any material personal interests within four weeks of commencement of employment (or, for committee members, when contracted) and then annually, and to update that information as soon as possible if any significant changes occur to their or their immediate family or partner’s interests.

3.34 The ANAO reviewed employee material interest declarations from all 13 ARC senior executives. As at 18 April 2019 all material interest declarations were completed and recorded in accordance with requirements. The ANAO also reviewed material interest and confidentiality agreement documentation for all 116 General Assessors involved in the assessment of the ANAO’s random sample of approved grants. In all cases General Assessor material interest and confidentiality agreements were on file.

Managing conflicts of interest

3.35 The Research Management System (RMS) manages conflicts of interest by recording and tracking individual assessor and participant relationships. The RMS prevents assessors being assigned to a proposal where there is a known relationship with any named participant or organisation on the proposal. Table 3.5 sets out the business rules for identifying and managing conflicts of interest.

Table 3.5: Conflict of interest rules

Rule used to exclude an assessor

Inputs used

Limit

The assessor worked for an administering or participating organisation on the proposal.

Employment history

Valid until two years have passed from the cessation of employment with that organisation.

The assessor has a direct ‘Personal’ relationship with at least one person participating on the proposal.

Direct relationships and assignment rejection

Persistent

The assessor has a direct ‘Professional’ relationship with at least one person participating on the proposal.

Assignment rejection

Valid until four years have passed from the date the relationship was recorded in RMS

The assessor has a direct ‘Professional’ relationship with at least one person participating on the proposal.

Direct relationships

Valid until four years have passed from the relationship end date. If no end date is entered then the relationship is ‘current’ and the rule is persistent.

The assessor has a direct relationship with at least one organisation administering or participating on the proposal.

Assignment rejection

Valid until two years have passed from the date the relationship was recorded in RMS

The assessor has a direct relationship with at least one organisation administering or participating on the proposal.

Direct relationships

Valid until two years have passed from the relationship end date. If no end date is entered then the relationship is ‘current’ and the rule is persistent.

     

Source: ARC conflict of interest rules.

3.36 Before accepting individual requests to assess ARC-funded proposals, assessors receive an automatic prompt in the RMS that sets out their obligations relating to conflict of interest. Assessors are required to agree to, and comply with, the relevant policies on confidentiality and conflict of interest prior to accepting assessment assignments (see Figure 3.2).

Figure 3.2: Assessor conflict of interest declaration

 

Before accepting individual requests to assess Australian Research Council-funded proposals, assessors receive an automatic prompt in the Australian Research Council’s Research Management System (RMS) that sets out their obligations relating to conflict o

 

Source: Screen capture from RMS.

3.37 After accepting the conflict of interest declaration requirements, assessors are able to view the full proposal and its participants. Upon viewing the proposal, assessors who identify a conflict are able to reject the assignment. In rejecting the assignment, assessors identify the conflict of interest and the RMS system records the relationship and prevents future assignments where the declared conflict exists, in accordance with conflict of interest business rules set out in Table 3.5.

3.38 During SAC meetings, when a proposal is brought forward for discussion where one or more of the attending SAC members is conflicted, the RMS meeting application will inform all committee members of the conflict. The conflicted committee member will also receive a message requesting that they leave the room (see Figure 3.3).

3.39 The ANAO observed the operation of conflict of interest measures during a SAC meeting. During these meetings, conflicted committee members were identified and asked by the committee chair to leave the room. The RMS lockout screen (see Figure 3.3) was displayed on the personal screens of committee members for the duration of the discussion and vote on the research proposal. Committee members left the room for the duration of the discussion, returning only once voting had concluded.

Figure 3.3: RMS meeting application lockout screen

 

During Selection Advisory Committee (SAC) meetings when a proposal is brought forward for discussion where one or more of the attending SAC members is conflicted, the Australian Research Council’s Research Management System (RMS) meeting application will

 

Source: Screen capture from RMS.

3.40 The ANAO examined the NCGP conflict of interest process and supporting systems and found them to be robust, with well documented processes and business rules. The process for General and Detailed Assessors is automated and embedded within the RMS. As all grant proposals are submitted through the RMS, all applicant and assessor conflicts of interest are dealt with through this system, and are compliant with the relevant NCGP conflict of interest SOPs and work instructions developed by the ARC.

Has the ARC provided the Minister for Education clear advice and funding recommendations regarding NCGP grants?

The ARC provided the Minister for Education clear advice and funding recommendations consistent with requirements of the ARC Act, the CGRGs and the NCGP guidelines.

3.41 Under the CGRGs, officials must provide written advice to ministers where ministers exercise the role of an approver. This advice must, at a minimum:

  • explicitly state that the spending proposal being considered for approval is a ‘grant’;
  • provide information on the applicable requirements of the Public Governance, Performance and Accountability Act 2013 (the PGPA Act) and Rule and the CGRGs (particularly any ministerial reporting obligations), including the legal authority for the grant;
  • outline the application and selection process followed, including the selection criteria, that were used to select potential grantees; and
  • include the merits of the proposed grant or grants relative to the grant opportunity guidelines and the key principle of achieving value with relevant money.

3.42 The NCGP guidelines state that the ARC CEO will submit recommendations regarding all applications to the Minister, recommending which proposals should and should not be approved for funding and the amount and timing of funding for approved proposals. The guidelines also state that under the ARC Act, the Minister must not approve for funding any proposal that fails to meet the eligibility criteria set out in the guidelines. Section 51 of the ARC Act states that the instrument of approval signed by the Minister must include the following information:

  • name of the approved organisation;
  • description of the approved research program;
  • name and title of the person leading the approved research program; and
  • amount of funding to be provided.

3.43 The ANAO analysed 45 ministerial submissions and found that they complied with the requirements outlined above, including that the submissions:

  • explicitly stated that the requested approval was for NCGP grants and outlined the merits of the grants. For example, a submission covering the NCGP Discovery programs highlighted the broad benefits of the programs: ‘Supporting high quality research, essential to Australia’s innovation system, for the development of new ideas, job creation, economic growth and an enhanced quality of life in Australia’;
  • set out the requirements of the PGPA Act, the CGRGs, requirements under sections 51–53 of the ARC Act and a statutory requirements and financial issues attachment that provided additional detail on the ARC Act and the PGPA Act; and
  • outlined the selection process and selection criteria used, noted elements of the selection criteria, including value for money, economic, environmental and/or social benefit to Australia and addressing Science and Research Priorities, and the full scheme specific selection criteria was attached to all submissions.

3.44 The ARC also provided to the Minister:

  • a schedule of all proposals recommended for funding for all grant opportunities commencing in 2016, 2017 and 2018, which included the name and title of the lead investigator, the administering organisation, the name of the partner organisation for Linkage Program proposals, an annual budget for each year of funding and a brief description (about 100 words) of the research activity; and
  • detailed selection reports for all grant opportunities commencing in 2016, 2017 and 2018, which outlined the selection process, opening and closing dates, high level selection criteria and weightings and the assessment process. Selection reports also noted outcomes by discipline, administering organisation and Science and Research Priorities and included information on career age, gender diversity and international collaboration.23 For example, 488 of recommended Discovery Projects 2017 proposals foreshadowed 965 instances of collaboration with researchers in 58 overseas locations.

3.45 For NCGP grants commencing in 2016, 2017 and 2018 the Minister accepted the ARC’s recommendations for all eligible applications not recommended for funding and most applications recommended for funding. The Minister did not fund 11 grants recommended by the ARC — six Discovery Projects 2018, three Discovery Early Career Researcher Awards 2018, and two Future Fellowships 2018. The Minister did not fund any projects that were not recommended for funding.

3.46 Proposals that are approved by the Minister require executed grant agreements before monies can be released by the ARC. All grants tested by the ANAO had funding agreements in place. ANAO analysis of funded grants found that 93 (or 99 per cent) were awarded less funding than the amount requested in the proposal.

3.47 Table 3.6 shows the ANAO’s analysis of ministerial submissions and approvals, including the percentage of budget approved relative to the amount applied for. This ranges from 60 per cent of the budget requested for Linkage Infrastructure, Equipment and Facilities 2017 proposals to 100 per cent for Industrial Transformation Research Hubs 2018.

Table 3.6: Ministerial submissions for grants approved, 2016–18

Scheme and year

Submitted

Approved

Success rate (%)

Total funds approved ($000s)

Percentage of funds requested (%)

Research priority (no. and %)

Discovery Program Schemes

Discovery Early Career Researcher Award

  • 2016

1220

200

16.4

70,737

91.4

N/A

  • 2017

1197

200

16.7

71,701

90.5

124 (62.0%)

  • 2018

1212

197

16.3

70,940

89.8

115 (58.4%)

Australian Laureate Fellowships

  • 2016

124

16

12.9

44,123

Not reported

13 (81.0%)

  • 2017

112

17

15.2

47,026

Not reported

12 (70.6%)

  • 2018

134

16

11.9

46,414

Not reported

12 (75.0%)

Discovery Projects

  • 2016

3584

635

17.7

244,935

65.7

N/A

  • 2017

3540

630

17.8

234,661

65.8

369 (58.6%)

  • 2018

3136

594

18.9

225,600

68.2

331 (55.7%)

Discovery Indigenous

  • 2016

31

10

32.3

4,059

74.6

N/A

  • 2017

31

11

35.5

4,635

64.8

9 (81.8%)

  • 2018

38

13

34.2

7,210

70.7

9 (69.2%)

Future Fellowships

  • 2016

324

100

30.9

77,024

88.3

56 (56.0%)

  • 2017

294

91

31.0

77,004

93.3

45 (49.5%)

  • 2018

509

100

19.6

84,704

93.4

55 (55.0%)

Linkage Program Schemes

Linkage Projects

  • 2016 Round One

742

231

31.1

81,218

71.8

193 (83.9%)

  • 2016 Round Two (Continuous)

225

89

39.6

34,144

76.9

73 (82.0%)

  • 2017 (Continuous)

417

132

31.7

53,330

81.7

112 (84.8%)

  • 2018 (Continuous)a

 

 

 

 

 

 

Linkage Infrastructure, Equipment and Facilities

  • 2016

173

54

31.2

37,974

78.5

N/A

  • 2017

179

48

26.8

28,629

60.0

41 (85.4%)

  • 2018

171

50

29.2

28,576

84.8

37 (74%)

Industrial Transformation Research Hubs

  • 2017

5

3

60.0

9,604

90.6

3 (100%)

  • 2018

9

4

44.4

17,972

100.0

4 (100%)

Industrial Transformation Training Centres

  • 2016

27

6

22.2

22,044

86.8

6 (100%)

  • 2017

26

9

34.6

36,990

93.3

9 (100%)

  • 2018

28

7

25.0

28,922

93.5

7 (100%)

ARC Centres of Excellence 2017

20

9

45.0

283,500

93.1

9 (100%)

Linkage Learned Academies Special Projects 2017

10

5

50.0

1,150

75.0

2 (40%)

Per- and Poly-Fluoroalkyl Substances Remediation Program 2018b

31

9

29.0

8,166

83.8

N/A

             

Note a: Linkage Projects 2018 (Continuous) round had not been completed as at 1 June 2019.

Note b: Per- and Poly-Fluoroalkyl Substances Remediation Program 2018 is a special research initiative program.

Source: ANAO analysis of ministerial submissions.

4. Performance and reporting

Areas examined

In line with the Commonwealth Resource Management Framework, when administering programs it is important that entities maintain an outcomes orientation and effectively monitor the progress of grants to ensure that objectives are being achieved and relevant money is being appropriately managed. The audit examined whether the Australian Research Council (ARC) has implemented effective monitoring, assurance, evaluation and reporting arrangements for the National Competitive Grants Program (NCGP).

Conclusion

The ARC’s arrangements to measure the performance of the NCGP, monitor and evaluate the program and provide assurance that administering organisations comply with funding requirements are largely effective. Most of the ARC’s key performance indicators (KPIs) are relevant, but not all are reliable. Its assurance arrangements could be more risk-based to provide greater assurance that administering organisations comply with grant agreement requirements and the program is achieving its objectives.

Areas for improvement

The ANAO made two recommendations aimed at improving the KPIs relevant to the NCGP and its monitoring and assurance activities.

Has an appropriate performance management framework for the NCGP been developed?

The performance management framework established by the ARC for the NCGP is largely appropriate. Most of the KPIs are relevant but require further refinement to be reliable and do not include any efficiency measures.

4.1 The Public Governance, Performance and Accountability Act 2013 (PGPA Act) provides the basis for the Commonwealth’s performance framework (the framework). The framework consists of the PGPA Act, the accompanying Public Governance, Performance and Accountability Rule 2014 and guidance issued by the Department of Finance. An important element of the framework is the clear alignment of the purpose(s) to the planned outcomes of the entity, and that the performance information provided by the entity provides the Parliament and the public with information to assess the entity’s progress towards achieving their purpose(s).

4.2 The ANAO examined the ARC’s performance management framework and governance arrangements for the NCGP. The ARC’s performance governance arrangements outlined in its Planning and Reporting Framework document are comprehensive. The framework outlines the key elements of ARC activities and processes associated with meeting the Commonwealth’s performance framework, including key responsibilities, risk management and communication. The ARC’s 2018 Performance Measurement Framework provides details of the performance measures, outlined in the corporate plan, and guidance for preparing the performance measures and the annual performance statements. The document is informative and consolidates the various components of the ARC’s performance management framework. The framework states that the ARC can control its activities, outputs and immediate outcomes, but can only influence long-term outcomes such as Australia’s capacity to respond to emerging priorities and the benefits to Australia from research discoveries and developments.

4.3 The ANAO also assessed whether performance criteria specified in the ARC’s annual performance statements align with those outlined in the corporate plan and the Portfolio Budget Statements (PBS) and were appropriate. The appropriateness of the performance criteria presented by entities in the PBS, corporate plans and performance statements is critical to fulfilling the transparency and meaningfulness aims of the framework. While criteria that set a minimum standard for the quality of performance information are not defined in the PGPA Act, the Department of Finance has provided guidance to entities on the characteristics of ‘good’ performance information — relevant, reliable and complete.24 In the absence of formal criteria in the PGPA Act, the ANAO drew on the Department of Finance’s guidance, and other relevant reference points, to develop audit criteria for assessing the appropriateness of performance information. These criteria are used consistently in ANAO audits and was applied in this audit.

4.4 Two of the characteristics of appropriate performance criteria are relevance and reliability. Table 4.1 outlines how relevant and reliable are described in Department of Finance guidance and how the ANAO assesses these elements. For the review of activities that do not relate to an entire entity the ANAO examines the adequacy of performance measures within an activity area rather than whether they are complete or not (discussed from paragraph 4.9).

Table 4.1: Assessing relevance and reliability

 

RMG 131

ANAO criteria

Relevant

Performance information should clearly state who benefits and how they benefit from the entity’s activities.

Benefit: clearly indicates who will benefit and how from the entity’s activities

Focus: should address a significant aspect/s of the purpose

Understandable: should provide sufficient information in a clear and concise manner

Reliable

Performance information should use information sources and methodologies that are fit-for-purpose and verifiable

Measurable: performance indicator should use information sources and methodologies that are fit for purpose

Free from bias: free from bias and where possible, benchmarked against similar activities

     

Source: Department of Finance. Resource Management Guide No. 131 Developing good performance information and related Quick Reference Guide.

4.5 The ARC Corporate Plan 2018–19 lists 20 KPIs linked to the NCGP and the ARC’s administration of the program. Of the 20 KPIs, the ANAO assessed 17 as being relevant. For the remaining three indicators, there was insufficient information to enable the performance indicator to be easily understood. Seven performance indicators were assessed as being reliable, eight as partly reliable and five indicators were assessed as being not reliable due to insufficiently described measurement methods or insufficiently defined targets.

4.6 A reader’s understanding of the ARC’s expected performance for the KPIs assessed as not being fully reliable would be enhanced by including in the corporate plan an explanation of the nature of funding cycles, other possible one-off factors and a benchmark or baseline for comparative purposes. A benchmark was included in the ARC’s Corporate Plan 2017–18 for 14 of the 23 KPIs but was not included in the Corporate Plan 2018–19. An entity’s corporate plan is designed for external accountability. Where baselines are published and the methodology underpinning the measures is clear, a reader can form a consistent expectation about performance and whether an entity is achieving its purpose.

4.7 Nine of the 20 KPIs do not have targets. The use of targets helps define performance and establishes expectations around an entity’s performance. The ARC’s corporate plan states that some activity and output metrics do not have targets because they are demand-driven. For example, the ‘number of proposals submitted to the ARC for funding’ are not known in advance and can vary from year-to-year depending on several factors such as the amount of funds available and the availability of researchers. While the ARC regard the nine measures as being demand-driven, targets would assist readers to gain a more complete understanding of the breadth of the ARC’s activity and desired levels of performance, and enable the ARC to identify emerging trends and potential issues.

4.8 The ARC considers that targets are not appropriate for a number of objectives under the NCGP. In its view, the ARC enables the achievement of policy objectives (for example, international collaboration and research training) by providing appropriate support and including these elements in selection criteria but does not dictate the extent to which research projects involve these elements. As a result of the ANAO’s audit, the ARC has indicated its intention to improve the clarity of some of its performance measures.

4.9 The ANAO also examined the adequacy of the KPIs. Performance indicators are considered to be adequate where they:

  • collectively address the purpose of the activities identified in the corporate plan — in this case, Key Activity 1: Funding the highest quality research; and
  • provide a balance between: effectiveness and efficiency indicators, quantitative and qualitative data, and short, medium and long-term performance.

4.10 To determine whether the ARC’s KPIs are adequate, the ANAO compared the intended outcomes for Key Activity 1 with the related performance indicators. This analysis found that 14 of the 16 strategies and actions aligned with the NCGP performance indicators. The exceptions were: strategy 1.8 ‘Cultivate a system-wide culture of research integrity’, which was not covered by any KPI; and strategy 1.7 ‘Maintain efficient and effective post-award management of ARC-funded grants (monitoring progress, conducting reviews and overseeing reporting requirements)’, for which post-award processes were not fully covered by the published KPIs, with only the achievement of ARC-funded research project objectives measured. The word ‘efficient’ was added to strategy 1.7 in the 2018–19 corporate plan. The ARC’s corporate plan does not include any efficiency indicators for the NCGP.

4.11 The ANAO also found that the indicators are sufficiently balanced, including that:

  • 13 performance indicators were direct measures of effectiveness, while the remaining seven indicators measured either inputs or outputs; and
  • one performance indicator was qualitative, with the remaining 19 indicators quantitative.

The ANAO noted that the ARC presents these performance measures as an appendix to its corporate plans, rather than as an integral component of the plan. The ANAO also found that the ARC’s reporting against its performance statement in the ARC Annual Report 2017–18 is accurate, except for one KPI. The performance statement says that the target for ‘Proportion of ARC-funded research projects that meet their objectives’ was achieved in 2017–18. However, the ARC used data on final reports yet to be approved by the ARC when determining this result. The ARC advised the ANAO that reporting against this performance indicator was impacted by delays in the processing of final reports submitted by grant recipients.

Recommendation no.2

4.12 The Australian Research Council ensure that its KPIs for the NCGP are reliable and include efficiency.

Australian Research Council response: Agreed.

4.13 The ARC will review its KPls to ensure they are reliable in accordance with the Resource Management Guidelines issued by the Department of Finance, and look at efficiency targets where appropriate. The ARC has commenced an assessment of KPls from other similar granting agencies both within Australia and internationally. Preliminary review has indicated that similar granting agencies, which fund long-term research, have found it difficult to develop efficiency targets. Internationally there appears to be no single indicator or evaluation method that adequately captures the results of research and development. The ARC is happy to work with the ANAO to develop efficiency targets for its KPls.

Does the ARC provide appropriate internal reporting about the NCGP?

Internal reporting about the NCGP focuses on point-in-time data and current status but does not include historical or contextual information. Including this type of information in internal management reporting would improve the ARC’s capacity to effectively monitor and administer the NCGP.

4.14 Putting in place appropriate internal reporting processes assists management to effectively oversee grant processes and make informed decisions about the administration of a grants program, including associated risks and the performance of an entity’s operations.

4.15 An internal audit report on post–award grant management (June 2018) noted that ‘there is no systematic reporting undertaken by the Post-Award Team regarding its management of post-award grants’.25 The ARC advised that it does not currently produce reports on a regular basis to monitor the reasons for variations to grant agreements, but produces reports on an ad hoc basis. The ARC also advised that the Research Management System (RMS) did not have the full reporting functionality to enable its post-award team to do business intelligence reporting but that a new business intelligence tool has recently been developed to address this.

4.16 Since June 2018 the Program, Strategy and Executive Committee has been in receipt of a fortnightly dashboard report on the status of high level activities from each NCGP-linked team. The fortnightly reports have progressively been enhanced and now present key program statistics for Major Investments and post-award processes, including the number of:

  • final reports due each month and overdue reports for Major Investments;
  • final reports assessed by the ARC for Major Investments;
  • final reports closed in RMS for Major Investments;
  • event briefs that are requested from parliamentary;
  • variations submitted;
  • variations approved for the calendar year;
  • final reports submitted to the ARC;
  • final reports overdue; and
  • final reports approved for the calendar year.26

4.17 The report records actions, current status and key dates by the Major Investments and post-award teams and also some specific activities within the NCGP Discovery and Linkage programs.

4.18 The ANAO reviewed a selection of fortnightly reports provided to the committee and noted that they focus primarily on point-in-time performance and report current status (in progress, progressing well or delayed). While these dashboard reports include key NCGP statistics in tabular and graphic form, they do not include historical or trend data to contextualise current performance. For example, statistics for grant variations present the number of variations submitted and the total number approved for the calendar year-to-date. As this is the ARC’s only internal NCGP reporting mechanism, the absence of historical data makes it difficult to determine whether outcomes are as expected or may be an issue that warrants management attention.

4.19 There is also no information provided in these reports on the reasons for variations. Capturing this information might enable ARC management to implement processes to reduce the number of variations submitted.27 Further, the number of final reports that are ‘overdue for submission to the ARC’ is captured in the dashboard report but no reasons are provided for the delays. Also, the report does not contain details about the time taken to review and approve final reports, an issue noted in paragraphs 4.23 to 4.28. Monitoring the time to approve final reports would allow the ARC to identify and address potential issues with report approval in a timely manner.28

4.20 In response to the ANAO’s data requests, the ARC developed a suite of reporting tools. The ARC informed the ANAO that it will use the newly developed reporting tools to improve the quality of information it considers when managing the NCGP.

Have appropriate monitoring and assurance arrangements for the NCGP been established?

The ARC has appropriate monitoring and assurance arrangements for the NCGP. However implementation could by improved by ensuring that the arrangements are more risk-based and effectively contribute to the ARC’s assurance that administering organisations comply with grant agreement requirements and NCGP risks are being managed appropriately.

4.21 The ARC has in place a number of monitoring and assurance mechanisms for the NCGP, including:

  • reporting from grant recipients and grant variation processes;
  • institutional reviews;
  • Major Investments reviews;
  • research integrity activities;
  • NCGP Grants Management Assurance Framework; and
  • evaluation activities (discussed in the next section).

Reporting

4.22 The ARC has in place a variety of reporting requirements that assist it to monitor compliance by administering organisations with grant agreements. These include:

  • End-of-year reports — assessed to ensure, for example, that: funds have been spent each year in accordance with the salary rates and/or any variations submitted to the ARC; justifications for any changes are provided; and funds transfers into the following financial year are within the allowable timeframe. The ARC will not accept an end-of-year report until the necessary action is followed up by the institution or a variation is submitted. End-of-year reporting is due by 31 March of the following year.
  • Progress reports — by exception. Collects information regarding any significant issues affecting the progress of a project during the calendar year.
  • Progress reports — Major Investments. All Major Investment projects are required to submit progress and annual reports, as specified in their grant agreements.29
  • Progress Reports — Australian Laureate Fellowships and Linkage Infrastructure, Equipment and Facilities. Each of these projects, funded for four to five years, must submit a progress report in year three of funding.
  • Final Reports — required to be completed for each NCGP-funded project and must be submitted within 12 months of the project end date. These reports are to be reviewed by ARC staff to ensure that project outcomes and outputs are achieved and to verify if there were any issues that were not reported previously. The process also provides an opportunity to check that all variations have been approved by the ARC, any serious issues are reported and the project has been satisfactorily completed in accordance with the grant agreement.

4.23 Between 2016 and end-May 2019, the ARC received 26,279 end-of-year reports and 5352 final reports.

4.24 In September 2018 the ARC established a taskforce to focus on processing a backlog of final reports. A streamlined assessment process was also set up to assess final report forms. At that time there were approximately 2,640 final reports waiting to be processed by the ARC. The oldest reports dated from early 2016. The taskforce recommended, and the ARC agreed, that a risk-based approach be taken to processing final reports going forward. That is, high value grants (over $600,000) should receive the highest level of scrutiny, based on a risk matrix for the processing of final reports.

4.25 Analysis undertaken by the ANAO on the quantum and processing times for all final reports received by the ARC found that the average processing time for ARC review peaked at 464 days in 2016 and had fallen to 137 days in 2018.30From 2016 to 2018, the number of final reports received remained largely the same (see Figure 4.1). The ANAO’s financial statements audit did not identify any material issues in the sample of final reports tested. In this audit, the ANAO did not form a conclusion about final report compliance as the grants are still active and final reports are not yet required.

Figure 4.1: Number of final reports received and ARC processing time, 2016 to 2018

 

Note: The year refers to the year in which the final report was received. Processing time is the cumulative time that the final report is with the ARC for assessment.

Source: ANAO analysis of ARC data.

4.26 Analysis undertaken by the ANAO found that the proportion of end-of-year reports received late from administering organisations increased from 28 per cent (2024 reports) in the 2015 funding year to 71 per cent (4936 reports) in 2016 and then decreased to 41 per cent (2600 reports) in 2017. As indicated in Figure 4.2, the lateness of end-of-year reports received averaged 22.7 days from 2015 to 2017.

Figure 4.2: End-of-year report status, 2016 to 2019

 

Note: The year refers to the funding year that is the subject of the end-of-year report. The data for 2018 is for reports that were received in the five months to 31 May 2019.

Source: ANAO analysis of ARC data.

4.27 Stakeholder feedback to the ANAO on the ARC’s monitoring mechanisms included that: requests for extensions submitted in end-of-year reports took too long to be processed by the ARC, leaving researchers in ‘limbo’; and it can take up to two years for the ARC to review and seek amendments to submitted final reports.31

4.28 By monitoring the submission and timeliness of its review of end-of-year and final reports, the ARC would be better placed to undertake actions to reduce the number of late end-of-year reports and improve the timeliness of its review and approval of final reports. This should result in improved efficiency and compliance with reporting requirements.

Variations

4.29 NCGP grant agreements specify requirements for project variations to be submitted to the ARC at any time during the conduct of a project where there are changes to the project, both financial and non-financial. Variations are assessed by ARC staff against the relevant grant agreement for the project and approved by the ARC delegate.

4.30 Stakeholder feedback to the ANAO on the ARC’s variation processes included: variations take too long to process and are too onerous, with around a third of grants requiring variations, often involving moving immaterial amounts of money between line items; a one line budget would circumvent many issues surrounding variations and end-of-project reporting; and universities having to submit a variation request for any movement between line items, including a letter from the Deputy Vice-Chancellor Research, Chief Investigator and agreement from every team member. This was considered to be onerous for what are often immaterial budget variations.32 Administering organisations also commented that it was not clear to them how the ARC assesses variation requests in the absence of a revised detailed budget.33

4.31 The ANAO’s analysis found that the number of variations submitted by administering organisations peaked at 1841 in 2017, with the average processing time taken by the ARC decreasing from 43 days in 2017 to 16 days as at 31 May 2019. The ARC has identified a number of improvements to streamline the variations process and is in the process of implementing them. For example, in December 2018 the ARC reduced the number of attachments needed to submit a variation request.

Institutional reviews

4.32 The objective of institutional reviews is to determine whether an administering organisation is effectively managing NCGP-funded research projects to ensure:

  • they are operating in accordance with the terms of the grant agreements;
  • appropriate financial, records management and project management systems are in place; and
  • research project activities correlate to those in the approved research proposal.

4.33 The ARC advised the ANAO that the existence of institutional reviews acts as a deterrent mechanism that may prevent institutions from not complying with grant funding requirements.

4.34 The reviews involve ARC staff visiting the selected institutions and examining documentation relating to a sample of grants funded by the ARC. Prior to the site visits, the selected institutions are required to provide checklists, guidelines, procedures and evidence of monitoring processes that demonstrate that their research projects are operating in accordance with the terms of the grant agreement and are compliant with a range of ARC policies.

4.35 The ARC completed two institutional reviews each year from 2011–12 to 2015–16 but did not complete any reviews in 2016–17. Three institutional reviews were completed in 2017–18 and two were scheduled in 2018–19.

4.36 In selecting the 2017–18 institutional review candidates from the group of 43 eligible organisations, four criteria were considered: institutions identified on the ARC’s Fraud Incidents Register; institutions that had previously been reviewed as part of an ARC monitoring activity; business intelligence shared between ARC teams; and correspondence received from institutions that indicate that a review may be beneficial. Advice to the CEO proposed that five institutions be considered for review, with the CEO requested to approve three of the five candidates. The request did not include any analysis or rationale for the selection, apart from a list of previous institutional reviews conducted from 2011–12 to 2015–16. One of the institutions selected was reviewed in 2012–13. In the absence of any rationale being provided to the CEO, it is unclear why this institution was selected in 2017–18 given that the conduct of previous reviews was one of the four criteria used in selecting the institutions for review.

4.37 Findings from the three institutional reviews conducted in 2017–18 included: a number of projects that commenced before necessary documents had been finalised; and expenditure on some items that were not allowed and were therefore in breach of the grant agreement. For one review, the review team was not able to source requested documents.

4.38 In selecting the 2018–19 institutional review candidates, the process was more robust. Twelve criteria were used to select two institutions for review, with a detailed assessment against the criteria included in advice to the CEO.34 These reviews had not been finalised at the time of the ANAO’s audit.

4.39 The ARC advised the ANAO that its approach to ensuring implementation of review recommendations is collaborative and based on negotiation with the institution. The ARC also advised the ANAO that if it becomes aware of non-compliance with grant agreements, it takes steps to correct the issue. While non-compliance can impact eligibility for future funding, the ARC advised that the findings of an institutional review have not been significant enough to have impacted on the eligibility of the institution to apply for future funding.

4.40 The ARC Audit Committee discussed institutional reviews in November 2018. The minutes of the meeting noted:

  • future institutional review reports should include the institution’s responses and asked that a process be established to monitor how requested actions have been responded to;
  • potential reputational risks to the ARC posed by the length of time universities had to respond to instances of non-compliance identified by institutional reviews;
  • the importance of training for ARC staff undertaking institutional reviews to ensure an awareness of audit principles and consistency;
  • whether ARC responses to instances of non-compliance were sufficiently strong; and
  • a suggestion that the ARC should determine appropriate points of escalation in the severity of the agency’s responses.

4.41 In May 2019 ARC’s Senior Management Group endorsed the inclusion of the compliance framework/reviews in its 2019–20 internal audit program.

4.42 University stakeholder feedback to the ANAO, although limited given the number of reviews undertaken, questioned their value. One concern was that the large amount of information provided to the ARC in advance of the site visit did not appear to have been reviewed before the visit. Another concern was that it took the ARC more than six months to provide a written report on review outcomes and the report provided little value to the university under review. Another stakeholder noted that while the process is fair, it focused on low level issues.

4.43 Overall, the institutional reviews:

  • have limited coverage — usually two reviews a year (from a pool of 43 administering organisations). Including the two undertaken in 2018–19, fourteen reviews will have been undertaken by the ARC since 2011–12, with one university subject to two reviews during this period;
  • are largely compliance-based and narrow in scope;
  • provide limited value-add given their narrow scope, lack of material findings and follow-up of recommendations by the ARC; and
  • are not well regarded by universities in terms of adding value or providing lessons learnt.

Major investment reviews

Industrial Transformation Research Hubs reviews

4.44 The ARC undertakes ad-hoc reviews of Industrial Transformation Research Hubs (Hubs) to ensure that research progress is satisfactory, the Hubs are meeting the ARC’s expectations in relation to operations and national focus, and to provide feedback to Hubs on their performance. The purpose of the reviews is to assess their progress since establishment and to identify any key issues for the remainder of its funding period under their funding agreement.

4.45 In 2018–19 three Hubs were selected for review. One Hub was selected due to concerns about the large number of partnering organisations and its ability to manage and deliver on the Hub objectives. Another was selected due to significant issues over the past several years. The third Hub was selected at random and because the university had not previously been reviewed.

4.46 The reviews involved consideration of submission material and included a one day site visit to each Hub by a two-member panel.The review reports all concluded that the Hubs had made good progress in meeting their objectives. The reviews contained eight, 10 and 14 broad ranging recommendations, respectively, which related to: performance reporting and the effectiveness and efficiency of the Hub’s administrative, financial and operational performance; and a Hub’s performance against its specific objectives.

ARC Centres of Excellence reviews

4.47 ARC Centres of Excellence mid-term reviews consider whether the Centres are meeting the ARC’s expectations in relation to the scheme objectives, and the operations and research performance of the Centre and provide an avenue for independent feedback to Centres on their research performance. The 12 ARC Centres of Excellence funded in 2014 were reviewed in 2017–18 by three-member panels, with key Centre stakeholders invited to provide a submission to the ARC and comment on the review report when finalised. The same approach used for the Hub reviews was used in the Centre reviews: site visits and a review of written submissions from the administering organisation, the Centre and invited stakeholders.

4.48 The ANAO reviewed six of the 12 ARC Centres of Excellence review reports. All six reports noted that the Centres had: strongly pursued their objectives to undertake innovative and potentially transformational research; and were achieving either significant (two Centres), outstanding (three Centres) or excellent (one Centre) levels of success in achievements against many of the scheme objectives. On average each report contained 11 recommendations that are specific to each Centre.

Research integrity activities

4.49 Institutions are responsible for investigating research integrity-related complaints and concerns within the context of a research integrity framework, supported by the ARC Research Integrity and Research Misconduct Policy (revised in June 2019). Compliance with the policy is mandated through grant guidelines and grant agreements.

4.50 Each quarter, reports are provided to ARC management about research integrity issues’. The ARC receives notifications about potential integrity issues from its assurance activities, members of the public and administering organisations. The ARC Research Integrity Report for 1 January to 31 March 2019 noted that the ARC was monitoring 11 matters carried over from the previous quarter, had been notified of nine new potential research integrity matters and seven matters had been resolved. As such, of the 20 research integrity matters (new and ongoing) considered in the period, 13 matters remained active as at 31 March 2019.

NCGP Grants Management Assurance Framework

4.51 In response to an internal audit report on post-award management, in February 2019 the ARC documented its strategies for reducing the risk of administering organisations’ not complying with NCGP obligations and developed a new NCGP grants assurance management framework.

4.52 The risks identified in the framework address: timely and efficient grant payments; financial management; budget, scope, personnel and organisational variations; participating/partner organisation defaults; conflicts of interest; reporting; and Major Investments. As at February 2019 two of the nine risks were rated as moderate and seven risks are rated as low. These program risks are distinct from the entity-wide risks that are captured in the ARC Strategic Risks Register 2018–19.

4.53 According to the new framework, the ARC will undertake an annual assessment to gain assurance that administering organisations are fulfilling their grant obligations under the Commonwealth Grants Rules and Guidelines.

Managing the results of monitoring and assurance activities

4.54 The ARC actions the outcomes of its monitoring and assurance activities in accordance with its procedures. For example, the ANAO observed the end-of-year report review and template-based review of a final report processes and noted it was undertaken in accordance with the relevant SOP and work instructions. The ANAO also observed the variations process undertaken by the ARC. Variations are reviewed by three ARC officers and then passed to the delegate for approval. The ARC advised the ANAO that this four-person review and approval process assists it to retain corporate knowledge and business intelligence across various levels.

4.55 The results and key learnings from the range of assurance mechanisms are shared within the ARC and with the university sector. As noted in paragraphs 2.40 to 2.43, the ARC also has a variety of strategies and activities that support its communications and engagement with stakeholders that often also have an educational function. The ANAO reviewed a number of ARC presentations to forums and meetings and noted that common issues arising from various post-award assurance activities were discussed.

Recommendation no.3

4.56 The Australian Research Council ensures that its monitoring and assurance activities, in particular institutional reviews, are risk-based and contribute to the Australian Research Council’s assurance that NCGP objectives are being achieved.

Australian Research Council response: Agreed.

4.57 The ARC will review its suite of monitoring and assurance activities to provide the necessary assurances that the NCGP objectives are being achieved. The ARC has already initiated this through commissioning an internal audit to review the NCGP compliance assurance framework using a risk-based approach. The review will look at the full suite of compliance and assurance mechanisms available to ensure that its monitoring and assurance framework is risk-based.

Does the ARC assess the effectiveness of the program and use the results to inform program improvements?

The ARC assesses the effectiveness of the NCGP through various activities, including evaluations, and has used the results of these activities to inform program improvements. In 2018 the ARC established a new evaluation strategy and plan, which resulted in two evaluations in 2018–19.

Evaluation

4.58 Prior to the establishment of a formal evaluation strategy and plan in 2018, improvements to the NCGP were informed by the review and assurance processes described in the previous section and a small number of evaluations. The latter included evaluations of Linkage Projects (2011), Future Fellowships Funding Scheme (2013) and ARC support for Indigenous researchers and Indigenous research (2017).

4.59 In mid-2018 the ARC developed an Evaluation Strategy and a Strategic Evaluation Plan 2019–2023. The purpose of the strategy is to guide and strengthen the ARC’s NCGP evaluation activity and to reinforce evaluative thinking within ARC policy and program processes. The strategy notes that to be most effective, evaluation should be built into program design and undertaken as an integral component of program management and that monitoring and review processes that are embedded in ARC policy and program management will help to measure program performance. According to the strategy, evaluation topics may include:

  • effectiveness in achieving individual funding scheme objectives and delivering intended outputs, outcomes and benefits;
  • effectiveness of ARC programs and policies in addressing specific thematic issues or whole-of-government priorities;
  • alignment of the NCGP with other research funding programs;
  • efficiency of funding scheme administration; and
  • the appropriateness and effectiveness of ARC policies on research matters.

4.60 The evaluation plan outlines a program of evaluation activity over four years from 2019, including the rationale for topic selection. The plan is intended to be reviewed annually to maintain its responsiveness to emerging ARC policy and program priorities, including in NCGP administration and implementation of broader government objectives.

4.61 Two evaluations were undertaken in 2018–19: implementation of the continuous Linkage Projects process35; and Industrial Transformation Research Program process and priorities. Evaluation methods used in the two evaluations included analysis of stakeholder surveys and interviews, internal program management data from RMS, and information from the ARC’s regular outreach activity, surveys, reviews, program/project reporting and assurance activities. Three evaluations are proposed for 2019–20.

4.62 The evaluation of the implementation of the continuous Linkage Projects process has been completed, with the report making five recommendations to improve implementation effectiveness and efficiency, for applicants and the ARC. The Industrial Transformation Research Program evaluation report was finalised in June 2019 and made four recommendations to improve the implementation of the program.

Administrative streamlining

4.63 On 10 September 2018 the ARC agreed to establish a Streamlining NCGP Working Group. The group’s terms of reference, revised in February 2019, state that its purpose is to coordinate the conduct of activities across the agency aimed at streamlining NCGP grant administration processes. These activities have included: work to progress the ARC’s July 2018 commitment to administering organisations to streamlining various administrative process; analysis of the responses to the Parliamentary inquiry that relate to NCGP administration36; and consultation with the College of Experts to collect feedback regarding the application and the information that is essential to assess the quality of an application.

4.64 The terms of reference noted that the working group would review grant administration processes across the grant lifecycle, including grant guidelines, application forms, the assessment process and post-award activities. The project aims to improve the efficiency and effectiveness of NCGP grants administration by simplifying or eliminating unnecessary activities for applicants, improving consistency across documentation and processes including application assessment and improving clarity around how and why processes are undertaken and information is sought. The project will use information collected from multiple sources, including a College of Experts survey conducted in 2018 and a meeting with the College in April 2019, and a workshop with members of the Australasian Research Managers Society held in November 2018.

4.65 The Working Group has identified a range of streamlining opportunities, including in: grant guidelines, applications, assessment, agreement variations, progress reports, final reports and SAC processes. The working group’s project plan outlines timelines for a first tranche of activities, which commenced in February 2019, to be completed by June 2019. Although a second tranche of activities was proposed for July to December 2019, the ARC advised that the Working Group met on 5 July and agreed that additional streamlining opportunities were now best addressed by line areas as part of normal business..

2017 benchmarking activity

4.66 In May 2017 the ARC wrote to 19 international funding agencies seeking information on its approaches to application and assessment processes. In particular, the ARC sought input about the implementation of continuous application processes as well as peer review and assessment processes more generally. ARC analysis of the ten responses received indicated that the ARC’s application and assessment processes are similar to those used internationally.

Improving the NCGP

4.67 Based on its review, evaluation and streamlining activities, the ARC has implemented a number of changes aimed at improving the NCGP and its administration. For example, the ARC has:

  • adopted the whole-of-government grant guidelines template;
  • standardised grant guidelines and grant agreements;
  • applied common provisions across ARC grant schemes;
  • streamlined the process for eligibility checks for researchers;
  • streamlined final report forms for all grant schemes with the auto-population of RMS information and outputs;
  • streamlined requirements relating to variations to grant agreements; and
  • enhanced the RMS to allow research officers visibility of overdue final reports.

Appendices

Appendix 1 Entity response

 

ARC response letter

 

Appendix 2 NCGP grants by institution, 2016 to 2018

Administering organisation

Number of grants funded

Grant funding ($000)

The University of New South Wales

386

264,289

The University of Queensland

378

230,466

The University of Melbourne

381

222,754

Monash University

335

207,307

The Australian National University

272

172,959

The University of Sydney

273

134,398

University of Wollongong

93

73,532

The University of Western Australia

149

72,741

The University of Adelaide

139

67,074

Swinburne University of Technology

48

58,694

Queensland University of Technology

95

48,666

Curtin University

91

45,810

Deakin University

74

45,167

Macquarie University

97

43,155

Griffith University

80

40,304

University of Technology Sydney

96

36,126

RMIT University

72

33,340

The University of Newcastle

69

31,731

University of Tasmania

65

31,253

La Trobe University

37

19,496

University of South Australia

40

17,297

Western Sydney University

42

16,353

Flinders University

40

15,808

Australian Catholic University

24

9,733

James Cook University

16

7,129

Murdoch University

14

6,624

University of the Sunshine Coast

12

5,204

Southern Cross University

11

4,929

University of Canberra

10

4,810

University of New England

11

4,466

Victoria University

8

2,840

Charles Darwin University

5

2,519

Charles Sturt University

7

2,178

Edith Cowan University

5

1,637

Central Queensland University

5

1,562

University of Southern Queensland

3

968

Federation University Australia

2

561

The University of Notre Dame Australia

1

439

Australian Council of Learned Academies

2

418

Australian Academy of Technology and Engineering

1

345

Australian Academy of Science

2

328

Australian Academy of the Humanities

1

306

Academy of the Social Sciences in Australia

1

170

Total

3,493

1,985, 886

     

Source: ARC data.

Footnotes

1 Department of Finance, Commonwealth Grants Rules and Guidelines, 2014 and 2017 editions.

2 Government responses to House of Representatives committee reports are required within six months of tabling.

3 The ARC uses the term grant guidelines and, previously, funding rules to describe the guidelines for the programs/schemes. In this report, the ANAO uses guidelines.

4 Department of Finance. Grant guidelines – Better practice checklist, https://www.finance.gov.au/sites/default/files/Checklist.pdf [accessed 4 April 2019].

5 A proposal is an application for grant funding.

6 Department of Prime Minister and Cabinet, National Innovation and Science Agenda Report, 2015.

8 ARC, Funding Rules for schemes under the Discovery Program (2017 Edition), p. 10.

9 ARC, Funding Rules for schemes under the Linkage Program (2017 Edition), p. 12.

10 The Industrial Transformation Research Hubs and Industrial Transformation Training Centres include the Industrial Transformation Priorities instead of the Science and Research Priorities. Current Industrial Transformation Priorities include: Advanced Manufacturing; Cyber Security; Food and Agribusiness; Medical Technologies and Pharmaceuticals; Mining Equipment; Technology and Services; Oil, Gas and Energy Resources.

11 The 42 excludes Learned Academies, which are eligible organisations for Learned Academies Special Projects and Supporting Responses to Commonwealth Science Council Priorities.

12 The NCGP guidelines use the terms ‘investigator’ and ‘candidate’. Chief investigators and partner investigators are participant researchers on most schemes, except for fellowship and laureate schemes where researchers are referred to as candidates. To ease readability, the term ‘researcher’ is used in this report unless specificity is required.

13 Guidelines for ARC Centres of Excellence, Special Research Initiatives and Supporting Responses to Commonwealth Science Council Priorities projects are issued separately and have different scheme-specific eligibility criteria.

14 The role of the NCGP Eligibility Committee is to review and provide recommendations on eligibility in cases where eligibility is unclear.

15 Under the National Interest Test, the ‘benefit and impact statement’ text box on ARC application forms has been replaced with a ‘national interest test statement’ text box. Under the test, applicants are asked to explain ‘… the extent to which the research contributes to Australia’s national interest through its potential to have economic, commercial, environmental, social or cultural benefits to the Australian community.’

16 The RMS is a bespoke research management platform built, and progressively modified, by the ARC.

17 The ARC also records its outreach and engagement activities in its IT system.

18 Quick Response (QR) codes are a machine readable barcode containing data for a locator, identifier, or tracker that points to a website or application.

19 This excludes ARC Helpdesk phone and email enquiries related to the RMS.

20 The ARC provides an assessor handbook and an ARC Centres of Excellence assessor handbook for General and Detailed Assessors.

21 ARC documents note that the College of Experts comprises a wide range of experienced and highly qualified people of international standing, drawn from academia, industry and public sector research organisations. Members are appointed for up to three years and are announced annually following a nomination and shortlisting process. As at 10 May 2019, 194 College of Experts were contracted to the ARC.

22 Uncertainty band refers to proposals being ranked within a defined range above and below the notional funding line. The number of proposals in the uncertainty band will vary depending on the size of the scheme. The notional funding line is the estimated point in the ranked list of proposals where scheme funding would be completely allocated.

23 Career age is calculated as years since the completion of a PhD.

24 Department of Finance, Quick Reference Guide – RMG 131 Developing good performance information, September 2016.

25 Australian Research Council, Internal Audit of Post-award Grant Management, 2017/18 Internal Audit, 28 June 2018, p. 8.

26 Reporting is discussed in more detail in paragraphs 4.22 to 4.28.

27 From 2016 to 31 May 2019, 4,889 variations had been submitted by administering organisations for approval.

28 Final reports are required to be completed for each NCGP-funded project and must be submitted with 12 months of the project end date. The final report is used to determine whether the project has been satisfactorily completed.

29 NCGP Major Investments are: ARC Centres of Excellence; Special Research Initiatives; the Industrial Transformation Research Program (which includes Industrial Transformation Research Hubs and Industrial Transformation Training Centres); and Linkage Learned Academies Special Projects.

30 The number of days is counted from initial submission of final reports by universities to the ARC and includes time where these reports are resubmitted back to universities for editing and further response. As at 31 May 2019 the average processing time for the ARC’s review of final reports was 26.3 days.

31 The ANAO received ten submissions; eight commented on ARC’s reporting requirements.

32 Four of the ten submissions referred to the variations process.

33 Following grant approval, administering organisations are not required to provide to the ARC revised budgets detailing how the funding will be distributed.

34 The 12 criteria included the four criteria noted in paragraph 4.37 and: value of ARC grant funding to the institution; number and type of grants; institutions identified through the ARC’s research integrity and misconduct dealings; notifications received from the Department of Education and Training as a result of its institutional financial statements reviews; and the cohort of institutions being reviewed by ARC and other government departments on an annual basis.

35 Since 2016 Linkage Projects applications can be submitted at any time, with funding outcomes to be recommended to the Minister within six months of application submission.

36 House of Representatives Standing Committee on Employment, Education and Training, Australian Government Funding Arrangements for non-NHMRC Research, November 2018.