The objective of the audit was to assess the effectiveness of the Department of Health and Ageing's support for improved access to integrated GP and primary healthcare services through its administration of the Primary Care Infrastructure Grants (PCIG) program.

Summary

Introduction

1. The $117 million Primary Care Infrastructure Grants (PCIG) program was one of a number of measures announced by the Australian Government in the 2010–11 Budget intended to reform the primary healthcare sector1 as a means of addressing the long-term financial sustainability of the Australian health system.

2. The decision to reform primary care was informed by the National Primary Health Care Strategy2, which was released with the 2010–11 Budget. The report supporting the strategy acknowledged that ‘strengthening and improving the way in which primary healthcare is provided is vital in determining how well the health system responds to current and emerging pressures’.3 The report also highlighted the benefits of increasing the focus on primary healthcare within the overall health system, including:

  • reducing the incidence of chronic disease in the community, through preventative health measures;
  • minimising the number of hospital admissions and reducing their length by providing clinically appropriate care in the community; and
  • improving overall equity in healthcare by reducing disadvantage flowing from where people live, their ability to pay, or their particular health conditions.4

3. The strategy had noted that shortcomings in the existing physical infrastructure provided a barrier to the delivery of better primary healthcare. A particular shortcoming it identified was a lack of available space in many general practices, which was seen as inhibiting the delivery of a broader range of integrated services by general practitioners, practice nurses and other health professionals.

4. The Government funded the PCIG program over two competitive grant funding rounds in 2010 and 2011 to upgrade or extend the physical infrastructure of around 425 existing general practitioner (GP) facilities.5 In addition to funding extra or improved floor space at GP practices for the delivery of a broader range of services, the program was designed to boost the capacity of practices to accommodate increased levels of clinical training placements and training facilities, to support the future primary care workforce.

5. Grants for the two funding rounds were provided through three streams—A, B and C—with maximum grants for each stream being $150 000, $300 000 and $500 000 respectively.6 Organisations or individuals were required to be currently providing GP services at an existing facility to be potentially eligible for funding. A condition of the grants was that the health services and workforce training delivered through the relevant facility be maintained for a period of two to five years, depending on the funding stream.

6. The Department of Health and Ageing (DoHA) is responsible for administering the PCIG program. The department’s administrative responsibilities under the program include assessing applications and providing a merit-based, ranked list of applicants within each of the three funding streams for consideration by the Minister for Health and Ageing (Minister). Following the Minister’s policy approval7 of the quantum of funding for each stream—which had the effect of generating a shortlist of preferred applicants for each stream8—DoHA negotiated individual funding agreements with preferred applicants and, if these negotiations were successful, provided the necessary financial approvals to commit Commonwealth resources through the formal offer of grant funding to the applicants. Following execution of the relevant funding agreement, DoHA administered the awarded grant in line with the funding agreement. While the PCIG program involved two funding rounds, which have concluded, the administration of funding agreements will continue for some years.

7. As a competitive grants program, the PCIG is subject to the Commonwealth Grant Guidelines (CGGs).9 The CGGs state that the fundamental objective of grants administration is to ‘establish the means to efficiently, effectively and ethically administer Australian Government funding to approved recipients in accordance with government policy outcomes’.10 This objective is supported through the Australian Government’s financial management framework, which includes a requirement to make proper use of Commonwealth resources11, and the seven key principles for grants administration set out in the CGGs, which include the principle of achieving value with public money.12

Audit objective, criteria and scope

8. The objective of the audit was to assess the effectiveness of DoHA’s support for improved access to integrated GP and primary healthcare services through its administration of the PCIG program.

9. The audit examined DoHA’s performance in the establishment, initiation and administration of the program against relevant policy and legislative requirements for the expenditure of public money and the key principles contained in the CGGs.13 In this context, the audit examined whether DoHA had:

  • established and initiated the program so that it was fit for the purpose of supporting infrastructure grants initiatives intended to improve access to integrated GP and primary healthcare services;
  • appropriately assessed applications and provided appropriate advice to the Minister in relation to assessment outcomes and related matters;
  • effectively negotiated projects with shortlisted applicants, properly approved grants and promptly executed funding agreements;
  • effectively administered individual grants during the construction phase;
  • established a program evaluation strategy that incorporated a robust key performance indicator (KPI) reporting framework; and
  • incorporated major ‘lessons learned’ from the 2010 funding round in the 2011 round.

10. The audit examined DoHA’s assessment of applications and advice to the Minister on assessment outcomes for both the 2010 and 2011 rounds. Given that the primary evidence collection phase of the audit concluded in December 2011, the audit examined the funding agreement negotiation, financial approval, individual grant administration, and performance reporting processes for the 2010 round. However, recognising that DoHA made changes to the funding agreement negotiation process for the 2011 round, the ANAO examined whether these changes resulted in a reduction in the delays experienced in the 2010 round in finalising such negotiations and executing funding agreements.

Overall conclusion

11. The PCIG program forms part of the Australian Government’s reforms to the primary healthcare sector, which are intended to improve the sustainability of the healthcare system as a whole. By contributing $117 million towards the upgrade of around 425 primary healthcare facilities, the PCIG program aims to improve community access to integrated GP and other primary healthcare services as a means of reducing the incidence of chronic disease and minimising the use of more expensive hospital services.

12. DoHA is making steady progress in implementing the PCIG program in a manner consistent with the objectives set by the Government. The first round of PCIG grants, launched in 2010, has to date funded14 214 facilities at a cost of $54.4 million15, while the second (and final) round launched in 2011 has funded 66 facilities at a cost of $19.1 million, with the potential to fund a total of over 190 facilities. Grant recipients have to date completed 72 projects.

13. In establishing the PCIG program, DoHA put in place many of the fundamentals for the effective administration of a competitive grants program, as set out in the Commonwealth Grant Guidelines (CGGs). The department facilitated the development of grant program guidelines within a very compressed timeframe16, which were approved by the Expenditure Review Committee of Cabinet and made publicly available, as then required by the CGGs.17 In the limited time available for planning the first funding round, DoHA also: undertook a risk assessment of the program; developed key performance indicators and incorporated some elements of an evaluation strategy; took measures to train program staff and introduced elements of quality assurance review for assessments; and drew on the knowledge of the more established GP Super Clinics program as well as other areas in the department with experience in administering infrastructure grants programs, including the internal Program Funding and Procurement Service.18

14. In launching and delivering the program, DoHA: adopted effective measures to raise awareness of the scheme to potential applicants; developed an appropriate approach to negotiating timelines, milestones and progress payments for grant funding agreements; and established a monitoring framework to administer grants from the execution of funding agreements to the completion of infrastructure construction works.

15. Furthermore, the department improved the overall effectiveness of program administration in the 2011 funding round, drawing on lessons learned from the 2010 round. Specifically, there were improvements in the assessment process, which resulted in a more consistent approach to scoring grant applications, and changes to the process for negotiating funding agreements, which expedited the execution of such agreements compared to the time taken in the 2010 round.

16. Notwithstanding these achievements, limitations were evident in the approach adopted by the department for assessing the value for public money19 offered by individual project proposals and the transparency of that approach. While the program guidelines set out the two selection criteria used to assess applications, they did not include any information about the relative weighting to be given to the criteria by the department, as these weightings were only finalised by DoHA on the day that applications for the 2010 round closed20, some two months after the 2010 guidelines were published.21 Moreover, the selection criteria were heavily weighted towards one of the key program objectives, the delivery of physical infrastructure, as compared to the other key program objective, improved access to new primary care services. While there is no requirement to inform applicants of the weightings to be given to criteria, it does help applicants shape their submissions if they are aware that selective weightings will be applied.22

17. In the assessment process, a weighting of up to 64 per cent of the total available assessment score was assigned to projects that satisfied program objective 1—to upgrade or enhance existing facilities.23 In contrast, the next most heavily weighted element relating to program objective 2 (the potential to provide access to new services)24 was assigned a weighting of up to 26 per cent of the total available assessment score. This approach to weighting, which favoured physical infrastructure, was also evident in the department’s approach to program objective 5—the development of training facilities.25 While program objective 5 was allocated a relatively low weighting of nine per cent, projects satisfying this objective could potentially have their maximum funding increased by up to 66 per cent, from $300 000 to $500 000.26

18. The capacity of DoHA staff to assess value for money was also inhibited by an assessment process and program guidance which indicated that they should focus on the level of detail provided by applicants and any ‘deficiencies’ in documentation27, a compliance-driven approach, rather than making use of that information to form a view on the relative value for public money offered by projects. Further, while the assessment process assigned 10 per cent of the final weighting to the ‘efficient and effective use of funds’, which are key financial framework requirements in assessing value for money and the proper use of Commonwealth resources, the questions asked of applicants to assess efficiency28 and effectiveness related mainly to aspects of effectiveness.29 The questions largely sought to elicit information from applicants about local health issues facing their practices, while the sole question that addressed efficiency focused on the relatively narrow issue of the availability of other sources of funding, and assigned a one per cent rating if applicants identified other funding sources.30

19. DoHA incorporated elements of a monitoring and evaluation framework in the course of developing the program, including the development of key performance indicators (KPIs) of project performance. However, the majority of KPIs lacked specificity and measurability. The department has advised that it is refining the existing KPIs for future reporting purposes. In this context, there would be benefit in DoHA considering the contribution that revised indicators could make to future evaluation activity, such as the extent to which they address the overall effectiveness of the program against its key objectives and outcomes (including improved access to integrated primary care services) and are not limited to the delivery of physical infrastructure. There would also be benefit in undertaking focused evaluation activity to assess program effectiveness, drawing on available information and reporting processes.

20. Recognising that no further PCIG funding rounds are proposed, the audit has made two recommendations designed to strengthen DoHA’s general administration of infrastructure grant programs, drawing on its experience in administering the PCIG program. The recommendations focus on: the explicit consideration and appropriate weighting of value for public money issues in grant assessment processes; and focused evaluation activity drawing on available information and reporting processes.

Key findings by chapter

Planning and Promoting the PCIG Program (Chapter 2)

21. Following the announcement of the program in the May 2010 Budget, DoHA was able to facilitate the development and approval of PCIG program guidelines by the Expenditure Review Committee of Cabinet for the 2010 round within very tight timeframes, in order to meet the Government’s requirement for an early opening for PCIG applications on 25 June 2010. However, as previously discussed, the 2010 program guidelines did not provide any information about the weightings that would apply to the selection criteria, since such weightings had not been decided at that time of publishing the guidelines. The weightings subsequently adopted by DoHA, which were not publicly disclosed in either round, were heavily weighted in favour of the delivery of physical infrastructure, as compared to improved access to new primary care services.

22. Risk management was addressed as part of program planning, although some significant risks were not identified in the management plan. These included risks relating to tenure and recipients’ legal structures, which arose regularly in the subsequent administration of the program, often resulting in delays in the negotiation and execution of funding agreements in the 2010 round.

23. DoHA effectively promoted the PCIG program, and the activities used to attract applications were generally well targeted at the potential grant recipient population.

Assessing Grant Applications (Chapter 3)

24. DoHA faced challenges in assessing the relative value for public money of competing PCIG projects due to: the diversity in the type, scale and location of infrastructure; differences in the new services proposed; and differing clinical training placement and facilities scenarios. Beyond these contextual difficulties, however, limitations in the design and implementation of the assessment process inhibited the department’s capacity to assess value for money. As discussed previously, these limitations related to the weighting of selection criteria across the program outcomes, and the limited use made of information received from applications. The department’s assessment process and program guidance had a strong compliance orientation, encouraging assessors to focus on the level of detail provided by applicants and any ‘deficiencies’ in documentation, rather than making use of that information to form a view on the relative value for public money offered by projects.

25. Other shortcomings in DoHA’s processes and practices to assess the 2010 PCIG funding applications related to the consistency of assessments and the lack of a documented quality assurance review process. Based on an examination of a sample of 108 (from a total of 593) applications assessed by DoHA, the ANAO found a significant proportion of inconsistent assessment practices in the 2010 round. While some of these inconsistencies were limited to matters such as assessment comments, the ANAO identified four examples in its sample of 2010 assessments where more consistent scoring would have affected the application’s shortlisting status, including in two cases where this would have moved applications from above to below the cut-off score for shortlisted applications.31

26. There was an improvement in the assessment of the 2011 PCIG applications, drawing on the lessons learned in the previous round. This was reflected in, among other things, a lower rate of inconsistencies being identified in the ANAO’s analysis of grant assessments from the 2011 round. Nonetheless, based on an examination of 105 of the 418 applications assessed by DoHA, the ANAO again identified four examples where more consistent scoring would have affected the application’s shortlisting status, including three cases where this would have moved applications from above to below the cut-off score for shortlisted applications. To help manage this risk, there would have been benefit in DoHA extending its quality assurance review of application assessments to focus on assessments close to the cut-off lines for each stream, and more fully documenting its review of assessments.

Negotiating, Approving and Executing Grants (Chapter 4)

27. DoHA’s advice to the Minister regarding PCIG assessment outcomes supported the Minister’s decision-making on the allocation of the pool of program funds between the three grant streams.32

28. DoHA’s processes and practices to negotiate the PCIG 2010 round funding agreements were generally sound, with its approach to negotiating timelines, milestones and progress payments for grant funding agreements reflecting the better practice principles set out in the CCGs .

29. However, in many cases there were substantial delays in finalising agreement negotiations and executing agreements, often arising from unanticipated issues, such as those relating to tenure and recipients’ legal structures. Based on data available as at December 2011, an average of around six months elapsed from the time of shortlisting the preferred applicants to executing the 2010 round funding agreements. Following changes made by DoHA in some of its negotiation processes for the 2011 round, the department has, as at April 2012, achieved a more rapid roll-out of executed funding agreements, with approximately double the number executed compared to the equivalent period in the 2010 round.

30. For the 2010 funding round, DoHA did not consistently comply with the mandatory grant reporting requirement to publish information on individual grants no later than seven working days after the funding agreement for the grant takes effect.33 DoHA advised that timeliness is likely to have improved following the introduction of new processes in September 2011 to simplify data entry and minimise the risk of not satisfying this mandatory requirement in the 2011 funding round.

Administering Grants to the Completion of Infrastructure Construction (Chapter 5)

31. DoHA established a monitoring framework to inform the administration of the PCIG program through to the completion of infrastructure works. Milestone progress reports provided by recipients, as required by the relevant funding agreements, have generally contained relevant and sufficiently detailed information to effectively monitor progress and are in some cases used to trigger progress payments. DoHA’s assessment of these milestone reports has been generally sound, although the department has adopted a ‘light touch’ to managing project delays and has tended to revise the existing funding agreements to accommodate delays rather than working more proactively with grant recipients to avoid or minimise delays. However, the ANAO’s examination of a sample of executed funding agreements over October and November 2011 indicated that such delays had not emerged as a significant problem across the program.34

32. DoHA’s administration of project financial reports has also been sound.35 That said, there were instances of omissions in, or delays in receiving, financial reports, which required follow-up by DoHA. This suggests that DoHA will need to carefully monitor such reporting in order to continue to provide assurance that Commonwealth funds have been used appropriately.

Developing Key Performance Indicators and Evaluating Program Performance (Chapter 6)

33. DoHA has incorporated elements of an evaluation framework into the development of the PCIG program. These elements included an articulation of the program objectives and outcomes, and associated KPIs. In addition, various reporting templates (including a project plan to assess proposed project-specific outcomes against program objectives, and a KPI template) were devised to assist with the evaluation of projects. However, no formal plan or strategy has yet been developed for the PCIG program setting out when and how the KPI reporting and other relevant information will be used to evaluate the overall performance of the program.

34. While the PCIG program KPIs developed by DoHA were generally relevant to the program objectives and were achievable, the majority of the program’s KPIs were not specific36 or measureable37 which was noted in comments on the draft 2010 program guidelines provided to DoHA by the Department of Finance and Deregulation.

35. DoHA has provided recipients with templates to assist them to meet their reporting obligations. Based on a sample of 17 KPI reports available at the close of fieldwork, the ANAO observed that while many included reasonably detailed responses, the information did not generally facilitate the objective assessment of individual projects. As a result, aggregated KPI reporting was unlikely to provide quantitative information regarding the achievement of program outcomes.

36. DoHA advised that it is refining the existing KPIs for future reporting purposes. These refinements are to be ’translated into the draft operational phase report templates that all applicants are required to report upon’. There would be merit in this revision taking into account the ‘SMART’ criteria38 and, as discussed previously, considering the contribution that revised indicators could make to future evaluation activity, such as the extent to which they address the overall effectiveness of the program against its key objectives and outcomes (including improved access to integrated primary care services) and are not limited to the delivery of physical infrastructure. There would also be benefit in undertaking focused evaluation activity to assess program effectiveness, drawing on available information and reporting processes.

DoHA response

The Department of Health and Ageing notes the audit report and agrees with the recommendations.

Footnotes

[1]   Primary healthcare is care provided by health professionals working in the community, as opposed to hospitals, institutions or specialist services. It is usually considered to include general practitioners, dentists and nurses working in private practices or community health services or Aboriginal Medical Services, allied health professions (such as physiotherapists, dieticians and mental health counsellors) and pharmacists.

[2]   Department of Health and Ageing, Building a 21st Century Primary Health Care System: Australia’s First National Primary Health Care Strategy, Canberra, May 2010. The National Primary Health Care Strategy was developed from 2008 to 2010 to support future investment in, and reform of, the primary healthcare system.

[3]   Department of Health and Ageing, Report to Support Australia’s First National Primary Health Care Strategy, Canberra, 2009, p. 8.

[4]   ibid., p. 5.

[5]   The May 2010 Budget also contained a $245 million expansion of the existing GP Super Clinics program. That program was established in 2008 to provide multi-million dollar grants to establish large or medium-sized medical centres to provide ‘one-stop shop’ primary health services. Such centres could involve the construction of entirely new facilities and/or the refurbishment of existing facilities.

[6]   Projects funded under stream A were only required to satisfy the first two program objectives (those relating to upgrading or extending existing facilities, and providing access to new health services); those under stream B were required to satisfy the first two plus an additional two objectives (strengthening team-based approaches to care and providing extended opening hours) and those under stream C were required to satisfy all five program objectives (the first four plus increasing training facilities).

[7]   The term ‘policy approval’ is used to distinguish the Minister’s decisions regarding the allocation of funding between the grant streams ‘pools’ from that of a DoHA official providing financial approval of grant spending proposals under Regulation 9 of the Financial Management and Accountability Regulations 1997 (FMA Regulation 9).

[8]   Establishing an overall budget for each of the streams had the effect of drawing a line on the merit list for each stream, with projects above the line able to be funded, and projects below the line unable to be funded (unless a shortlisted project was ultimately not awarded a grant, thus releasing those funds for another grant).

[9]   Department of Finance and Deregulation, Commonwealth Grant Guidelines: Policies and Principles for Grants Administration, July 2009. The CGGs ‘establish the policy framework and articulate the Government’s expectations for all departments and agencies … subject to the Financial Management and Accountability Act 1997 (FMA Act) … and their officials, when performing duties in relation to grants administration’, paragraph 3.24.

[10]   ibid., p. 3.

[11]   ‘Proper use’ in this context means the ‘efficient, effective, economical and ethical use of Commonwealth resources that is not inconsistent with the policies of the Commonwealth’, as specified in section 44 of the FMA Act and FMA Regulation 9. Often, this is referred to as a ‘value for money’ test.

[12]   Department of Finance and Deregulation, op cit., p. 30. The CGGs provide that the grants administration function itself should provide value for public money, as should the selection of grant recipients to deliver grant outcomes.

[13]   Where relevant, the audit also considered DoHA’s administration in the context of the ANAO Better Practice Guide—Implementing Better Practice Grants Administration, June 2010, Canberra.

[14]   As at 12 April 2012.

[15]   With another 14 funding agreements still under negotiation or awaiting formal approval.

[16]   While the program was announced in the May 2010 Budget, the Government required an early opening for PCIG applications on 25 June 2010.

[17]   On 27 September 2010, the Government decided that guidelines for new programs were to be submitted to the Expenditure Review Committee on a case-by-case basis.

[18]   The Program Funding and Procurement Service is a unit within DoHA established to contribute to program outcomes by advising program managers on best practice procurement, funding and contract management.

[19]   As mentioned in footnote 11, under the Commonwealth’s financial framework, the overall test for the ‘proper use’ of public money is the ‘efficient, effective, economical and ethical use of Commonwealth resources that is not inconsistent with the policies of the Commonwealth’, as specified in section 44 of the FMA Act and FMA Regulation 9. Often, this is referred to as a ‘value for money’ test. The CGGs provide that the objective of a grants appraisal process is to ‘select projects/activities that best represent value for public money in the context of the objectives and outcomes of the granting activity.’ Department of Finance and Deregulation, op. cit. p. 30.

[20]   Applications for the 2010 round closed on 20 August 2010.

[21]   Guidelines for the 2010 round were published on 25 June 2010.

[22]   The transparency, consistency and defensibility of the assessment process will be supported by the grant program guidelines making clear the extent, if any, to which nominated assessment criteria will be more heavily weighted (or favoured) in determining an application's overall assessment and, where relevant, relative ranking in comparison to competing applications: ANAO Better Practice Guide—Implementing Better Practice Grants Administration, June 2010, p. 66. DoHA advised that no information was contained in the 2010 Guidelines due to time constraints resulting from the Government’s decision to quickly launch the program. While the department did not face similar timing pressures in 2011, information on the weightings was not included in the 2011 guidelines.

[23]   The terms of objective 1 were: ‘to upgrade or extend existing facilities to provide space for additional general practitioners, nurses and allied health professionals and/or student[s] on clinical placements.’

[24]   The terms of objective 2 were: ‘to provide access to new services that meet local community health needs with a focus on preventative activities and better chronic disease management.’

[25]   The terms of objective 5 were: ‘to develop new, or enhance existing, clinical training facilities.’

[26]   Objective 5 was the single additional criterion required to raise the maximum funding amount potentially available from $300 000 (for projects satisfying stream B criteria) to $500 000 (for projects satisfying stream C criteria).

[27]   In the case of the capital works budget, while applicants were required to provide costings for a range of matters, the assessment guidance focused on the extent to which the budget contained ‘deficiencies’, with a maximum score of five assigned if the budget had ‘very minor or no deficiencies’ and a score of one out of five assigned if the budget had ‘major deficiencies/almost no detail’. This compliance-driven approach did not require assessors to form an opinion on value for public money based on a substantive assessment of the information provided by applicants.

[28]   Efficiency relates to maximising the ratio of outputs to inputs.

[29]   Effectiveness relates to the extent to which intended outcomes are achieved.

[30]   In contrast, the guidelines for the National Rural and Remote Health Infrastructure Program, also administered by DoHA, specified ‘demonstrated value for money’ as a selection criterion, weighted at 20 per cent of the total available assessment score. Assessors were required to consider whether the: project cost was effective relative to the gains to the community; outcomes of the project justified the funding investment; and the budget items had been fully costed and justified.

[31]   However, as a significant number of shortlisted projects were not ultimately awarded grants, projects that were just below the cut-off score were generally approached by DoHA to enter into negotiations for funding agreements.

[32]   A total of $64.5 million was available for the 2010 round, with the Minister deciding to allocate $14.9 million to stream A, $17.6 million to stream B, and $32 million to stream C. See also footnote 8.

[33]   Department of Finance and Deregulation op. cit., clause 4.2. See also Finance Circular 2009/04.

[34]   The sample included 29 projects for which funding agreements had been executed. 

[35]   Nineteen financial reports were examined by the ANAO in reaching this finding. Some grant recipients had not received funding in the 2010–11 financial year, or construction had not proceeded to a stage to require progress reporting, and so they had not been required to submit any funding reports as of the close of audit fieldwork in early December 2011.

[36]   KPIs were not specific if they did not focus on results that can be attributed to the program.

[37]   KPIs were not measurable if they did not include quantifiable units that can be readily compared over time.

[38]   The ‘SMART’ criteria require KPIs to be: specific, measurable, achievable, relevant and timed.