The objective of this audit was to assess how effectively entities had developed and implemented appropriate KPIs to support stated program objectives.

Summary

Introduction

1. The Organisation for Economic Cooperation and Development (OECD) has observed that:

How government activities are measured matters. Citizens are entitled to understand how government works and how public revenues are used.[1]

2. Performance reporting regimes have been place in many OECD countries, including Australia, since the mid-1980s. Over time, there has been a trend to move away from a focus on reporting on financial inputs to models that are intended to provide a clearer picture of what governments achieve for their expenditure on inputs—in other words the outcomes or the impact sought or expected by government.

3. Adequate performance information, particularly in relation to program effectiveness, allows managers to provide sound advice on the appropriateness, success, shortcomings and/or future directions of programs. This information also allows for informed decisions to be made on the allocation and use of program resources. Importantly, the Parliament and the public’s consideration of a program’s performance, in relation to impact and cost effectiveness, rely heavily on reliable and appropriate performance information.

Outcomes and Outputs Framework: 1999–2000 to 2008–09

4. In Australia, an Outcomes and Outputs Framework was introduced as part of the 1999–2000 Commonwealth Budget process. Entities[2] were required to specify intended outcomes, and to measure and report on actual performance. The outcomes were the results or impacts of government policy measures on the Australian community. The Portfolio Budget Statements (PBSs) were required to include entities’ outcomes by output group, specifying the indicators and targets used to assess and monitor performance.

5. In 2006–07, the Australian National Audit Office (ANAO) examined the application of the Outcomes and Outputs Framework by government entities[3]. The audit concluded that the development of a comprehensive, relevant and informative regime of performance indicators, including cost-effective systems and processes to capture, monitor and report complete, accurate and relevant entity performance, continued to be challenging for many entities. In particular, the audit identified that many performance indicators did not enable an assessment to be made as to whether desired results were achieved, as the indicators did not incorporate targets, benchmarks or other details of the extent of achievement expected.[4]

6. Parliamentary committees also identified that a sustained effort and commitment by all entities was required to ensure a relevant, informative and useful range of performance indicators that can be tracked over time. In 2002 the Joint Committee of Public Accounts and Audit (JCPAA) made the following comment in regard to the aggregation of information as presented in entities’ PBSs:

The Committee considers that high levels of aggregation in some agency outputs is a major problem making it difficult for the Parliament and the community to track the level of funding of particular organisations and what they are doing with those funds.[5]

7. More recently, the JCPAA noted that: ‘Measuring key aspects of an agency’s performance is a critical part of the Government’s Outcomes Framework’.[6]

Outcomes and Programs Framework: from 2009–10

8. Beginning in 2009–10, all entities in the General Government Sector (GGS) were required to report in accordance with the Outcomes and Programs Framework. This resulted in a number of revisions being made to the budget and reporting framework. A central element of the new framework is that entities are required to identify and report against the programs that contribute to government outcomes over the Budget and forward years, rather than the output group that contributed to government outcomes.[7]

9. Programs are now the building blocks of government budgeting and reporting, and are expected to provide a tangible link between government decisions, activities and their impacts (that is, the actual outcomes).[8] The new focus is intended to improve entity reporting, clearly demonstrating entities’ achievement against pre-defined program objectives.

10. Government policy relating to the Outcomes and Programs Framework is delivered and evaluated by the Department of Finance and Deregulation (Finance), and requires that the outcome statements used by entities in their PBSs are the most current statements agreed to by the Minister for Finance and Deregulation (Finance Minister), and are consistent with relevant Appropriation Bills.[9]

11. As part of Finance’s role in administering the Outcomes and Programs Framework, it has developed policy guidance to support entities’ transition to the new Outcomes and Programs Framework, and to amend their PBSs’ information. The guidance includes information on developing effectiveness KPIs to be included in an entity’s PBSs.[10]12. Current literature provides a range of approaches to the successful development, implementation and review of Key Performance Indicators (KPIs). Because effectiveness KPIs are statements of the pre-defined and expected impacts of a program, it is important that they are:

  • specific—so as to focus on those results that can be attributed to the particular intervention/program;
  • measurable—include quantifiable units or targets that can be readily compared over time;
  • achievable—realistic when compared with baseline performance and the resources to be made available;
  • relevant—embody a direct link between the program’s objective and the respective effectiveness KPI; and
  • timed—include specific timeframes for completion.

13. Collectively, these characteristics are commonly known as the SMART criteria.[11]

14. In advice to entities on developing KPIs, Finance has recommended that entities use both qualitative and quantitative KPIs to measure program performance and, if a program’s objectives are quantitative in nature, agencies are encouraged to consider the use of targets. To the extent that this advice promotes the use of quantitative approaches and targets, it aligns with the ‘measurable’ characteristic of the SMART criteria.

Audit objective and scope

15. The objective of this audit was to assess how effectively entities had developed and implemented appropriate KPIs to support stated program objectives. To address the audit objective, the ANAO:

  • undertook a desktop review of the published effectiveness KPIs for 89 programs across 50 Financial Management and Accountability Act and Commonwealth Authorities and Companies Act entities within the General Government Sector (GGS)[12] ;
  • supplemented this desktop review with more detailed analysis of four entities—the Australian Customs and Border Protection Service (Customs); Fair Work Australia (FWA); the National Film and Sound Archive (NFSA); and the Department of Resources, Energy and Tourism (RET)—including the reporting of performance in each entity’s annual report; and
  • assessed the role of Finance in administering the Outcomes and Programs Framework, including the preparation of guidance material for entities.

Overall conclusion

16. Since the mid-1980s, reforms have been progressively introduced into the Australian Public Service with the specific aim of making it more responsive to community needs, more efficient and more accountable. A constant theme of the reforms has been that monitoring, reviewing and reporting on the effectiveness of government policies and programs, and the efficiency of their delivery, are expected to be part of an ongoing process that is undertaken in the ordinary course of business in the public sector to inform decisions about the impact of programs, their targeting and administration.

17. The importance of effective management at the program level is a particular focus of the Outcomes and Programs Framework, designed to improve the link between government outcomes and the responsibilities of entities for the delivery of effective and efficient services. The foundation for entity accountability and transparency in the context of the new framework is performance information, with the measures and targets presented initially in PBSs and the results provided in entities’ annual reports.

18. Over a number of years, ANAO audits have identified areas where the quality of performance information and its reporting in annual reports required improvement. While the Outcomes and Programs Framework is in its third year, the findings in this audit indicate that many entities continue to find it challenging to develop and implement KPIs; in particular, effectiveness KPIs that provide quantitative and measurable information, allowing for an informed and comprehensive assessment and reporting of achievements against stated objectives.

19. The ANAO’s review of the publicly available performance information, including effectiveness KPIs for 50 entities across the GGS, identified that most of the entities examined have scope to improve the development of effectiveness KPIs and the reporting against them, in some cases significantly so. Overall, a third of the entities reviewed had effectiveness KPIs that were appropriate in terms of being specific, measurable, achievable, relevant and timed, a third were mixed (often differing significantly at the program level), and a third required much further development.

20. Generally, where entities had clearly defined program objectives, this initial level of precision translated into meaningful and quantitative effectiveness KPIs. However, with 45 per cent of the 89 programs reviewed specifying quantitative KPIs, and only 30 per cent based on KPIs that included clearly stated targets, the majority of the KPIs reviewed were largely descriptive in nature and unmeasurable. As a result, they did not allow for an informed assessment of the achievement of program objectives.

21. The tendency for entities to rely on qualitative effectiveness KPIs reduces their ability to measure the results of program activities over time. A mix of effectiveness KPIs that place greater emphasis on quantitative information and targets would provide a more measurable basis for performance assessment. While quantitative indicators are not necessarily always more objective, their precision is beneficial in gaining agreement on the interpretation of evidence-driven data, and for this reason are usually preferable. On the whole, quantitative indicators are also more easily recorded, more readily compared, and allow trends over time to be identified. Where appropriate, they can be supplemented with relevant qualitative information that provides insights into those factors responsible for the success, or otherwise, of a program.

22. In the annual reports reviewed by the ANAO, the information provided by entities to give an account of the impact of their programs was also largely qualitative and descriptive in nature, and generally consisted of listings of activities undertaken during the course of the year. Trends over time were not provided. Tracking and reporting against KPIs that include targets would provide a useful frame of reference for external stakeholders, and allow them to make an assessment of an entity’s progress toward achieving stated program objectives.

23. The way entities manage ongoing programs is central to the efficient delivery of government initiatives, and it is a requirement of the Outcomes and Programs Framework that entities allocate departmental expenses in support of program delivery to the relevant program. Used appropriately, such information allows informed decisions to be made about the efficient allocation and use of entity resources.

24. However, the audit identified that program support costs are not consistently identified in PBSs. While nearly all the entities reviewed included details of the deliverables associated with their programs, that is, the goods and services provided, 73 per cent of the programs examined did not specifically identify program support costs, as is required by Finance’s guidance.

25. The collection and use of information on costs associated with the delivery of individual programs is an important component of the Government’s Outcomes and Programs Framework. Over time, accurate and consistent identification of program support costs will provide a credible basis to allow assessments to be made about the cost of program administration and to inform an assessment of whether there are more efficient ways of achieving program objectives.

26. Increasingly, whole-of-government delivery of services requires departments to work together and across jurisdictions to develop budgeting and performance reporting arrangements that meet both the accountability obligations of individual departments and also contribute to the collective achievement of whole-of-government outcomes.

27. In this context, a new federal financial framework has been in place since 1 January 2009, following the signing of the Intergovernmental Agreement on Federal Financial Relations. Under this new framework, the Commonwealth is directing funding to the achievement of outcomes and outputs through national agreements. At this stage in the evolution of the new framework, the reporting requirements for Commonwealth entity PBSs and those for national agreements do not always intersect, as their foci can be different. There is currently little guidance for entities on how to assess and report in PBSs or annual reports on the performance of programs funded under national agreements. To assist in ensuring appropriate linkages, the Finance guidance could be enhanced by including additional advice on how to incorporate the performance of programs funded under national agreements into PBSs and annual reports.

28. The ultimate objective in preparing quality performance information is to inform stakeholders and decision-makers of the extent to which Australian Government resources are being used efficiently and effectively in improving the outcomes for the community. Consistent with this position, an OECD study identified the need for a government to ‘quantify its promises and measure its actions in ways that allow citizens, managers and politicians to make meaningful decisions about increasingly complex state activities’.[13]

29. The findings of this audit suggest that it is timely for entities to refocus efforts to improve the quality and relevance of performance information and reporting for the benefit of the Government and the Parliament. In pursuing the further development of KPIs, it is important that entities take a strategic approach to the selection of indicators, both effectiveness and efficiency, so that information that will assist in evaluating program performance is reported over time. The real benefits of entities improving indicators of program effectiveness and efficiency is that government is better placed to allocate resources to programs that demonstrably make a difference to achieving policy outcomes, and entities are better placed to provide advice on policy options for government and to identify opportunities for improving program delivery.

Key findings

Key Performance Indicators (Chapter 2)

30. A required element of the Outcome and Programs Framework is the use of effectiveness KPIs that enable the measurement and assessment of the achievement of program objectives in support of respective outcomes. Program KPIs should be designed to allow managers to provide sound advice on the appropriateness, success, shortcomings and/or future directions of programs.

Use of qualitative and quantitative KPIs, including targets

31. In its review of 50 entities of the GGS, the ANAO examined whether entities used an appropriate mix of qualitative and quantitative KPIs including targets against which progress towards program objectives could be assessed. The majority of program KPIs reviewed were not constructed in such a way to allow an assessment of likely progress, especially for those programs where targets were not identified. In particular:

  • 58 per cent of programs had qualitative indicators;
  • 45 per cent of programs had quantitative indicators (a small percentage had both qualitative and quantitative indicators); and
  • only 30 per cent of programs identified targets.

32. Of the four audited entities, NFSA and FWA included targets for KPIs, which allowed readers to assess progress independently of any statement by the entity. Customs and RET did not set targets for KPIs for the programs examined in this audit.

33. The greater use of quantitative KPIs, including the use of targets, is required to improve the basis for assessing entity performance. Targets express quantifiable performance levels or changes of level to be attained at a future date and, as such, allow entity managers and external stakeholders including the Parliament to determine the progress being made in meeting pre-determined program objectives.

34. Targets should be based on factors that entities can influence and relate directly to either the overall program objective or the specific factors that will lead to success. Commonly, information for some indicators will be readily available; information for others may take time to collect, in which case intermediate indicators should be considered. As for most areas of public administration, executive leadership is critical in emphasising the importance of performance information for decision-making, and in ensuring work is commissioned to assess the effectiveness of KPIs.

Underpinning Characteristics of KPIs

35. For the 89 programs included in its desktop review, the ANAO used the SMART criteria to assess whether the KPIs were specific, measureable, achievable, relevant and timed. The analysis of the 89 programs identified that approximately:

  • 40 per cent of the programs had non-specific KPIs, usually associated with the use of very broad program objectives;
  • 45 per cent of the programs had KPIs that were not measurable, usually associated with an absence of quantifiable units of measurement or targets;
  • 55 per cent of the programs had KPIs that were not clear as to whether they were achievable, usually associated with the KPI not being specific or measurable;
  • 10 per cent of the programs had KPIs that were not relevant, usually associated with not being clearly linked to the program’s objective or the broader respective outcome; and
  • 50 per cent of the programs had KPIs that were not timed, usually associated with no timeframe being specified for achieving the KPI.

36. The ANAO also used the SMART criteria to examine effectiveness KPIs for six programs—two programs from Customs; two programs from RET; one program from FWA; and one program from NSFA—to make an overall assessment of their appropriateness.

37. Although performance information of the four audited entities contained some of the elements required by the Outcomes and Programs Framework, each entity had aspects that required improvement. While each of the audited entities had developed KPIs to measure the effectiveness of the program in achieving a stated objective, there was wide variation in their usefulness as a basis for measuring and tracking performance over time. A particular weakness was the use of effectiveness KPIs that were activity-based rather than designed to measure the impact of a program. Generally where there was a lack of specificity in program objectives, this was associated with effectiveness KPIs that were also unclear and not measurable.

Annual reporting against KPIs

38. The performance information in the annual reports assessed by the ANAO was not sufficient to allow external stakeholders to understand the progress being made by entities in meeting their program objectives. When considering the effectiveness KPIs used to support 2009–10 annual reports, the four audited entities included reporting of achievements against KPIs, but generally reported on activities undertaken, with little analysis of the effectiveness of the program in meeting its stated objective or its contribution to the relevant outcome. Entity annual reports were more useful when they reported progress against quantitative and measurable KPIs.

39. The inclusion of trend data over time would also assist stakeholders to make a comparison and assess whether performance is on track, or is better or worse than previous years. Tracking performance over time, and including relevant historical data and commentary, is an established method for providing stakeholders with contextual information in which to assess whether the intended results were obtained. This approach was not apparent in the annual reports for the four audited entities.

Program deliverables and program support costs (Chapter 3)

40. Improving the operational efficiency of public sector entities has been a key driver of public sector reform over several decades. An important part of this reform process is ensuring that officials who scrutinise, manage and allocate resources direct their attention to the cost and efficiency of the resources used in achieving government objectives.

41. Finance guidance for entities on the preparation of the information to be contained in PBSs identifies a number of required elements including that:

entities are to outline the deliverables that will be produced over the budget and forward years to achieve the program objectives;

departmental expenses in support of program activities are to be allocated to ‘Program Support’; and

entities need to determine the units that will be used to show the quantity of program deliverables and will be used to determine the program‘s efficiency when measured against the resourcing provided.[14]

42. Nearly all 50 entities reviewed by the ANAO included details of the deliverables associated with their programs in their PBSs. However, 73 per cent of the programs reviewed did not identify or otherwise reference program support cost information. Of the four audited entities, RET identified separately its departmental program support costs for its administered programs. Understanding the program support costs assists in the assessment of the cost of administering programs, and ascertaining whether there are more efficient ways of achieving program objectives.

43. For both FWA and NFSA, the total entity appropriation reflected the agencies’ cost of delivering one program. Similarly, for Customs, departmental costs were fully allocated across its five programs. These costs represented the program support costs for these programs, although this was not clearly stated. From the perspective of the reader, it would be beneficial for entities to more clearly reference the allocation of departmental expenses to programs as program support costs, consistent with the Finance Guidelines.

The role of the Department of Finance and Deregulation (Chapter 4)

44. The preparation of PBSs is undertaken by entities, using guidance provided by Finance. The following policy documents have been developed by Finance to support the Outcomes and Programs Framework: Commonwealth Programs Policy and Approval Process, December 2008; 2009–10 Budget Portfolio Budget Statements Constructors Kit; Outcome Statements Policy and Approval Process, June 2009; Guidance for the Preparation of the 2010–11 Portfolio Budget Statements, March 2010; and Guidance for the Preparation of the 2011–12 Portfolio Budget Statements, March 2011.

45. To improve the usability of existing guidance, in October 2010 Finance separated its existing policy guidance into two documents—one that focuses on the mechanical construction and printing requirements of the PBSs, and a second that provides guidance on the application of the Outcomes and Programs Framework, and the development and implementation of KPIs.[15]

46. In previous versions of guidance to entities, Finance identified that consideration should be given to whether effectiveness KPIs and targets were specific, measureable, achievable, relevant and timed.[16] Given the findings of this audit that many entities are yet to develop and implement effectiveness KPIs that provide quantitative and measurable information, it would be beneficial if Finance revisited this previous guidance and suggested a diagnostic tool and methodology, such as the SMART criteria, to further assist entities to review and evaluate the usefulness of their KPIs. Any additional emphasis that could be given to the importance of this work would be beneficial.

47. In oversighting the development and use of outcome statements and performance information, Finance is also required to promote cross-government consistency. Under new federal financial arrangements, traditional accountability mechanisms are evolving as a shift to outcomes measurement requires performance indicators that link directly to outcomes, and a greater focus on the capture of robust and timely performance data.

48. Increasingly, a significant portion of Commonwealth funding is provided to State and Territory Governments through national agreements. These payments are included in the Department of the Treasury (Treasury) PBSs, and the PBSs for Treasury include KPIs associated with these payments. The Treasury KPIs for these payments are solely concerned with the process of providing the payments, and do not measure the objectives associated with the programs for which the payments are made.

49. While the funding provided for national agreements is included in Treasury’s PBSs, with a link to the relevant entity’s program, there is variability in the way entities include KPIs for those programs in their own PBSs. As such, reporting is often either at a very high level or, in some cases, non-existent. Given the growing importance of cross-government initiatives, there is a role for Finance to provide additional guidance for entities on how to reference performance reporting for programs delivered through national agreements.

50. Although the primary responsibility for the application of the Outcomes and Programs Framework rests with entities, Finance is expected to maintain an awareness and oversight of entities’ implementation. Government policy commits Finance to a number of activities, including a systematic program of evaluation of performance indicators against targets.

51. Finance advised that the Department will be preparing a report to government which compares entities’ reported performance information from 2009–10 and 2010–11 annual reports to the KPIs included in the 2009–10 and 2010–11 PBSs to assess the achievement of entities performance targets over time.

52. There is also scope for Finance to undertake reviews of other aspects of the implementation of the Outcomes and Programs Framework, particularly in terms of the development and implementation of effectiveness KPIs. Such reviews would serve to provide feedback to both government and entities on areas that would benefit from greater attention.

Summary of agencies’ responses

53. The agencies’ comments on the recommendations are contained in the body of the report following the relevant recommendation. Agencies’ responses are provided below.

Australian Customs and Border Protection Service

54. Customs and Border Protection welcomes the opportunity to contribute to the ANAO’s performance audit on the Development of Key Performance Indicators to Support the Outcomes and Programs Framework. Customs and Border Protection agrees with all three of the recommendations arising from the Audit.

Fair Work Australia

55. FWA welcomes the opportunity to participate and contribute in the ANAO’s Development and Implementation of Key Performance Indicators to Support the Outcomes and Programs Framework particularly as a new agency having commenced operations in July 2009.

56. FWA notes the views formed by the ANAO, agreeing in general with recommendations provided in the report. The report will be of value to the FWA in the regular review of our key performance indicators and in the reporting of performance.

National Film and Sound Archive

57. The National Film and Sound Archive was recently part of an Australian National Audit Office (ANAO) audit to assess how effectively entities' had developed and implemented appropriate key performance indicators (KPIs) to support stated program objectives. The NFSA was selected as one of four agencies subject to detailed analysis.

58. Overall, the audit found that the NFSA has a robust performance framework, with SMART (specific, measurable, achievable, relevant and timely) performance indicators. The report recognises that the NFSA is one of the few agencies which uses both targeted qualitative and quantitative KPIs. Overall assessment of effectiveness KPIs illustrates how well the agency meets these criteria. The findings in relation to effectiveness KPIs states:

NFSA's effectiveness KPIs were specific, measurable (with identified targets), had the potential to be achieved, and were relevant and timed. The KPIs clearly articulated how the impact of the program objective would be assessed. As such, these KPIs were sufficient to demonstrate the intended contribution of NFSA's program 1.1 to the respective outcome.

59. The National Film and Sound Archive (NFSA) recognises the importance of adequate performance information in managing program effectiveness. We pride ourselves in having performance information which is specific, measurable, achievable, relevant and timely (SMART). The NFSA acknowledges that it can further improve annual reporting by including reasons why KPIs are not achieved. We also agree that our definition of what is considered to be an interaction with the national collection could be made more explicit.

60. We consider the report to be a fair representation of our performance framework, and support the audit findings. The NFSA will continue to undertake ongoing improvements to our framework, including the use of trend data and analysis in our annual reports.

Department of Resources, Energy and Tourism

61. RET note that there are significant challenges in a policy department for developing SMART criteria for this body of work. This difficulty impacts upon reporting for a significant component of work which RET undertakes. RET would welcome both the ANAO’s and Department of Finance and Deregulation’s views on how to improve KPI development in this area.

62. RET supports the application of the SMART criteria in the development of KPIs, and has implemented a process to improve the KPIs and other reporting for the Department. A complete review of RET’s KPIs has been undertaken, and these were reported in the 2010–11 Portfolio Additional Estimates Statements. There are challenges with the development of SMART KPIs for policy development and advice, which makes up a large portion of RET’s activities, and further advice and assistance in this area from the ANAO and the Department of Finance and Deregulation would be useful.

Department of Finance and Deregulation

63. Finance will look to undertake a review of the development and implementation of effectiveness KPIs.

64. As part of its ongoing review of guidance provided to agencies on performance reporting, Finance will consider including a diagnostic tool and methodology to assist agencies in reviewing and evaluating their KPIs.

65. In assessing which diagnostic tool and methodology to include in its guidance, Finance will consult the ANAO, given the ANAO’s anticipated role in reviewing agency compliance with their KPIs.

66. Finance will consider the inclusion of further guidance to agencies on how to reference performance reporting for programs delivered through national agreements as part of this review.

Footnotes

[1] Organisation for Economic Cooperation and Development, Measuring Government Activity, 2009.

[2]   Entities refers to those bodies within the General Government Sector governed by the Financial Management and Accountability Act 1997 or Commonwealth Authorities and Companies Act 1997 that are required to report publicly in accordance with the Outcomes and Programs Framework.

[3]  ANAO Audit Report No.23 2006–07, Application of the Outcomes and Outputs Framework.

[4]   Over a number of years a range of ANAO audits have identified areas where the quality of performance indicators and reporting against them could be improved. See, for example, Audit Report No.18 2001–02 Performance Information in Portfolio Budget Statements; and Audit Report No.11 2003–04 Annual Performance Reporting.

[5]   Joint Committee of Public Accounts and Audit, Report 388, Review of the Accrual Budget Documentation, June 2002, p. 41.

[6]   Joint Committee of Public Accounts and Audit, Report 419, Inquiry into the Auditor-General Act 1997, December 2010, p. 20.

[7]  A report by Senator Andrew Murray, Review of Operation Sunlight: overhauling budgetary transparency (the Murray Review), and the Government’s response to the 45 recommendations made in the review, was released in June 2008. Notably, the Government agreed to the implementation of Recommendation 9, stating that ‘PBSs are to include financial and non-financial information on agency programs with effect from the 2009‑10 Budget’.

[8]  Department of Finance and Deregulation, 2009–10 Budget Portfolio Budget Statements Constructors Kit, March 2009, p. 6.

[9]  Department of Finance and Deregulation, Guidance for the Preparation of the 2010–11 Portfolio Budget Statements, March 2010, p. 4.

[10]   In the context of the framework, KPIs focus on effectiveness, not efficiency.

[11]   Professor Rufus Black’s Review of the Defence Accountability Framework, January 2011, references the use of SMART, particularly where he recommends Defence establish a framework on which it can hold its personnel accountable for delivery of outcomes.

[12]   Details of the program selection approach are contained in Appendix 2.

[13]   Organisation for Economic Cooperation and Development 2009, op.cit., p. 15.

[14] Department of Finance and Deregulation, Guidance for the Preparation of the 2011–12 Portfolio Budget Statements, March 2011, pp. 32, 35 and 36.

[15]  Department of Finance and Deregulation, Performance Information and Indicators, October 2010.

[16] Department of Finance and Deregulation, 2008–09 Portfolio Budget Statements Constructors Kit, March 2009, p. 93.