The objective of this audit was: to form an opinion on the adequacy of selected agencies' approaches to monitoring and evaluation of government programs and services delivered on the Internet; and to identify better practices and opportunities for improvement. In order to achieve this objective, the audit examined the websites and Internet-delivered services of five agencies.

Entities

  • Australian Securities and Investments Commission
  • Australian Taxation Office
  • Commonwealth Scientific and Industrial Research Organisation
  • Australian Taxation Office
  • Department of Employment and Workplace Relations
  • Health Insurance Commission
  • National Office for the Information Economy

Summary

Background

Government Internet services have a relatively short history. With few exceptions, such services did not exist ten years ago. In December 1997, in recognition of its potential to improve government services, the Federal Government committed the Commonwealth to deliver all appropriate services using the Internet by the end of 2001.

In 2000, the Federal Government reaffirmed the importance it placed on agencies using the Internet to improve services to the clients of government programs. Government Online—The Commonwealth's Strategy1 made clear that provision of government online services was to include not only provision of information about agencies and their programs, but also the conduct of transactions between government agencies and members of the public or businesses. Internet services were to complement—not replace—existing written, telephone, fax and counter services, as well as improve the quality, availability, responsiveness and consistency of those services.

In support of this initiative, the ANAO produced a Better Practice Guide, Internet Delivery Decisions—A Government Program Manager's Guide, to assist program managers to use the Internet effectively. The Better Practice Guide also provides a useful framework and context for the other Internet-related audits planned by the ANAO, including an audit scheduled for 2004 on the efficiency and effectiveness of Internet-delivered services.

At the time of this audit, it was acknowledged, internationally, that there was limited performance information being collected about Internet services. Consequently, few clear indications were available of the benefits to actual users. However, it was also generally recognised that, without some form of systematic performance measurement, the benefits from Internet services and the return on significant related investments cannot be assessed. Therefore, the ANAO considered it was timely and important to identify whether Commonwealth agencies were adequately monitoring and evaluating the effectiveness of their Internet services and the latter's contribution to agency outputs and outcomes.

The audit approach

This audit examined the adequacy of selected agencies' approaches to monitoring and evaluation of Internet¬delivered Internet¬delivered government programs and services. We also identified better practices and opportunities for improvement. Monitoring refers to the systematic collection of information to provide indications of how a program or service is performing. Evaluation is concerned with asking questions of that performance information (for example, about effectiveness, efficiency and appropriateness), so as to provide answers that assist those responsible to better manage and improve service delivery and to promote accountability for performance.

The audit included the following five agencies:

  • the Australian Securities and Investments Commission (ASIC);
  • the Australian Taxation Office (ATO);
  • the Commonwealth Scientific and Industrial Research Organisation (CSIRO);
  • the Department of Employment and Workplace Relations (DEWR); and 
  •  the Health Insurance Commission (HIC).

Given that agency's special interest in the topic, a copy of the proposed report was provided to the National Office for the Information Economy (NOIE) for comment.

In each agency covered, the audit team looked most closely at two levels of Internet use:

  • agency website level—this refers to websites controlled by agencies. Websites were generally used to deliver information about the agency and, on some occasions, to deliver selected agency services. This part of the audit also included portals, which allow users to access collections of related government websites and services via a single online entry point. An example of a portal is Australian WorkPlace, headed by DEWR. Australian WorkPlace provides information about employment and workplace relations, government assistance, jobs, careers, training and wages; and
  • Internet-delivered service level—this level refers to the services delivered on the Internet by divisions or programs within each agency. These services may have their own website or be delivered through the agency website.

In the broad, the services the ANAO examined were delivered in two main ways over the Internet:

  • where any Internet user can browse and interact with the agency's data or databases. Examples of services delivered in this way included:
    • ASIC's FIDO News—a monthly newsletter that provides tips and warnings to consumers, and information about investing in shares and managed investments; and
    • CSIRO Enquiries—a national science and enquiry service for both internal and external users. This service has been operating for over 60 years using a range of methods which now includes an Internet option; and
  • where clients can interact with agency databases and exchange sensitive information. Examples of this form of delivery included:
    • the ATO's e-tax—a software package which can be downloaded and enables the taxpayer to lodge a return over the Internet. The ATO aimed to have 800 000 lodgements via e-tax for the 2002–2003 tax year;
    • CSIRO's Sentinel Hotspots—an Internet based satellite mapping tool which provides timely fire location data to emergency service managers across Australia. ‘Sentinel' was used extensively during the recent ACT, NSW and Victoria bushfire crises;
    • DEWR's Australian JobSearch, which enables Job Network members, jobseekers, employers and other agencies to meet their recruitment and employment needs;
    • DEWR's National Indigenous Cadetship Project. This service aims to improve professional employment
      prospects of Indigenous Australians. It has been available on the Internet for just over two years;
    • EXAD—an ASIC service that allows liquidators, receivers and administrators to lodge documents electronically for companies under external administration. Target users are some 800 liquidators in Australia;
    • Pharmaceutical Benefits Scheme Online (PBS Online)—an HIC initiative, which is still in the early stages of development and is designed to improve the efficiency of claims processing for pharmacies; and
    • the Australian Childhood Immunisation Register (ACIR)—HIC's national database containing information on the immunisation status of Australian children under the age of seven.

This report does not comment specifically on the performance of individual agencies. Rather, it presents the general findings across the agencies examined. As a result, readers should not assume a general finding necessarily applies to a particular agency examined. The report does, however, include examples of better practice found in the particular agencies examined, for the benefit of other public sector organisations.

Key findings

Administrative policies and responsibilities (Chapter 2)

None of the agencies had agency-level administrative policies or guidance specifically addressing requirements and responsibilities for both monitoring and evaluation of agency websites and Internet-delivered services.

All agencies examined were yet to develop agency-level policies addressing requirements and responsibilities for monitoring of their Internet services and no agencies had policies for evaluation. The absence of agency policies or requirements for periodic evaluation means there was no clear obligation to assess the overall effectiveness of Internet services or the actual benefits to clients, even after extensive reorganisation of responsibility for website management had been put in place. However, in some instances, business planning processes at the program level provided a structured means to partially meet the monitoring gap by specifying monitoring and reporting requirements for certain defined Internet-delivered services.

Overall, while there were various approaches to address the issue of measuring the impact of Internet delivery by agencies, there were no examples of administrative policy or guidance that achieved all of the following:

  • applicable at agency level;
  • provided a consistent approach to monitoring and evaluation of websites, portals and Internet-delivered services; and
  •  allowed for judgements to be made about agency (as opposed to program) performance.

The ANAO considers that a strategic agency-level focus on Internet services is necessary because of the current evolving nature of the Internet, the likely cost of investment, and the impact it may have on future delivery strategies for government services. Such a focus should be supported through the establishment and communication of clear policies for monitoring, reporting and evaluation of Internet services that include definition of roles and responsibilities, specification of a range of monitoring information, and scheduling of periodic evaluations. This, in turn, would facilitate the establishment of consistent practices across the organisation and the collection of data that is useful in measuring and/or assessing agency performance. Given that its relatively recent introduction means there is limited experience about the best methods of providing services online, evaluation is essential as a basis for rigorous assessment of the most effective ways to provide quality Internet services in the future.

Most agencies were in the process of changing the governance and management arrangements for their agency websites.

Most agencies had identified areas for improvements in their websites, and were in the process of changing governance and management arrangements for their websites to ensure consistency of standards, design and presentation of content. Most agencies recognised this as providing a good opportunity to develop appropriate policies and to specify responsibilities for monitoring and evaluating their Internet services.

Agency websites and portals (Chapter 3)

Performance information collected on agency websites and portals was limited because of a lack of clear objectives and agreed performance indicators.

Agency websites and portals examined were not established with clear objectives and associated performance indicators. Agencies had relied largely on readily available web statistics as the main source of performance information and had generally not established clear objectives or performance indicators. While they provided summary views of site activity, web statistics did not, by themselves, necessarily suggest areas for improvement. Managers were able to interpret the information when the web statistics (consisting of trend analysis over time) were combined with user research that accounted for the observed patterns. Agencies were not yet using options within web statistics packages, for example, data on pathways of users around the website, which would provide a richer, more layered, understanding of site use.

There was limited user research in development of agency websites and portals.

None of the agencies had conducted detailed user research to inform the development and content of their websites or portals. However, the ANAO found that, following the establishment of agency websites, the majority of agencies audited had commissioned some detailed user research to inform redevelopment of their websites.

All agencies provided means for users to comment on the website, but monitoring of these comments was occurring mainly at the agency web manager level. This information was used chiefly for alerting agencies to problems such as links to other sites that no longer worked or to inaccurate content, rather than for gathering information about users and their experience of the website or portal.

Agencies were unable to demonstrate the contribution of websites and portals to their outcomes and outputs.

The ANAO found that no agency had evaluated the overall effectiveness of its websites and portals and their contribution to agency outputs and outcomes. The absence of strategic policy, combined with the absence of overall performance evaluation, means that agency investment in website development and redevelopment is not informed by evidence as to how effective these sites are in engaging with, and providing services to, the public in general and clients in particular.

In the main, the information reported to decision-makers responsible for the agency website and portals was limited to broad usage data and was not linked to objectives. Therefore, the contribution of agency websites and portals to agency outcomes and outputs was not demonstrated.

The ANAO considered that agency websites and portals should be subject to the same rigorous review as any other programs or activities that are considered key to the accomplishment of agency strategy and objectives.

Agencies have used the results of reviews to improve, restructure and redevelop their websites and portals.

The ANAO found that senior executives were responding to the results of website reviews. There were instances where reviews of agency websites provided the basis for complete redevelopment of those sites.

Notwithstanding this, there was little evidence that agencies were systematically considering performance information, with the exception of user visits, at executive management level. This may partly be due to the fact that most agencies were in the process of determining agency web strategy, organisational arrangements and subsequent information, monitoring and reporting requirements.

Internet-delivered services (Chapter 4)

Most agencies had objectives for their Internet-delivered services and were monitoring a core set of information that showed achievement against those particular objectives.

The ANAO found that a core set of information for monitoring performance and use had been developed for most of the audited Internet-delivered services. In contrast to the limited information used for monitoring agency websites, the information collected and monitored for most services was adequate as it was aligned with pre-defined service objectives and encompassed a variety of sources and collection techniques. The ANAO considered that this contributed to appropriate and effective agency action to address the implications of data collected and enhance service delivery.

Managers of Internet services utilised existing agency business planning processes to report on their performance. The ANAO found that the service contribution to agency outcomes and outputs was best demonstrated when explicit reference was made in planning documents to the relationship between the service and higher-level agency policy and program objectives. Furthermore, reporting against service objectives informed decision-makers and other stakeholders of the broader role of the service and facilitated accountability and management review of its relevance and usefulness.

Most agencies had not conducted user research prior to establishing their Internet-delivered services, but they had commissioned several forms of user research once services were operating.

While most agencies had not conducted user research prior to establishing their services, they had commissioned several forms of user research once services were operating. Agencies applied the results of user research to improve service delivery. Research included customer satisfaction surveys and market research of specific user groups.

Agencies were reviewing aspects of service provision and acting on these reviews to improve service delivery.

In contrast with the situation for agency websites, taking action on performance information, particularly at program manager level, was evident for all services, reflecting the more extensive performance information and the greater alignment of performance information with service objectives.

From their reviews of performance information about the services, agencies addressed the following three issues:

  • technical and systems capacity issues highlighted by web statistics and system performance monitoring;
  • content or information issues, raised through user comments and monitoring of use patterns; and
  • business process issues raised through monitoring client usage and process efficiency, for those services where clients conduct business transactions through the service.

In the case of services which had been operating for sufficient time to warrant special studies or reviews, the recommendations from agency studies had either been implemented or were in the process of being implemented.

No agency in the audit had planned or conducted an overall evaluation of the effectiveness of Internet-delivered services.

As previously mentioned, most agencies had reviewed aspects of service provision. The results were used to improve these aspects of the service. However, no agency in the audit had planned or conducted an overall evaluation of the effectiveness of Internet-delivered services. As already discussed in relation to agency websites, the lack of agency policy on evaluation, combined with the absence of evaluation activity, means that the investment in Internet-delivered services is occurring with limited quantitative or qualitative evidence of the effectiveness of these services in meeting their objectives.

While this situation was understandable in the early days of Internet delivery, the ANAO considers that now that the Internet is becoming a standard means of communication and, in many cases, service delivery for all government agencies, Internet delivery should be subject to the same rigorous review and evaluation as services delivered in any other manner.

Overall audit opinion

The audit concluded that audited agencies did not have specific agency level policies, including clear responsibilities for both the monitoring and evaluation of websites, portals and Internet-delivered services.

In relation to websites and portals, in most instances clear objectives had not been articulated, and little user research had been conducted. Agencies recognised this weakness. As a result, most agencies audited are putting in place measures to ensure a more holistic and strategic view of Internet services, and online communication with their clients.

However, the audit found that the situation is more positive at the individual Internet-delivered service or program level within agencies, where information is routinely being collected, analysed and applied to improve service delivery. Nevertheless, there is considerable scope for improvement in the quality of that information. The audit has identified some possible areas for action in its recommendations.

Overall, in the ANAO's opinion, the audit concluded that current approaches to the monitoring and evaluation of Internet services by agencies were not adequate. However, in view of the short period that Internet services have been a key feature of service delivery, this situation is not totally unexpected. Also, there was evidence that agencies are now making considerable efforts to improve their monitoring and evaluation of this potentially key strategic tool.

Agency Responses Summary

ASIC, ATO, CSIRO, HIC and NOIE agreed with all recommendations. DEWR agreed with three recommendations and noted two. Full responses from ASIC, CSIRO, HIC and NOIE can be found in Appendix 1 of the report.

NOIE commented that it believes that the proposed report will be valuable in focusing agencies' attention on the need for a strategic approach to service delivery and, in particular, the Internet and emerging electronic service delivery channels.

NOIE noted that most agencies were in the process of ensuring consistency across their websites and portals. It suggested a new recommendation:

That agencies develop governance and management arrangements for their websites and portals to ensure consistency of standards, design and presentation of content across the agency.

The consistency of design and presentation of content is not addressed in this report. Nevertheless, the ANAO considers this to be a useful recommendation.

Footnotes

1 Government Online—The Commonwealth's Strategy, Department of Communications, Information Technology and the Arts, April 2000