This audit was designed to identify the methods used by selected agencies to measure the efficiency and effectiveness of their delivery of services through the Internet, and to evaluate the adequacy of these methods. ANAO also identified better practices, lessons learned and opportunities for improvements.

Summary

Background

Australian Government policy is that agencies use the Internet to deliver all appropriate programs and services. This has led to considerable agency investment in Internet-based service delivery. ANAO, in this audit, examines whether agencies are measuring the efficiency and effectiveness of the services and programs they deliver through the Internet.

The Australian Government's Better Services, Better Government strategy outlined the broad directions and priorities for the future of e-government in Australia, and sought to maintain the momentum of agencies' actions under the Government's earlier policy Government Online. One of the key objectives of the Better Services, Better Government strategy was for agencies to achieve greater efficiency in providing services and a return on their investments in Internet-based service delivery. It also stated that investing in e-government should deliver tangible returns, whether they take the form of cost reductions, increased efficiency and productivity, or improved services to business and the broader community.

Australians use of the Internet has increased over recent years making online service delivery an important form of service delivery, particularly for government agencies. Australian Bureau of Statistics (ABS) surveys showed that 53 per cent of Australian households had Internet access at home in June 2003, compared to 16 per cent in 1998; while 71 per cent of businesses had Internet access.

The potential audience for government websites and online services is restricted to those who have access to, and the skills to use, the Internet. While over 80 per cent of people aged 16 years and over currently have access to the Internet from any location, Internet access is lower for some groups, particularly older people, who comprise a large proportion of the clients of government agencies. Some people and businesses have only limited or reduced Internet access, as they use older computer technology or a modem/single telephone line, or live in an area with less coverage.

In this regard, it is important for agencies to improve their understanding of their target audiences' information and service needs, and to measure and provide reliable estimates of likely demand. These steps require agencies to measure Internet adoption and take-up, equipping them to develop more efficient and effective approaches to online service delivery.

In 2001, the ANAO developed a Better Practice Guide, Internet Delivery Decisions, which set out broad guidelines to assist program managers to use the Internet more efficiently and effectively when delivering programs and services. ANAO also uses the Better Practice Guide as a framework and context for audits that review agencies' Internet services.

In 2002–03, ANAO conducted an audit that examined the adequacy of agencies' approaches to monitoring and evaluation of Internet-delivered services. This current audit complements the previous audit and builds on its findings.

The audit approach

This audit was designed to identify the methods used by selected agencies to measure the efficiency and effectiveness of their delivery of services through the Internet, and to evaluate the adequacy of these methods. ANAO also identified better practices, lessons learned and opportunities for improvements.

The audit included the following six agencies:

  • the Australian Trade Commission (Austrade); 
  •  Centrelink; 
  •  the Child Support Agency (CSA); 
  •  the Department of Health and Ageing (Health and Ageing); 
  •  the Department of Veterans' Affairs (DVA); and 
  •  the National Archives of Australia (National Archives).

Given its special interest in the topic, a copy of the proposed report was provided to the Department of Finance and Administration (Finance) for comment by the Australian Government Information Management Office (AGIMO).

The audit team examined both the agency website and one service delivered on the Internet by each of the selected agencies. The six online services selected for the audit were:

  • Austrade's Export Market Development Grant (EMDG) online eligibility questionnaire, which allows clients to provide information about their business and receive a response about their eligibility to apply for a grant based on that information;
  • Centrelink's Update Families Income Estimate Service (UFIES), which allows families to advise Centrelink of their estimated annual income; 
  •  CSA's ‘When Parents Change Address' form that allows parents to change their address and contact details online; 
  •  Health and Ageing's HealthInsite that provides Australians with access to reliable, high quality health information;
  • DVA's World War Two (WW2) Nominal Roll, which offers online access to a database containing the details of service men and women who served in WW2; and
  • National Archives' eShop, which permits Internet users to purchase National Archives' publications.

This report does not comment specifically on the performance of individual agencies. Rather, it presents the general findings across the agencies examined. As a result, readers should not assume that a general finding necessarily applies to a particular agency examined. The report, however, includes examples of better practice found in the particular agencies audited, and outlines opportunities for improvement for the benefit of other public sector organisations.

Key findings

Managing e-government (Chapter 2)

All audited agencies had a strategy for their websites that set out the management, responsibility and approach for use of the Internet. Agencies had more coherent management arrangements for their websites and online services. ANAO considers that this raised the likelihood that agencies were operating their websites more efficiently.

All of the selected agencies had developed guidelines and policies for developing websites, website management, and internal website design standards for use by staff. ANAO suggests that other agencies ensure that their policies and guidelines on websites are widely disseminated and used.

Overall, most agencies had developed criteria to determine if a service was appropriate for online delivery. Agencies were consulting users and stakeholders about their needs prior to developing new online services. Most of the agencies were striving to offer users more online services that allowed a two-way flow of information between the agencies and users.

All agencies followed the Government's Online Information Service Obligations (OISOs), and had addressed the standards and guidelines for Metadata, accessibility, usability, privacy and security. ANAO considered that all the audited agencies addressed these standards and guidelines, increasing the likelihood that agency websites were accessible and user friendly. Users could easily access agencies' statements on protection of website users' privacy and security of information.

All audited agencies designed their Internet services so that users who had either older computers or newer technology could access them. This approach to designing Internet services increased the potential number of customers. Agencies were also aware that some of their users were reliant on older technology and balanced this against the potential for increased services that result from use of newer technology.

Planning efficient and effective Internet services (Chapter 3)

All agencies selected in the audit had developed planning documents that directed their websites' most recent redevelopments or major redesigns. ANAO noted an improvement on the findings of the previous audit in that agencies had clear objectives for their websites, and had outlined the potential benefits to the agency and to clients. However, rather than consulting users before commencing a website redesign or redevelopment, agencies tended to involve users in testing the new design.

ANAO found that agencies generally did better in measuring the efficiency and effectiveness of the specific online services and programs selected than they did with their websites. Agencies had developed business cases or Budget proposals that specified the objectives of the online service and the expected benefits to the agency and potential users. Most agencies consulted these users to determine whether the services were feasible and how they should operate.

All agencies had prepared broad cost estimates for the development of their online services, and these were generally sufficient to gain program approval for the services. Most were able to provide the recurrent cost of maintaining the service. Two agencies could not provide the maintenance cost of the service, as they included this with the cost of usual website maintenance activities.

While agencies were able to provide estimates of the recurrent costs of their websites, they used different methods to calculate these costs and included a range of different items. Agencies had not conducted activity based costing of their websites. This made it difficult to compare the costs of websites against each other. The major item in most agencies' recurrent costs of their websites was salaries for the staff responsible for managing the website. IT cost information was limited, and, where such costs were provided, most were relatively small.

Websites in agencies at similar stages of Internet service delivery displayed wide variations in costs. However, it was not apparent whether these differences were related to the stage of website development and/or the size of the agency, or to other factors not identified. Further, there was insufficient comparable data to determine whether cost differences were related to degrees of website efficiency and effectiveness.

Only one agency had conducted a cost-benefit analysis to determine whether the Internet was the most effective form of delivery for their online service. No agency had calculated an expected return on investment for providing the service. Despite having information on both costs and benefits, and having outlined this as one of the principles to be used in determining whether a particular service should be provided online, other agencies did not include a cost-benefit analysis in their business cases. ANAO considers that, where agencies can identify tangible benefits for the proposed service, they should be undertaking cost-benefit analyses to inform their planning for Internet service delivery.

Measuring the performance of Internet services (Chapter 4)

While performance measures have been widely used for assessing traditional forms of government service delivery, ANAO found that agencies generally have been slower in developing performance measures to assess the delivery of services through the Internet.

All agencies included in this audit used various methods to collect a range of data for monitoring the performance of their websites and online services. Many, however, relied on this monitoring data to determine areas for improvement of their Internet service delivery. Only two agencies had developed specific performance indicators for their websites that measured performance against their website's objectives.

Three agencies had developed performance indicators for their online services. This meant that half of the agencies had not identified how the success of the program would be measured, such as by meeting estimated targets or achieving reduced costs. As well, while agencies included information on various e government activities related to a number of their programs in their annual reports, few had reported externally on any specific performance indicators for their websites or online services.

ANAO considered that some agencies would have difficulty in determining appropriate performance indicators for their websites, because some of the websites' objectives or aims were very general or not clearly specified. ANAO noted, however, that agencies were already collecting much of the information required to develop adequate indicators to assess performance.

Despite including evaluation plans in their business cases, most agencies had not evaluated their website redevelopments or new online services, although most planned to. Further, agencies did not generally have an integrated monitoring and evaluation policy for their Internet service delivery.

Overall audit conclusion

The ANAO found that all of the selected agencies had developed a strategy that allocated responsibility for the management of agency websites and online services, and set out the agency's approach for their use of the Internet. ANAO considered that agencies' arrangements for more coherent website management were moving them towards more efficient operation of their websites.

ANAO found that agencies' monitoring of the effectiveness of their websites was adequate. They used a number of methods to collect information and used this data to continuously improve their websites and online services. However, agencies had difficulty in obtaining sound activity or management costing of their websites, and the costs provided varied for similar services. Few had used cost-benefit analysis nor determined productivity gains or returns on investment. Agencies were unable to report any efficiency savings through use of the Internet, as they had not evaluated their services. ANAO concluded therefore, that most agencies had not developed adequate measures to determine whether the website is an efficient form of service delivery.

ANAO found that, in general, agencies did better in measuring the efficiency and effectiveness of the selected online services than they did with their websites. All agencies had used a range of methods to collect performance data and a number of agencies had determined some performance indicators for their online services. However, very few had collected information that enabled comparison of the efficiency of online service delivery against other service delivery channels.

ANAO noted that agencies could demonstrate their achievements against the Government's aims for
e-government by providing improved services to their clients, business and the broader community. However, they were generally unable to determine whether their investments in e-government were delivering tangible returns, such as cost reductions or increased efficiency and productivity.

Overall, the ANAO concluded that agencies' methods were inadequate to assess whether their delivery of government services and programs through the Internet was efficient and effective. However, there was evidence that agencies included in the audit were making considerable efforts to improve, and there had been significant progress made in the past year.

Recommendations

ANAO made three recommendations, addressing the need for agencies to:

  • use cost-benefit analyses to support proposals for online services;
  • evaluate websites; and 
  •  integrate measurement of website performance with measurement of service delivery.

Opportunities for improvement

ANAO also made a number of suggestions as to how agencies could improve their management of e-government and their measurement of the efficiency and effectiveness of Internet service delivery. These are included in the Summary and Key Findings, and throughout the body of the report.

Agency responses

Agencies audited and consulted provided the following summary responses to the draft audit report.

Austrade Response

Austrade agrees with the key recommendations made by ANAO in this report and believes that implementation of these recommendations will assist in the progressive improvement of online services and their alignment with agencies' corporate directions.

While strongly supporting the integration of website measurement into the overall measurement of effectiveness of service delivery, Austrade sees benefit in continuing to measure the performance of websites in their own right, as the channels for delivery of information and services and as a distinct business activity.

Centrelink Response

Centrelink agrees with the recommendations made in the report.

CSA Response

CSA agrees with the recommendations and the opportunities for improvement identified in the report. CSA believes that the recommendations support the improved planning, implementation and evaluation of CSA's online service channel and promote consistency between the Internet service and other service delivery channels.

The major areas of focus for CSA will be to formalise existing arrangements, such as evaluation of online services and developing more focused objectives and performance indicators, to ensure greater transparency and accountability.

DVA Response

DVA agrees with the overall findings of this report. DVA is further developing its framework for the evaluation and measurement of efficiency and effectiveness of the main DVA website.

Health and Ageing Response

Health and Ageing supports the recommendations made in the audit. Both the Department's website and HealthInsite have recently been upgraded, based on the results of extensive evaluations and analysis of a range of performance indicators. Future development of the Health and Ageing website and HealthInsite will continue to be based on evaluations of the needs of users, performance indicators, costs and, where possible, comparisons with similar indicators for other service delivery channels.

National Archives Response

National Archives agrees with the recommendations made in the report.

Finance Response

Evaluation is a key component of the strategic management of any initiative and e-government is no exception. The report emphasises the importance of the management of e-government, from the planning of which services are appropriate to the evaluation and measurement of what constitutes effective and efficient services for both citizens and government. Opportunities for improvement exist with the application of more rigorous planning tools like the Demand and Value Assessment Methodology and the recognition that operational excellence comes from a continuing focus on value for money, on results obtained from effort and on measurement of value.

Agencies also provided responses to each of the recommendations. The relevant responses appear immediately following each recommendation, in the body of the report.