Thursday, February 17, 2005

Delivering value online

The Australian National Audit Office have been checking up on how the folks down under are doing with their e-government initiatives. As in all audit reports, there is a brief discussion of any positives before rapidly getting down to uncover the negatives. The key points, though, are relevant I suspect to any and every country with an e-government programme: First, on website cost comparison

21. While agencies were able to provide estimates of the recurrent costs of their websites, they used different methods to calculate these costs and included a range of different items. Agencies had not conducted activity based costing of their websites. This made it difficult to compare the costs of websites against each other. The major item in most agencies’ recurrent costs of their websites was salaries for the staff responsible for managing the website. IT cost information was limited, and, where such costs were provided, most were relatively small.

22. Websites in agencies at similar stages of Internet service delivery displayed wide variations in costs. However, it was not apparent whether these differences were related to the stage of website development and/or the size of the agency, or to other factors not identified. Further, there was insufficient comparable data to determine whether cost differences were related to degrees of website efficiency and effectiveness.

23. Only one agency had conducted a cost-benefit analysis to determine whether the Internet was the most effective form of delivery for their online service. No agency had calculated an expected return on investment for providing the service. Despite having information on both costs and benefits, and having outlined this as one of the principles to be used in determining whether a particular service should be provided online, other agencies did not include a cost-benefit analysis in their business cases
And then on monitoring success

26. Three agencies had developed performance indicators for their online services. This meant that half of the agencies had not identified how the success of the program would be measured, such as by meeting estimated targets or achieving reduced costs. As well, while agencies included information on various e-government activities related to a number of their programs in their annual reports, few had reported externally on any specific performance indicators for their websites or online services.

27. ANAO considered that some agencies would have difficulty in determining appropriate performance indicators for their websites, because some of the websites’ objectives or aims were very general or not clearly specified. ANAO noted, however, that agencies were already collecting much of the information required to develop adequate indicators to assess performance.

28. Despite including evaluation plans in their business cases, most agencies had not evaluated their website redevelopments or new online services, although most planned to. Further, agencies did not generally have an integrated monitoring and evaluation policy for their Internet service delivery.

And they recommend

34. ANAO suggests that to improve their management of e-government, and their measurement of the efficiency and effectiveness of Internet service delivery, agencies:

  • establish coherent arrangements for management of their websites to further their more efficient use;
  • develop internal policies and guidelines for the Internet and encourage agency staff to use them;
  • quantify the benefits and costs of their websites;
  • consider using AGIMO’s Demand and Value Assessment Methodology to assess websites and online service delivery;
  • identify the audience for their website and online services, and consult potential users about their needs;
  • assess demand for the delivery of services via the Internet, and specify targets for achievements against objectives; and
  • compare the performance of their websites with that of other agencies or sites, to assist in assessing whether the website is efficient and effective.
All of the changes to bold text are mine. The report includes the comments from the various agencies audited and all agree the findings and the recommendations. I think that's the first time I've ever seen that with a public audit report. Someone once said to me that the NAO (the UK equivalent of the folks that did this report) know what their report will say within 2 weeks of starting their work but negotiating the wording takes a further 18 months, which is why reports are often published so long after the event being reviewed. The work on this audit was carried out from February to May 2004 and it appears to have been published on Feb 10th 2005. Maybe it's the same in Australia?

1 comment:

  1. Anonymous9:09 pm

    Interesting as OZeGov is, let's not forget that the life expentancy of the average aborigine is about 20 years less than white aussies.

    Depending on you cut the stats, death rates for an age group can be x4-x7 equivalent white rates. How can this be when their xml is so thorough?

    It's always fun to hear Australia strutting as a lead tech nation, whilst they operate dark age policies against their own indigenous population.

    Would be good to see eGov health sites reflect what is really happening, Australia needs an Abo Mandela more than a website.