Make them count, by Keith Davis

16 Feb 06
When the National Audit Office investigated Whitehall's efficiency savings it found that they weren't all they seemed. Some were aspirational, some weren't efficient and others couldn't be proved. Keith Davis advises how to identify genuine gains

17 February 2006

When the National Audit Office investigated Whitehall's efficiency savings it found that they weren't all they seemed. Some were aspirational, some weren't efficient and others couldn't be proved. Keith Davis advises how to identify genuine gains

How do you capture improvements in the quality of teaching created by giving teachers more time to prepare for lessons? How do you put a value on the increased security that comes from paying benefits directly into people's bank accounts? These are the sorts of challenges government departments have been facing as they aim to secure £21.5bn of annual efficiency gains by March 2008.

This week, the National Audit Office has published the first in a series of reports for Parliament on the efficiency programme. As part of our work we have examined how all the major departments are planning to measure their efficiencies. We carried out detailed reviews of 20 projects, which between them account for £6bn of the £21.5bn targeted gains.

We saw many examples of good progress, though it is too early to tell whether the programme overall will succeed. We found that, in many respects, departments are managing their efficiency programmes well. There is generally a high degree of senior management commitment to the programme and there has been good progress in recruiting high-calibre project management professionals.

Our report points out that this is a high-risk programme. It is very dependent on a small number of departments and individual projects. Fifty of the 300 projects are forecast to deliver four-fifths of the programme. And at least 15% is dependent on information and communications technologies. It wouldn't be fair to say that all the eggs are in just the one basket but they're certainly not divided between very many.

One of our conclusions is that some caution needs to be applied when assessing the magnitude of efficiency gains achieved. At this point, we think that they should be regarded as provisional.

This is for a number of reasons. In some sectors, there are significant delays in obtaining data. Sometimes this means that further efficiencies have yet to come through for the period already reported on. In others, it means that we don't yet have any information on the impact reported efficiency gains might have had on service quality. In other words inadequacies in the data mean that, with some efficiency projects, we cannot be sure that service quality has not suffered as a result.

In addition, our review of projects has shown that there are important limitations in many of the measurement methodologies that are being used. Some projects do not have a baseline against which changes to inputs and outputs can be compared. This means that we cannot be confident of what has been achieved because we do not have a full picture of the situation in the first place. In other cases, departments have not taken account of the additional costs incurred as part of the efficiency project. But these costs need to be deducted before any efficiency gains are reported.

For efficiency gains to be fully credible, public bodies need to demonstrate that: baselines are in place that represent the situation before the initiative began; methodologies capture inputs (including any additional costs incurred as a result of the initiative) and outputs (quantity and quality); and that data assurance is based on clear audit trails and independent validation.

There is no doubt that government departments face a real challenge in trying to get this right. Deficiencies in their management information systems mean that their ability to track changes in output quantity and quality generally lags behind the needs of a programme that has to demonstrate that it is delivering efficiencies, not just spending cuts. This is important because reduction in service quality is always going to be the biggest risk with a programme of this kind.

We would suggest that, for any given project, if a department is unable to satisfy the principles of good practice in efficiency measurement, it would be wise to look to an alternative activity. This does not mean that the project should be abandoned; in most cases, contribution to an efficiency target is only one of many intended project benefits.

Our report notes that there was a difficult transition from Sir Peter Gershon's review into the efficiency programme, with almost all the key individuals moving on. While naturally it took the Office of Government Commerce's efficiency team a little time to recover from this, we found that it has made major recent improvements in its monitoring of progress and in the support that it is providing to departments. During the course of our investigation, we have seen the team becoming much more effective in its work with departments on measurement issues.

More generally, more still needs to be done to ensure that all staff are motivated to achieve efficiency gains. For many, especially those at the end of long delivery chains, the efficiency programme can be perceived as more of an economy drive than something genuinely aiming to improve efficiency. If local bodies are to implement initiatives quickly, priorities need to be communicated clearly and funds outside existing budgets might need to be made available.

A further dimension to our report has been a search for good practice. From a number of sources, we have identified case examples of successful efficiency initiatives from public, private and voluntary sectors, within the UK and from overseas. From each example, we have extracted the main lessons for public bodies trying to improve efficiency.

The focus in government is very much on achieving the £21.5bn figure. Much less attention has been given to the programme's other objective of embedding efficiency into the public sector culture. We think that experience elsewhere shows that there is potential to go a lot further than the targets set for the current programme, if deeper and more systematic changes are pursued.

For this to happen, the public sector needs to improve in six main areas:

  • Strategic leadership from the centre of government
  • Staff professionalism and expertise
  • Quality and timeliness of data on efficiency and productivity
  • Integration of efficiency into day-to-day systems and thinking
  • Use of efficiency comparisons between organisations
  • Collaboration between public sector organisations

Part of this would involve public sector bodies having a better understanding of the drivers of cost and value in their organisations. They need to review on a regular basis how well the different aspects of their organisation are contributing to overall efficiency. To help with this, the NAO has been developing guidance that assesses an organisation's approach to achieving efficiency and identifies improvement opportunities. We will be launching this in the spring.

Measuring efficiency improvements is not an easy task but it is worth it for the government. While no one should underestimate how hard this can be, the value for citizens and public sector managers alike in having a clear and comprehensive picture of what's happening to inputs, outputs and quality will make the effort and pain worth it.

Keith Davis is director of the National Audit Office's Efficiency Centre

PFfeb2006

Did you enjoy this article?

AddToAny

Top