One for good measure, by David Griffiths

6 Apr 06
The latest guidance issued to councils on 'efficiency gains' is clearer and acknowledges the importance of quality. But there are still gaps, not least how all these savings should be measured. David Griffiths reports

07 April 2006

The latest guidance issued to councils on 'efficiency gains' is clearer and acknowledges the importance of quality. But there are still gaps, not least how all these savings should be measured. David Griffiths reports

Productivity and efficiency are back in the news with new data from the Office for National Statistics plus the National Audit Office's progress report on the Gershon programme. The Office of the Deputy Prime Minister followed these up on February 28 with updated guidance for local government, Measuring and reporting efficiency gains – a guide to completing annual efficiency statements.

The new guidance is markedly more coherent than the succession of documents issued in the run-up to the first annual efficiency statement submissions last April. But it also leaves many questions outstanding about the validity of the current approach to efficiency measurement and its links to the rest of the performance management framework for councils.

A significant and welcome shift is the increased emphasis on service quality and outcomes. The Gershon report itself effectively recognised all three of the traditional 'Es' within its definition of efficiency – economy in acquiring inputs, technical efficiency in using them to produce outputs, and effectiveness in producing the best mix of outputs for social benefit. Sir Peter Gershon himself expressed this last form of efficiency as: 'Changing the balance between different outputs aimed at delivering a similar overall objective in a way which achieves a greater overall output for the same inputs ('allocative efficiency').'

This is clearly a key dimension if services are being reprovisioned to meet changing needs. The ODPM's Efficiency technical note of January 2005 failed to carry it over from the Gershon report, focusing instead solely on economy and technical efficiency. Now, although the ETN stands, the new guidance clearly acknowledges the issue. It states: 'It could be possible to say that where a council restructures service provision to provide a more efficient service to the same client group in a way that maintains overall output and service quality while costs reduce, then a legitimate efficiency gain has been generated.

'For example, if a council decides to close a day centre but recycles the money released… by increasing the uptake (through additional support) of direct payments, then an efficiency gain may be achieved. Some people immediately affected by the closure may experience (possibly only perceived) a drop in service but a greater number of people in the community might be served better, thus improving the overall service and benefit to the client group in question. Any cash reduction or service quality improvements will be an achieved efficiency gain.'

In the real local government world, of continual adjustment in the balance of service provision within tight budget constraints, this is a welcome development.

The mention of quality is also significant. To be fair, improved quality was always seen as an acceptable efficiency payoff – but how to measure it? It turns out that the ODPM has commissioned extensive work on this from the Institute of Local Government at the University of Birmingham. Further guidance is promised by early May to support preparation of the 'backward look' AES submissions in June.

Also welcome is a more realistic approach to inflation. In calculating real efficiencies, the applicable inflation rate can be an important consideration, for example when a contract price is frozen in cash terms. The ETN required local government to apply the gross domestic product deflator with very limited exceptions. These have now been expanded to cover adult social care, highways, home-to-school transport and both capital and revenue aspects of social housing. Sector-specific indexes are now specified for all of these.

There is also a much fuller discussion of capital efficiencies – although one that still begs major questions.

A final plus point is not part of the guidance, but very much part of the mood music that has accompanied its launch. Subject of course to future ministerial decisions, the strong and unsurprising hint from the ODPM is that we can expect a second three years of something very like the Gershon regime – from 2008 to 2011 – as part of next year's Comprehensive Spending Review. This means it will make more sense than before to take a long-term, strategic view of sources of efficiency over the next five years.

But this depends on getting the regime right for those next three years, and therefore on resolving some of the continuing issues of concern.

The biggest problem is this. The ODPM has substantially refined its technical advice on measurement and will no doubt continue to do so. Indeed, with the Inlogov work, the suite of 'measurement toolkits' – guidance notes – for each service sector, and the continuing deliberations of the 30-strong Measurement Task Force, this is something of a cottage industry. But no amount of refinement can get round the fact that items that are inherently very different are being added into the figures in the annual efficiency statements and in the national totals that ministers have announced.

After all, these headline numbers derive from adding together:

  • Cashable revenue savings, where the council spends less on something while maintaining the volume and quality of output (or possibly outcome).
  • Unit cost reductions, where expenditure may rise but the output more so. (For example, the Department for Environment, Food and Rural Affairs allows cashable efficiencies to be claimed for waste management where rising waste volumes help to spread fixed costs within a growing budget. But is this either a financial or an environmental efficiency?)
  • Non-cashable gains, where some physical or human asset is worked harder to yield a higher output in the short term.
  • Capital efficiencies, which, because identical projects are rarely repeated, are often to be assessed against 'counterfactual baselines'. Worked examples in the new guidance include bid costs reduced from a 'previously projected' amount (projected when, and with what rigour?), and pre-contract cost reductions 'based on professional judgement'.

And affecting all these categories are three further underlying – and unresolved – issues. The first is: are we looking for technical or social efficiency? For example, procurement is a vital part of the Gershon programme, and of course it is possible to obtain inputs to the public sector more cheaply. But if this is solely through the application of bargaining power, without any technical improvement on the supply side, the public sector's gain is the supplier's loss and there is no net gain to society as a whole. The same would apply to 'productive time' gains achieved solely by intensifying labour input, as distinct from genuine productivity gains through, for example, mobile working.

The second issue is aggregation. Local government has an aggregate three-year target of £6.45bn per annum efficiencies. Is this the sum of each authority's (and school's) efficiencies, or the net national position? Similarly, are each authority's reported efficiencies the sum of efficiencies at departmental, or at cost centre level? The total will depend on the level of aggregation, unless there are no offsetting 'disefficiencies' anywhere in the system. For example, workforce remodelling can boost efficiency by reducing unit labour costs for some tasks. But elsewhere labour costs may rise, perhaps through implementation of single status and equal pay agreements.

Should these costs be netted off? Or do higher wages for low-paid women represent a proper 'release of resources to the front line'? As Colin Talbot noted in Public Finance (February 24–March 2), it is possible for all efficiency projects to succeed and yet for the public sector to be less efficient at the end than it was at the start.

Then there's sustainability. To qualify, efficiencies must be maintained over the three-year horizon. (One-off savings in a year are only of value if the authority cannot otherwise reach the 2.5% annual target, they cannot count against the cumulative 7.5% target.) But how can this be known? For example, spending on empty housing might improve due to cyclical, and therefore reversible, circumstances as well as through management action.

All of this means that there is a very real danger that we will get better and better – locally and centrally – at refining and adding up numbers which, at the end of the day, don't mean very much at all.

So what to do? There seem to be three broad options. One can take Professor Tony Travers' view that this is a symbolic process, in which the naked emperor and the admiring crowd just have to play out their parts. But this seems too cynical. After all, efficiency and productivity do matter. Most of what local government does is in the public sector for a reason – because markets for the services in question are non-existent or subject to serious failures. They therefore face little competitive pressure and, because they are labour-intensive, have an inherent tendency to increased relative cost as real earnings grow – while the public resists paying higher taxes. So the efficiency agenda will not go away, and we need to retrieve a more sustainable approach, particularly for the 'second three years'.

A second option is to continue to refine the present measurement regime. This surely means tackling some of the confusions detailed above. It would mean more focus on management than on financial accounting; attention to 'negatives' as well as 'positives'; and a more prescriptive regime, to ensure accurate capture of true technical and allocative efficiencies within a closed accounting system.

This seems to be where the National Audit Office is headed. Its recent report, Progress in improving government efficiency, clearly accepts the official view that the current AES arrangements for local government are a 'light touch' regime. Their concerns seem to have driven another aspect of the new ODPM guidance – an increased emphasis on the review of each authority's process by the Audit Commission and, by exception, by the ODPM itself. The NAO argues for greater emphasis on cost baselines, audit trails and data assurance, all of which would point to greater prescriptiveness in the measurement guidance.

Thirdly, the government could take a 'big picture' approach. As Talbot put it: 'The only way of knowing if overall efficiency is growing or declining in a public service is to measure it overall.' What matters here is that the total balance between local government costs and outputs or outcomes continues to improve. Total costs are not hard to measure – outputs more, outcomes much more so. But the change in outputs or outcomes through time is perhaps not quite so challenging – and the Office for National Statistics' programme of work on measuring government output, in the wake of last year's Atkinson report, has the potential to help.

Essentially we are in the realm here of an overall Comprehensive Performance Assessment-type judgement, and many of the detailed measurement issues rehearsed above simply melt away. Instead of refining these, the government – and the taxpayer – would rely for continued efficiency improvement on the incentives provided through the pincer effect of public demand for services, tight money and electoral politics. In this model, the financial focus is on the volume of expenditure relative to what it achieves, rather than on cost accounting at the micro level.

I believe this would be the best approach. It goes to the heart of what matters, while avoiding death by measurement bureaucracy. It could be integrated into the new performance information framework recently put out to consultation from the ODPM, with its emphasis on the 'efficient, effective authority'. But it does, of course, depend on a more consistently trustful relationship between central and local government.

If this is a step too far for ministers, perhaps councils could be placed under two efficiency reporting requirements. One would be to continue to demonstrate economy, efficiency, effectiveness and therefore value for money at the aggregate level, through the CPA mechanism or its successor. The other would be to list in an 'annual efficiency statement', or more simply within the Best Value performance plan or annual report and accounts, a select list of areas where deliberate management action has resulted in genuine (and auditable) cost improvements.

The present AES arrangements seem to fall between these two stools. They are supported by increasingly detailed, but unavoidably flawed, guidance on the measurement of cost improvements. But they also demand a spurious aggregation of hard-to-compare items, which focuses attention on 'adding up' rather than cost improvement and leaves the regime open to criticism.

All-round performance, and the incentives to improve it, are what matters. Efficiency is too important to be left to the efficiency experts.

David Griffiths is head of corporate support and efficiency programme manager at Kirklees Metropolitan Council. The views expressed are his own

PFapr2006

Did you enjoy this article?

AddToAny

Top