The blame game, by Arthur Midwinter

26 Oct 06
It's one thing to demand that local authorities prove they have achieved desired outcomes, but quite another to do it. Arthur Midwinter argues for a more realistic approach to performance measurement in Scotland

27 October 2006

It's one thing to demand that local authorities prove they have achieved desired outcomes, but quite another to do it. Arthur Midwinter argues for a more realistic approach to performance measurement in Scotland

Performance measurement with an emphasis on outcomes is a key element of New Labour's rhetoric on public services and Audit Scotland's approach to Best Value in local government. But councils keep falling short of the mark. So is this their problem – or is the system letting them down?

Under the statutory guidance setting out what constitutes the 'proper arrangements for Best Value', Scottish local authorities are required to set aims, objectives and targets for all their activities. Performance against them must be assessed and monitored within an integrated framework of community, corporate and service plans, budgets and performance reports.

One of Audit Scotland's central tenets is that a robust performance management system is crucial for councils to achieve continuous improvement in service delivery.

However, a recurring criticism of Scottish councils is that they lack such systems. In particular, Audit Scotland's inspectors have repeatedly drawn attention to failures to link budgets to outcomes; an over-reliance on statutory performance indicators (SPIs); and a lack of systems to measure outcomes.

The problem lies not in the competence of Scottish councils – it is the lack of realism in the system itself.

First, the rhetoric of budgeting for outcomes far exceeds the capacity of government to implement such models successfully. Outcome indicators have been recognised as problematic across the globe because of the inherent difficulties of linking budgets to outcomes. Inevitably, a whole host of external factors come into play that influence the result of any budget decision.

Several councils have been criticised by Audit Scotland for their failure to link budgets better with plans, and to redistribute funding accordingly, rather than following historic trends. In practice, though, budgetary systems are incremental, in the sense of focusing on marginal changes to baselines, where reallocation is feasible.

Rather than pursuing the hierarchical, overly bureaucratic model advocated by Audit Scotland, councils would be better served developing a strategic approach to guide resource allocation, by setting out a small number of clear corporate priorities.

Secondly, the over-reliance on performance indicators is understandable, because of the emphasis in Best Value on benchmarking with comparable authorities. That said, however, some better indicators of performance can be found in national statistics publications, and councils could make much more use of this information.

The criticism that councils lack outcome measures is difficult to take seriously given the limited state of the art globally, and the same lack of outcome measures in the SPIs. Indeed, most of these are simple efficiency measures of collection rates or processing times, when what is needed for Best Value is better output measures so that service performance can be measured.

In the current set of indicators, 18 measure outputs, seven measure outcomes and four measure costs. Indeed, the Accounts Commission has recognised that SPIs provide a less-than-rounded picture of council performance.

Research in Wales, conducted by the Centre for Local and Regional Government Research at Cardiff University, supports this view. The research team there identified a lack of data and expertise, and problems with defining both outputs and outcomes.

The auditors use SPIs to assess performance through a performance index. This measures the ratio of improvement to decline, based on indicators that record improvements or reductions in quality greater than 5%. Between 2001/02 and 2003/04, for example, it gave a performance index of 1.04 for Scotland as a whole.

The index is derived arithmetically, simply by dividing the number of indicators reporting greater than 5% improvement by the number reporting greater than 5% decline in performance. For example, 429 divided by 411 gives a rate of 1.04.

But this is just a partial measure that excludes most SPIs. What's more, there are ten indicators that already record scores of over 90%. This makes it hard for an authority to deliver a 5% improvement, and mathematically impossible for authorities that already score above 95%.

Consequently, there are a significant number of indicators on which some or most councils will have no capacity for demonstrating improvement. These include processing times for: social work enquiry reports; invoice payments; building warrants; supervision orders; and business advice requests.

The same problem applies to indicators that measure the length of time taken to attend to problems and complaints, such as noise, pest control and street and traffic light repairs. It is also an issue in relation to collection rates for council tax; non-domestic rate income; council house rent arrears; and rent loss through voids.

This approach understates the scale of improvement in performance among Scottish local authorities, and is not consistent with the standards they are expected to meet since there is no statutory requirement for improvement above 5%.

Reworking the data to include all indicators showing improvement or decline increases the performance index from 1.04 to 1.50 – a figure that suggests a much more positive picture.

So it is clear that current practice in assessing performance for Best Value audit has considerable scope for improvement.

First, there is a strong case for using fewer, but more relevant, SPIs, which provide information on the comparative costs of service delivery along with comparative data on service standards.

Secondly, there is a need to spell out the evidence base that underpins judgements on a local authority's performance, as too often this is simply asserted, not demonstrated.

Thirdly, there is a need to reconsider the assessment of the structures and processes associated with governance, since Audit Scotland has not provided any coherent body of evidence that this is an essential precondition of continuous improvement.

Indeed, most of the councils criticised for having inadequate arrangements for delivering Best Value actually perform well in service delivery on the auditor's own data. Shetland Islands Council, which was strongly criticised last year, is a good example in this respect.

Finally, Audit Scotland needs to consider a more selective approach as public spending tightens and funding becomes less generous. It is inevitable that in such a climate not all services will be improved. Indeed, some will face cuts if they are deemed to be low priorities.

In short, the current approach needs greater analytical rigour, and a more realistic audit model. Otherwise, there are likely to be serious challenges to audit findings that are open to contestable interpretations.

Arthur Midwinter is a visiting professor at the Institute of Public Sector Accounting Research at the University of Edinburgh

PFoct2006

Did you enjoy this article?

AddToAny

Top