Departmental business plans: new wine in old bottles, by Colin Talbot

8 Nov 10
The coalition government today announced departmental business plans across Whitehall. These are supposed to be 'revolutionary', but the revolution is more spin than substance

The coalition government announced today a series of ‘Departmental Business Plans’, following up on the Spending Review announcements last month. These plans are supposed to be ‘revolutionary’ in several ways. But this ‘revolution’ is more spin than substance, with much of what they are doing simply following in the footsteps of previous governments – Conservative as well as Labour.

Chief Secretary to the Treasury Danny Alexander has said about the new plans: ‘All departments in previous governments have had plans for what they are going to do. The difference is that we’re making our plans public, so that the public can see how we intend to go about our business, what we intend to do in any department, any area that people are interested in.’

This is, of course, nonsense. John Major’s Conservative government started publishing business plans for government departments back in the early 1990s. By 1996 they had also started publishing an ‘Output and Performance Analysis’ (OPA) statement for each department, showing how well they were doing on important areas of delivery. The New Labour government from 1997 evolved OPAs into PSAs – public service agreements – which did a similar job.

So on the key issue of ‘transparency’ for performance, what the government is doing is nothing new. Some of the data it is publishing on ‘inputs’ (costs) is new, at least in the level of detail, but it is as yet unclear how useful it will be. Without proper ‘activity-based costing’, which by and large Whitehall can’t do, simply publishing more detailed lists of spending will be fairly useless to anyone.

A brief glance at the Home Office plan confirms this – the financial data is less detailed than that already published in departmental annual report and accounts.

The other key claim is that the new system has dumped the much hated ‘targets’ of New Labour. This claim is partly true, but only partly and is potentially misleading.

Where it is true is that the target-based PSAs have gone, and with them some of the cascaded targets for local government and other local services. Where it is not true is that all targets have been scrapped – waiting times for A&E, for example, have been changed but not scrapped.

Where it is misleading is the idea that the ‘burden’ of providing performance information has also been scrapped. In fact, far from being scrapped it seems to be increasing in many areas. According to my sources within the NHS, for example, the demand for various types of performance information has started to increase – on A&E waits, for example, the Department of Health is now demanding far more, detailed, information than before.

The new business plans also introduce a whole new category of ‘targets’ in the form of reform ‘milestones’. The Home Office plan, for example, details 63 milestone targets to be hit by 2015. Milestones are a particular sub-set of targets, ones that have to be hit by a certain time and usually relate to structural and systems changes. But they are targets nonetheless, and there are hundreds of them across government in this new system.

Of course most of the ‘milestone’ type data is of very limited interest to the ordinary citizen (although not of no interest). What matters most to most people is the actual services that affect their daily lives.

Here the government claims that the data it will publish will enable people to take charge, make choices and force change. There is a big assumption here that this what people want – research in the British Social Attitudes survey suggested that choice in public services is far less popular than politicians think. But leaving that aside, what is different about the new approach?

‘Data which will help people to make informed choices’, as the Home Office plan calls this sort of information, turns out to be not very much different from the sort of data that was published under Public Service Agreements. Instead of ‘outcome’ indicators under New Labour we now have ‘impact’ indicators under the coalition – if you can figure out the difference, I’m willing to offer a prize. On a first, quick, look through the plans there appear to be roughly the same number of such indicators as there were ‘targets’ in the old New Labour scheme.

The only substantial difference is that the coalition’s impact indicators don’t have targets attached to them. They say this is because it is up to citizens to make their own judgements – a cynic might respond that it also has the added advantage that the government can’t be held to account for not meeting their own targets – remember Estelle Morris resigning as Education Secretary precisely because of such a failure on exam results? Perhaps ministers in the new government have learnt something. When they talk about replacing ‘bureaucratic accountability’ with ‘democratic accountability’, they clearly don’t include their accountability to the electorate in the latter.

There are important areas where the coalition is cutting performance reporting – especially the compilation of comparative data of how well your council, or local services, is doing compared to others. The Comprehensive Performance Assessments and their successor Comprehensive Area Assessments, developed by the Audit Commission, have gone, soon to be followed by the commission itself. Again, cynics might conclude that some forms of performance information might allow voters to make rather too well informed decisions.

Does anyone remember the last government’s annual reports? Probably not, because they are long dead. New Labour boldly decided to publish a government annual report – starting for 1997-98. They managed to produce four of these glossy publications, which were sold (in very small numbers) via WH Smith, before the reports died of embarrassment.

The reporting in the annual reports was relentlessly one-sided – successes were highlighted, failures side-lined or ignored. No-one was independently auditing the data being used to produce the reports, so it was impossible to tell if it was accurate. Subsequent studies of PSA data by the National Audit Office showed that a great deal of it wasn’t.

It will be interesting to see how much of this new wave of ‘transparency’ survives contact with reality. Will the government’s ‘transparency’ website – the modern equivalent of the annual reports – go the same way? We shall see.

Colin Talbot is professor of public policy and management at Manchester Business School. This post first appeared on Whitehall Watch

Did you enjoy this article?

AddToAny

Top