01 October 2004
Although the Home Office's latest police force performance figures caused widespread concern about the time officers spend behind their desks, they also contained some positive news about crime clear-up rates.
Or so it seemed. The problem now is that a leading independent policing expert is sceptical about the methods used to produce these figures. Worse still, the same critic says that there is little evidence to support ministerial claims that the public value police performance measurement as a democratic tool.
When police minister Hazel Blears published Police performance monitoring 2003/04 on September 22, an assessment of England and Wales' 43 forces, she was positively gloating. The figures indicated that police have cut crime, improved their investigation of offences, reduced sickness absence and promoted a greater sense of safety among the public.
Interestingly, many of the crimes that forces have tackled effectively are not connected with the Home Office's current obsession – antisocial behaviour.
Instead, it is traditional problems that seem to have been curbed. Burglaries have been reduced by 8.1%, from 19.8 per 1,000 households in 2002/03 to 18.2 last year. Robberies are also down, by 6.3%, from 2.1 per 1,000 households to 1.9. Car crime, meanwhile, has fallen by 8.9% over the year.
The number of offences 'brought to justice' – where offenders have been cautioned or convicted – has risen by 3% nationally, and the sickness absence in forces is down by 9%.
All this during a period when, Blears reluctantly revealed, police spent an average of just 53% of their time crime-fighting and 10% of it preparing cases for court. The remainder is spent on other administrative functions.
'These [figures] demonstrate a remarkable year of achievement for policing,' claimed Chris Fox, president of the Association of Chief Police Officers.
Blears added that the 'overall picture of policing performance that emerges is encouraging'. All of which could lead us to believe the government's claims that the police have turned the tide in the battle against crime.
Not so, says Tony Travers, director of the Greater London Group at the London School of Economics, and one of the country's foremost policing academics.
He points out that overall detection rates are down by 1.7% nationally, with the worst culprit, London's Metropolitan Police Force, solving just one in every eight crimes.
But Travers' biggest gripe is with the compilation and dissemination to the public of the figures.
His criticism is threefold. First, he says the government has regularly changed the way certain crimes are measured so that the year-on-year comparisons highlighted by the PPM exercise are devalued. For instance, the Home Office's adoption of the new National Crime Reporting Standards in 2002 led to a 10% rise in recorded crime, according to researchers at the University of Essex.
Additional regular changes, to police authority boundaries for example, have rendered other statistics 'little more than useful', Travers claims.
The Home Office, which has 'long been one of the government departments most opposed to targets and measurements', has moved the goalposts to stave off criticism, Travers contends.
Secondly, he believes that the Home Office has steered clear of obvious changes that could be made to the PPM regime that would make the data 'genuinely useful'.
Travers claims that a system of comparative police reporting used in New York, CompStat, has been ignored by the Home Office, despite being commended in a joint Cabinet Office/Treasury report this summer.
CompStat is a more accurate reflection of crime rates and trends, US experts say. It also allows the public to analyse police performance in areas as small as individual precincts. Yet UK police forces oppose its introduction here.
Finally, Travers also rejects Blears' claim that the PPM exercise 'plays a vital role in informing the public about how their force is performing'. He argues that the way the figures are illustrated – using complex 'spidergrams' and 'peer families' for comparative purposes – prevents any functional use by those with only a passing interest in policing.
Interestingly, in spite of the government's attempts to muddy the waters of comparison, it may yet be possible to produce a crude national league table for police forces on the back of the data used in the PPM.
The 'peer' system uses what horse racing fans would recognise as a handicap scheme to equalise local and regional variations. While some data may be slightly skewed, therefore, it is possible to tot up how each authority scores against the six criteria, and list them in a table.
Is it mere coincidence that in this crude list (see table below), all seven of the police authorities identified as in need of the Police Standards Unit's help over the past year – including Avon and Somerset, Greater Manchester and Northamptonshire – fall into the bottom third?
|Top five police forces||Overall score|
|Bottom five police forces||
|Avon and Somerset||542|