Safety in numbers

12 Jun 09
The new statistics authority bared its teeth when the government misused crime data last year. But there’s a way to go before public trust in official figures is restored
By Jill Leyland

30 January 2009

The new statistics authority bared its teeth when the government misused crime data last year. But there’s a way to go before public trust in official figures is restored

In December, a minor political storm was sparked when the chair of the UK Statistics Authority criticised the government for publishing figures on teenagers hospitalised as a result of knife crime. In a public letter to Number 10 permanent secretary Jeremy Heywood, Sir Michael Scholar expressed concern that Downing Street officials caused the Home Office to quote the unchecked data in a press release.

The premature publication of these data was against the advice of official statisticians, and in breach of the National Statistics code of practice. Criticism of the government, which was forced to apologise in Parliament, came not just from media and opposition parties but also from two senior Labour MPs, Tony Wright and Keith Vaz, respectively chairs of the public administration and Home Office select committees.

In January, the authority followed up with a detailed analysis of why the press notice, and its accompanying ‘factsheet’ on the Tackling Knives Action Programme, infringed the code of practice.

This incident was of more importance than the temporary political embarrassment caused to the government. It indicated that the new authority, established on April 1 last year under the Statistics and Registration Service Act 2007, had set its face against the misleading use of statistics – ‘spin’ – by the government. It also showed that the authority was able, at least on this occasion, to muster sufficient clout, with media and parliamentary support, to impose its view.

The Royal Statistical Society sees this as a notable start but we believe that more progress is needed.

Official statistics are important. Every day, decisions are made – by central, devolved or local government, by commercial companies or by individuals – that are based on, or influenced by, some set of official data. You do not have to dig very far into almost any area of our national life before you will find some piece of information linked to an official figure. Statistics are not the most high-profile element of government work but, like the foundations to any building, their quality and trustworthiness are crucial. Like Caesar’s wife, they should be above suspicion.

Yet public confidence in official data is low. The latest survey by the Office for National Statistics, published in March 2008, showed that only about a third of respondents trusted government statistics, while just 20% believed that statistics were produced without political interference and 16% that ‘the government uses figures honestly when talking about its policies’. In 2007, a Eurobarometer survey found that trust in statistics in the UK was lower than in any other of the 27 European Union member countries.

This low level of trust in official statistics poisons and distorts public debate and undermines decision-taking. It also does not reflect fairly the intrinsic quality of UK statistics or the calibre of those producing them. Rather, many of the ills that affect UK statistics can be found in the decision-making structures that surround them and in the way they are used by government.

The Statistics and Registration Service Act aimed to make the production of official statistics more independent of government. It established the authority, which is independent of ministerial control, and reports direct to the Parliaments in Westminster and Scotland and the Welsh and Northern Irish assemblies. It has direct responsibility for the Office for National Statistics and oversight of official statistics as a whole. The authority has the objective of ‘promoting and safeguarding the production and publication of official statistics that serve the public good’ and is obliged to produce a code of practice against which all national statistics have to be assessed.

The Royal Statistical Society, the UK’s only professional and learned society devoted to the interests of statistics and statisticians, has long argued for reform of the official statistical system. It has taken an active role in discussions around the various changes introduced by the current government.

Along with the (now abolished) Statistics Commission and other bodies such as the Statistics User Forum, the RSS lobbied hard for changes to the government’s original proposals for the Bill, with a fair degree of success. Of course, the Act is not perfect. ‘Independence’ from political oversight is partial since the ONS produces only certain data. The UK has long had, and retains, a decentralised statistical system in which many datasets are produced by departments and thus remain, at least nominally, under ministerial control. These include such high-profile figures as those on crime, health and education. Some statistics are legally under the control of the devolved administrations.

The RSS believes many things need to be done to unlock the potential benefits of the 2007 Act. It listed its concerns in a letter to Sir Michael Scholar and some of these are already being tackled. Four elements highlighted as crucial are: better planning; better communication; more attention to the needs of users, particularly outside central government; and the proper separation of policy comment and statistical information.

Official statistics need planning and should evolve as society and the economy evolves. These are areas where the UK statistical system has failed in the past and still sometimes fails, such as the slowness to develop statistics to reflect the growing importance of the service sector and the well-aired problems with migration statistics. The authority is already assessing planning procedures.

Communication and presentation of statistics has also been a weak point of the UK statistical system. Symptoms of this have included the much-criticised ONS website (now being revamped), turgid and unhelpful press notices, and at times a failure to communicate adequately with users.

Producers of official statistics need to become more outward-looking and to engage with society in general. The needs of central government are generally well-catered for but more attention needs to be paid to other users such as local authorities, demographers and economists. The authority’s code of practice emphasises this and obliges statisticians to assess and document user needs. We await the practical results.

Finally, we return to where this article began: the separation of policy comment and statistics and, more broadly, the proper use of statistics by government. All evidence points to politicians’ misleading use of statistics as one of the crucial factors behind the low levels of trust in UK official data. We believe it is crucial to address this.

‘Spin’ might be a part of the rough and tumble of political life but it has no place in the general work of the civil service. Documents for public information, such as the Tackling Knives Action Programme factsheet, should be what they imply – a balanced assessment of the facts.

The RSS has, therefore, welcomed Cabinet secretary Sir Gus O’Donnell’s commitment to learn the lessons of the December events, particularly in recognising that the code of practice ‘covers all officials and advisors who use and quote official statistics’, and that it is ‘essential that statisticians are involved at an early stage in the production of any publications that contain official statistics’.

Government can be sure that many bodies, the RSS among them, will be watching developments with more than active interest.

Did you enjoy this article?