The recent rail franchising debacle has delivered a new body-blow to trust in Whitehall statistics. It now appears that Department for Transport officials got their modelling badly wrong, notably in relation to assumptions about inflation. The more complex government becomes and the less officials can understand the numbers they rely on, the greater the chance of serious mishap. The government might yet face a multimillion pound compensation claim because of this numerical disaster.
Ahead of the chancellor’s Autumn Statement, we are bound to see further concentration on measures of public borrowing, growth forecasts and inflation assumptions. The basis of government policy, market responses and public sentiment can be swayed by ‘better’ or ‘worse’ sets of numbers. Yet the very omnipresence of statistics and public data makes it difficult to observe problems and weaknesses. Moreover, there can be an Alice in Wonderland quality to debates about improving existing numbers. Statistics are not always what they seem.
For example, the Office for National Statistics has recently launched a consultation about the future of the Retail Prices Index and the Consumer Prices Index. Even though one of these measures refers to ‘retail prices’ and the other to prices faced by ‘consumers’, they are not the same. The RPI is generally somewhat higher than CPI, because the two measures involve different baskets of goods/services and are also calculated differently. The CPI is broadly consistent with measures used internationally.
In the 1980s, Margaret Thatcher’s government invented the Tax and Prices Index to show that inflation including tax cuts was falling (at least until it didn’t), while RPIX has been used for many years to provide a figure for the RPI minus mortgage costs. Another measure, RPIY, measures inflation excluding mortgages and a number of indirect taxes. Beyond all of these is the GDP deflator, which is used to calculate ‘real-terms’ public spending and is supposed to measure inflation across the whole economy.
How inflation is measured matters to us all. In April 2011, the government switched from using the RPI to the CPI for the indexation of benefits, tax credits and public sector pensions. Index-linked government bonds continue to be adjusted in line with movements in the RPI as do large numbers of private sector pension schemes. For students in England and Wales the interest rate paid on their student loans depends on the RPI. The RPI is also often used in pay bargaining and for price regulation, notably for certain privatised utilities and also for train fares. If the ONS decides to reform the RPI, it will materially affect millions of incomes and all household spending.
It is easy to see how the Economist magazine has resorted, semi-seriously, to using the ‘Big Mac Index’ to compare purchasing power in different countries. Attempting to assess the compound impact of changes in inflation and exchange rates from country to country is very difficult. The cost of an internationally quality-controlled item such as a Big Mac or Mars Bar gives a clearer sense of what is going on.
There is no true measure of inflation. Oligarchs concern themselves about the rising costs of Mayfair apartments and yachts, while pensioners worry about milk, fuel and bread prices. Owner-occupiers see the cost of living differently to those who rent. Although we have a single UK measure of RPI and CPI, it is improbable that prices are currently increasing as fast in Milton Keynes as in the Western Isles or Newcastle. Inflation measures are constructs.
Many other statistics suffer in similar ways: they are methodologically complex artefacts, not absolute truth. Governments have fiddled with crime and unemployment figures until it is hard to know what precisely they are telling us. Population figures also provide good examples of hard-to-interpret numbers. There are big differences between official 2011 Census figures and the mid-year estimates previously available. London’s population turned out to be 400,000 higher in 2011 than the 2010 mid-year estimates suggested. Such numbers matter: they are used to calculate grants to localities for local government, health and other services.
But the vagaries of inflation, crime or unemployment indicators pale into insignificance alongside some of the adjustments made to time series of statistics produced by the government, which are also, in many cases, official statistics. Take the apparently simple question of whether expenditure by central and, separately, local government is rising or falling. At a time when the chancellor has made huge efforts to reduce public spending as part of his deficit reduction policy, it is surely important to be able to be precise about the path of Whitehall and council spending.
Looking at Public Expenditure Statistical Analyses 2012, there are tables showing ‘Central government own expenditure’ and ‘Total local government expenditure’. The former rose from £506.2bn in 2010/11 to £513.8bn in 2011/12, an increase of 1.5%. Local government expenditure reduced from £175.1bn in 2010/11 to £174.2bn in 2011/12, a fall of 0.5%. So, although spending by the centre moved up by a small amount and by councils down a fraction, there was not much difference between the two.
Yet local authority employment is falling significantly faster than central government’s. Since the coalition took office in the second quarter of 2010, the number of council workers has fallen by more than 300,000 while the number of central government employees had dropped by about 90,000, from a similar base. The likely spending gap appears far wider here. Perhaps the official spending figures are masking true changes.
Analysis of the local government spending figures shows a jump in ‘local authority self-financed expenditure’ (LASFE) from £25.7bn in 2010/11 to £37.6bn in 2011/12 – an increase of £11.9bn in a year. The mystery of the big jump in self-financed expenditure is not explained in the Treasury’s PESA document. But there is a helpful table in the Office for Budget Responsibility’s March 2012 Economic and fiscal outlook showing that locally financed capital expenditure (a component of the LASFE) has been the subject of an adjustment because of a transfer resulting from the reform of the Housing Revenue Account.
This shifted about £8bn of spending from central to local government between 2010/11 and 2011/12. Without this ‘adjustment’, local government expenditure in the PESA document would have been closer to £166bn than the £174bn shown. Council spending would have fallen by more than 5% rather than by the 0.5% shown. Central government spending would have been correspondingly higher, up about 3% in 2011/12.
This analysis might seem desperately arcane. But it shows how the failure of the government’s PESA document to provide a footnote explaining the £8bn accounting change has rendered a proper analysis of local (or central) government spending virtually impossible. In fairness to the Treasury, in a separate section on adjustments to the national accounts it is noted that: the HRA reform ‘represents a receipt by central government of net capital grants from local authorities that implement the reform of council house financing announced in the Spending Review. This net receipt is completely offset by a net payment included within capital LASFE, so this has no impact on the overall public finances’. This is an obscure way of saying that £8bn of spending has, by an accounting adjustment, been shifted from central to local government.
It is possible there have been other adjustments that move spending the opposite way: from local to central government. The Department for Communities and Local Government publishes annual local authority revenue statistics. Its budget estimates for 2012/13 show overall spending down by 3.1% in cash, with some services, notably transport and planning, falling significantly more. Education is shown as dropping by 8.6% year on year.
However, there are two problems with this latter figure. First, councils have little or no control over the bulk of schools’ spending, as ministers determine the amount of the Dedicated Schools Grant paid to local authorities. Second, as academies are created, their spending shifts across from local to central government. In this case, a reform is taking place that overstates the reduction to council spending while adding to central government budgets. Any observations about the pressure on budgets based on these spending figures will be either wrong or, at best, misleading. Next year, spending on public health will transfer from central to local government, further confusing the picture.
Beyond the relentless movement of spending from central to local government and, separately, accounting adjustments, there are reclassifications brought about by ONS decisions about the blurred border between the public and private sectors. The most recent public sector employment figures appear to show a big drop in the central government workforce. In fact, the whole of the reduction was because 196,000 further education employees had been transferred from the public to the private sector. In fairness, the ONS included text to explain this change. But in future quarters there will be little more than a footnote to denote the change. A slight change in the governance of colleges, schools or, indeed, hospitals could lead to further reclassifications.
Next year, the transfer of the Royal Mail pension fund to central government will cut apparent spending by £28bn. Misleading conclusions will be drawn.
Government cannot stand still forever. But the restlessness that affects Britain’s public administration means that these kinds of accounting and classification changes will make it almost impossible to be certain whether a trend is a trend. Departments have regularly been reconfigured, councils restructured, services transferred from one part of government to another, assets reclassified and employees moved between the public and the private sectors. Time series in official publications must be analysed with such caution that it will often be safer not to compare any one year’s figures with another.
In the end, it is very difficult to know whether local government spending is falling by 0.5% or 5%, although the impact on the ground will be easy enough to see. The number of public sector employees can fall and the private sector grow because of a statistical definition.
The closer we approach a time when even people who can find their way round official statistics realise they cannot be sure what any particular number means, the worse it will be for democracy. If we cannot be sure whether an asset or employee is in the public or private sector, we have a problem. If we cannot measure how cuts in public spending are affecting sub-sectors of the economy, people will resort to anecdote and hunch.
Looking ahead, the Office for National Statistics and the UK Statistics Authority must take the lead in ensuring that official data are produced in ways that allow comparisons over time, regardless of the configuration of government. The ONS website, which often makes it hard to find things, needs to be improved. There should also be more ‘overlapping’ tables of times series that make it possible to look across radical changes in classification, organisation and the delivery of services.
We need the ONS to represent the public interest in the production of trustworthy, consistent figures about the operation of government. No amount of official ‘open data’ policy will overcome the problem of inconsistent or ill-presented statistics. Statistics must not be allowed to give way to damned lies.
Tony Travers is the director of the Greater London Group at the London School of Economics