Big-data analytics: the power of prediction

27 Jan 16

The ability to anticipate demands will improve planning and financial efficiency, and collecting and analysing data will enable the public sector to look ahead


Hospitals around the country are well accustomed to huge annual rises in patient numbers as winter demand hits accident and emergency departments. But Wrightington, Wigan and Leigh NHS Foundation Trust (WWL) had to rethink service planning after unprecedented A&E demand during a sunny July 2014, which saw ambulances queuing outside the hospital. The trust now employs computer analysis to help predict and prepare for peaks in demand.

As public sector organisations grapple with ever-tighter savings targets, analysis of a broad range of historical data – big data analytics – offers an opportunity to pre-empt service requirements and so help the public sector manage demand more effectively and target scarce resources better. However, working with data to gain insight and save money is not without its challenges.

At WWL, a partnership with business support provider NHS Shared Business Services – a 50:50 joint venture between the Department of Health and technology firm Sopra Steria – resulted in a project that uses an analysis of historical data and complex algorithms to predict the most likely scenarios. In September, the partners launched HealthIntell, a suite of data reporting tools for A&E, procurement and finance.

The suite includes an application designed to help hospitals better cope with A&E pressures and meet waiting time targets. HealthIntell presents real-time data on attendances at A&E departments to doctors and other decision makers. It can predict demand on a daily and hourly basis, and allows trusts to use their own data to identify peaks and troughs – for example, the likely rise in attendances due to bad weather or major sporting events – to help deploy the right people with the right expertise at the right time.

The trust sees around 280 patients in A&E every day, of whom 60 or so will be admitted and need a bed. To plan to meet demand, having an accurate picture of what is happening is crucial. Information is presented throughout the trust’s hospitals, including on a 70-inch touchscreen installed in A&E (pictured above), which helps to ensure enough staff are on hand and that sufficient beds are available for patients likely to be admitted.

“We never went into this thinking we needed to build an app,” Mark Singleton, head of business intelligence at WWL told Public Finance. “Now it’s helping us strike the balance between a high-quality service and a financially viable and sustainable one. To make efficiency savings, you need to be looking forwards rather than looking at yesterday’s data.”

While Singleton remains tight-lipped about the cost of the project, use of HealthIntell at WWL has resulted in an 18.75% reduction in the average length of stay in A&E from 160 minutes in April 2014 to 130 minutes in February 2015 – well within the four-hour target regularly breached across the NHS last winter.

The use of predictive analytics is not the only way in which the app is influencing performance at WWL, says David Morris, managing director of NHS SBS. Greater team working at the coalface to discuss the data on the HealthIntell dashboard and work out action plans has undoubtedly paid off; it also represents a significant cultural shift, one not to be underestimated by organisations looking to emulate WWL’s success.

“There’s more to this than the app itself – it’s also how you use that information and how you, as a service, can improve as a result,” Singleton says. Morris adds that the visual nature of dashboards is a powerful aid to understanding: “Things will go wrong at times but this lets them understand why, so they can target recovery in the right way.”

Morris is broadly optimistic about the opportunities that big data presents for forecasting across the public sector; he also admits that it has not been an easy sell, because big data is generally not well understood. However, interest is growing in the role that technology-enabled crystal ball gazing could play in reducing costs across all spheres of the public sector, as demands for doing more for less continues to bite.

In his interim review of operational productivity across the NHS, published last June, Labour peer Lord Carter said that up to £5bn a year could be saved by 2020, giving smarter procurement of hospital supplies and better management of staff rosters as areas of focus.

Rikke Duus, a senior teaching fellow at University College London’s School of Management, agrees strongly that an evidence-based approach to providing services is key to efficiency gains, using data that is already available. Although the use of big data across the public sector is trailing well behind that in the private sector, pressure is mounting for it to catch up. Consumers’ experiences with private sector organisations – in particular the growing personalisation of services – is raising expectations about the sort of public services people expect to receive.

Transparency, openness and integration can benefit consumers, Duus says. “It’s about reinventing the business model to cut costs and improve efficiency. We have to use data to predict and prevent. The public-sector mindset is getting there and the huge repositories of data held across the public sector offer a great starting point, but often they don’t know how to get into it and skills are an issue,” Duus says.

Burgeoning demand for analytics expertise in retail, banking and finance has created a severe skills shortage that is allowing big-data professionals to command an average salary of £55,000 – 31% higher than the average IT position, according to a report published in November 2014 by the Tech Partnership employers’ network and business analytics company SAS. More than three quarters of posts were considered “fairly” or “very” difficult to fill, and the situation is unlikely to have eased in the interim.

Professor Robert Fildes, director of the Lancaster Centre for Forecasting, part of Lancaster University Management School, warns that public sector organisations are at a distinct disadvantage when it comes to competing for such sought-after skills.

The centre has worked on a number of public sector forecasting projects, including a Department of Health initiative to predict pay drift for its non-medical workforce and a scheme commissioned by NHS Blackpool to forecast patient activity.

“The other constraint is data,” Fildes observes. “People talk about data as if it is a uniform value. But the Department of Health doesn’t have any real data on the demand for, say, hip operations. They only have data on the operations they’ve done. The data required for analysis isn’t good enough,” he says.

Duus says it is vital that organisations collect both the structured and unstructured data they need for effective decision making: “It’s very important to have clear objectives and [decide] what data you need to achieve that,” she notes. Combining data sets from a variety of disparate sources is not the only difficulty in getting this right, however. The security of data, data protection issues and countering widespread public cynicism about how data is used are issues that need to be addressed head on.

“For these sorts of project to work, there’s got to be a fit between the tools and the jobs that people do,” Fildes warns. “The key is engaging people but the public sector is so far removed from having the resources. A couple of software licences won’t get you anywhere.” Resources aside, he believes that success depends on projects looking to address operational problems rather than being IT driven.

Despite the challenges, projects are reaping rewards across a variety of public sector organisations. Since 2008, the London Fire Brigade (LFB) has been using software from SAS to prioritise the allocation of fire prevention resources, even pinpointing specific households most at risk of fire. The software brings together around 60 data inputs including demographic information, geographical locations, historical data, land use and deprivation levels to create lifestyle profiles for London households.

Deaths caused by fire in the capital fell by almost 50% between 2010 and 2015, according to the LFB. It attributes much of the reduction to better targeting of around 90,000 home visits the brigade carries out each year, to advise on fire safety.



LFB analysis in 2013 showed that 31% of London households fall into a “young, educated” category and, while less likely to suffer a fire than average, don’t respond to home visits. A social media approach was instead used to influence this group.

The Department for Work and Pensions is also using SAS software to run simulations and test ‘what if’ scenarios, helping project everything from pensioner income distributions to benefit expenditure. HM Revenue & Customs is using predictive analytics to model pension demand.

Locally, Hackney Council has worked with software and services firm Xantura to pilot a risk profiling model in children’s services to help identify children most at risk of maltreatment and to target interventions more intelligently.

Hackney head of finance Hamza Yusuf says cost avoidance is the most complex part of any business case or financial model. “To quantify the savings, we must demonstrate that, but for the action taken, the avoided events would have been very likely to occur, and the avoided events need to be monetised by the finance department. Not all high-risk referrals will result in a very costly care placement being avoided, so both prudence and realism are required,” he wrote in Public Finance in September.

Experts insist that having the right level of communication, employee engagement, and senior level sponsorship are critical to success in any big-data project.

“You need buy-in to maximise use of the data,” Morris says. And, regardless of the thrust of your project, aiming to produce quick wins that demonstrate the value of the approach can be vital to keep more ambitious undertakings on track. “It has to be run as a managed project with a start and an end time, with project champions on the ground,” Morris advises.

Staff churn means that ongoing training is important, not only to ensure that dashboards are interpreted correctly but also to drum home the value that data analysis can bring to the organisation. “You need to think about how you can quantify the benefits of any project,” Fildes adds.

“People think that big data is all about ‘big’,” says Richard Neale, EMEA marketing director at business intelligence company Birst. “The reality is, it’s all about the little decisions that are made every day and making them more informed and more efficient. Big data is the key to unlocking that.”

When it comes to predictive analytics, local government has so far done little more than dip a toe or two into the waters of big data. That seems bound to change as more councils take the plunge and learn from the experience.

Devon County Council started a project last year to collect and analyse data with the intention of reducing traffic congestion and air pollution in the city. The Engaged Smart Transport project was set up by the council with a consortium of technology companies and the University of Exeter, led by IT services provider NTT Data.

“For us, this is very much about understanding the relationship between weather, where people will travel and how, so we can optimise the system,” explains Devon councillor Andrew Leadbetter. “Predictive analysis is very much where we want to be going because, if you can have a certain level of understanding about the future, it means you can get a bigger bang for your buck from your infrastructure. It’s a holy grail. We’re at the beginning of that journey.”