To predict the next crisis, governments need to go big on big data

The UK has relied on a hodgepodge of information scientists to fight Covid-19, but it's time to think on a grander scale

Powered by automated translation

The 2013 book Big Data was a New York Times bestseller, with a manifesto predicting that the ability of society to harness information in new ways would prove transformative.

In the midst of the Covid-19 pandemic, its opening pages read like an exercise in naivety. It discusses how Google analytics data in 2009 predicted the coming wave of H1N1 flu far more effectively than the systems of the US Centres for Disease Control and Prevention. Those with long memories may recall there was a contemporary article in the scientific journal Nature verifying the Google claim.

Big Data's writers thought that the next global outbreak would encounter powerful data tools to "predict and prevent its spread".

We now know it did not work out that way. What went wrong?

First, it can be argued the events of the last 15 or so months were not an evolutionary failure of big data.

More likely, the major factor has been the failure of some states to get on top of how to adapt as much as it has been failing to meet the pandemic threat.

The failure to get a grip means that governments did not forecast developing threats and were incapable of harnessing information to guide their response.

Britain, for example, has an alphabet soup of bodies (variously titled with acronyms like Sage, Nervtag and SPI-M) that gave advice to government or created the forecasts it relied on to respond to the pandemic. With more than 100,000 deaths, this relationship between the government and the scientists has failed fundamental tests.

The system-wide faults stem from a casual approach to planning and projection that can be seen across the whole of government.

epa09021303 A hospital staff walks past a mobile Covis-19 test centre in London, Britain, 18 February 2021. Information published 18 February 2021 in the Imperial College London's React study states hospital admissions and deaths from Covid-19 are dropping significantly. The British  government is hopeful that schools can reopen in early March and that lockdown restrictions can start to be lifted.  EPA/ANDY RAIN
Britain's government has relied on advisory committees and data modellers for its Covid-19 strategy, but couldn't predict the pandemic ahead of time. EPA

George Robertson, the former secretary-general of Nato, observed last week that a 2016 exercise in how to cope with a pandemic hitting the UK was not embedded in government systems ahead of the Covid-19 crisis.

When Boris Johnson’s former chief of staff Dominic Cummings released his personal email address last year and called for “super forecasters” to join his team, the waters were muddied yet further. Mr Cummings basically wanted eccentric visionaries to set up radical visionary initiatives for the country’s future. However, since these forecasters also use analytical tools and statistical methodology, their recommendations are also supposed to be bullet-proof.

As the pandemic hit, a new layer of adviser was pitched into the spotlight. The epidemiologists coalesced around projections of an uncontrolled spread of Covid-19.

The model used by Professor Neil Ferguson, who led the Sage work, has described as an “angel hair pasta bowl” of an algorithm. No outside expert has replicated his numbers using his system.

Yet the Sage college of experts often hand down their conclusions without much challenge. For example, a scientist last week pointed out that a combination of two 90 per cent probabilities in a vaccine rollout lowered the overall figure to 81 per cent protection. His calculation was presented as a dire warning. But herd immunity is widely seen as sitting at 70 per cent with the current Covid-19 variants.

Casual and inconsistent processes that provide vital information are not limited to health care. The UK’s national broadcaster ditched the country’s meteorological office as the source of its weather forecasts, instead granting a contract to a Dutch-based firm. Complaints about the inaccuracy of the predictions have soared, and a national institution has been deprived of resources.

The failure to get a grip means that governments did not forecast developing threats

The government itself prefers the work of small-scale units of behavioural scientists who formulate “nudge policies”. These teams come up with incentives to change habits and attitudes. For example, by drawing traffic lines on roads in new ways to control speeding and moderate drivers’ decisions.

Standing up a forecast or having a vision is one thing. But the demands of the times exist on a different plane. Placing a forecasting operation at the heart of policymaking would provide a transformation of government.

This means not just tapping ad hoc academic groupings, or dotting government departments with chief advisers from the professions, or having in-house, expert panels. And it certainly does not mean relying on the occasional outreach to super forecasters to provide uncommon ideas.

Move away from nudging or, at least, subordinate the behaviouralists. Recognise that the epidemiologists are, like economists, so reliant on assumptions that their work can give guidance but maybe not tangible results.

Big Data makes a fundamental point. The advent of information at scale, as well as tools like supercomputers, the limitless cloud, search, curation and data-driven diagnosis, is a turning point. It takes us away from causality in analysis and decision-making.

Why something came about is less important than what it means for future actions. From the authors’ perspective, causal mechanisms are self-congratulatory and illusory.

The world under big data is shifting from causation to correlation.

The imperative for governments is to become a machine that handles, manages and processes the data, and for this function to be placed under a direct senior leadership that masters its application.

The summit of G7 nations on Friday agreed to promote a warning system for the next pathogen through a network of pandemic surveillance centres. That announcement was an acknowledgement that scaling up forecasting is the name of the game in the aftermath of the Covid-19 pandemic.

Damien McElroy is the London bureau chief at The National