Facts Are Facts — Aren’t They?
There can be few more critical issues to the country right now than the protection of those in the front line of Covid-19. We stood on our doorstep and applauded them; we require our Government to do right by them.
It looked like it was. Last year, we were told that one of the five criteria to be met before the first lockdown was eased was that the supply of tests and PPE would meet future demand. Quite how this would be done was however left opaque by Ministers who preferred to talk about the number of new contracts with PPE suppliers (although not about those that were awarded to unqualified companies without tender) and how many billions of items had been supplied. This of course didn’t tell us if any items had actually arrived, whether or not they were of the required quality to be useful, whether the volumes were sufficient or how long they would last. When fact checkers tried to look via Freedom of Information requests about PPE levels at all NHS Trusts, they discovered only data that was too thin to properly answer the question. Then real-life evidence emerged of front-line workers left high and dry without protection. Clearly, if we were looking for reassurance of key worker protection, we wouldn’t find it.
Our Information Age
It seemed odd that such fundamental information could be missing. After all, we live in a time of rich data, information and intelligence that promises answers. It’s everywhere. Our lives are built on the insights of Data Scientists. The world crisis through which we are all living results in a deluge of numbers coming at us in what Professor David Spiegelhalter has described as “number theatre” (or what many of us may know as “next slide pleasism.”) It’s often easy to miss their point — that this is about people and lives — and the PPE issue bought that home. What we lacked wasn’t complex or clever — it was basic.
In the era of fake news and cynicism, the PPE story is a real-world example of the importance of fact checking. What many cited as a case of good old-fashioned spin may have turned out to be a rather more pragmatic problem of data simply not existing; or maybe that fact was spun.
This story is timely. It comes from a recent publication by Full Fact — “Fighting A Pandemic Needs Good Information” — covering its work in 2020. As a state of the nation on truth and facts it should be a pretty good barometer. The organisation’s claimed independence is of course itself critical. It has been attacked by the left (accusing it of bias because one of its founders was Tory donor Michael Samuel — although others included Labour and Lib Dem peers) and the right (in November 2020 The Daily Mail attacked its funding by a George Soros foundation and its work with Facebook and Google) which strongly suggests that it’s doing a decent job. It’s unsurprisingly busy too. In 2020 it published almost 600 fact checks and articles about claims made by politicians and public figures in a range of ‘old and new’ media.
The picture of ‘facts’ that emerges reflects broadly three themes — availability, interpretation and presentation of data. Spin and lies can be used in all three but may also be used in none. The fake news picture isn’t always straightforward.
From Numbers to Headlines
This range of data themes is illustrated by a couple of examples which are both quite different to the PPE story.
Firstly, in the early days of the pandemic the media reported a Government figure of daily deaths that came from NHS England and related only to deaths in hospital after a positive test. At the same time, The Office for National Statistics (ONS) provided another measure of daily deaths — this one based on the registration of deaths anywhere in England and Wales in which Covid-19 was mentioned on the death certificate (whether or not this was the underlying cause of death.) The existence and reporting of the two figures appeared to show a contradiction or, at the very least, confusion; it may have been taken as evidence of both.
Secondly, conspiracy theories that started in niche online media quickly spread to traditional media so that 5G or Bill Gates accusations migrated to the grown-ups table where they were joined by the denigration of public health advice. Investigations of these quickly became a significant part of Full Fact’s work.
These two cases collided when headlines appeared claiming that Flu was killing six times more people than Coronavirus. It was a timely and dangerous example of casting doubt on Covid-19 risk or even suggesting denial of the whole thing. The numbers were based on an ONS report using death certificates that mention flu and/or pneumonia but don’t cite them as the specific cause of death. Where Covid-19 was mentioned on a death certificate, in over 90% of cases it was accepted as being more likely to be the underlying cause of death (by comparison, the proportion for flu or pneumonia was less than 30%.) The headline was likely created — and even more likely circulated — by wilful misrepresentation. The data behind it though was neither made up nor fake.
Information Driving Action
These healthcare examples reflect the fact that the sector is obviously front and centre right now. The situation is shining a light on the fault line between the NHS and social care — the former state controlled and the latter a hotch potch of mostly private companies (often off shore based conglomerates.) This fault line matters very much. Full Fact describes a “black hole in the UK’s information on social care, one that the government was already aware of” (a 2018/19 Office for Statistics Regulation review made the point.) Our Government simply knew too little about the care homes in which people were living or the care being provided to others in their own homes. Given the need to free up beds in Hospitals and the risk levels of the vulnerable in these care environments, this made considered decisions nigh on impossible when they were in fact essential. If a full picture had been known and action had been taken based on it then, the report says, “more effective and quicker responses” could have “reduced the high rates of infection and the number of deaths in care homes.” Once again, this is not about stats — it’s about lives.
Who Do You Believe?
One of the most pernicious dangers of fake news (or the perception of information as fake) is that if both sides are doing it then we end up trusting no one. There are no good guys and no bad guys; no responsible Governments versus dangerous conspiracists.
When Governments spin therefore it can fuel the fire; it’s easier to argue that the powers are lying to us on one thing when they’ve been proven to do so on another.
When it comes to the interpretation and presentation of data, there has been much spin — and suspicion of it — from Government.
Take Boris Johnson’s June PMQ’s claims on test turnaround times. At the time nothing was published that could be checked. A month later when stats were made available, Full Fact found one claim to be inaccurate and still no data available to assess another.
Or the reporting of care home staff testing. In May 2020, Johnson claimed that 125,000 care home staff had been tested. Although there was no publicly available data to verify this at the time, within two months data showed that the figure related to the number of tests rather than the number of people tested. Since people often receive more than one test, the number of people tested was certainly considerably fewer than the stated 125,000.
Or in the setting and measurement of targets. In reporting against its commitment to carry out 100,000 tests per day, Matt Hancock announced testing figures had hit 122,347 however this figure included tests at the point they were sent out to people or to satellite centres — not when they were completed. When questioned about this, Hancock accepted the different measurements and potential debates about their accuracy. He however made it clear that, in these days of crisis, such things happen. When building diagnostics capability during a pandemic he said, “worrying about a letter from the stats authority that might come through in a few weeks’ time is not top of the in-tray.” You can see his point. Which of us would prioritise measurement over action; who would want to be in his position?
But proper measurement is important and its absence creates suspicion of spin which damages trust. Suspicion is fostered when an independent fact checker is unable to validate claims because no public data is available. When that same organisation sees “a systematic positive exaggeration of test performance by officials” as well as some targets seeming to have been “designed in retrospect to ensure that they were hit” it becomes a dark cloud.
In some of the cases above we can at least recognise that people were trying to measure impact in order to better deliver. When spin is utilised for no reason other than flag waving, it is much more unsettling. In March 2020, Grant Shapps said the UK’s testing rate was higher than that of any other country apart from China and Italy. The only data available at that time however covered just 38 countries (and the UK ranked fifth, not third.) A day later, after an update to increase the number of countries measured to 63, the UK fell to 7th. When population size was considered, the UK ranked 27th of the 63. We may wave this aside as political grandstanding but surely it seeks to paper over cracks that affect us all.
Sometimes the spread of information is accelerated by an alliance between Government and media precipitated by the need for speed and audience share. Full Fact cites “occasions where relevant legislation was only available less than an hour before coming into force.” This was made worse by Governments’ “repeated leaks to the media, which saw important information splashed across front pages and then debated on the radio before it was properly communicated to the public or parliament.” We all see this — news organisations picking through the detail and implications of what hasn’t even yet been said. As a result many of us are less inclined to tune in when it is.
Words matter. They are often the cause of problems more than the numbers that they represent. A good example is the phrase ‘following the science’ which we have consistently heard from Government as a reassurance of Ministers’ impartiality and objectivity. Early on, most of this science was not shared and no minutes from the SAGE meetings were published. Since we all know that policy is a political rather than a scientific decision — and that the science shifts by the day — the phrase has very little meaning.
If any of these problems cause the public to switch off from important messages that impact behaviour and the whole economy then we have a problem. Between late March and mid May 2020, Ipsos MORI tracked 18–75-year olds’ view on the clarity of Government communication on Covid-19 rules. The percentage of those defining it as ‘very or fairly clear’ dropped from 90% to 56%. By August the percentage that trusted general Government information about Covid-19 was just 44% — four months earlier it had been 67%.
This clearly shows that, whether by cock up or conspiracy, data matters. If that data is bad and information based on it is wrong then it needs to be recognised and corrected. If and how this happens appears to vary greatly by source. Full Fact acknowledges that Newspapers responded fairly quickly to their requests — even if to decline them — however any publication of apologies or retractions were rather less speedy or visible (no change there.) By contrast, the BBC’s online corrections process was anything but quick — in two cases it took over three months to get a response. Worst of all was Government. Departmental responses were “too often slow, unclear or inaccurate” and some answers were contradictory. Full Fact made 20 requests for corrections or clarifications from Ministers and received no full response to any of them (they also made 16 requests of shadow ministers and other MPs, of which 8 were fully resolved.)
Fixing It
Information can be missing, interpreted wrongly, presented badly or just plain spun. There is much to fix.
It is clear that those fixes will involve a lot of people and processes. The report highlights “increased cross-border collaboration and expanded monitoring processes amongst fact checkers alongside initiatives by social media platforms and Government.” The cynic may suggest that an alliance between these parties is unlikely — content of dubious quality can be a very good eyeball monetiser in Silicon Valley; the powers that be are well represented in the examples above — however it is essential.
The report emphasises that not only is it critical for the current pandemic and its impact but also in other areas of society that, though impacted right now, were serious problems before the pandemic and will be after. These include reviews of homeless, rough sleeping and poverty measurements to reflect the “data gaps and inconsistencies” that the current situation highlights.
Full Fact makes nine recommendations. They range from the immediate (a “government-led programme to identify data gaps in areas of significant societal importance and work to fill them”) to the longer term (a “horizon-scanning function that anticipates the major societal questions the UK will face in the next five years, and the data and insights necessary to provide answers to those questions”)
Fact checkers are going to be increasingly necessary and important because we clearly live in an age of disinformation. Much of our protection though has to come from ourselves. We inevitably become cynical and it is easy — maybe necessary as an act of self-protection — to wave aside so much. We might even see it as an amusing sign of the times. However, when it comes from apparently trusted media, world leaders, experts and governing officials on a subject that means everything to everyone, we mustn’t.
Facts remain facts.