Features

The keyboard secret army

The paranoid war on “disinformation” at the heart of the Government’s pandemic response

This article is taken from the March 2023 issue of The Critic. To get the full magazine why not subscribe? Right now we’re offering five issues for just £10.


In late March 2020 a brigadier from the British Army’s 77th Brigade arrived in Whitehall to find a Cabinet Office in disarray. The country was in the early days of lockdown. Polling suggested public support, but who knew what might happen in the weeks and months ahead?

Nothing like lockdown had been tried before and Westminster, incurably online as usual, was paranoid. What might start trending from the trolls on Twitter and Facebook? The brigadier, who had cut his teeth fighting social media disinformation campaigns by hostile foreign actors, reassured the ministers and civil servants he was here to help. 

Within a few weeks he and his team, a motley crew of British army reservists, former translators and “cyber experts”, were sitting in a safe house somewhere in Berkshire monitoring Twitter and other social media sites for any dodgy takes that might undermine the Government’s pandemic response. 

Over the next two years, they and other government disinformation and media monitoring units would report on everyone from pub landlords and Conservative MPs to Peter Hitchens’s tweets. One unit, tasked with identifying harmful narratives, even flagged up Professor Carl Heneghan, a man who would later advise the Chancellor, Rishi Sunak. 

How we arrived at this farcical episode is the story of the rise and fall of the fight against “online disinformation”. It’s a tale of incompetence and paranoia that saw the concept abused, stretched and confused by a vast network of fact-checkers, social media reporters and “content moderators”, from Whitehall to the BBC via Silicon Valley.

What exactly is “disinformation”?

What exactly is “disinformation”? The word itself had smuggled its way through the 20th century, with origins in 1920s Soviet propaganda. There it was understood to be “false information with the intention to deceive public opinion”. Other distinctions have since been made. How readily can disinformation (spread deliberately) be separated from “misinformation” (spread accidentally)? Then there’s the casual way the word is thrown around regardless of who is accused. There is (surely?) a difference between Brenda in Bolton tweeting conspiracies about vaccines and an Oxford epidemiologist criticising the government’s pandemic policy. 

The word’s recent history provides some clue as to how tackling disinformation came to be abused during the pandemic. In 2019, the American news network, NPR, declared “disinformation” to be its word of the year and warned it was a “sign of things to come”. In the decade that saw Brexit and President Trump, “disinformation” came to signal a wider paranoia about how populism had harnessed social media. 

For those appalled by these political developments, “disinformation”, alongside the trend away from established news outlets, offered a reassuring means to explain away the populist revolt against globalised liberalism. 

Increasingly, a schism emerged among liberals (and many conservatives) between those willing to engage with these criticisms in the real world, and those eager to dismiss Trump and Brexit as the product of bad posting. The rationale even had a dystopian tech angle: algorithmically-amplified disinformation on social media, including that spread by the Kremlin, had played a decisive role in the revolt of the public against the elites.

In 2017, the Moscow bots were even subjected to a US congressional hearing. Russian Tweets, Senator Mark Warner warned, were a “threat to democratic institutions”.

The idea was influential in Britain, too. Orwell Prize-winner Carole Cadwalladr and many other prominent journalists argued that Vote Leave’s success was “the biggest electoral fraud in Britain for 100 years”, and that it had come about by targeting people with dodgy information on Facebook. ”Carole Cadwalladr: the dismantler of disinformation” was one strapline for her many podcast appearances. 

No one doubted inaccurate news online was a problem, but all this paranoia, increasingly crystallised around the simple concept of “online disinformation”, went a step further. Unchecked posting online and algorithms, they argued, was threatening to tear apart the very foundations of western democracy. 

Out of this conspiracy reasoning, the disinformation reporter was born. CNN and NBC pioneered the position with the BBC following suit. Marianna Spring was the Corporation’s first specialist disinformation correspondent, appointed on the eve of lockdown to “fight back against information that claimed to come from a reputable news organisation”. 

She seemed like a shrewd acquisition. A young, up-and-coming journalist who had been spotted by Emily Maitlis, she appeared to be exactly the sort of person the BBC needed to turbo-charge itself into the digital age and help the public get up to speed with all the bad information that would undoubtedly spread at a time of crisis. 

In the pandemic, the fighters of disinformation would face their greatest test yet. Unsurprisingly, they failed. Not only did they and the gatekeepers designated to tackle online disinformation end up getting some of their much heralded “fact-checks” completely wrong, they also ended up becoming ardent apologists for government policy, thus helping to further inflame precisely the conspiracy theories they were meant to tackle. 

We now know that the flawed logic of the disinformation-tacklers made its way not only into Whitehall but to the content moderators of Facebook and Twitter, where scientists, journalists and other public health experts ended up being wrongly accused of dis/misinformation. To call it Orwellian, or an attack on free speech, fails to appreciate how utterly banal, pointless and avoidable it all was.

To fully understand how it went wrong, it is necessary to grasp how the BBC’s Disinformation Unit led by Spring exemplified the flaws in the approach. “Totally out of their depth”, was how one unhappy senior BBC journalist described them to me, “a de facto mouthpiece for the government”. 

Shortly after I wrote for The Critic online about this internal BBC discomfort towards the Unit’s work during the pandemic, its former editor, Mike Wendling, got in touch. The accusation by the senior journalist was “baffling”, he protested, since his unit was only interested in tackling “falsifiable rubbish”. Was there an example where it had gone beyond its remit and, as my article argued, ended up unduly promoting government policy?

There was. In one of its articles, the BBC disinformation team claimed to have been fact-checking an NHS doctor who had challenged a proposed vaccine mandate for healthcare staff. In reality, the journalists and the doctor in question were actually engaged in the same approach — prioritising information in various studies to make their point. 

The problem with the BBC article was that it weighed in with the false authority of “disinformation” — the implication being that the doctor had been debunked, and thus consigned to the dustbin of nonsense. Funnily enough, the doctor’s initial argument regarding the long-term efficacy of the Covid-19 vaccine versus natural immunity has since gained significant traction within the medical establishment. 

Thanks to the work of Big Brother Watch, we now know that this amateur approach to labelling “disinformation” was not just taking place at the BBC, but also amongst Whitehall’s secretive mis/disinformation units. 

Behind the reports and dossiers solemnly circulated around Whitehall, there was no advanced methodology for determining disinformation. This is hardly surprising. The very nature of the pandemic was confusing, not least in relation to some of the unprecedented policies being enacted. From vaccine efficacy to the impact of lockdowns, it was often nigh on impossible to apply the binary notion of “disinformation” at a time when a deluge of data, scientific studies and evidence were emerging from all over the world. 

Where transparency and honesty with regards to what was or wasn’t known might have prevailed, the opposite happened. Whitehall paranoia dictated the absurd idea that dissenting opinion, even that of reputed experts, threatened to undermine public health policy. 

“There is no bright-line category called misinformation,” said the writer Scott Alexander in a recent essay on fake news and the media. Instead, it is an absence of appropriate context, or the prioritising of one piece of information over another which at best misleads rather than outright lies. Of course, those targeted by the disinformation experts were very much guilty of this. Over the course of the pandemic, however, so too was the government narrative they often found themselves defending. 

In one instance, Facebook “fact-checked” an article on the website UnHerd for pointing out it was still too early to rule out the possibility that the Covid virus had originated from a laboratory. Soon after, however, the White House also expressed “deep concerns” over the initial WHO exoneration of Wuhan’s laboratories, thereby providing credence to the lab leak theory. Moving in line with the White House, Facebook duly backtracked.

Even the British Medical Journal fell foul of the Facebook fact-checkers for publishing a peer-reviewed piece, which documented an investigation into clinical trial research practices occurring at Ventavia, a contracted company that assisted Pfizer in its Covid-19 vaccine trials. 

I’ve witnessed this curious approach to journalism myself

Professor Carl Heneghan drew the attention of the dis/misinformation units in Whitehall for pointing out that the government’s “Rule of Six” had little evidential basis. 

So, too, did the Tory MP David Davis, for critiquing the government’s proposed “Covid passport”. A speech criticising the policy Davis gave to the Conservative Party conference was removed from YouTube. Yet, his point regarding the “false reassurance” of vaccine passports had also been made by Professor Robert West, who sat on the government’s SAGE advisory committee. 

Similarly, in America, the so-called Twitter Files have unearthed a cache of micromanagement and “shadow-banning” of opinion, particularly against lockdown’s most prominent critic, Stanford University’s Professor Jay Bhattacharya. The records betray the same disinformation paranoia in Silicon Valley during the pandemic.

An evolving context and the inevitable progression of scientific consensus aren’t of much interest to tacklers of disinformation, in part because all too often their job is to tell the supposedly gullible public what they simply must not believe. To do otherwise runs the risk of undermining the epistemological certainty demanded by their role. 

As such, this too often leads them to punch down rather than up, finding themselves forced to focus on the loony fringes of a much wider dissent (an NHS doctor being one of their more misguided targets). Marianna Spring now has a well-documented reputation for prowling anti-lockdown groups on Facebook and other platforms, waiting to snare someone who believed something appropriately barmy.

I’ve witnessed this curious approach to journalism myself, having been sent the content of a bizarre conversation between one of Spring’s colleagues and a member of the public who had apparently been sharing vaccine conspiracies to his 500 Twitter followers. Even the reporter lost interest in pursuing this conspiracy theory after finding out the man had nothing particularly bonkers to say. 

In obsessing over bad posting online, the BBC’s Disinformation Unit starts to resemble its own conspiracy theory, defined above all by its narrow mindedness. It is going out of its way to service this approach at a time when public trust in the Corporation to provide an impartial account of events, like many other legacy news outlets, is at a record low following the pandemic. 

We should abolish the very concept of the disinformation reporter. Indeed, “disinformation” was so abused over the course of the pandemic, we should stop using the word entirely. Proposing this risks being met with appalled confusion. “You want disinformation to run riot? What about Andrew Bridgen, the anti-vaxxers, The WEF conspiracy theorists?” 

Such a reaction betrays the curious mindset of those who defend the cult of disinformation, notably the idea that “normal journalism” has never possessed the power to persuade people they are wrong. Yes, there are a lot of people who need winning over in the aftermath of the pandemic, but is there anything a disinformation reporter can do that a good journalist with a normal title — such as “reporter” — cannot do better? 

If we are to move beyond the failed disinformation paradigm, it is worth forcing its most ardent advocates to acknowledge the inadequate worldview the concept often conveys, however much the fight against “online disinformation” in the age of Brexit and Trump offers a reassuringly comforting cause to champion. 

Elon Musk’s Twitter is now firmly in Marianna Spring’s sights, it being apparently complicit in a foiled far-right plot in Germany as well as the storming of the Brazilian Congress. The enduring appeal of this style of reporting seems to be its ability to command a frustrated sense of morality in a world gone astray: there are “trolls” online spreading “disinformation”. 

Sometimes they’re Russian. Sometimes they’re far-Right. Sometimes they’re just misguided, lonely and lost people, who are also magically, massively influential on the online innocents. 

As a means of lamenting the ever-declining influence of “trusted news” such as the BBC, this works very nicely. As a means of seeking to understand our present moment, it is self-satisfied, misleading and conspiratorial.

Enjoying The Critic online? It's even better in print

Try five issues of Britain’s newest magazine for £10

Subscribe
Critic magazine cover