The age of misinformation: Can social media sites actually stamp out false content?

Both the coronavirus pandemic and the coming US presidential election have prompted the platforms to act

Facebook and Twitter logos are seen on a shop window in Malaga, Spain, June 4, 2018. REUTERS/Jon Nazca
Powered by automated translation

Social media has allowed us to create our own truths. If we don’t like a piece of information, we can find something contrasting that we prefer, or make one up ourselves. We treasure the opinions we like and dismiss out of hand the ones we don’t. Parachuting into this landscape are individuals and groups who deliberately add layers of confusion to advance their own interests. The result is a soup of misinformation that is said to actively damage societies and communities, and which social media platforms seem unable to control.

In the past few weeks a range of measures have been implemented by Facebook, Twitter, YouTube and even TikTok to try to stem the flow of falsehood and point people towards truth. Also this month, Twitter, Google and Facebook chief executives Jack Dorsey, Sundar Pichai and Mark Zuckerberg, are expected to testify in a US Senate hearing on tech companies’ control of hate speech and misinformation on their platforms. But with their business models appearing to thrive on the pitched battles that misinformation engenders, there are question marks over their commitment to accuracy, and if the battle is even winnable.

“It’s tempting to say that misinformation is good for these platforms,” says Claire Wardle, head of strategy at First Draft, a non-profit organisation working against information disorder. “But from a public relations perspective, the amount of questionable content online is really not a great look for them. And they do actually employ a tonne of people to work in these darker areas of the internet.”

Yet, the content proliferates. A US congressional hearing in July noted that one false video promoting hydroxychloroquine as a cure for Covid-19 was removed by Facebook, but in the five hours it took them to do so, it was watched 20 million times. The sheer size of the platforms – in number of users and geographical spread – makes policing them in a timely fashion impossible. There is also intense debate over what actually constitutes misinformation. "A lot of it is legal speech," says Wardle. "In a country like the US, not taking it down is, many believe, one of the things that makes America great. So part of the problem is definition."

Articles and videos denying climate change may fly in the face of science, but if they qualify as opinion they’re perfectly entitled to appear, and many of them will garner massive audiences.

[People] seek out information that reinforces what they already believe. And we don't know how to deal with this. We just don't have the tools in our arsenal

Facebook promised last month to establish an information hub to counter climate change denial, but Wardle says such approaches fail to acknowledge our relationship to information, which is emotional rather than rational. “It’s all tied to identity and performance,” she says.

“There’s a ritual nature to the kind of content we share and why. People want to feel good – particularly at the moment – so they seek out information that reinforces what they already believe. And we don’t know how to deal with this. We just don’t have the tools in our arsenal.”

Both the coronavirus pandemic and the coming US presidential election have prompted the platforms to act. Last month, Google removed auto-complete search queries that might affect voting intentions, and began to penalise websites and advertisers participating in misinformation campaigns.

Twitter has started applying fact-check labels to tweets containing false or incendiary information. Last week, YouTube promised to delete any claims about coronavirus vaccines that contradict health authorities, and Facebook will soon introduce a ban on new political adverts in the US as the presidential race draws to a climax. But the effect of all this will be hard to measure. “We don’t know the unintended consequences of these interventions,” says Wardle. “It may even drive down trust in information flows. And because there are no independent researchers embedded in these companies, they’re effectively marking their own homework.”

According to a study published in the journal Psychological Science, one way to improve the quality of information on social media is simply to prompt people to think about accuracy. Researchers established that introducing friction into the sharing of information – effectively introducing a "hang on" moment – immediately makes us more adept at distinguishing truth from falsehood.

This strategy is now being used by both Twitter and Facebook to slow down the speed at which information travels. The number of people to whom users can forward messages using Facebook Messenger was reduced last month from 150 to only five. A new policy from Twitter, which began today, prompts anyone retweeting a post to add their own comment. This joins another recent change that asks anyone tweeting a link to read the content of that link first. It’s a notable shift towards putting quality before engagement.

Eagle-eyed code-watchers have also noted a feature coming to Twitter entitled Birdwatch, which allows users to comment on and flag tweets that they deem inaccurate – effectively crowdsourcing the moderation process and using it to help Twitter’s algorithm restrict the spread of falsehood. But Wardle worries that these policies are being directed at individual atoms of content, rather than dealing with far more complex, long-term issues.

“We’re swimming in really polluted waters,” she says. “We should be talking about how our brains make us susceptible to this stuff, and how information is dividing our communities and societies. I don’t see anybody taking the problem seriously enough. Fact checks and media-literacy programmes are not going to get us out of this hole. I think historians are going to look back at this time and say wow, they really sleepwalked into those civil wars.”