Europe warns social media firms over disinformation fight as job cuts loom

Nothing wrong with companies making money but EU officials stress a safety-first approach at London event

A woman checks her phone next to a screen showing social media icons. EPA
Beta V.1.0 - Powered by automated translation

As social media companies tighten their belts and seek to reduce diminishing profits with job cuts, officials on the front line of the information war have warned against moves that could render the platforms more vulnerable to manipulation and deception.

European Union representatives are seeking to remind social media companies that they have signed up to a code of conduct on moderation and usage, and that the bloc is tightening these regulations through a new Digital Standards Act.

In parallel, British officials are hoping a new Online Harms bill will protect account holders and curb abuses.

“We have created rules in the European Union and they apply to everybody,” said Lutz Gullner, a strategic communication specialist with the European External Action Service, told a forum organised by the Aspen Institute UK on Tuesday.

“Whoever owns Twitter is, of course, an interesting issue but the rules remain the same — the system has been put in place and they need to play on this basis.”

New owner Elon Musk initiated the dismissal of an estimated 3,700 employees after discovering that Twitter was loosing $4 million a day, leaving the serial entrepreneur with “no choice” but to restructure.

Rival Meta, which owns Facebook and Instagram, is also planning redundancies that are expected to affect thousands of staff, with an announcement expected to be made on Wednesday.

With Meta alone believed to employ more than 30,000 people worldwide in content moderation on its sites, activists and officials are concerned over the potential disproportionate impact of the downturn on this area.

The war in Ukraine has put a high level of importance on meeting online information threats at the state level while also relying on measures from the biggest firms.

“Genuine voices in our societies that are fans of [Russian President Vladimir] Putin or those that believe there should be a ceasefire [in Ukraine] as soon as possible are not the problem,” said Mr Gullner.

“If these voices are amplified by manipulative means — by false identities or by technical ways of expanding their reach — that is a problem and in the end, that is undermining freedom of speech.

“This is not just narrative against narrative, it's not just a communications challenge,” he said. “We have lots of different policy areas to bring together.”

The conflict in Ukraine has been waged globally: polls show strong support across Europe for the country but in other parts of the world, the picture is mixed.

The global food crunch caused by Ukraine's grain exports grinding to a virtual halt has led to many in developing nations accepting Russia-promoted stories that the hardship they are experiencing is the West's fault.

“Europe has, by and large, become more resilient to disinformation,” said Andy Pryce, head of counter-disinformation at the UK's Foreign Commonwealth and Development Office.

“Perhaps the rest of the world is not, when we look at Africa and Latin America, places with less experience of disinformation, are more susceptible.”

It is also worth pointing out that publicly available information used as open-source intelligence by governments played a role in counteracting Russian claims when the war began in February.

“This information or data that's been gathered over time really helps us understand what the Russian state is doing and really undermines its information operations,” Mr Pryce said.

Krisztina Stump, head of the media convergence and social media unit at the European Commission, likened the regulation of platforms to the rules on car safety that applied to car makers.

“There is nothing wrong with a company that is growing and making money but at the same time, this comes with responsibility,” she said.

“Our analogy is if you are building cars, you have to install a safety bar.

“If your activities cause systemic risks to society such as disseminating disinformation, you have to put certain measures in place, like measures against manipulative behaviour or safe design of user services.

For some experts, exerting pressure on social media companies to moderate online behaviour and reduce risk is fighting a loosing battle when the scale of the threat is pervasive.

Carl Miller of the Centre for the Analysis of Social Media at the Demos think tank sees new dangers constantly, including a lucrative dark web of private businesses.

“For all the power of platforms, they are not able to bring pressure and risk against the people who are doing this in any kind of meaningful way,” he told the Aspen event.

Updated: November 08, 2022, 4:33 PM