Social media giants should fund a UK watchdog, says report

If created, the watchdog would have data collection powers and the ability to fine transgressors.

ANKARA, TURKEY - OCTOBER 26: Logos of social media are seen on keyboard keys in Ankara, Turkey on October 26, 2017.  (Photo by Aytac Unal/Anadolu Agency/Getty Images)
Powered by automated translation

The UK should form an online information watchdog paid for by social media companies to stop the spread of disinformation online, a report from the London School of Economics (LSE) has recommended.

The MPs, academics and experts in the LSE’s Truth, Trust & Technology Commission made a series of short, medium and long-term recommendations for combating the spread of disinformation online, including creating the Independent Platform Agency (IPA) and improving the nation’s digital literacy.

The 44-page report undertaken by the LSE group said despite British news providers making strides to prove the truth of their work, what can't now be controlled is how their news, versus misinformation and falsehoods, are shared on US-based platforms.

“It can no longer be asserted that the information services provided by such technology companies are merely conduits for information. They are curators of information through the design of their platforms and through the operation of content moderation systems,” the report reads.

“These systems have significant influence over what users can post (length of texts, images, comments), the duration of posted information, and to whom content is pushed using algorithms that target, downgrade or influence how people interact with information. These companies are making decisions about what content is allowed, what prominence it is given, and what revenue goes to content producers such as news organisations.”

Similar to the way the press is regulated in the UK, the new body would be paid for by the social media companies, but would be fully independent. The IPA would report to, but not be answerable to, the British government and have powers to fine platforms for not handing over relevant data.

The body would observe how news and information is disseminated online, requiring social media platforms to hand over data showing where, how and to whom news is being shared and compiling an annual report on disinformation. It would also make policy recommendations to the Government.

The report cited the five “giant evils” of the information crisis - confusion, cynicism, fragmentation, irresponsibility and apathy -  as being a threat to “individual decision-making, national security and democratic government”.


Read more:

Tech firms should be liable for 'fake news' on sites, say UK politicians

UK regulator announces review of RT broadcasting license


LSE said government needed to make a concerted effort to combat these evils by raising the digital media literacy of the population, including programmes in schools and heavy investment in educating adults.

The UK National Literacy Trust found that only 2 percent of primary and secondary age children have the critical literacy skills they need to tell if a news story is real or fake, signalling a critical need for education in this area.

A spokesperson for Twitter told The National the company would not be commenting on the report and its recommendations.

The recommendations are a departure from current thinking on halting the spread of fake news and abuse. In September, UK broadcast regulator Ofcom published ideas for how it could use its knowledge and expertise to regulate the online world.