A new British regulator needs tough powers to independently audit social media companies to uncover what steps they have taken to halt the spread of misinformation and hate, according to a leading investigator of online extremism.
Imran Ahmed, the founder of the Centre for Countering Digital Hate, told British MPs that social media companies had failed three major tests on preventing the spread of political disinformation in the US, tackling anti-vaxxers and stopping death threats and abuse against black sportsmen.
The UK's proposed Online Safety Bill, described as the most ambitious legislation of its kind anywhere in the world, aims to regulate companies like Facebook and Twitter and fine them for breaching the duty of care to users.
But Mr Ahmed told MPs scrutinising the legislation that the regulator would not have enough powers under the proposals to probe the actions and policies of companies such as Facebook, which had failed to adequately respond to earlier episodes of abuse.
“Social media companies are incapable of regulating themselves,” Mr Ahmed said. “Whenever they are given the chance, they put profit before people.
“There are right now people gurgling for breath in ICUs [intensive care units] in the UK and the rest of the world who are telling their doctors: ‘I saw this on Facebook - I believe it to be true — I thought the vaccine would harm me.’
“Those people will not survive the lies and misinformation that have been fed to them.”
Mr Ahmed’s comments came after US President Joe Biden said in July that companies like Facebook were “killing people” because of their failure to stem anti-vaccine conspiracies.
The company criticised the comments, saying that more people had seen pro-vaccine messages on its platform than anywhere else in the world.
“President Biden’s goal was for 70 per cent of Americans to be vaccinated by July 4. Facebook is not the reason this goal was missed,” a senior official wrote at the time.
Mr Ahmed said the UK parliament was going up against the “most powerful impressive lobbying machine … in history” assembled by the tech companies.
“The question is whether or not the regulator will be able to wield those powers with sufficient confidence and effectiveness to give us the results socially that we want,” he said.
Britain’s communications regulator Ofcom will be given the role under the proposed legislation. But questions remain about how effective it can be tackling multibillion dollar companies which are headquartered outside the UK.
Mr Ahmed said the spread of extremism was so effective on social media sites as they brought together large numbers of people who were fed misinformation over time. “Facebook, for example, is the 800-pound gorilla in the radicalisation market,” he said.
MPs were also hearing evidence on Thursday from witnesses including Rio Ferdinand, the former Manchester United and England footballer.
His brother, Anton, who also played professionally, told another committee of MPs on Wednesday that it was a “disgrace” that abuse reported by himself and others had been dismissed by social media companies.
“Are they waiting for a high-profile footballer to kill themselves, or a member of their family to commit suicide? Is that what they're waiting for? Because if they're waiting for that, it's too late.
“This comes down to if they really want to make change? So far, their words are that they want to, but their actions are different.”
Tara Hopkins, director of public policy at Instagram, said on Wednesday that 95 per cent of hateful content was proactively removed on the platform.
But she apologised after MPs pointed out the continued existence of racist abuse on the platform targeting England footballers, after the side lost the final of the Euro 2020 championships in the summer.