France and New Zealand will host a joint summit next month aimed at tackling promotion of terror content on social media after Facebook admitted footage of the Christchurch mosques attack could still be available on its site.
Global leaders and technology company executives have been called to attend the meeting in Paris on May 15, which will be co-chaired by Jacinda Ardern and Emmanuel Macron.
Ms Ardern said the aim was to see politicians and tech CEOs agree to a pledge called the "Christchurch Call" to eliminate extremist content, such as the live stream of the March 15 gun attack, online.
The New Zealand prime minister said the Christchurch massacre in which 50 people died “saw social media used in an unprecedented way as a tool to promote an act of terrorism and hate”.
“We’re calling on the leaders of tech companies to join with us and help achieve our goal of eliminating violent extremism online at the Christchurch Summit in Paris,” she added in a statement.
Facebook, YouTube and Twitter were criticised by British politicians on Wednesday for failing to remove far right extremist content from its platforms.
Giving evidence to a committee of MPs in London on hate crime and its violent consequences, Facebook’s public policy director said the platform’s artificial intelligence systems had struggled to identify the footage used in the Christchurch attack as harmful because it had been shot with a head-mounted camera.
“This was a first-person shooter video, one where we have someone using a GoPro helmet with a camera focused from their perspective of shooting,” Neil Potts said.
He said that the particular footage was “a type of video we had not seen before”.
Another executive at the hearing pushed back against the political pressure on the social media firms to police content published on the networks.
"We have no interest in having violent extremist groups on our platform but we can't ban our way out of the problem," said Twitter's head of public policy Katy Minshall.
Christchurch footage was circulated, duplicated and shared all over the world. When questioned by British committee chair Yvette Cooper on whether there was still footage of the attack on Facebook, Mr Potts said it was “possible”.
Representatives from Twitter and YouTube also at the hearing received stern rebukes from MP Stephen Doughty.
Mr Doughty said he had found links to neo-Nazi content created by banned groups on Facebook, Twitter and YouTube “within minutes” just the evening before the meeting.
He said all three firms were not doing their jobs.
Ms Cooper suggested that in the future more governments might impose social media blackouts, such as in Sri Lanka, in the aftermath of a terror attack because social media companies cannot be trusted to remove offensive material.
Marco Pancini, director of public policy at YouTube, said: "We need to respect this decision. But voices from civil society are raising concerns about the ability to understand what is happening and to communicate if social media is blocked."
Of the three companies, the committee chair was particularly critical of YouTube. She said she had been recommended “increasingly extreme” content when she searched on the platform.
"The logic is based on user behaviour," Mr Pancini said. "I'm aware of the challenges this raises when it comes to political speech. I'm not here to defend this type of content."
In her closing remarks, Ms Cooper said all three firms were “continuing to pursue and to promote radicalisation that in the end has huge, damaging consequences to families’ lives and to communities right across the country”.
“Particularly for YouTube, I am appalled that the answers that you have given us are no better than the answers that your predecessors gave us in every previous evidence session. From your organisation in particular very little has changed.”