Facebook has been accused of actively connecting ISIS extremists to each other through its ‘suggested friends’ feature.
A new study, the findings of which will be published later this month by the Counter Extremism Project, reveals how the social media giant helped introduce thousands of terrorists around the globe, enabling them to grow new networks and even recruit new members.
The report, seen by The Telegraph newspaper, will add pressure on Facebook to do more to combat terror activity on its platform. The world's biggest social network is already facing the wrath of politicians over its failure to remove terrorist material posted online.
The Counter Extremism Project is a non-profit NGO which aims to combat the online reach of extremist groups such as ISIS.
Its researchers examined the Facebook activity of a thousand ISIS supporters in 96 countries, and found that they were routinely introduced to each other via the ‘suggested friends’ feature.
The feature is designed to connect people who have common interests, using a range of algorithms. Facebook is able to do this because it already amasses a huge amount of personal data about its users, which is used to direct advertisements.
However, the tool also enables terrorists to reach out to sympathisers, the report warns.
One of the researchers on the report, Robert Postings, said: "Facebook, in their desire to connect as many people as possible have inadvertently created a system which helps connect extremists and terrorists.”
Mr Postings experienced this for himself when he clicked on several news stories about an Islamist uprising in the Philippines. This led to him being bombarded with friend suggestions for extremists based in that region, within just a few hours.
Once a connection has been made, Facebook’s failure to tackle terrorist material on its site means extremists can quickly radicalise vulnerable targets.
An example given by the researchers relates to an Indonesian ISIS supporter who sent a friend request to a non-Muslim user in March 2017. The American user started off by saying he was not religious, but had an interest in Islam. The Indonesian user spent the next few months sending him increasingly radical content, all of which was “liked” by the American.
Mr Postings said: “Over a period of six months the [US based user] went from having no clear religion to becoming a radicalised Muslim supporting ISIS.”
The research also showed how Facebook is failing to clamp down on terrorist content on its platform. Out of the thousand ISIS-sympathising profiles identified by researchers, fewer than 50 per cent had been suspended by the social media group after six months. And even when offending posts are removed, their authors are often allowed to continue using Facebook.
"Removing profiles that disseminate ISIS propaganda, calls for attacks and otherwise support the group is important,” Mr Postings said. “The fact that the majority of pro-ISIS profiles in this database have gone unremoved by Facebook is exceptionally concerning."
In other cases, pro-ISIS users who were suspended have been able to reactivate their accounts after complaining to moderators.
One British terror suspect, accused of having posted ISIS propaganda on the site, managed to get his Facebook account reinstated nine times after issuing complaints.
“This project has laid bare Facebook's inability or unwillingness to efficiently address extremist content on their site,” said another researcher, Gregory Waters.
"The failure to effectively police its platform has allowed Facebook to become a place where extensive ISIS supporting networks exist, propaganda is disseminated people are radicalised and new supporters are recruited."
He added: “The fact that Facebook's own recommended friends algorithm is directly facilitating the spread of this terrorist group on its site is beyond unacceptable."
A spokesman for Facebook told The Telegraph that there is "no place" for extremists on the platform.
“We work aggressively to ensure that we do not have terrorists or terror groups using the site, and we also remove any content that praises or supports terrorism,” he said.
"Our approach is working – 99 per cent of ISIS and Al Qaeda-related content we remove is found by our automated systems. But there is no easy technical fix to fight online extremism.
"We have and will continue to invest millions of pounds in both people and technology to identify and remove terrorist content."