Facebook has come under fire after a Twitter user noticed that the video recommendation on the platform mistook black men as 'primates'. AFP
Facebook has come under fire after a Twitter user noticed that the video recommendation on the platform mistook black men as 'primates'. AFP
Facebook has come under fire after a Twitter user noticed that the video recommendation on the platform mistook black men as 'primates'. AFP
Facebook has come under fire after a Twitter user noticed that the video recommendation on the platform mistook black men as 'primates'. AFP

Facebook mistakenly labels black men as 'primates' in video recommendation feature


  • English
  • Arabic

Facebook on Friday said it disabled its topic recommendation feature after it mistook black men for "primates" in a video on the social network.

A Facebook representative called it a "clearly unacceptable error" and said the recommendation software involve was taken offline.

"We apologise to anyone who may have seen these offensive recommendations," Facebook said in response to an AFP inquiry.

"We disabled the entire topic recommendation feature as soon as we realised this was happening so we could investigate the cause and prevent this from happening again."

Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly when it comes to people who are not white.

Facebook users in recent days who watched a British tabloid video featuring black men were shown an auto-generated prompt asking if they would like to "keep seeing videos about Primates," according to The New York Times.

The June 2020 video in question, posted by the Daily Mail, is titled "White man calls cops on black men at marina."

While humans are among the many species in the primate family, the video had nothing to do with monkeys, chimpanzees or gorillas.

A screen capture of the recommendation was shared on Twitter by former Facebook content design manager Darci Groves.

"This 'keep seeing' prompt is unacceptable," Groves tweeted, aiming the message at former colleagues at Facebook.

"This is egregious."

Back in June, Google told Reuters that it was developing an alternative to the industry-standard method for classifying skin tones, which a growing chorus of technology researchers and dermatologists said was inadequate for assessing whether products are biased against people of colour.

"We are working on alternative, more inclusive, measures that could be useful in the development of our products, and will collaborate with scientific and medical experts, as well as groups working with communities of colour," the company said, declining to offer details on the effort.

– Additional reporting by Reuters

Our family matters legal consultant

Name: Hassan Mohsen Elhais

Position: legal consultant with Al Rowaad Advocates and Legal Consultants.

Updated: September 04, 2021, 6:59 AM