Discrimination is not just the domain of humans. Now artificial intelligence is showing gender bias too

In a world where AI tools are being developed to help everyone from crimefighters to recruiters, it is imperative to ensure they are built without discrimination based on gender, race or colour

LONDON, ENGLAND - OCTOBER 25: In this handout image provided by Huawei, Georgia Toffolo poses in front of a giant AI-powered Rubiks Cube unveiled by Londons South Bank to celebrate the launch of the new Huawei Mate 20 Pro, the worlds first dual-AI powered smartphone, on October 25, 2018 in London, England. Members of the public are invited to work together with the Artificial Intelligence inside the new Huawei Mate 20 Pro to solve the Cube. (Photo by Tom Nicholson/Huawei via Getty Images)
Powered by automated translation

“For most of history,” wrote Virginia Woolf, “Anonymous was a woman.” For centuries, women have been ignored by history or simply written out of it. The problem with anonymity is that no one knows who you are; your talents cannot be recognised; nor can you receive the credit that is your due. As a result, you cannot progress in life or your career – that is, if you have one. You are, to all intents and purposes, a nobody.

Look at the women who made contributions in fields such as science. It is only relatively recently that attempts have been made to redress the balance by honouring those whom the history books have omitted or erased from our story of collective human achievement.

When it came to authors, female writers in the West often wrote anonymously, or under noms de plume that were either masculine or obscured their gender, to get published and be widely read in male circles.

These biases against women are not entirely a thing of the past. They persist against women in every sphere, whether that's representation, pay disparity, access or leadership. Next week at the Abu Dhabi Diplomacy Conference, for example, experts will be gathering to discuss why so few women worldwide take up diplomatic posts and represent their countries overseas.

Pinpointing the deep-seated prejudice that influences people’s opinions is difficult. They might not even be aware of the fact that they carry a bias based on gender or ethnicity. And suggesting to those making choices that they are guilty of bias is a minefield in itself as few people would want to be accused of discrimination. Add to that the fact that those who rise to the top are unlikely to accept that they might have done so because the system is weighted in their favour, both at a structural and individual level. Everyone wants to believe they have got to where they are based on merit - but the huge discrepancies demonstrate that is not always the case.

We need systems in place that can eliminate these biases. Making CVs anonymous is now increasingly common practice; studies show that can make a difference, both in terms of who is selected for interview and offered a job.

___________________________

Read more from Shelina Janmohamed:

___________________________

The great shining light on the horizon should be the use of artificial intelligence. Machine learning is seen as a panacea to many of our issues of in-built discrimination. A number of organisations have been actively developing AI that can help with recruitment by sifting through job applications and CVs to select the best candidates.

But things are not quite as rosy as they might seem. Amazon announced last month that it was scrapping an AI tool it had been developing for recruitment purposes as it had been rejecting women wholesale. Candidates were given scores by the AI machine but a year into its development, the company realised it was not rating them in a gender-neutral way. Because its algorithms had been developed based on CVs submitted over a 10-year period, most of which came from men, it automatically rejected any resumes that contained the word “women”.

In a world where AI tools are being developed to help those involved in fighting crime and establishing law and order, it is even more imperative to ensure they are built without discriminating based on gender, race or colour. Diversity recognition and equality must be at the forefront of any new developments.

In the USA, facial recognition software is being used to find criminals. Research has found that they can be accurate – but primarily for white men, with accuracy as high as 99 per cent. But when it comes to women – and black women in particular – it can drop to as low as 35 per cent, meaning there is a very real threat of wrongful arrest, prosecution and punishment based on false data.

There is a lesson to be learned from the erasure of so many people from our collective past, as well as the discrimination that is still being perpetuated today – by both humans and machines. We need new ways to redress the balance and new mechanisms to do so too.

We are standing on the cusp of an exciting new future. It cannot be tainted by the very worst of human nature. If machines are learning those traits from us, we need to tackle our own prejudices first and foremost, no matter how painful that might be.

Shelina Janmohamed is the author of Love in a Headscarf and Generation M: Young Muslims Changing the World