Shelina Janmohamed is an author and a culture columnist for The National
June 23, 2023
Discussions about how artificial intelligence is going to radically transform our lives, and possibly even take over human beings, are everywhere. If you’re on social media, your feed might look like mine: a never-ending stream of threads explaining “how to get ahead with AI” or proclaiming that “I couldn't use AI until I discovered these 10 prompts and now I'm going to share them with you”. The latter are dangled like teasers, but since there are so many of them, and they are so alike, I’m beginning to wonder if they also are all generated by AI.
But in all the excitement and the sense of awe around AI, we need to remember something extremely important, and keep it centre stage in any discussion about this: AI uses “knowledge” and data that currently exist. And when it comes to existing resources, they all come with pre-existing biases.
AI is seen as somehow neutral, freed from human error and bias. But not only is this not true, it can actually make the problem of identifying and eliminating biases that already exist in the world even harder. The flaws in biased data become magnified and then further entrenched in any onward analysis, products and outputs.
Take a simple, almost playful example. In a recent experiment by the Lenny Henry Centre for Media Diversity, ChatGPT and BingGPT were asked: “Who are the 20 most important actors in the 20th century?” It seems egregious that not a single person on the list was from Bollywood, or in fact from outside Hollywood. I tried the experiment, too, and I had to specifically prompt the tool to include Bollywood. Which means I had to already know what I was looking for. If I don’t then that knowledge starts to disappear from wider discourse, and because of the generative nature of AI it might well eventually be eradicated.
The reverence and presupposition of objectivity afforded to AI mean make further investigation into its answers on such topics unlikely. And even worse, biases are buried so deep inside the tech that it will be difficult to understand how they got there.
That means work to identify and overturn the decades, perhaps centuries, of embedded biases like racism and sexism could easily disappear in a swamp digital and “real-world” information.
I had to specifically prompt the tool to include Bollywood
The racism of AI in facial recognition is already documented (e.g. black women's faces are not recognised as well as white men's faces, and where the tech is used, this has led to higher rates of false convictions of black women in the US, for example).
Attempts to harness the technology for good – by those who are not alert to its risk – have multiple ways to create harm. Levi’s recently announced it would use A-generated clothing models to diversify the online shopping experience. But why not just use more diverse real models, who are badly under-represented? Was this just another form of erasure, wondered Levi’s consumers? And might this lead to “digital blackface” where white models could be airbrushed to resemble people of colour?
There are more subtle ways that biases are built in to the “norms” feeding AI. If you asked ChatGPT “what is a good city to visit?”, how would it decide what “good” is? What are the pre-existing sources that determine “good”? Are these biased depending on who wrote them? Given western countries’ reviewers, critics and “tastemakers” are typically not from minority groups, the biases once again may rear their heads.
All of this is particularly troubling when it comes to news generation. AI is an exciting tool for the news and media industry, and is already being used to aggregate news and produce copy. But it needs human oversight and for readers to know if something has been AI-generated, or human-written. And if the former, whether it has been fact-checked by a real person or not. There’s already enough fake news in the world. Given the risks of inaccurate or unrepresentative materials we’ve just discussed, ensuring accuracy and trust in news information is a must.
The dystopia is depressing and dangerous – replicating and amplifying the nasty biases that already exist, and minimising and potentially eliminating counterviews. The idea of “automated discrimination” is not something to relish.
But we must be optimistic. AI brings us to the cusp of a brave new world. But it’s one where we must be alert to the risks. And to avoid dystopia and reach into a world where technology actually supports our goals towards greater equality, and elimination of bias, human beings must remember that we are still in charge.
Day 1 results:
Open Men (bonus points in brackets)
New Zealand 125 (1) beat UAE 111 (3)
India 111 (4) beat Singapore 75 (0)
South Africa 66 (2) beat Sri Lanka 57 (2)
Australia 126 (4) beat Malaysia -16 (0)
Open Women
New Zealand 64 (2) beat South Africa 57 (2)
England 69 (3) beat UAE 63 (1)
Australia 124 (4) beat UAE 23 (0)
New Zealand 74 (2) beat England 55 (2)
MATCH INFO
AC Milan v Inter, Sunday, 6pm (UAE), match live on BeIN Sports
Funders: Oman Technology Fund, 500 Startups, Vision Ventures, Seedstars, Mindshift Capital, Delta Partners Ventures, with support from the OQAL Angel Investor Network and UAE Business Angels
More than 2.2 million Indian tourists arrived in UAE in 2023 More than 3.5 million Indians reside in UAE Indian tourists can make purchases in UAE using rupee accounts in India through QR-code-based UPI real-time payment systems Indian residents in UAE can use their non-resident NRO and NRE accounts held in Indian banks linked to a UAE mobile number for UPI transactions
This article is part of a guide on where to live in the UAE. Our reporters will profile some of the country’s most desirable districts, provide an estimate of rental prices and introduce you to some of the residents who call each area home.
Chris Whiteoak, a photographer at The National, spent months taking some of Jacqui Allan's props around the UAE, positioning them perfectly in front of some of the country's most recognisable landmarks. He placed a pirate on Kite Beach, in front of the Burj Al Arab, the Cheshire Cat from Alice in Wonderland at the Burj Khalifa, and brought one of Allan's snails (Freddie, which represents her grandfather) to the Dubai Frame. In Abu Dhabi, a dinosaur went to Al Ain's Jebel Hafeet. And a flamingo was taken all the way to the Hatta Mountains. This special project suitably brings to life the quirky nature of Allan's prop shop (and Allan herself!).
The Sackler family is a transatlantic dynasty that owns Purdue Pharma, which manufactures and markets OxyContin, one of the drugs at the centre of America's opioids crisis. The family is well known for their generous philanthropy towards the world's top cultural institutions, including Guggenheim Museum, the National Portrait Gallery, Tate in Britain, Yale University and the Serpentine Gallery, to name a few. Two branches of the family control Purdue Pharma.
Isaac Sackler and Sophie Greenberg were Jewish immigrants who arrived in New York before the First World War. They had three sons. The first, Arthur, died before OxyContin was invented. The second, Mortimer, who died aged 93 in 2010, was a former chief executive of Purdue Pharma. The third, Raymond, died aged 97 in 2017 and was also a former chief executive of Purdue Pharma.
It was Arthur, a psychiatrist and pharmaceutical marketeer, who started the family business dynasty. He and his brothers bought a small company called Purdue Frederick; among their first products were laxatives and prescription earwax remover.
Arthur's branch of the family has not been involved in Purdue for many years and his daughter, Elizabeth, has spoken out against it, saying the company's role in America's drugs crisis is "morally abhorrent".
The lawsuits that were brought by the attorneys general of New York and Massachussetts named eight Sacklers. This includes Kathe, Mortimer, Richard, Jonathan and Ilene Sackler Lefcourt, who are all the children of either Mortimer or Raymond. Then there's Theresa Sackler, who is Mortimer senior's widow; Beverly, Raymond's widow; and David Sackler, Raymond's grandson.
Members of the Sackler family are rarely seen in public.
Last six stroke-play events (First round score in brackets)
Arnold Palmer Invitational Tied for 4th (74)
The US Masters Tied for 7th (72)
The Players Championship Tied for 35th (73)
US Open Missed the cut (78)
Travellers Championship Tied for 17th (67)
Irish Open Missed the cut (72)
Our family matters legal consultant
Name: Hassan Mohsen Elhais
Position: legal consultant with Al Rowaad Advocates and Legal Consultants.
The Birkin bag is made by Hermès.
It is named after actress and singer Jane Birkin
Noone from Hermès will go on record to say how much a new Birkin costs, how long one would have to wait to get one, and how many bags are actually made each year.
Sole survivors
Cecelia Crocker was on board Northwest Airlines Flight 255 in 1987 when it crashed in Detroit, killing 154 people, including her parents and brother. The plane had hit a light pole on take off
George Lamson Jr, from Minnesota, was on a Galaxy Airlines flight that crashed in Reno in 1985, killing 68 people. His entire seat was launched out of the plane
Bahia Bakari, then 12, survived when a Yemenia Airways flight crashed near the Comoros in 2009, killing 152. She was found clinging to wreckage after floating in the ocean for 13 hours.
Jim Polehinke was the co-pilot and sole survivor of a 2006 Comair flight that crashed in Lexington, Kentucky, killing 49.