Religious GPT: The chatbots and developers fighting bias with AI

'Online tools help users connect to ancient wisdom through technology'

People are increasingly using AI and technology to connect with their faith and learn about others. Photo: Unsplash
Powered by automated translation

In March, a day before Ramadan, Raihan Khan sat in front of his laptop to welcome in the holy month. And he did, in true Gen Z style, by launching his AI chatbot, QuranGPT.

“In the first 36 hours alone, it processed over 60,000 searches from 73 countries,” says the 20-year-old, who lives in Kolkata, India, and developed the AI chatbot. “Apart from India and the US, countries like the UAE, Saudi Arabia and Morocco are in the list of top 10 countries whose users have visited it.”

QuranGPT, he explains, is trained on the Quran and uses GPT 3.5 Turbo, an advanced version of OpenAI’s GPT-3.

The ever-booming field of artificial intelligence has led to the rise of religious AI chatbots. They usually answer questions by quoting or referencing holy books including the Quran, Bible and Bhagavad Gita. Like most founders and developers, Khan is young, enthusiastic to learn how AI can transform lives and has a realistic understanding of his generation’s tenuous connection to religion.

“Youngsters don’t [always] go through religious texts in detail. If they want to know whether the Earth is flat or spherical in shape, they are not going to go through a 700-page book on it,” he observes.

Jaspreet Bindra, an expert on generative AI and author of The Tech Whisperer: On Digital Transformation and the Technologies that Enable It, is not surprised by the development – after all, religion is a very popular topic.

“If you go on any social media or chat platform, there is a huge number of conversations around religion and interpretations of religion – sometimes good, sometimes malicious but they are there. So when there is a new, powerful tool in technology, it will be used to talk about something as big as religion,” Bindra says, adding that he hasn’t personally explored or used AI chatbots for religion.

Fighting bias with AI

The founders, especially of chatbots based on Islamic religious texts, are united under another common goal – to present their religion accurately to the world.

GPT-3 has been accused of having an “anti-Muslim bias”. A 2021 article by Vox reported that when Stanford researchers typed in the words, “Two Muslims walked into a …", the AI system completed the sentence with " … a Texas cartoon contest and opened fire”.

It may have improved in the two years since, but as Fardeem Munir, founder of HadithGPT in the US points out, the online world is still largely Islamophobic. “These AI models were trained on the internet where, sadly, Islamophobia is a big issue,” he says. “It just goes to show how careful we have to be when training them.”

Munir started HadithGPT in February as “more of a resource for people, to draw inspiration from the life of [the] Prophet Mohammed, peace be upon him”. The chatbot, as its name implies, is based and trained on the Hadith. “But people began to ask very nuanced questions about Islamic law and AI can't answer that. And even if it did, it wouldn't be a valid answer. That's not how Islamic law works,” he says.

So Munir shut it down and relaunched it recently with “more guard rails” in place. The chatbot’s generated responses come with a disclaimer: “It is important to learn and consult real Islamic scholars for more accurate information.”

Khan has tried to ensure that QuranGPT does not comment on any other religion or “anything else that is not mentioned in the Quran”.

“I want to show that Islam preaches peace, love and brotherhood, and not terrorism,” he adds.

In Pakistan, Ali Zahid Raja, the founder and developer of Islam & AI, spent hours trying to ensure the same. “I asked my friends to stress-test it by asking controversial questions so that I could see the answers for myself and fix them,” he says, adding that the chatbot is currently a beta version and he plans to add more AI features.

“With the advancement of ChatGPT, I [wanted to] create a bot that would help people – both Muslims and non-Muslims alike – with questions about Islam, without any bias or cultural influences.”

Combating loneliness

The founders explain that users look at these chatbots as a novel way to understand their religion. What they didn’t expect, though, was that they would use them to deal with issues such as loneliness as well.

Sukuru Sai Vineet of GitaGPT, which is based on the Bhagavad Gita, says that since launching the chatbot this year, he’s realised there is a “loneliness epidemic”.

“Human-like chatbots, which are fuelled by religious wisdom, could solve your loneliness but obviously, it’s a very risky area as well,” he explains. “You can inject your ideology into these chatbots and the thing with programmes is that you can scale it infinitely.

“Every new technology brings along with it a risk and predicting these risks is very difficult. The safety issue is very evident so you have to try to engineer the bot into not giving violent or irrelevant answers. If a question is irrelevant or controversial, our chatbot points out that it is not mentioned in the Bhagavad Gita.”

Type in “I am feeling lonely” on Islam & AI and the chatbot’s generated response is a combination of a verse from the Quran, a few sentences on the teachings of the Prophet Mohammed, the importance of charity work and reaching out to an Islamic scholar to deal with loneliness.

“The most-asked questions are about loneliness and depression, so people find these answers very calming,” adds Raja. Meanwhile, QuranGPT quotes relevant verses from the Quran, along with tafsirs (explanations).

Handling critics

It hasn’t been all smooth-sailing, though. The founders and developers have had to deal with criticism about mixing science and technology with religion, and of their products attempting to replace the role of religious leaders and scholars.

“Some scholars have criticised religion-based AI chatbots, including Islam & AI,” says Raja. “I am open to constructive critique but I believe we all should start embracing technology, especially AI, and make the best of it.”

While GitaGPT’s Vineet says that both religion and science share the same objective, to “find out the nature of the reality that we have landed ourselves in”. Khan points out that these tools don’t pretend to be the final authority on religion and are, at best, just a starting point for those who would like to learn more about their religion. “Then it is your responsibility to get the answer verified by either searching online further or talking to a scholar.”

Bindra says that although it’s not possible to predict the future, these could prove to be important tools, especially for the younger generation, as long as they spread secularism and equality and don't try to weaponise religion.

“It depends on the creator and the intent behind creating it, not the tool,” Bindra points out. But he doesn’t see it taking off in a big way, unless major companies scale it up. “Frankly, I don’t see big companies getting into religion. I don’t see this as one of the biggest uses of this technology.”

But the numbers seem to tell a different story. HadithGPT, for instance, once had 100,000 users in a month. And Vineet says that GitaGPT has processed millions of queries so far and that an Indian technology company is about to launch its own AI chatbot that’s similar to GitaGPT.

“I think these chatbots are here to stay,” Vineet says. “It’s just a way to bring back ancient wisdom through technology.

Updated: July 28, 2023, 6:02 PM