Consumers don't trust artificial intelligence to handle personal information

Experts say this is an important issue because data is the crucial fuel or raw material to run any AI application

Dubai, April 30, 2019.  Ai Everything show at the Dubai World Trade Centre.-- (L-R)  Arif Ahmed, SVP of Innovation, U.S. Bank - USA; Cédric Wachholz, Chief, ICT in Education, Culture and Science Section Knowledge Societies Division Communication and Information Sector, UNESCO and Dr. Chae Sub Lee, Director of ITU's Telecommunication Standardization Bureau, International Telecommunication Union (ITU) - Switzerland.
Victor Besa/The National
Section:  NA
Reporter:   Alkesh Sharma

Business leaders who work in the field of artificial intelligence admit that consumers are not confident about the way their personal data is handled.

"Trust is an important issue in AI and we are trying to address this problem," Dr Hatem Bugshan, head of Big Innovation Centre in the Middle East, told The National, adding that main concerns are regarding the usage of consumer data.

“Data is the crucial fuel or raw material to run any AI application but individuals have no trust on how their personal data is processed by the companies,” added Mr Bugshan.

The Big Innovation Centre, a London-based incubator that has offices in Riyadh and Dubai, has been working with the UK Parliament since 2017 to develop a framework for governing data practices.

“We do small-scale experiments with selected companies, individuals and government entities and advise policymakers in the UK,” said Mr Bugshan, who said his company is looking to implement a similar model in the Middle East.

The UAE, the Arabian Gulf's second-largest economy, is projected to gain the most in the region as a result of AI adoption. The technology is expected to contribute up to 14 per cent to the country's GDP – equivalent to Dh352.5 billion – by 2030, according to a report by consultancy company PwC. The Emirates will be followed by Saudi Arabia, where AI is forecast to add 12.4 per cent to GDP.

“There is certainly a trust issue, especially when processing sensitive information like medical records of patients and other data sets with personal characteristics,” said Shail Khiyara, chief customer experience officer at UiPath, a software company that develops robotics and automation.

He suggested regulations to avoid misuse of AI to mine data.

Experts also said it is important to familiarise students with AI in schools to ensure a more adaptive future generation.

Cedric Wachholz, chief of information and communication technology in education, culture and science at Unesco, said that AI should be simplified and children should be taught in schools to understand how it works.

“There is a new concept called ‘computational thinking’ where children learn problem solving looking at data, analysing it and finding solutions. The idea is not to become a computer specialist but to learn to understand how AI works. It will also prepare children for the future changes in workplaces,” said Mr Wachholz.

He added that many schools in Argentina, Malaysia, Singapore and Europe have already introduced this concept in the curriculum.

David Cox, chief technology officer and director of IBM-MIT Lab in the US, said AI is set to "transform everything that we do, how people work, how they interact with governments".

"But we really need to give attention to trust.”