“It can be very harmful if deployed wrongly and we don’t have all the answers there yet – and the technology is moving fast …. So does that keep me up at night? Absolutely,” the tech giant’s Chief Executive Officer (CEO) said.
In an interview with CBS’s “60 Minutes”, Mr Pichai voiced concerns that society needed to adapt for the use of AI technology.
He warned that AI is set to “impact every product across every company”, adding that “knowledge workers” such as those working as writers, accountants, and software engineers are likely to be more affected.
“For example, you could be a radiologist, if you think about five to 10 years from now, you’re going to have an AI collaborator with you,” he said.
“You come in the morning, let’s say you have a hundred things to go through, it may say, ‘these are the most serious cases you need to look at first’,” Mr Pichai added.
The tech giant boss cautioned about the ease with which fake media reports can be generated using AI.
On a “societal scale”, he said, such messages that can be easily made with AI “can cause a lot of harm”.
However, instead of abandoning AI, its use can be regulated with more laws that “align with human values including morality”.
“This is why I think the development of this needs to include not just engineers but social scientists, ethicists, philosophers, and so on,” he said.
In line with Mr Pichai’s statements, Google also recently released a 20 page document listing its “recommendations for regulating AI”.
The Silicon Valley giant also launched its new AI chatbot “Bard” in February in what seemed to be aimed at competing with the now-famous AI system ChatGPT.
Bard is an “experimental conversational AI service” that can be used to simplify complex topics like “explaining new discoveries from Nasa’s James Webb Space Telescope to a 9-year-old”, Google notes in its website.
Asked in the interview whether he thinks Bard is safe, he said: “The way we have launched it today, as an experiment in a limited way, I think [it is]. But we all have to be responsible in each step along the way.”