Published: 15:39, February 11, 2023 | Updated: 15:39, February 11, 2023
Report: Google cautions against 'hallucinating' chatbots
By Reuters

Text from the ChatGPT page of the OpenAI website is shown in this photo, in New York, Thursday, Feb. 2, 2023. (PHOTO / AP)

BERLIN – The boss of Google's search engine warned against the pitfalls of artificial intelligence in chatbots in a newspaper interview published on Saturday, as Google parent company Alphabet battles to compete with blockbuster app ChatGPT.

"This kind of artificial intelligence we're talking about right now can sometimes lead to something we call hallucination," Prabhakar Raghavan, senior vice president at Google and head of Google Search, told Germany's Welt am Sonntag newspaper.

READ MORE: Google unveils ChatGPT rival Bard, AI search plans

We obviously feel the urgency, but we also feel the great responsibility. We certainly don't want to mislead the public.

Prabhakar Raghavan, Senior Vice President, Google

"This then expresses itself in such a way that a machine provides a convincing but completely made-up answer," Raghavan said in comments published in German. One of the fundamental tasks, he added, was keeping this to a minimum.

Google has been on the back foot after OpenAI, a startup Microsoft is backing with around $10 billion, in November introduced ChatGPT, which has since wowed users with its strikingly human-like responses to user queries.

Alphabet Inc introduced Bard, its own chatbot, earlier this week, but the software shared inaccurate information in a promotional video in a gaffe that cost the company $100 billion in market value on Wednesday.

ALSO READ: Bard vs ChatGPT: What do we know about Google's AI chatbot?

The logo for OpenAI, the maker of ChatGPT, appears on a mobile phone, in New York, Jan 31, 2023. (PHOTO / AP)

Alphabet, which is still conducting user testing on Bard, has not yet indicated when the app could go public.

"We obviously feel the urgency, but we also feel the great responsibility," Raghavan said. "We certainly don't want to mislead the public."