AI blunders are expensive for Google: Chatbot invents facts

One wrong answer to a sample question cost Google tens of billions of dollars on Wall Street. The stock price dropped 8 percent after the new chatbot ad proved to show the limits of this new AI perfectly.

Google introduced the new service Bard this week in response to a similar initiative from Microsoft that integrated the advanced text generator ChatGPT into its Bing search service. In both cases, artificial intelligence is used to present search results in a coherent story.

The purpose of this is to make it possible to get detailed answers to specific search queries. For example, ask for travel tips or a summary of trends on Twitter for a five-day trip to London. Bard didn’t answer the question, “What can I tell my 9-year-old about the discoveries made by the James Webb Space Telescope?”

Bard, for example, not only correctly replied that the telescope was photographing ancient galaxies, but also claimed that the telescope “made the first images of a planet outside our own solar system.” However, images of such exoplanets were captured by the Very Large Telescope in 2004.

The error appears in this statement from Google:

“As someone who took a picture of an exoplanet 14 years before the Webb telescope, maybe you should find another example,” astronomer Bruce Macintosh said dryly when the Reuters news agency spotted the error. Others pointed out, ironically, that Google’s tried-and-tested search engine provided the correct answer.

It’s particularly painful that the bug was discovered yesterday because the company later gave a presentation from Paris about all the AI ​​innovations it’s working on. This gave Wall Street little confidence.

In a response, Google says it’s still working hard to remove teething problems from the system before they become widespread. For example, testing devices will be used from this week to make the results more reliable.

hallucination

Experts have long drawn attention to such errors in artificial intelligence programs. It’s called hallucination because the bot conflates complete nonsense in a very convincing way. “These systems are designed to provide plausible responses based on statistical analysis – they are not designed to provide accurate answers,” Oxford University artificial intelligence expert Carissa Véliz told New Scientist.

In this case, Bard seems confused by reports that the Webb telescope is capturing exoplanets for the first time. This was specifically about Webb’s first voyage, but was interpreted by Bard as the first voyage ever.

Investors were shocked by the error: The company lost almost 8 percent of its value on Wall Street yesterday, nearly 100 billion euros. Bard must ensure that Google remains the search engine leader, and such a failure undermines trust in Google.

intuitive foresight

Meanwhile, Microsoft’s Bing’s big competitor, ChatGPT, is also not without startup problems. For example, when asked by the AP news agency what the biggest sports news in the last 24 hours was, the search engine didn’t say anything about LeBron James’ record, but instead gave LeBron James detailed information about the Superbowl outcome. It is not played until Sunday. .

“It was an exciting game between the Philadelphia Eagles and Kansas City Chiefs, two of the best teams in the NFL this season. Led by quarterback Jalen Hurts, the Eagles defeated the Chiefs, led by quarterback Patrick, to win their second Lombardi Cup. Mahomes needs to be defeated 31-28.” For completeness, the bot also included some exclusive match stats, listing three songs that Rihanna will sing in “an amazing breakout show.”

“This highlights the problems with hallucinatory AI programs,” the AP concluded, “if Bing isn’t a prophet — we’ll find out on Sunday.”

Source: NOS

follow:
\