Google’s AI search engine tells users to “eat rocks” for your health

Stay informed with free updates

Google’s new AI search tool has advised users that eating rocks can be healthy and sticking cheese on pizza, prompting derision and raising questions about its decision to incorporate an experimental feature into its flagship product.

“Eating the right rocks can be good for you because they contain minerals that are important to your body’s health,” Google’s AI Overview responded to a Financial Times query on Friday, apparently in reference to an April 2021 satirical article from The Onion headlined “Geologists they recommend eating at least one small stone a day.”

Other examples of incorrect answers include the recommendation to add glue to the pizza sauce to increase its “stickiness” and stop the cheese from sliding, which may have been based on a joke 11 years ago on Reddit.

More seriously, when asked “how many Muslim presidents does the US have,” the AI ​​review answered, “The United States has one Muslim president, Barack Hussein Obama” — echoing the falsehood about the former president’s religion promoted by some of his political opponents.

Google said: “The vast majority of AI reports provide high-quality information with links that allow you to dig deeper into the web. Many of the examples we saw were unusual queries, and we also saw examples that were faked or that we couldn’t reproduce.

“We did extensive testing before launching this new experience, and as with other features we’ve launched in Search, we appreciate your feedback. Where necessary, we take swift action according to our content policies and use these examples to develop broader improvements to our systems, some of which we have already started rolling out.”

Errors resulting from answers generated by Google’s artificial intelligence are an integral part of the systems on which this technology is based, known as “hallucinations” or fabrications. The models that power the likes of Google’s Gemini and OpenAI’s ChatGPT are predictive, meaning they work by picking the likely next best words in a sequence based on the data they’ve been trained on.

While companies creating generative AI models—including OpenAI, Meta, and Google—say the latest versions of their AI software have reduced the incidence of fabrication, it remains a significant problem for consumer and business applications.

For Google, whose search platform is trusted by billions of users because of its links to original sources, “hallucinations” are particularly damaging. Its parent company, Alphabet, generates the vast majority of revenue from search and related advertising activity.

In recent months, CEO Sundar Pichai has come under internal and external pressure to speed up the release of new consumer-focused generative AI features after being criticized for falling behind rivals, notably OpenAI, which has a $13 billion partnership with Microsoft.

At Google’s annual developer conference this month, Pichai unveiled the company’s new artificial intelligence strategy. It released Insights – a short answer to queries generated by Gemini – at the top of many common search results for millions of US users under the slogan “Let Google Google for you” and “save you searching”.

The first issues Overviews is facing reflect backlash in February against its Gemini chatbot, which used its image-making tool to create historically inaccurate depictions of different ethnicities and genders, such as women and people of color like Viking kings or otherworldly German soldiers. war.

In response, Google apologized and suspended the generation of images of people using its Gemini model. The function has not been restored.

Pichai spoke about Google’s dilemma of how to keep up with competitors while acting ethically and remaining the search engine of record widely relied upon to provide accurate and verifiable information.

Speaking at an event at Stanford University last month, he said: “People are coming to look at important moments like dosing a three-month-old baby, so we have to get it right. . . that trust is hard earned and easily lost.”

“People let us know when we’re wrong, consumers have the highest bar. . . that is our north star and where our innovation is headed,” Pichai added. “It helps us improve the products, and rightly so.”

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top