• Email Us: [email protected]
  • Contact Us: +1 718 874 1545
  • Skip to main content
  • Skip to primary sidebar

Medical Market Report

  • Home
  • All Reports
  • About Us
  • Contact Us

Adding One Word To Searches Makes Google’s AI Spout Pure, Unfiltered Nonsense

April 28, 2025 by Deborah Bloomfield

While the tech bros of the world declare the singularity – the moment where artificial intelligence (AI) surpasses human intelligence – imminent, various AI systems are still struggling with tasks humans can perform with ease.

For instance, image generators struggle with hands, teeth, or a glass of wine that is full to the brim, while large language model (LLM) chatbots are easily vexed by problems that can be solved by most 8-year-old humans. As well as this, they are still prone to “hallucinations”, or serving up plausible-sounding lies rather than true information.

Despite these problems, search and other tech giants have been eager to implement AI into their various products. In the latest in a long line of issues, Internet users have discovered Google’s AI summary will spout some plausible-sounding nonsense if you add the word “meaning” to the end of your search.

ⓘ IFLScience is not responsible for content shared from external sites.

Other people tried the same technique, with similar results.

“The phrase ‘rainbow trout in a chocolate volcano’ is a metaphorical way of describing a situation or a person’s state, often used to highlight a surprising or unexpected combination,” Google told one X user. “It implies a juxtaposition of seemingly contrasting elements: the freshness of rainbow trout with the sweetness and richness of a chocolate volcano.”

“‘Stick it to the diddly time’ is a slang expression meaning to resist authority or a system, or to refuse to conform to expectations,” it told a Redditor. “It’s a playful and defiant way of saying you’re not going to put up with something, or you’re going to do things your own way. The phrase ‘diddly time’ itself is a nonsensical phrase that adds to the playful, rebellious tone.”

ⓘ IFLScience is not responsible for content shared from external sites.

While people may have already begun outsourcing their own critical thinking to AI, or relying on it for information, these chatbots are not really doing any factchecking. What they do is put words in a pleasing order, based on their data training set. They are more “spicy autocomplete” than SkyNet or Optimus Prime. 

When they cannot come up with a truthful answer, really taken by smushing together answers from humans in their dataset, they are prone to “hallucination” in their attempt to please their human users. Or in simple terms, they will sometimes talk crap at you rather than provide you with no answer at all.

That’s not ideal for a service like Google, whose whole schtick has been to provide information to people seeking information. However, the issue currently appears to have been temporarily patched, with AI overview turned off whenever you type in an uncommon or made up phrase followed by the word “meaning”.

Deborah Bloomfield
Deborah Bloomfield

Related posts:

  1. Russia moves Sukhoi Su-30 fighter jets to Belarus to patrol borders, Minsk says
  2. French senators to visit Taiwan amid soaring China tensions
  3. Thought Unicorns Don’t Exist? Turns Out They Live In A Chinese Cave
  4. What Is Causing The Antihelium Detected On The International Space Station?

Source Link: Adding One Word To Searches Makes Google's AI Spout Pure, Unfiltered Nonsense

Filed Under: News

Primary Sidebar

  • A New Way Of Looking At Einstein’s Equations Could Reveal What Happened Before The Big Bang
  • First-Ever Look At Neanderthal Nasal Cavity Shatters Expectations, NASA Reveals Comet 3I/ATLAS Images From 8 Missions, And Much More This Week
  • The Latest Internet Debate: Is It More Efficient To Walk Around On Massive Stilts?
  • The Trump Administration Wants To Change The Endangered Species Act – Here’s What To Know
  • That Iconic Lion Roar? Turns Out, They Have A Whole Other One That We Never Knew About
  • What Are Gravity Assists And Why Do Spacecraft Use Them So Much?
  • In 2026, Unique Mission Will Try To Save A NASA Telescope Set To Uncontrollably Crash To Earth
  • Blue Origin Just Revealed Its Latest New Glenn Rocket And It’s As Tall As SpaceX’s Starship
  • What Exactly Is The “Man In The Moon”?
  • 45,000 Years Ago, These Neanderthals Cannibalized Women And Children From A Rival Group
  • “Parasocial” Announced As Word Of The Year 2025 – Does It Describe You? And Is It Even Healthy?
  • Why Do Crocodiles Not Eat Capybaras?
  • Not An Artist Impression – JWST’s Latest Image Both Wows And Solves Mystery Of Aging Star System
  • “We Were Genuinely Astonished”: Moss Spores Survive 9 Months In Space Before Successfully Reproducing Back On Earth
  • The US’s Surprisingly Recent Plan To Nuke The Moon In Search Of “Negative Mass”
  • 14,400-Year-Old Paw Prints Are World’s Oldest Evidence Of Humans Living Alongside Domesticated Dogs
  • The Tribe That Has Lived Deep Within The Grand Canyon For Over 1,000 Years
  • Finger Monkeys: The Smallest Monkeys In The World Are Tiny, Chatty, And Adorable
  • Atmospheric River Brings North America’s Driest Place 25 Percent Of Its Yearly Rainfall In A Single Day
  • These Extinct Ice Age Giant Ground Sloths Were Fans Of “Cannonball Fruit”, Something We Still Eat Today
  • Business
  • Health
  • News
  • Science
  • Technology
  • +1 718 874 1545
  • +91 78878 22626
  • [email protected]
Office Address
Prudour Pvt. Ltd. 420 Lexington Avenue Suite 300 New York City, NY 10170.

Powered by Prudour Network

Copyrights © 2025 · Medical Market Report. All Rights Reserved.

Go to mobile version