Google's shiny new AI powered results are sometimes terribly wrong, because the algorithm misinterprets satire or jokes as factual information. There doesn't seem to be any real problem with the automated summarization process per se, taking a text and transforming it into another text that's shorter. But the things it's picking up as answers are sourced from shitposting on Reddit or The Onion's brand of humor. It's summarizing the wrong things, as though it didn't understand the question at all.
Armed with just a little knowledge of where the convincing looking funny pages are, you can get Google to spit back well-formed sentences that are nonsense about gasoline spaghetti, how many rocks you should eat, the presidents that went to the University of Wisconsin, glue on pizza, and more others than I can count.
Google appears at the moment to be steadfast in defending its search engine results as authentic and accurate most of the time. But they have lost this round and I expect a retreat.
A good write up on the current state of affairs from the BBC: "Glue pizza and eat rocks: Google AI search errors go viral".
https://www.bbc.com/news/articles/cd11gzejgz4o
More pointed is 404 Media, "Google Is Paying Reddit $60 Million for Fucksmith to Tell Its Users to Eat Glue"