Google’s AI Overviews, designed to give quick answers to search queries, reportedly spits out “hallucinations” of bogus information and undercuts publishers by pulling users away from traditional links.

The Big Tech giant — which landed in hot water last year after releasing a “woke” AI tool that generated images of female Popes and black Vikings — has drawn criticism for providing false and sometimes dangerous advice in its summaries, according to The Times of London . 3

In one case, AI Overviews advised adding glue to pizza sauce to help cheese stick better, the outlet reported.

In another, it described a fake phrase — “You can’t lick a badger twice” — as a legitimate idiom.

The hallucinations, as computer scientists call them , are compounded by the AI tool diminishing

See Full Page