By Sandeep Budki
It all started with a bold deception. In May 2023, two lawyers from New York filed a court document that had been drafted using ChatGPT . The submission appeared impeccable, complete with citations and quotes from judges. However, the court discovered that none of the referenced cases actually existed. The AI had fabricated them, which serves as a classic example of hallucination – that is, AI generating false information.
That courtroom embarrassment exposed the central paradox of AI . Generative models can write with confidence but cannot always separate fact from fiction. “By nature, large language models are designed to respond to everything and that’s precisely why they hallucinate,” said Ankush Sabharwal, founder and CEO of CoRover.ai. “Hallucinations is not an

Financial Express

Live Law
The Hindu
Ommcom News
AlterNet
Deadline