It was tempting, for a while, to treat AI models like ChatGPT as all-knowing oracles for every crisis in our lives. Got a weird rash? Ask ChatGPT. Need to draft a will? Ask ChatGPT. But that era is officially over. Citing massive liability risks, Big Tech is slamming the brakes.

As of 29 October, ChatGPT's rules have reportedly changed: it will no longer give specific medical, legal, or financial advice. As reported by NEXTA, the bot is now officially an 'educational tool', not a 'consultant.' The reason? As NEXTA notes, 'regulations and liability fears squeezed it — Big Tech doesn't want lawsuits on its plate.'

Now, instead of providing direct advice, the model will 'only explain principles, outline general mechanisms and tell you to talk to a doctor, lawyer or financial professional.

See Full Page