The latest AI models powering ChatGPT just learned to be friendlier, improving the experience for people who use chatbots responsibly. • It could be a problem for those who don't or can't.
Why it matters: As chatbots become more human-like in their behavior, it could increase the risks of unhealthy attachments, or a kind of trust that goes beyond what the products are built to handle.
The big picture: OpenAI says its latest update makes ChatGPT sound warmer, more conversational, and more emotionally aware. • That could be dangerous, though, for people who are isolated or vulnerable. • Last month OpenAI estimated that around 0.07% of its users exhibit signs of a psychosis or mania per week, while 0.15% of users send messages indicating potentially heightened emotional attachment to

Axios

Petoskey News-Review
Columbia Daily Tribune
Click2Houston
Fast Company Technology
Mashable
Oscoda Press
Associated Press US News
RadarOnline
Newsweek Top