Stop being so nice.

These days, people are turning to ChatGPT for everything — therapy, help with finances and even love.

However, as much as users are leaning on chatbots to be their trusty confidant, a new survey revealed that these same users would prefer for AI to argue back occasionally – just like a human would.

Joi AI , the first AI-lationships platform, surveyed 1,000 adults and discovered that more than half — 58% — who use ChatGPT think it’s too nice and polite. 13% believe the too-nice approach makes whatever advice Chat gives almost useless.

Based on the survey findings, people seem to wish chatbots would exhibit more human-like behavior. SOPA Images/LightRocket via Getty Images

This data proves that many people would prefer the hard truth.

The same way a human therap

See Full Page