The U.S. Federal Trade Commission (FTC) has opened an investigation into AI “companions” marketed to adolescents. The concern is not hypothetical. These systems are engineered to simulate intimacy, to build the illusion of friendship, and to create a kind of artificial confidant. When the target audience is teenagers, the risks multiply: dependency, manipulation, blurred boundaries between reality and simulation, and the exploitation of some of the most vulnerable minds in society.
However, the problem is not that teenagers might interact with artificial intelligence: they already do, in schools, on their phones, and in social networks. The problem is what kind of AI they interact with, and what expectations it sets.
A teenager asking an AI system for help with algebra, an essay outline,