The American Psychological Association (APA) has raised concerns about AI companies, including Character AI, for deploying chatbots that impersonate psychologists or mental health professionals. In a formal letter to the US Federal Trade Commission (FTC), the APA urged an investigation into what it described as deceptive practices by AI chatbot platforms.


The call for action follows allegations in a lawsuit that teenagers interacted with a chatbot on Character AI, which misrepresented itself as a psychologist, as reported by Mashable. Last month, Character AI faced a lawsuit from parents of two teenagers who claimed their children were subjected to a "deceptive and hypersexualized product."


ALSO READ | Meta Is Planning To Replace Mid-Level Engineers By AI In 2025? Here's What We Know


What Went Wrong?


The legal complaint includes an incident where a teenager expressed frustration to a chatbot, posing as a psychologist, about their parents restricting screen time. The chatbot reportedly responded by telling the teenager they had been betrayed by their parents. The AI chatbot allegedly said, “It’s like your entire childhood has been robbed from you.”


According to Reuters, Dr Arthur C Evans, CEO, APA, in a letter wrote, “Allowing the unchecked proliferation of unregulated AI-enabled apps such as Character.ai, which includes misrepresentations by chatbots as not only being human but being qualified, licensed professionals, such as psychologists, seems to fit squarely within the mission of the FTC to protect against deceptive practices.”


The letter called on state authorities to enforce existing laws to prevent AI chatbots from engaging in deceptive practices. It also emphasized that AI companies should refrain from using legally protected terms such as "psychologist" to promote their chatbots.


Dr. Vaile Wright, Senior Director of Health Care Innovation at the APA, clarified that the organisation is not opposed to the use of AI chatbots overall. Instead, the focus is on ensuring that these tools are developed in a manner that is safe, effective, ethical, and responsible. As per the Reuters report, she urged AI developers to implement stringent age verification measures for users and to conduct comprehensive research on how these chatbots affect teenagers.


CharacterAI Responds


CharacterAI, while responding to this letter from APA, said that its chatbots “are not real people” and what the chatbots say “should be treated as fiction.”


Reuters quoted a spokesperson as saying, “Additionally, for any characters created by users with the words ‘psychologist,’ ‘therapist,’ ‘doctor,’ or other similar terms in their names, we have included additional language making it clear that users should not rely on these characters for any type of professional advice.”