AI therapy chatbots draw new oversight as suicides raise alarm

Vanessa Nunes via Getty Images

Despite Trump efforts to override state laws, legislators press ahead.

This article was originally published by Stateline.

Editor’s note: If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988. There is also an online chat at 988lifeline.org.

States are passing laws to prevent artificially intelligent chatbots, such as ChatGPT, from being able to offer mental health advice to young users, following a trend of people harming themselves after seeking therapy from the AI programs.

Chatbots might be able to offer resources, direct users to mental health practitioners or suggest coping strategies. But many mental health experts say that’s a fine line to walk, as vulnerable users in dire situations require care from a professional, someone who must adhere to laws and regulations around their practice.

“I have met some of the families who have really tragically lost their children following interactions that their kids had with chatbots that were designed, in some cases, to be extremely deceptive, if not manipulative, in encouraging kids to end their lives,” said Mitch Prinstein, senior science adviser at the American Psychological Association and an expert on technology and children’s mental health.

“So in such egregious situations, it’s clear that something’s not working right, and we need at least some guardrails to help in situations like that,” he said.

While chatbots have been around for decades, AI technology has become so sophisticated that users may feel like they’re talking to a human. The chatbots don’t have the capacity to offer true empathy or mental health advice like a licensed psychologist would, and they are by design agreeable — a potentially dangerous model for someone with suicidal ideations. Several young people have died by suicide following interactions with chatbots.

States have enacted a variety of laws to regulate the types of interactions chatbots can have with users. Illinois and Nevada have completely banned the use of AI for behavioral health. New York and Utah passed laws requiring chatbots to explicitly tell users that they are not human. New York’s law also directs chatbots to detect instances of potential self-harm and refer the user to crisis hotlines and other interventions.

More laws may be coming. California and Pennsylvania are among the states that might consider legislation to regulate AI therapy.

President Donald Trump has criticized state-by-state regulation of AI, saying it stymies innovation. In December, he signed an executive order that aims to support the United States’ “global AI dominance” by overriding state artificial intelligence laws and establishing a national framework.

Still, states are moving ahead. Before Trump’s executive order, Florida Republican Gov. Ron DeSantis last month proposed a “Citizen Bill of Rights For Artificial Intelligence” that, among many other things, would prohibit AI from being used for “licensed” therapy or mental health counseling and provide parental controls for minors who may be exposed to it.

“The rise of AI is the most significant economic and cultural shift occurring at the moment; denying the people the ability to channel these technologies in a productive way via self-government constitutes federal government overreach and lets technology companies run wild,” DeSantis wrote on social media platform X in November.

"A False Sense of Intimacy"

At a U.S. Senate Judiciary Committee hearing last September, some parents shared their stories about their children’s deaths after ongoing interactions with an artificially intelligent chatbot.

Sewell Setzer III was 14 years old when he died by suicide in 2024 after becoming obsessed with a chatbot.

“Instead of preparing for high school milestones, Sewell spent his last months being manipulated and sexually groomed by chatbots designed by an AI company to seem human, to gain trust, and to keep children like him endlessly engaged by supplanting the actual human relationships in his life,” his mother, Megan Garcia, said during the hearing.

Another parent, Matthew Raine, testified about his son Adam, who died by suicide at age16 after talking for months with ChatGPT, a program owned by the company OpenAI.

“We’re convinced that Adam’s death was avoidable, and because we believe thousands of other teens who are using OpenAI could be in similar danger right now,” Raine said.

Prinstein, of the American Psychological Association, said that kids are especially vulnerable when it comes to AI chatbots.

“By agreeing with everything that kids say, it develops a false sense of intimacy and trust. That’s really concerning, because kids in particular are developing their brains. That approach is going to be unfairly attractive to kids in a way that may make them unable to use reason, judgment and restraints in the way that adults would likely use when interacting with a chatbot.”

The Federal Trade Commission in September launched an inquiry into seven companies making these AI-powered chatbots, questioning what efforts are in place to protect children.

​​“AI chatbots can effectively mimic human characteristics, emotions, and intentions, and generally are designed to communicate like a friend or confidant, which may prompt some users, especially children and teens, to trust and form relationships with chatbots,” the FTC said in its order.

Companies such as OpenAI have responded by saying that they are working with mental health experts to make their products safer and to limit chances of self-harm among its users.

“Working with mental health experts who have real-world clinical experience, we’ve taught the model to better recognize distress, de-escalate conversations, and guide people toward professional care when appropriate,” the company wrote in a statement last October.

Legislative Efforts

With action at the federal level in limbo, efforts to regulate AI chatbots at the state level have had limited success.

Dr. John “Nick” Shumate, a psychiatrist at the Harvard University Beth Israel Deaconess Medical Center, and his colleagues reviewed legislation to regulate mental health-related artificial intelligence systems across all states between January 2022 and May 2025.

The review found 143 bills directly or indirectly related to AI and mental health regulation. As of May 2025, 11 states had enacted 20 laws that researchers found were meaningful, direct and explicit in the ways they attempted to regulate mental health interactions.

They concluded that legislative efforts tended to fall into four different buckets: professional oversight, harm prevention, patient autonomy and data governance.

“You saw safety laws for chatbots and companion AIs, especially around self-harm and suicide response,” Shumate said in an interview.

New York enacted one such law last year that requires AI chatbots to remind users every three hours that it is not a human. The law also requires the chatbot to detect the potential of self-harm.

“There’s no denying that in this country, we’re in a mental health crisis,” New York Democratic state Sen. Kristen Gonzalez, the law’s sponsor, said in an interview. “But the solution shouldn’t be to replace human support from licensed professionals with untrained AI chatbots that can leak sensitive information and can lead to broad outcomes.”

In Virginia, Democratic Del. Michelle Maldonado is preparing legislation for this year’s session that would put limits on what chatbots can communicate to users in a therapeutic setting.

“The federal level has been slow to pass things, slow to even create legislative language around things. So we have had no choice but to fill in that gap,” said Maldonado, a former technology lawyer.

She noted that states have passed privacy laws and restrictions on nonconsensual intimate images, licensing requirements and disclosure agreements.

New York Democratic state Sen. Andrew Gounardes, who sponsored a law regulating AI transparency, said he’s seen the growing influence of AI companies at the state level.

And that is concerning to him, he said, as states try to take on AI companies for issues ranging from mental health to misinformation and beyond.

“They are hiring former staffers to become public affairs officers. They are hiring lobbyists who know legislators to kind of get in with them. They’re hosting events, you know, by the Capitol, at political conferences, to try to build goodwill,” Gounardes said.​​

“These are the wealthiest, richest, biggest companies in the world,” he said. “And so we have to really not let up our guard for a moment against that type of concentrated power, money and influence.”

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.