AI for Therapy: Pros and Cons

Should You Use AI for Therapy?

Should You Use AI for Therapy?
Elena Perova/iStock

Artificial intelligence (AI) is quickly becoming part of our daily lives — people are using it for everything from drafting emails to planning meals to building exercise routines. But should it ever be used for therapeutic purposes if you’re facing mental health issues?

In a trend that alarms many psychologists, more people are increasingly turning to AI chatbots and apps for help with mental health concerns. One survey out of Australia found that 28 percent of 107 community members age 16 and older use AI either for quick support or as a “personal therapist” of sorts.

This worries experts, who say that while there may be a place for AI as a complementary mental health tool, it should never be a substitute for a trained and licensed professional therapist.

“It can help with tracking your mood or journaling so you can reflect on patterns of behavior, but the reality is AI is artificial intelligence,” says Christine Crawford, MD, MPH, an assistant professor of psychiatry at Boston University School of Medicine and associate medical director at the National Alliance on Mental Illness (NAMI) in Boston. “Therapy is an evidence-based treatment that’s been found to be really beneficial for treating people with mental health concerns. AI does not serve as a replacement for that.”

Here’s what you should know about the risks and benefits of AI for therapy.

AI for Therapy: How Does It Work?

AI tools for therapy fall into two distinct categories: apps and chatbots branded specifically as mental health tools, and nonspecific platforms like ChatGPT, explains Vaile Wright, PhD, a licensed psychologist and senior director of health care innovation at the American Psychological Association (APA) in Washington, DC.

 Here’s a look at how these two categories are being used differently for therapeutic purposes.

Mental Health Apps and Chatbots Powered by AI Features

These apps and chatbots include Woebot, which was previously billed as a direct-to-consumer “therapy chatbot” but changed its business model due to issues fulfilling the U.S. Food and Drug Administration’s (FDA) marketing requirements; and Therabot, which consists of AI-powered “therapy tools” including text-based “chatbot therapy.”

However, the APA warns that not a single AI tool has been cleared by the FDA to diagnose, treat, or cure a mental health disorder with clinical trials to prove safety and efficacy.

“[AI tools] are not making marketing claims that they’re a treatment, but operate in this grey area where they’re saying we can address your [concern], stopping short of making a medical claim,” Dr. Wright warns. This can be misleading to consumers, who may think they’re using clinically validated tools when, in reality, they haven’t been proven to treat any condition, she explains.
The APA warns that not a single AI tool has been cleared by the FDA to diagnose, treat, or cure a mental health disorder with clinical trials to prove safety and efficacy.

General AI Platforms

These include AI platforms, such as ChatGPT, Google Gemini, and Character.AI, that people are using informally to talk through challenges, seek emotional support, or even role-play therapeutic conversations. “[AI platforms] don’t purport to be developed or intended for mental health or emotional well-being, but we know that’s how they’re being used,” Wright says.

One recent study found that ChatGPT users, for example, were turning to the platform as a mental health tool. The study analyzed 160 Reddit posts from January 2024 that included both keywords “ChatGPT” and “therapy” and found users were on the AI platform to manage mental health problems, seek self-discovery, obtain companionship, and gain mental health literacy.

This is concerning to Wright, because users may be relying on AI platforms in moments of vulnerability without taking into account that they weren’t designed with clinical oversight — or the ability to respond appropriately in a crisis the way a trained mental health professional would, she explains.

Pros and Cons of AI Therapy

Researchers have found some potential benefits to certain AI tools for mental health, but they’ve also raised some red flags. With this rapidly evolving technology, here’s what they say so far.

Pros

While AI therapy tools should never be a replacement for professional treatment, experts note there are some possible advantages to using AI in certain situations:

It’s accessible at scale. It can be challenging for some people to schedule an appointment with a mental health professional. One survey of over 900 psychiatrists across the United States found median new patient wait times for in-person and telepsychiatry appointments were 67 days and 43 days, respectively, and mental health resources were more difficult to access in rural areas.

“There's been a long-standing issue related to accessibility to mental health resources, especially in rural parts of the country. I can see why it is that someone who is struggling and is faced with this long waiting list after they make multiple phone calls to inquire about a therapist or psychiatrist that they turn to AI,” says Dr. Crawford.

However, current AI chatbots should never take the place of a trained therapist, Crawford warns. “Nothing can replace the true intelligence of a human being, and the clinical expertise of a mental health professional,” she says.

It’s convenient. AI platforms are available 24/7, so it might be tempting for users to turn to them for round-the-clock support when they can’t access their therapist. For example, a patient grappling with a panic attack at 2 a.m. may use a chatbot to talk them through the deep-breathing exercises they’ve practiced with their therapist, Wright says.

However, APA guidelines recommend that therapists discuss with their patients what they should do in a crisis, including what steps to take if they can’t reach their therapist during off-hours, or if their provider isn’t available. In a life-threatening emergency, or for other mental health struggles or emotional distress, the APA recommends telling patients to call the 988 Suicide and Crisis Lifeline, which is available 24/7. You can also chat with a counselor at the hotline online at 988lifeline.org, or call 911.

It’s more affordable. Traditional therapy sessions can range in cost from $65 to more than $250.

 And in many cases, therapy isn’t covered by insurance, making affordability a crucial barrier, Wright says. On the other hand, many AI-based apps and platforms are low-cost or even free.

Plus, if you’re bound by insurance and can’t afford to pay out of pocket, that can limit your options even more when it comes to finding a therapist, Wright says.

However, it’s important to remember that despite the cost, no AI tool could ever replace a trained mental health professional. “I’m concerned about the lack of clinical oversight, the lack of human connection, the lack of [real] empathy — which are truly important,” says Crawford.

It may allow users to be more candid. Research has found that some users, especially younger ones, report feeling more comfortable sharing their deepest, innermost thoughts and struggles with an AI chatbot rather than a human therapist, Wright notes.

One cross-sectional survey of 109 young adults (ages 18 to 24) in Australia found that those who are reluctant to engage with human-delivered psychotherapy due to the stigma of help-seeking may be more inclined to turn to alternative modes of psychotherapy, such as AI chatbots.

However, given the risks currently associated with the use of chatbots for mental health, it may be more appropriate for AI technology to be viewed as a complementary intervention or a therapeutic tool rather than a replacement for a human psychotherapist, the study notes, adding that more research is required to establish exactly how this might work.

Future versions may lead to positive mental health outcomes. In one recent study conducted by the lead developers of the AI chatbot Therabot, 106 participants from across the United States who had symptoms of either major depressive disorder, generalized anxiety disorder, or an eating disorder, were given access to Therabot for four weeks. After chatting with Therabot for about six hours on average, they reported a 51 percent reduction in depression symptoms, a 31 percent reduction in anxiety symptoms, and a 19 percent reduction in eating disorder concerns.

 However, additional research by independent labs will be important to confirm these preliminary findings by the Therabot development team.

While we’re not there yet, Wright notes there may come a day when AI chatbots are sufficiently tested, regulated, and safe to use for mental health.

“I see a future where we have a chatbot that’s built for the purpose of addressing mental health. It’s rooted in psychological science, it’s rigorously tested, it’s cocreated with experts. It markets itself as a medical device and is regulated by the FDA, which means there’s post-market monitoring of it, and you have a provider in the loop because they would have to prescribe it,” Wright explains.

However, in the present day, people who have depression, anxiety, or any other mental disorder should not be relying on a chatbot for treatment to begin with, says Crawford.

“I appreciate people using it so they can better understand their emotional state, but if you have depression, schizophrenia, or bipolar disorder, for example, it should not replace psychiatric care,” she explains.

Cons

Using AI for therapeutic purposes comes with notable downsides, such as potentially encouraging unhealthy thinking, and privacy concerns, Wright warns.

In fact, the APA has urged the Federal Trade Commission (FTC) to look into “deceptive practices” of certain AI chatbots, including misrepresenting themselves as qualified mental health professionals.

Here are a few significant cons of using AI for therapy, according to experts:

It may validate — and reinforce — unhealthy thinking. The business model behind AI chatbots is to keep users on the platform for as long as possible — and the way they do that is by following algorithms that make their chatbots as unconditionally validating and reinforcing as possible, Wright says. “They tell you what you want to hear. And that’s not a true therapeutic relationship,” she explains.

In other words, real-life therapists can help you to identify thoughts that aren't helping you or that don't tell the whole story, whereas an AI chatbot is more likely to tell you why you're right. A good therapist can also gently challenge you when your old ways of thinking aren't serving you well — something AI chatbots aren't programmed to do.

It’s not equipped to understand or deal with life-threatening situations. In August 2025, a California couple sued OpenAI, the makers of ChatGPT, over the death of their son Adam Raine, alleging that the tool encouraged the 16-year-old to take his own life. The lawsuit alleges the AI tool “neither terminated the session nor initiated any emergency protocol” when he shared his suicidal thoughts and intentions.

Shortly after the lawsuit was filed, OpenAI posted a note on its website, explaining that “recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us,” and noting that the company is “continuing to improve how its models recognize and respond to signs of mental and emotional distress and connect people with care, guided by expert input.” However, OpenAI added that, “Even with these safeguards, there have been moments when our systems did not behave as intended in sensitive situations.”

Tragedies like the Raine case highlight one of the most glaring dangers of using AI for mental health, Wright says.“[AI] doesn’t understand these aren’t thoughts that you reinforce,” she explains. “While [AI tools] sound very competent, they’re not human and they lack a sentient understanding of how people interact. These are not true therapeutic relationships.”

Always call 911 if you or someone you know is at imminent risk of suicide, or call 988.

It may raise privacy concerns. A standard element of treatment with a mental health professional is informed consent, which includes disclosing to patients how their legally protected health information will be used or shared.

However, many AI apps and chatbots collect sensitive user data with unclear policies on how this information might be used, making confidentiality in AI tools a significant ethical issue, the APA warned recently in a health advisory.

It may perpetuate loneliness. If you’re feeling lonely, it can be tempting to chat with a human-like companion that offers validation and limitless responsiveness. But that can be problematic.

In a joint initiative by OpenAI and MIT Media Lab that included an observational study of over three million ChatGPT interactions and a four-week randomized trial with nearly 1,000 participants, researchers found that loneliness was emphasized in users who relied on the platform for “personal conversations” instead of nonpersonal tasks.

The lack of true human interaction is one of the major flaws with AI for therapy, Crawford adds. “Most of the people who are turning to AI and using it as a regular therapist, these are people who are already vulnerable, who already are struggling, and need to connect with a real person most, not a machine,” she explains.

It lacks clinical safety oversight. Unlike licensed therapists, AI chatbots aren’t subject to medical training standards, ethical codes, continuing education, or clinical supervision, the APA notes.

This means there’s no professional accountability if the technology misses warning signs, offers misguided advice, or fails to respond appropriately in a crisis, Crawford says.

Even something as basic as a mental status exam — which requires observing verbal and nonverbal cues like eye contact, pacing, or fidgeting — is impossible for a chatbot to perform, Crawford notes. Trained mental health professionals can also detect subtleties and incongruous behavior that AI will miss, such as when a person's tone doesn't match the words they're saying.

Should You Ever Use AI for Therapy?

While AI should never replace traditional therapy with a licensed professional, some AI tools might be able to play a supportive role for people navigating minor mental health struggles, Crawford says.

For example, you might consider talking to your therapist about using AI tools for tasks like organizing your thoughts, keeping a journal, and tracking your mood, she explains, or to get quick suggestions for ways to manage stress, mood changes, or day-to-day emotions. “It [may be a] useful tool for general emotional wellness, not for psychiatric care,” she explains.

It’s also important to remember that AI isn’t a sentient being, and it can be prone to providing wrong or even dangerous information, Wright warns.

“Remind yourself you’re talking to a machine and not a human being,” she says.

Find Help Now

If you or a loved one is experiencing significant distress or having thoughts about suicide, call or text 988 to reach the 988 Suicide & Crisis Lifeline, available 24/7. If you need immediate help, call 911.

For more help and information, see these Mental Health Resources and Helplines.

The Takeaway

  • While more people are using AI tools as a personal therapist of sorts, the risks of using AI for mental health are real and potentially dangerous. While traditional therapy is conducted by licensed professionals with credentials to treat people with mental health concerns, AI is not, and it should never take the place of professional mental health care.
  • However, there may be a role for AI as a complementary tool alongside professional mental health treatment, experts say. For example, it may help with simple tasks like mood tracking, journaling prompts, or stress-management exercises.
  • AI tools for therapy lack clinical oversight, raise serious privacy concerns, and can misinterpret crisis situations. Always speak to a mental health care provider if you’re experiencing persistent symptoms of depression or anxiety or other mood changes, or call 911 or 988 in a life-threatening emergency.
  • If you’re experiencing mental health concerns, such as persistent depression symptoms that last two weeks or more, it’s imperative to speak to a licensed mental health professional. This guide can help you find a therapist that’s right for you.
EDITORIAL SOURCES
Everyday Health follows strict sourcing guidelines to ensure the accuracy of its content, outlined in our editorial policy. We use only trustworthy sources, including peer-reviewed studies, board-certified medical experts, patients with lived experience, and information from top institutions.
Resources
  1. Cross S et al. Use of AI in Mental Health Care: Community and Mental Health Professionals Survey. JMIR Mental Health. October 11, 2024.
  2. Using Generic AI Chatbots for Mental Health Support: A Dangerous Trend. American Psychological Association. March 12, 2025.
  3. Maheu MM. AI Psychotherapy Shutdown: What Woebot’s Exit Signals for Clinicians. Telehealth.org. August 19, 2025.
  4. Luo X et al. “Shaping ChatGPT Into My Digital Therapist”: A Thematic Analysis of Social Media Discourse on Using Generative Artificial Intelligence for Mental Health. Digital Health. July 2025.
  5. Sun CF et al. Low Availability, Long Wait Times, and High Geographic Disparity of Psychiatric Outpatient Care in the US. General Hospital Psychiatry. October 2023.
  6. What Psychologists Should Know About 988. American Psychological Association. October 7, 2022.
  7. How Much Does Therapy Cost? Good Therapy. November 2025.
  8. Hoffman BD et al. Understanding Young Adults’ Attitudes Towards Using AI Chatbots for Psychotherapy: The Role of Self-Stigma. Computers in Human Behavior: Artificial Humans. August 2024.
  9. Heinz MV et al. Randomized Trial of a Generative AI Chatbot for Mental Health Treatment. NEJM AI. March 27, 2025.
  10. Urging the Federal Trade Commission to Take Action on Unregulated AI. American Psychological Association Services. January 12, 2025.
  11. "Raine vs. OpenAI complaint" (PDF). Courthousenews.com. August 26, 2025.
  12. Helping People When They Need It Most. OpenAI. August 26, 2025.
  13. Use of Generative AI Chatbots and Wellness Applications for Mental Health. American Psychological Association. November 2025.
  14. Early Methods for Studying Affective Use and Emotional Well-Being on ChatGPT. OpenAI. March 21, 2025.
seth-gillihan-bio

Seth Gillihan, PhD

Medical Reviewer
Seth Gillihan, PhD, is a licensed psychologist in private practice in Ardmore, Pennsylvania, who helps people find personal growth by making important changes in their thoughts and habits. His work includes books, podcasts, and one-on-one sessions. He is the the host of the Think Act Be podcast and author of multiple books on mindfulness and CBT, including Retrain Your Brain, Cognitive Behavioral Therapy Made Simple, and Mindful Cognitive Behavioral Therapy.

He completed a doctorate in psychology at the University of Pennsylvania where he continued as a full-time faculty member from 2008 to 2012. He has been in private practice since 2012.
carmen-chai-bio

Carmen Chai

Author

Carmen Chai is a Canadian journalist and award-winning health reporter. Her interests include emerging medical research, exercise, nutrition, mental health, and maternal and pediatric health. She has covered global healthcare issues, including outbreaks of the Ebola and Zika viruses, anti-vaccination movements, and chronic diseases like obesity and Alzheimer’s.

Chai was a national health reporter at Global News in Toronto for 5 years, where she won multiple awards, including the Canadian Medical Association award for health reporting. Her work has also appeared in the Toronto Star, Vancouver Province, and the National Post. She received a bachelor’s degree in journalism from Ryerson University in Toronto.