Should You Trust an AI Therapist? The Benefits and Risks of Virtual Mental Health Tools

Understanding your mental health care options is more important than ever. AI therapy chatbots are growing in popularity, but are they right for you? This guide explains the benefits, risks, and when you should choose a human therapist.

Is Treatment Available In My State?

What is AI Therapy?

AI therapy tools use artificial intelligence and natural language processing (NLP) to simulate conversations with users. These platforms are based on established psychological techniques like Cognitive Behavioral Therapy (CBT), Dialectical Behavior Therapy (DBT), and mindfulness.

When you chat with an AI therapist, it analyzes your messages, detects emotional patterns, and provides coping strategies tailored to your responses. Many apps track moods over time, offer guided exercises for stress and anxiety, and give practical mental health tips. Some combine AI chat with access to human therapists, creating a hybrid support model. ​

What AI therapy is NOT: AI apps are not a replacement for licensed professionals. They cannot diagnose, prescribe medications, or handle crises. Most AI chatbots are considered “wellness tools” rather than medical devices, which means they have limited regulation. Out of roughly 20,000 mental health apps, only five have FDA approval. ​

The Mental Health Crisis and AI’s Role

Mental health challenges affect more people than most realize. Globally, over 1 billion people live with a mental health condition. In the U.S., 18.2% of adults experienced anxiety symptoms and 21.4% reported depression symptoms in 2022, with rates highest among young adults aged 18–29. 

Therapist shortages worsen the problem. As of August 2024, 122 million Americans live in areas with too few mental health professionals, particularly in rural regions. Estimates suggest the U.S. needs 238,000–1.8 million more therapists to meet demand.

AI adoption is helping bridge this gap. The global mental health chatbot market reached $1.37 billion in 2024, offering 24/7 support, affordability, and accessibility for people who might otherwise go without care.

Benefits of AI Mental Health Tools

1. 24/7 Accessibility

AI chatbots are always available late at night, during lunch breaks, or on weekends. This flexibility is especially valuable for busy schedules, mobility challenges, or areas with few therapists.

2. Cost-Effective Support

Traditional therapy often costs $100–$300 per session. AI therapy apps usually cost $10–$40/month or offer free versions, making support far more affordable.

3. Reduced Stigma

Many users feel more comfortable sharing with an AI chatbot than a human therapist. Platforms like Wysa have facilitated over 1 billion AI conversations worldwide, with 91% of users finding them helpful.

4. Evidence-Based Results for Mild to Moderate Symptoms

Research shows AI therapy can help reduce depression, anxiety, and stress. For example, the AI chatbot Therabot showed a 51% reduction in depression and 31% reduction in anxiety, comparable to traditional therapy.

5. Scalability

AI tools can support millions at once. Wysa has helped 6 million people across (105) countries and conducted over 2 million CBT sessions, expanding access globally.

Risks and Limitations

1. Lack of Genuine Human Connection

AI can mimic empathy but cannot feel it. The therapeutic relationship including warmth, compassion, and understanding is critical to effective therapy.

2. Cannot Handle Crises

AI chatbots often fail crisis simulations. They cannot detect suicide risk or intervene. Only human therapists can provide immediate support and escalation.

3. Privacy Concerns

Not all apps are HIPAA-protected. Conversations may be stored or exposed in data breaches, putting sensitive information at risk. 

4. Risk of Harmful Advice

AI can “hallucinate” incorrect information. Some therapy chatbots have given harmful advice, including dietary recommendations or unsafe responses to suicidal ideation.

5. Potential for Increased Isolation

Frequent use of AI therapy chatbots may foster emotional dependence and reduce users’ motivation to seek real-world social support. While these tools can temporarily ease feelings of loneliness, over reliance on them may ultimately reinforce social isolation in some users.

6. Lack of Regulation

Most apps operate without clinical validation. Some states (Illinois, Nevada, Utah) have laws requiring licensed oversight and privacy protections, but federal regulation is limited.

Who Should (and Shouldn’t) Use AI Therapy

AI Therapy May Be Helpful If You:

  • Experience mild to moderate stress, anxiety, or low mood

  • Want support between therapy sessions

  • Need help building coping skills or tracking mood

  • Face financial or geographic barriers

  • Prefer anonymous support

  • Want daily mental health maintenance

Choose a Human Therapist If You:

  • Experience severe depression, anxiety, or serious mental health conditions

  • Have thoughts of self-harm or suicide

  • Deal with trauma, abuse, or PTSD

  • Have complex or co-occurring conditions

  • Need diagnosis, medication management, or insurance coverage

  • Are a minor without adult supervision

Special Considerations:

  • “AI therapy apps should only be used by minors under parental or professional supervision to ensure safety and appropriate guidance.”

  • Individuals with serious mental illnesses (schizophrenia, bipolar disorder, active substance use) should seek licensed professional care.

Popular AI Therapy Apps

Wysa – Clinically validated, optional human therapist access, FDA Breakthrough Device Designation, 6+ million users.

Woebot – Uses CBT, DBT, and interpersonal therapy frameworks; research-backed.

Therabot – Developed by Dartmouth, RCT shows significant reductions in depression and anxiety.

Note: General AI platforms like ChatGPT, Google Gemini, or Microsoft Copilot are not validated mental health tools. While conversationally supportive, they lack clinical testing and oversight, and unsupervised use may exacerbate distress or confusion in vulnerable users.

Making the Right Choice

AI therapy can support mild to moderate symptoms, but it is not a replacement for professional care. The most effective approach may combine AI tools for daily support with a licensed therapist for deeper work.

Questions to Ask Before Using an AI App:

  • Is it clinically validated?

  • Is my data protected?

  • Does it explain limitations and when to seek human help?

  • Can I access human support if needed?

  • Are there safety protocols for crises?

When to Seek Immediate Human Help

If experiencing suicidal thoughts, self-harm urges, or a mental health crisis:

  • Call 988 Suicide and Crisis Lifeline (24/7)

  • Text HELLO to 741741 (Crisis Text Line)

  • Go to the nearest emergency room

  • Call 911

  • Contact a licensed mental health professional

References

Aidx.ai. (2025). AI therapy vs. traditional therapy: Personalization features. Retrieved from https://aidx.ai/p/ai-therapy-vs-traditional-therapy-personalization-features/

BetterUp. (2025, January 20). What is AI therapy? Pros and cons for mental health care. Retrieved from https://www.betterup.com/blog/ai-therapy

ClearHQ. (2025, October 7). Regulation of AI therapy apps. Retrieved from https://www.clearhq.org/news/regulation-of-ai-therapy-apps-10-8-25

CNN. (2024, December 18). AI chatbots are becoming popular for therapy. Here's what mental health experts say about them. Retrieved from https://www.cnn.com/2024/12/18/health/chatbot-ai-therapy-risks-wellness

Dartmouth College. (2025, March 26). First therapy chatbot trial yields mental health benefits. Retrieved from https://home.dartmouth.edu/news/2025/03/first-therapy-chatbot-trial-yields-mental-health-benefits

Datasumi. (2023, September 11). Limitations of AI in crisis management. Retrieved from https://www.datasumi.com/limitations-of-ai-in-crisis-management

Firth, J., Torous, J., Nicholas, J., Carney, R., Pratap, A., Rosenbaum, S., & Sarris, J. (2019). The efficacy of smartphone-based mental health interventions for depressive symptoms: A meta-analysis of randomized controlled trials. World Psychiatry, 16(3), 287-298.

Gardner Law. (2025, September 24). AI mental health tools face mounting regulatory and legal pressure. Retrieved from https://gardner.law/news/legal-and-regulatory-pressure-on-ai-mental-health-tools

Health Resources and Services Administration. (2024, November). State of the behavioral health workforce, 2024. Retrieved from https://bhw.hrsa.gov/sites/default/files/bureau-health-workforce/state-of-the-behavioral-health-workforce-report-2024.pdf

JMIR mHealth and uHealth. (2020, May 28). Mobile apps for mental health issues: Meta-review of meta-analyses. Retrieved from https://mhealth.jmir.org/2020/5/e17458/

JMIR mHealth and uHealth. (2022, November 6). Effectiveness of mental health apps for distress during COVID-19. Retrieved from https://mhealth.jmir.org/2022/11/e41689

Katie Couric Media. (2025, July 20). AI therapy in 2025: Benefits, risks, apps. Retrieved from https://katiecouric.com/health/mental-health/artificial-intelligence-ai-therapy-benefits-risks-privacy/

Loyo Law Review. (2024, August 4). Artificial intelligence and health privacy. Retrieved from https://loynolawreview.org/theforum/artificial-intelligence-and-health-privacy1442024

Mindful AI Health. (2025, February 26). How much does AI therapy cost? Retrieved from https://mindfulaihealth.com/how-much-does-ai-therapy-cost/

Mymeditate Mate. (2025, October 6). 8 best AI mental health apps for 2025. Retrieved from https://mymeditatemate.com/blogs/wellness-tech/best-ai-mental-health-apps

National Center for Health Workforce Analysis. (2022, May 19). Evaluation of mental health mobile applications. National Institutes of Health. Retrieved from https://www.ncbi.nlm.nih.gov/books/NBK580942/

National Counseling Society. (2025, August 27). A closer look at the mental health provider shortage. Retrieved from https://www.counseling.org/publications/counseling-today-magazine/article-archive/article/legacy/a-closer-look-at-the-mental-health-provider-shortage

Nature. (2025, September 17). Can AI chatbots trigger psychosis? What the science says. Nature. Retrieved from https://www.nature.com/articles/d41586-025-03020-9

Precedence Research. (2024, June 7). Chatbots for mental health and therapy market size. Retrieved from https://www.precedenceresearch.com/chatbots-for-mental-health-and-therapy-market

PubMed Central. (2022, August 24). Potential and pitfalls of mobile mental health apps in traditional treatment. Retrieved from https://pmc.ncbi.nlm.nih.gov/articles/PMC9505389/

PubMed Central. (2022, April 11). Evaluating user feedback for an AI-enabled cognitive behavioral therapy chatbot. Retrieved from https://pmc.ncbi.nlm.nih.gov/articles/PMC9044157/

PubMed Central. (2023, November 7). Your robot therapist is not your therapist: Understanding AI-powered mental health chatbots. Retrieved from https://pmc.ncbi.nlm.nih.gov/articles/PMC10663264/

PubMed Central. (2024, July 31). Do AI chatbots incite harmful behaviors in mental health contexts? Retrieved from https://pmc.ncbi.nlm.nih.gov/articles/PMC11738096/

Psychology Today. (2025, October 11). Therapy using AI chatbots is not just risky, it's dangerous. Retrieved from https://www.psychologytoday.com/us/blog/some-assembly-required/202510/therapy-using-ai-chatbots-is-not-just-risky-its-dangerous

Sprypt. (2025, October 13). HIPAA compliance AI in 2025: Critical security requirements. Retrieved from https://www.sprypt.com/blog/hipaa-compliance-ai-in-2025-critical-security-requirements

Stanford HAI. (2025, October 6). Exploring the dangers of AI in mental health care. Retrieved from https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care

Tebra. (2025, February 23). Healthcare AI and HIPAA privacy concerns: Everything you need to know. Retrieved from https://www.tebra.com/theintake/practice-operations/legal-and-compliance/privacy-concerns-with-ai-in-healthcare

Terlizzi, E. P., & Zablotsky, B. (2024, November 7). Symptoms of anxiety and depression among adults: United States, 2019 and 2022. National Health Statistics Reports, 213. Centers for Disease Control and Prevention.

The HIPAA Journal. (2024, September 28). HIPAA, healthcare data, and artificial intelligence. Retrieved from https://www.hipaajournal.com/hipaa-healthcare-data-and-artificial-intelligence/

The HIPAA Journal. (2024, October 6). When AI technology and HIPAA collide. Retrieved from https://www.hipaajournal.com/when-ai-technology-and-hipaa-collide/

Therapist St. Pete. (2025, August 27). Human therapist vs AI. Retrieved from https://www.therapystpete.com/post/human-therapist-vs-ai

Time Magazine. (2025, June 11). The risks of kids getting AI therapy from a chatbot. Retrieved from https://time.com/7291048/ai-chatbot-therapy-kids/

Uptowell. (2025, August 4). From Wysa to Woebot: Are AI mental health apps the future of therapy? Retrieved from https://uptowell.com/wysa-woebot-ai-mental-health-apps/

Wikipedia. (2025, June 28). Chatbot psychosis. Retrieved from https://en.wikipedia.org/wiki/Chatbot_psychosis

World Health Organization. (2025, September 1). WHO releases new reports highlighting urgent gaps in mental health. Retrieved from https://www.who.int/news/item/02-09-2025-who-releases-new-reports-and-estimates-highlighting-urgent-gaps-in-mental-health

Wysa. (2024). Clinical evidence & research. Retrieved from https://www.wysa.com/clinical-evidence

Close-up of a bald man with a light beard, smiling outdoors in a maroon sweater.

Clinically Reviewed By:

Dr. Akash Kumar, MD

View Profile