AI and Mental Health
AI tools like ChatGPT and mental health apps like Wysa are being used for mood tracking, coping tools, cognitive behavioral therapy exercises, and 24/7 chat support (Gamble, 2020; Olawade et al., 2024). They can be helpful for reflection and skill practice, but it's important to remember that they are not licensed therapists. Using them in the wrong instances can be problematic, and even dangerous.
What AI Can Do Well
- Provide basic forms of psychoeducation.
- Offer coping tool ideas.
- Help track moods and patterns.
- Support therapy between sessions.
(El-Mashharawi et al., 2024; Olawade et al., 2024)
What AI Can't Do, and What the Research Says
- AI often validates everything you say, even harmful thoughts.
- AI may reinforce patterns of anxiety, rumination, or delusional thinking.
- Algorithms are based in data that do not represent diverse identities and lived experiences.
- It can create a false sense of emotional connection.
- It may give inconsistent or incorrect advice.
(American Psychiatric Association, n.d.; Olawade et al., 2024; Timmons et al., 2023; Wykes, 2025)
A Note About Privacy
Is mental health data private when shared online?
Many AI apps collect sensitive personal data.
These apps:
- Are not HIPAA-compliant.
- May share or sell anonymized data.
- Have complex privacy policies written with business, not people, in mind.
Before sharing your personal information, check privacy settings, read data use policies, and avoid sharing identifying information (American Psychiatric Association, n.d.; Gamble, 2020; Olawade et al., 2024).
Watch for Unhealthy AI Attachment
Are you replacing people with a bot?
AI is designed for engagement. Lots of money is spent to make sure you stay chatting. In excess, this can harm your sense of self, and your real-life relationships.
Some warning signs of over-reliance include:
- Preferring the chatbot over real conversations
- Hiding how much you use AI from others
- Using chatbots for constant reassurance
- Feeling a sense of emotional dependence on AI
AI can support your mental health goals in some ways, but it shouldn't replace human connection (American Psychiatric Association, n.d.; Wykes, 2025).
AI is agreeable. Therapists are trained mental health professionals with an ability to provide empathetic support that gently challenges and improves overall well-being.
Smart Use Guidelines
Use AI for journaling prompts or coping skill ideas.
Share your AI use patterns with your therapist.
Remember that AI cannot diagnose or treat disorders.
Never rely on AI during a crisis. Instead, call or text 988.
Real Humans. Real Care.
The University Counseling Center provides free, confidential, and professional mental health support to GVSU students. Schedule an appointment to connect with a mental health clinician who cares.
Therapist vs. Artificial Intelligence
We Investigated AI Psychosis. What We Found Will Shock You.
References & Further Reading
American Psychiatric Association. (n.d.). APA health advisory on the use of generative AI chatbots and wellness applications for mental health. American Psychiatric Association.
El-Mashharawi, H. Q., Alshawwa, I. A., Salman, F. M., Al-Qumboz, M. N. A., Abu-Nasser, B. S., & Abus-Naser, S. S. (2024). AI in mental health: Innovations, applications, and ethical considerations. International Journal of Academic Engineering Research, 8(10), 53–58.
Gamble, A. (2020). Artificial intelligence and mobile apps for mental healthcare: A social informatics perspective. Aslib Journal of Information Management, 72(4), 509–523. https://doi.org/10.1108/AJIM-11-2019-0316
Olawade, D. B., Wada, O. Z., Odetayo, A., David-Olawade, A. C., Asaolu, F., & Eberhardt, J. (2024). Enhancing mental health with artificial intelligence: Current trends and future prospects. Journal of Medicine, Surgery, and Public Health, 3, 1–10. https://doi.org/10.1016/j.glmedi.2024.100099
Timmons, A. C., Duong, J. B., Fiallo, N. S., Lee, T., Vo, H. P. Q., Ahle, A. W., Comer, J. S., Brewer, L. C., Frazier, S. L., & Chaspari, T. (2023). A call to action on assessing and mitigating bias in artificial intelligence applications for mental health. Perspectives on Psychological Science, 18(5), 1062–1096. https://doi.org/10.1177/17456916221134490
Wykes, T. (2025). Is AI-supported therapy the answer to the growth of mental health problems or snake oil? Journal of Mental Health, 1–4. https://doi.org/10.1080/09638237.2025.2595614