AI and Mental Health
AI tools like ChatGPT and mental health apps like Wysa are being used for mood tracking, coping tools, cognitive behavioral therapy exercises, and 24/7 chat support. They can be helpful for reflection and skill practice, but it's important to remember that they are not licensed therapists. Using them in the wrong instances can be problematic, and even dangerous.
What AI Can Do Well
- Provide basic forms of psychoeducation.
- Offer coping tool ideas.
- Help track moods and patterns.
- Support therapy between sessions.
What AI Can't Do, and What the Research Says
- AI often validates everything you say, even harmful thoughts.
- AI may reinforce patterns of anxiety, rumination, or delusional thinking.
- Algorithms are based in data that do not represent diverse identities and lived experiences.
- It can create a false sense of emotional connection.
- It may give inconsistent or incorrect advice.
A Note About Privacy
Is mental health data private when shared online?
Many AI apps collect sensitive personal data.
These apps:
- Are not HIPAA-compliant.
- May share or sell anonymized data.
- Have complex privacy policies written with business, not people, in mind.
Before sharing your personal information, check privacy settings, read data use policies, and avoid sharing identifying information.
AI is agreeable. Therapists are trained mental health professionals with an ability to provide empathetic support that gently challenges and improves overall well-being.
Smart Use Guidelines
Use AI for journaling prompts or coping skill ideas.
Share your AI use patterns with your therapist.
Remember that AI cannot diagnose or treat disorders.
Never rely on AI during a crisis. Instead, call or text 988.
Real Humans. Real Care.
The University Counseling Center provides free, confidential, and professional mental health support to GVSU students. Schedule an appointment to connect with a mental health clinician who cares.