Artificial Intelligence (AI) is a driving force in our digital age. The promise of AI to enhance the efficiency of mental health (MH) treatment and improve patient outcomes is growing as it becomes more integrated into healthcare. While AI offers numerous potential benefits in the mental health field, it also comes with its share of controversies.
Overview of Artificial Intelligence (AI) In Mental Health
AI in mental health (MH) has boomed over the last few years. In 2023, Epic, one of the largest and most utilized electronic health record companies, and Nuance Inc. of Microsoft announced plans to partner to implement AI in Epic’s medical record system. This collaboration hopes to support clinical documentation, administrative needs, and even billing of visits to insurance companies.1 Throughout this article, we will explore AI mental health and the intricacies, benefits, and pitfalls this may entail.
How AI is Involved in Your Mental Health Journey
AI mental health has emerged as a potentially helpful addition to therapists’ day-to-day practices. Many websites and applications are now available to therapists through integrated AI technology. These sites and apps hope to revolutionize how therapists complete their documentation and may even become a mainstay for tracking patients’ progress over time.
Additionally, AI mental health has surged in self-help and self-diagnosis. More recently, AI chatbots have been developed for self-help and MH purposes. These chatbots offer persons access to, reportedly, evidence-based therapeutic interventions and treatments. However, studies on their effectiveness remain few and far between.3 These are just a few ways you may encounter AI mental health in your daily experience(s).
Your Mental Health Searches Online
Most of us, at one point in time, have “Googled” symptoms we are experiencing; looking for an answer to: “What is happening to me?” Over the past year, Google has blended AI technology into their mental health searches. We can now see “AI Overview” (AIOs) at the top of many of our search results. At first glance, AI in mental health searches on Google may help give a broad understanding of the topic along with other elements to consider.
However, these AIOs beg the question: is this the type of information people should be receiving regarding mental health? A fellow ChoosingTherapy.com author, Matthew Kerr, researched what these Google AIOs mean for mental health. He found that over 38% of mental health (MH) keywords originated from an AIO.
To put that into perspective, this means that more than a third of the mental health-related searches on Google are being answered by AI-generated responses instead of by human-written articles. This is significant because it shows how heavily people are already relying on AI-driven sources for sensitive mental health information, perhaps without even realizing it. While many of the top sources for these AIOs include reputable sites like NIH.gov, ChoosingTherapy.com, and WebMD, this statistic raises concerns about the quality and depth of information being provided.
Kerr’s research indicates that, so far, there hasn’t been any explicitly harmful information found within these AIOs. However, the lack of human oversight and the potential for AI to miss nuances, especially in complex and individualized areas like mental health, highlights the importance of ongoing research to assess whether these AI-generated responses are truly beneficial—or if they fall short of what people need when seeking mental health support.
Articles & Websites
We need to know where the information we are reading comes from. There has been an influx of ethical concerns related to AI written content, especially in education. While AI has grown, so has technology to detect AI in writing to ensure writing standards are met.4
While you can utilize tools to detect AI, there are key features to look for when reviewing the articles that come up from your mental health search in a web browser.For example, seek out the editorial policies of the websites and articles you are interfacing with. Often, when there is no clear indication that an article was medically reviewed or written by a licensed professional, this can be a “red flag,” and you should take steps to verify the information through additional sources.
Mental Health Assessments
Similar to Google, many of us turn to online assessments or quizzes to clarify our experience. At this time, there is limited research on how AI plays a role in MH quizzes and online MH self-assessments. One study by Weisenberger et al. (2024) recruited adults via social media to engage in a brief AI bot-administered interview and a depression self-report in the app, Aiberry. Aiberry utilizes language learning models (LLM) which analyze text, audio, and video cues during the bot-administered interview. These cues support AI in understanding an individual at a more personable level, rather than simply generating the same generic information for everyone. This study found that 90% of AI predictions either agreed with the consumer self-report or with clinical expert opinion. These results are promising, however, more studies and critical analysis must be done to inform AI models in scoring assessments accurately.13
It is understandable why consumers would turn to these online features, they are easily accessible and offer answers in moments. AI appears to still be in the developing stages for how to best complete assessments surrounding mental health. In a true MH assessment with a professional therapist, there is a more comprehensive overview and understanding of all factors that are important to consider when supporting someone struggling with MH symptoms. A professional therapist’s ability to separate the subjective and objective is viable. The same cannot necessarily be guaranteed, at least at this time, with AI-generated material.
Mental Health Apps
Mental health applications are among the most popular applications today. Some applications like InsightTimer and Waking Up offer meditation and mindfulness skills, while others like the Bearable App provide mood tracking and help you challenge and improve your negative thoughts. AI is being integrated into many applications to promote a more person-centered approach to the users. As we discussed, LLM are at the heart of AI’s adaptability to the person outside the screen.13 While most apps offer material in a library format, AI may be able to deliver the most beneficial aspects of the mental health apps to the consumer based on gathered data. As we will discuss, AI’s ability to personalize to the consumer is one of its most appealing aspects.
Smartphones & Digital Wearables
The next horizon of AI mental health is digital phenotyping. Digital phenotyping, or personal sensing, brings data together from a variety of sources. These sources include a person’s past and current medical records, social media interactions, and wearable devices (such as an Apple Watch) to (allegedly) detect changes in symptoms.2
Digital phenotyping may look like this: a person’s medical records show that disengagement from usually enjoyed activities is a red flag for symptom increase. Then their wearable device indicates they have become more sedentary and are spending more time on social media. AI may interpret this as an increase in MH symptomatology. AI can allegedly indicate these changes to therapists or psychiatrists when this occurs. The hope is that the mental health provider can use the additional information to be well-informed of their patient’s day-to-day experiences and support treatment planning.
AI Therapy & Chatbots
As previously discussed, AI chatbots for mental health purposes are booming. You’ll find AI chatbots used in services like Youper and Wysa, among others. Yet, the reliability and validity of these chatbots are unclear. A few studies have found MH chatbots to be supportive to those living in rural areas, those who work odd hours when therapists are generally not available, and veterans and adolescents who struggle with the stigma of receiving MH care. The chatbots offer mindfulness skills and education about MH, and some even provide a referral to a therapist in the real world.6
Overall, AI chatbots for MH are constructed to offer support and resources. AI has been built to replicate human interaction in these MH chatbots. The chatbot will receive the information from the consumer and the chatbot’s response will be provided in a manner that befits the needs of that individual. Therefore, responses are not simply generalized. This may be beneficial in the interim for a person receiving therapeutic support, however, long-term benefits remain to be seen.
Another area for consideration is the risk factors in MH. Many persons experience suicidal ideations, self-harming, and other life-altering concerns. Elyoseph & Levkovich (2023) found in their study that ChatGPT, powered by OpenAI, rated the risk of suicidality for patients lower than mental health professionals across various conditions. These results posit that the use of ChatGPT may provide an inaccurate assessment that underestimates the actual suicide risk when compared to trained mental health professionals.
Online Therapy Platforms
Current research on AI’s integration into online therapy platforms is minimal. Researchers overseas7 developed a platform that applied Artificial Intelligence (AI) through Natural Language Processing (NLP) and Automatic Speech Recognition (ASR). The hope is that AI will enhance the therapist’s understanding of the client’s needs so that interventions will be more appropriate.
More research on AI mental health and online therapy platforms is needed to address the complexities of human language and the varied nuances throughout it. Unfortunately, humans are not free from racism and biases. Thus, our technology is not free from biases or racism as it is a human creation. Research shows algorithms and AI tech produce responses that are fueled by racism and prejudice. More should be done to mitigate bias before implementation.
How Therapists Use AI
Therapists have been inundated with AI applications and websites that market making the therapist’s life easier. These are built to record therapy sessions in their secure HIPAA-compliant platform and then translate that session into the therapist’s progress note and/or fellow documentation such as a treatment plan. Some platforms prompt therapists to give a brief description of the session and then AI will generate a progress note from this information.
Lifting the burden of documentation from a therapist is a huge benefit for consumers. Therapists can focus on their work with clients without the constant worry of paperwork. However, there are considerations to be made around consent for AI’s use in your therapeutic care and ensuring the accuracy of what AI generates in your medical record.
While AI has grown by leaps and bounds, the ability of AI to accurately diagnose mental health disorders remains murky at best.9 AI cannot offer empathy or assess the nuances of a person’s way of being in real time. Thus, at a certain stage, a line may need to be drawn around the extent to which AI is integrated into mental health care.
Billing Softwares & EHRs
As previously discussed, one of the largest EHRs, Epic Systems, is soon looking to offer AI. This has the potential to revolutionize the billing and administrative processes of healthcare. It has been suggested that AI could be more accurate with billing and coding, which would support the healthcare system in receiving more timely reimbursements.11
The main concern is the rate at which AI is happening in EHRs. Time and care should go into this process to ensure two key layers of mental health are being addressed.10 One layer is privacy. Another layer is informed consent. Persons should be able to trust that their health information is protected and have the choice as to whether or not they agree to AI being a part of their care.
Insurance Companies
With AI having the potential to improve billing and coding, the rate at which insurance companies have to deny claims may drastically decrease. Another benefit AI may provide is fraud detection. With AI’s ability to assess a large amount of data all at once, it may be the best option for detecting inconsistencies and/or suspicious activity. Compliance could mean lower costs for everyone, from consumers to providers.
Benefits of AI in Mental Health
As the demand for accessible and efficient mental health support grows, AI offers promising solutions to enhance affordability, accessibility, and overall care.
Here are some key benefits AI brings to mental health:
- Affordability: AI is flexible which may make it more affordable. AI can be accessed at times that do not interfere with a person’s work, do not take them away from responsibilities, and do not require them to travel.
- Accessibility: With flexibility comes accessibility. Certain areas of the United States are more isolated and certain careers do not allow a person to receive support during a therapist’s general hours of operation. Persons in these areas and circumstances can turn to AI for immediate support.
- Efficiency: AI can make our lives “easier” by potentially providing accuracy across health care. The ability for healthcare professionals to get reimbursed faster and for insurance companies to lessen the burden of complex claims also make for a good argument for using AI.
- Privacy and ease of opening up: Many online therapy platforms have offered texting options for a therapist. While AI does not have a real-time person on the other end, it provides an increased level of anonymity for people. Perhaps entrusting AI may feel easier at first than a real-life therapist. For example, shame and guilt from our experiences may be easier to discuss on an AI platform for a person to process how to discuss it further with a therapist in real time.
- Support for therapists: Therapists are often overwhelmed by the high level of expectations around their documentation. On the other hand, documentation is vital to prove the quality of care for which patients are receiving. If a therapist uses AI to assist with documentation, it allows therapists to focus more on their patients while not having the added stress of documentation.
Concerns About AI in Mental Health
The primary concerns of AI are privacy, informed consent, and equity. For AI to be equitable it must be free from bias and prejudices, which our technology thus far has struggled to achieve. For people to trust in AI, there may need to be reassurances and evidence of the safety measures put in place to keep information private. For the use of AI to be ethical, practitioners, EHRs, and insurance companies must offer informed consent and may even need to consider options for consumers who do not wish for AI to be a part of their care.
Another significant concern with AI in mental health is the lack of empathy. In therapeutic settings, empathy is crucial for building trust and rapport between patients and their therapists. Human therapists can offer emotional support, validate feelings, and intuitively respond to the nuances of a patient’s emotions—skills that AI currently cannot replicate. The absence of empathy in AI-based therapy may hinder patients from feeling fully understood or supported, which could negatively impact therapeutic outcomes. While AI can process data and offer insights, it lacks the emotional intelligence needed to foster the genuine human connection that is often essential for mental health progress. Without this connection, patients may miss out on the full spectrum of benefits that come from working with a real therapist who can provide personalized empathy and understanding.
Questions to Ask Your Provider About AI
The following questions may help to guide a conversation about AI between you and your mental health provider. As AI is emerging, some providers may not have any real connection to AI in their work at this time. While AI is not a universal aspect of mental health care, consumers should know how to advocate for themselves.
Here’s a list of questions to ask your healthcare providers about AI:
- Are you using AI in your work at this time?
- How is AI being utilized in your work at this time?
- What are the security measures in place to protect my privacy?
- Can I revoke consent for AI’s use in my care?
- Who is primarily responsible for the security of my data?
- Are you double-checking the AI output to ensure accuracy?
- How will AI support my care?
- How does AI support you in providing your care?
Things to Research Before Using Mental Health AI Tools
As we have discussed, informed consent around AI is ethically imperative. Equally important is our efforts to understand how AI mental health tools are both accessing and utilizing our data. Unfortunately, not all AI tools are created equal.
Here’s what to research before using mental health AI tools:
- Legitimacy of the developer
- Official website and/or source of the AI tool
- AI tools that request extensive personal information or credit card details are not to be trusted
- Read fellow consumers’ reviews
- Pay attention to permission requests within the apps
In My Experience
Choosing Therapy strives to provide our readers with mental health content that is accurate and actionable. We have high standards for what can be cited within our articles. Acceptable sources include government agencies, universities and colleges, scholarly journals, industry and professional associations, and other high-integrity sources of mental health journalism. Learn more by reviewing our full editorial policy.
-
Nuance Communications, Inc. (2023). Nuance and Epic Expand Ambient Documentation Integration Across the Clinical Experience with DAX Express for Epic – Jun 27, 2023. News.nuance.com; Nuance Communications, Inc. https://news.nuance.com/2023-06-27-Nuance-and-Epic-Expand-Ambient-Documentation-Integration-Across-the-Clinical-Experience-with-DAX-Express-for-Epic
-
Oudin, A., Maatoug, R., Bourla, A., Ferreri, F., Bonnot, O., Millet, B., Schoeller, F., Mouchabac, S., & Adrien, V. (2023). Digital Phenotyping: Data-Driven Psychiatry to Redefine Mental Health. Journal of medical Internet research, 25, e44502. https://doi.org/10.2196/44502
-
Haque, M. D. R., & Rubya, S. (2023). An Overview of Chatbot-Based Mobile Mental Health Apps: Insights From App Description and User Reviews. JMIR mHealth and uHealth, 11, e44838. https://doi.org/10.2196/44838
-
Johnson, A. (2023). New Tool Can Tell If Something Is AI-Written With 99% Accuracy. Forbes. https://www.forbes.com/sites/ariannajohnson/2023/06/07/new-tool-can-tell-if-something-is-ai-written-with-99-accuracy/
-
Manchanda, P. (2023, April 28). ChatGPT, AI, & Creating a Safe Digital Future. Technically Spiritual. https://www.technicallyspiritual.com/blog/chatgpt-ai-safe-digital-future
-
Haque, M. D. R., & Rubya, S. (2023). An Overview of Chatbot-Based Mobile Mental Health Apps: Insights From App Description and User Reviews. JMIR mHealth and uHealth, 11, e44838. https://doi.org/10.2196/44838
-
Jelassi, M.; Matteli, K.; Ben Khalfallah, H.; Demongeot, J. Enhancing Mental Health Support through Artificial Intelligence: Advances in Speech and Text Analysis within Online Therapy Platforms. Preprints 2024, 2024021585. https://doi.org/10.20944/preprints202402.1585.v1
-
Elyoseph, Z., & Levkovich, I. (2023). Beyond human expertise: the promise and limitations of ChatGPT in suicide risk assessment. Frontiers in psychiatry, 14, 1213141. https://doi.org/10.3389/fpsyt.2023.1213141
-
Andrew, J., Rudra, M., Eunice, J., & Belfin, R. V. (2023). Artificial intelligence in adolescents mental health disorder diagnosis, prognosis, and treatment. Frontiers in public health, 11, 1110088. https://doi.org/10.3389/fpubh.2023.1110088
-
Alanazi A. (2023). Clinicians’ Views on Using Artificial Intelligence in Healthcare: Opportunities, Challenges, and Beyond. Cureus, 15(9), e45255. https://doi.org/10.7759/cureus.45255
-
Polo, M. (2023, December 18). Navigating the Risks: Responsible Use of AI in Medical Billing. NCDS. https://www.ncdsinc.com/navigating-the-risks-responsible-use-of-ai-in-medical-billing/#:~:text=The%20introduction%20of%20AI%20in
-
Ho, C. W. L., Ali, J., & Caals, K. (2020). Ensuring trustworthy use of artificial intelligence and big data analytics in health insurance. Bulletin of the World Health Organization, 98(4), 263–269. https://doi.org/10.2471/BLT.19.234732
-
Weisenburger, R. L., Mullarkey, M. C., Labrada, J., Labrousse, D., Yang, M. Y., Allison Huff MacPherson, Hsu, K. J., Ugail, H., Shumake, J., & Beevers, C. G. (2024). Conversational assessment using artificial intelligence is as clinically useful as depression scales and preferred by users. Journal of Affective Disorders, 351, 489–498. https://doi.org/10.1016/j.jad.2024.01.212
Your Voice Matters
Can't find what you're looking for?
Request an article! Tell ChoosingTherapy.com’s editorial team what questions you have about mental health, emotional wellness, relationships, and parenting. Our licensed therapists are just waiting to cover new topics you care about!
Leave your feedback for our editors.
Share your feedback on this article with our editors. If there’s something we missed or something we could improve on, we’d love to hear it.
Our writers and editors love compliments, too. :)
Stories You Might Like
Best Online Therapy Services
There are a number of factors to consider when trying to determine which online therapy platform is going to be the best fit for you. It’s important to be mindful of what each platform costs, the services they provide you with, their providers’ training and level of expertise, and several other important criteria.
Best Online Psychiatry Services
Online psychiatry, sometimes called telepsychiatry, platforms offer medication management by phone, video, or secure messaging for a variety of mental health conditions. In some cases, online psychiatry may be more affordable than seeing an in-person provider. Mental health treatment has expanded to include many online psychiatry and therapy services. With so many choices, it can feel overwhelming to find the one that is right for you.