This article is freely available to all

To the Editor: Since the rise of technology in the 21st century, modalities such as social media and artificial intelligence (AI) have continued to grow. Particularly during the 2020 COVID-19 pandemic, online social media platforms, such as Reddit, became supportive communities.1 In late 2022, ChatGPT, a large language model, was released. From performing on academic boards to answering patient questions empathetically, ChatGPT’s full potential has yet to be discovered.2

In 2023, the intersection of Reddit and ChatGPT was observed in communities such as the subreddit: r/NarcissisticAbuse, a group whose about statement is as follows: “This is a safe place for people who suffered or are currently suffering from narcissistic abuse to seek support, learn, vent, discuss, document their abuse, and come together in their path towards healing.”

On a November 2023 post entitled “ChatGPT can help” from the NarcissisticAbuse subreddit, a user shared that they copied and pasted communication from their narcissistic ex into ChatGPT with the following message: “Can you draft me a response to this email from my narcissistic ex?” The user also shared that they asked ChatGPT to analyze the communication from their ex for emotionally abusive tactics, and the responses they received from ChatGPT were enormously validating. This post received significant traction, with many users commenting that they have saved ongoing chats with ChatGPT for affirmation and perspective. One of the comments on this post even described possible responses from ChatGPT after asking ChatGPT to respond as a narcissist.

Studies have shown that explanations of abusive partners as narcissists help women process their trauma and heal faster.3 Identification of the abusive partners as narcissists allows women to work with their counselor to find practical strategies for specific situations.3 Many victims of narcissistic abuse report victim blaming and lack of accountability in their partner.3 Thus, it is refreshing to see an AI modality such as ChatGPT used to validate victims of narcissistic abuse and provide methods of recognizing emotional abuse.

Research has demonstrated that integrative conversational AI chatbots can serve as cost-effective and accessible therapeutic methods.4 While not a replacement for a trained therapist, AI may provide support by listening to and reassuring distressed individuals. Prior research has shown that ChatGPT may alleviate stress and supply encouragement to worried parents of children with atopic dermatitis.5

Analysis of the NarcissisticAbuse subreddit demonstrates that utilizing AI talkbots such as ChatGPT may be beneficial for victims of abuse. Based on the popularity of the post in the NarcissisticAbuse subreddit, counselors may want to suggest ChatGPT as an immediate, short-term method of validation and emotional support to victims of narcissistic abuse. Nevertheless, persons in acute distress should preferably reach out to a crisis hotline or seek inpatient mental health services. Potential pitfalls of AI include racial bias, particularly ChatGPT’s strikingly differing response for pain management (opioids vs aspirin) based on race.6 Other shortcomings of chatbots include a lack of information verification that has the potential to induce stress in users by providing them with an incorrect diagnosis or even dangerous information.6 Clinicians may mitigate some of these challenges by having awareness of AI-based racial biases and informing their patients. Clinicians may also try to create open conversation with their patients about AI, particularly where their patients acquire information and if they are involved in any online social support groups. The interplay of online social media support groups and interactive chatbots may represent a new realm of mental health support, but providers must approach this interplay with caution to ensure optimal patient education, patient safety, and ethical considerations. Future studies are needed to elucidate the most beneficial adaptation of social media and AI for mental health aid.

Article Information

Published Online: June 4, 2024. https://doi.org/10.4088/PCC.23lr03698
© 2024 Physicians Postgraduate Press, Inc.
Prim Care Companion CNS Disord 2024;26(3):23lr03698
To Cite: Ahuja K, DeSena G, Spiegel D. The intersection of chatGPT and Reddit: a new avenue for support of abuse survivors?. Prim Care Companion CNS Disord. 2024;26(3):23lr03698.
Author Affiliations: Eastern Virginia Medical School, Norfolk, Virginia (Ahuja, Spiegel); University of Florida College of Medicine, Gainesville, Forida (DeSena).
Corresponding Author: Kripa Ahuja, MS, Eastern Virginia Medical School, 825 Fairfax Ave, Norfolk, VA 23507 ([email protected]).
Relevant Financial Relationships: None.
Funding Sources: None.

  1. Eastern Virginia Medical School, Norfolk, Virginia
  2. Corresponding Author: Kripa Ahuja, MS, Eastern Virginia Medical School, 825 Fairfax Ave, Norfolk, VA 23507 ([email protected]).
  3. University of Florida College of Medicine, Gainesville, Forida
  4. Eastern Virginia Medical School, Norfolk, Virginia
  1. McAuliffe C, Slemon A, Goodyear T, et al. Connectedness in the time of COVID-19: Reddit as a source of support for coping with suicidal thinking. SSM Qual Res Health. 2022;2:100062. PubMed CrossRef
  2. Ahuja K, DeSena G, Lio P. Continuing medical education in dermatology: the possible use of artificial intelligence. Clin Dermatol. 2024;42(1):79–81. PubMed CrossRef
  3. Marsden S, Humphreys C, Hegarty K. Why does he do it? What explanations resonate during counseling for women in understanding their partner’s abuse? J Interpers Violence. 2022;37(13–14):NP10758–NP10781. PubMed CrossRef
  4. Fulmer R, Joerin A, Gentile B, et al. Using psychological artificial intelligence (Tess) to relieve symptoms of depression and anxiety: randomized controlled trial. JMIR Ment Health. 2018;5(4):e64. PubMed CrossRef
  5. Ahuja K, DeSena G, Laageide L, et al. From eczema to anxiety: how artificial intelligence shapes parental perspectives. Pediatr Dermatol. 2023;40(5):964–965. PubMed CrossRef
  6. Suran M, Hswen Y. How to navigate the pitfalls of AI hype in health care. JAMA. 2024;331(4):273–276. PubMed CrossRef