Viewpoint
Abstract
Recent breakthroughs in artificial intelligence (AI) language models have elevated the vision of using conversational AI support for mental health, with a growing body of literature indicating varying degrees of efficacy. In this paper, we ask when, in therapy, it will be easier to replace humans and, conversely, in what instances, human connection will still be more valued. We suggest that empathy lies at the heart of the answer to this question. First, we define different aspects of empathy and outline the potential empathic capabilities of humans versus AI. Next, we consider what determines when these aspects are needed most in therapy, both from the perspective of therapeutic methodology and from the perspective of patient objectives. Ultimately, our goal is to prompt further investigation and dialogue, urging both practitioners and scholars engaged in AI-mediated therapy to keep these questions and considerations in mind when investigating AI implementation in mental health.
JMIR Ment Health 2024;11:e56529doi:10.2196/56529
Keywords
Introduction
The prospect of using machine learning algorithms for automated health care responses and counseling has long been considered [
]. Such algorithms could have vast benefits ranging from increased accessibility and affordability of mental health services, reduced waiting times, and personalized treatment options, to the potential to reach underserved populations and combat the escalating loneliness epidemic [ ]. Recent breakthroughs in artificial intelligence (AI) language models have elevated this vision, as evidenced by a growing body of literature indicating varying degrees of efficacy. For instance, studies demonstrate that digital chatbots are proficient in delivering psychoeducation and improving treatment adherence over short durations [ ]. Additionally, AI-driven chatbots have been effectively used to impart strategies derived from positive psychology and cognitive behavioral techniques to mitigate stress and enhance subjective well-being [ ]. AI chatbots can also provide preliminary support in the absence of a therapist by prompting self-reflective questioning and facilitating emotion regulation in challenging scenarios [ , ]. Machine learning can also be used in order to detect symptom changes in new ways [ ]. In the realm of medicine, a recent study revealed that the responses from GPT-3 received higher ratings for the quality of medical advice compared to physicians. Moreover, these responses were perceived to exhibit significantly more empathy compared to those from physicians [ ].While the potential for AI chatbots to take over certain elements of the therapeutic process exists, there are compelling reasons to believe that they cannot completely substitute for the human element. This raises a critical question: under what circumstances will human therapists remain indispensable, and conversely, when could they feasibly be replaced by AI models? We suggest that part of the answer may reside in an exploration of the role of empathy in the therapeutic process. In the following paper, we address the multifaceted nature of empathy, including its cognitive, emotional, and motivational aspects. We claim that in those cases where emotional or motivational empathy is needed, humans will be harder to replace. We then consider what determines when these aspects are needed most in therapy—whether certain therapeutic approaches, particular patient goals, or specific points within the therapeutic timeline. Our inquiry explores these considerations from both the perspective of therapeutic methods and patient objectives. Ultimately, our goal is to prompt further investigation and dialogue, urging both practitioners and scholars engaged in AI-mediated therapy to consider these issues through the lens of empathy and human connection.
Empathy
A comprehensive definition of empathy recognizes 3 dimensions of empathic engagement: cognitive empathy, or mentalizing, which pertains to the recognition and understanding of the emotional states of others; emotional empathy, or affective sharing, which involves resonating with others’ emotional experiences while maintaining self-other differentiation; and motivational empathy, often termed empathic concern or compassion, which encompasses feelings of concern for another’s welfare and a readiness to act to enhance their well-being [
].Current advances in natural language processing and facial recognition technologies have positioned AI-based algorithms at the forefront of discerning emotional states [
, ], with projections suggesting that they may reach or surpass human capability in the near future. Therefore, in the most basic sense of cognitive empathy as recognition of the other’s emotional state, AI algorithms will probably do quite well.Nonetheless, AI, at least in its current form, does not exhibit the latter 2 empathic capacities. AI does not partake in emotional experiences—it neither shares in joy nor sorrow. Therefore, regardless of how eloquently it crafts a response to seem like it shares an emotional experience, this response will be untruthful, as it does not share any experience. While such responses may still have some benefits, they will probably not be experienced by the listener in the same way [
, ].Moreover, conversational AI does not have the capacity to manifest genuine care and concern. Human expressions of empathic care signal a willingness to bear an emotional burden and expend limited cognitive-emotional resources on the interaction. Empathy, being potentially taxing, is selectively directed, often preferentially toward close relations and in group members, rather than those more distant [
]. In this way, such expressions signify the recipient’s importance and closeness to the empathizer. Indeed, studies show that, stripped of context and motivation, individuals often tend to avoid empathy [ , ]. Thus, whether in therapy or in social or professional realms, authentic expressions of empathy are significant to the recipient because they reflect a conscious commitment of time, thought, and emotional labor from the empathizer. Though these resources are inherently scarce for humans, they are unlimited for a conversational AI model. Its response is essentially cost-free, and it would react with comparable enthusiasm to anyone else. As a result, the conversational AI’s empathy fails to convey authentic care or indicate that the recipient holds any unique importance [ ].Empathy, and specifically its emotional and motivational components, has been consistently linked to positive outcomes in treatment. The extent of empathy expressed by the therapist and perceived by the patient has a substantial correlation with the success of the treatment [
]. Rogers [ ] even describes the therapeutic process as a mutual participation in an emotional exchange, which is then accurately interpreted and reflected upon with the patient to facilitate understanding of their experiences. As Rogers articulated, comprehending the patient’s emotions (cognitive empathy) is imperative for endorsing and designing goals and interventions that confront these emotions. This process is underpinned by a commitment to assist and support the patient (motivational empathy), both of which stem from participating in the patient’s emotional journey (affective empathy).Upon considering the importance of empathy for the therapeutic process and outcomes, as well as the limitations of AI discussed earlier, several questions arise. First, given the limitations, for which aspects of therapy could AI completely replace human therapists? Second, in aspects of therapy where human empathy is essential, how can AI algorithms assist therapists? For example, could it aid therapists in being more accurate or in being more committed to their patients (perhaps by enhancing therapist understanding and empathy and potentially reducing burnout)?
While we do not claim to give clear-cut answers, this paper explores these questions through dual lenses: the perspective of the therapeutic approach and the perspective that prioritizes specific motivations and needs of the patient.
Perspectives in Psychotherapy
Psychotherapy encompasses a diverse spectrum of approaches. A major debate in the field of psychotherapy concerns the mechanism of change. To state it simplistically, one extreme viewpoint contends that the therapeutic relationship is the main mechanism [
], whereas the other extreme argues that therapeutic techniques or procedures are the main mechanism [ ]. At times, these 2 stances are reflected in therapeutic approaches, such that psychodynamic (ie, Neo-Freudian) approaches tend to emphasize the therapeutic relationship whereas cognitive behavioral approaches tend to emphasize technique. In reality, most psychotherapies attempt to integrate some combination of the 2 and allow them to build on one another; there are techniques used to form the relationship, and relationships facilitate the use of techniques. Furthermore, a given act can be seen as both relationship-building and the administration of a technique. Essentially, the therapeutic process often demands that the therapist engage in a comparative analysis of emotional experiences with the patient, thus exercising some form of affective empathy, though it is possible that these affective components of empathy are more crucial in some therapeutic interventions than in others.Despite the variance in therapeutic orientations, there is broad agreement that one of the fundamental transtheoretical elements critical for a successful outcome is the treatment’s working alliance [
]. Conventionally, the alliance is measured across 3 domains: agreement on therapeutic goals, agreement on the therapeutic activities needed to achieve these goals, and the warmth and genuineness of the connection. The working alliance is a significant predictor of treatment outcomes across all forms of psychotherapy [ , ]. Motivational empathy is intimately connected to a fruitful working alliance, particularly the aspect concerning warmth and care [ ]. Such results were reported in early studies of cognitive behavioral therapy, where warmth was a predictor of symptom improvement [ ], though subsequent findings have been equivocal [ ].One of the provocative findings regarding the alliance of client and therapist is that it appears to be similarly related to reported outcomes in face-to-face psychotherapy and in internet-based interventions (IBIs), which include asynchronous communications [
] with minimal therapist contact. Within the field of IBI, the presence of therapist support is predictive of more symptom improvement, less dropout, and greater adherence in comparison with unguided IBI [ , ] (though data are not conclusive [ ]). Furthermore, the relationship with the internet-based program has been found to be predictive of symptom reduction, whereas the relationship with the therapist was predictive of adherence and dropout in a therapist-guided IBI [ ]. These differences raise the possibility that although patients can potentially form a relationship with a digital interface, such a relationship differs in its benefits in comparison to a relationship with a therapist, and may not be able to create the same profound alliance and conversations with patients [ ]. In other words, conversational AI and digital programs can be helpful in psychoeducation and administering practical techniques that alleviate symptoms, but they are unlikely to build a motivational relationship in the same way as a human therapist and client. Moreover, it has been claimed that even if conversational AI were to be used only for practical purposes alongside a human therapist, this may change and interact with the human dynamic in therapy, especially with the therapeutic alliance [ ], which may possibly change how one experiences empathy throughout the therapeutic process.A Patient-Needs Perspective
The previous section examined the importance of empathy in therapy, from the perspective of therapeutic treatment approaches, which could be thought of as a continuum in terms of their theoretical mechanisms from skill acquisition (eg, cognitive behavioral therapy) to relationship-based (eg, interpersonal psychoanalysis). An alternate perspective considers a patient’s specific motivations and needs for seeking treatment, regardless of the therapist’s theoretical orientation. We contend that patients enter therapy with a variety of needs, motivations, and expectations, which can be conceptualized along two orthogonal continua including (1) a desire to acquire practical tools and (2) a desire for human connection and empathy. The relative emphasis on each dimension varies among patients according to their theory of change (ie, what they need to decrease distress or improve quality of life) and many other factors (such as their history of successful or unsuccessful treatments, stigma, culture, etc). In addition to individual variability, we contend that individuals can change in their theory and emphasis during therapy as a result of their cumulative experiences over the course of therapy. The more a patient seeks to acquire strategies to cope, the more conversational AI might be able to facilitate this process by providing psychoeducation, exercises, and the like. Conversely, the greater the patient’s need for human connection and empathy—be it for affirmation, a confidant for their thoughts and feelings, or simply the sense that someone cares—the less capable conversational AI might be in fulfilling these requirements (
). We note that although we see these as 2 dimensions, we would not expect many individuals who seek treatment to be low on both, as there would be no motivation to seek treatment (aside from appeasing others).Adding to this complexity is the patient’s self-awareness and accuracy regarding their needs, which is often ambiguous (eg, they may believe they need practical tools when, in reality, they require empathic care more, or vice versa). It is conceivable that a patient might need coping skills for personal growth; yet, the most effective means of acquiring these tools could be through engagement with a compassionate therapist. Such a process may take time and require working on the therapeutic relationship, and needs may change during this period. We believe that when thinking of how to use AI systems in mental health contexts, we must not seek to use them as a “quick fix” solution to a specific problem at all times, as in some cases, a long-term human connection may be required to provide more thorough help.
Moreover, even in therapeutic interactions that are not complete courses of psychotherapy and are comprised of short, concrete interventions, such as crisis helplines, people still may need a human connection. Indeed, research shows that the main reason people call crisis helplines is to have someone to talk to [
]. While conversational AI systems can, at times, give the feeling of “being heard,” their responses have still been reported to have less value than those written by humans [ ]. One could assume that repeat callers [ ] to such helplines seek human connection and will not find it sufficient to only communicate with a bot. A bot-only approach may lead to deterioration in their condition, a risk that should be minimized, especially if a person is going through a crisis. Additionally, other users develop an overdependency on the AI tool, and interventions should be planned in a manner that would mitigate the risk of a long-term dependency. It is possible that AI-assisted communication, including with a therapist, could help deal with both of these risks.We bring this population as an example, but these considerations pose open-ended questions that warrant exploration in the burgeoning domain of AI-mediated therapy broadly, where the interplay between human touch and technological aid is continually redefined.
Patient Perceptions of AI Bots
In the growing literature examining individuals’ perceptions of AI bots, different factors have been shown to influence the levels of trust and bonding created with AI [
- ]. Much of this literature can be viewed from the same 2 axes: how helpful and capable the bot is in providing appropriate tools and results, and how empathic it is (in terms of understanding the user’s needs). One study showed that a therapeutic bot was rated on the dimensions of alliance at similar levels to face-to-face therapists [ ]. In another study, responses generated by ChatGPT were rated as more authentic, professional, and practical [ ]; however, participants were blind to the fact that responses were generated by conversational AI. When participants are aware of AI systems’ involvement or simply believe it is involved, responses can seem less authentic and less trustworthy and raise negative emotions [ - ]. Such findings require further research to determine whether responses can truly be experienced as empathic when one knows their artificial origin and whether such experiences differ in their relationship to treatment outcomes according to patients’ theory of change.Conclusions
The advent of advanced AI technologies offers substantial benefits and potential enhancements to therapeutic practices, as well as greater accessibility for a wider population. Nevertheless, certain junctures within the therapeutic process may be particularly sensitive to the need for human rapport. We suggest that those points, which may be whole treatments or specific sessions, are times when empathy is especially needed. Although conversational AI can adeptly simulate empathic interactions, sometimes creating the impression of empathy surpassing human capability [
], the essence of seeking empathy transcends the mere reception of an ideal empathic response. It encompasses a longing for the genuine care and emotional engagement of the individual offering support. The optimal path forward may lie in designing applications that facilitate therapist-AI partnerships, wherein AI systems could augment various facets of therapy—from initial intake and evaluation to, in certain instances, complete treatment modalities—while also consciously addressing the need for authentic human empathy, compassion, and care, when relevant for treatment success. However, most of our proposal is theoretical, and ultimately, we raise an empirical question that should be evaluated in future studies. We also encourage industry professionals developing AI applications for mental health and those conducting research within this domain to remain mindful of these considerations.Conflicts of Interest
None declared.
References
- Weizenbaum J. ELIZA—a computer program for the study of natural language communication between man and machine. Commun ACM. 1966;9(1):36-45. [CrossRef]
- Sullivan Y, Nyawa S, Fosso WS. Combating loneliness with artificial intelligence: an AI-based emotional support model. 2023. Presented at: Proceedings of the 56th Hawaii International Conference on System Sciences; January 3, 2023; Hawaii. URL: https://hdl.handle.net/10125/103173 [CrossRef]
- Vaidyam AN, Wisniewski H, Halamka JD, Kashavan MS, Torous JB. Chatbots and conversational agents in mental health: a review of the psychiatric landscape. Can J Psychiatry. 2019;64(7):456-464. [FREE Full text] [CrossRef] [Medline]
- Ly KH, Ly AM, Andersson G. A fully automated conversational agent for promoting mental well-being: a pilot RCT using mixed methods. Internet Interv. 2017;10:39-46. [FREE Full text] [CrossRef] [Medline]
- Inkster B, Sarda S, Subramanian V. An empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: real-world data evaluation mixed-methods study. JMIR Mhealth Uhealth. 2018;6(11):e12106. [CrossRef] [Medline]
- Inkster B, Kadaba M, Subramanian V. Understanding the impact of an AI-enabled conversational agent mobile app on users' mental health and wellbeing with a self-reported maternal event: a mixed method real-world data mHealth study. Front Glob Womens Health. 2023;4:1084302. [FREE Full text] [CrossRef] [Medline]
- Funk B, Sadeh-Sharvit S, Fitzsimmons-Craft EE, Trockel MT, Monterubio GE, Goel NJ, et al. A framework for applying natural language processing in digital health interventions. J Med Internet Res. 2020;22(2):e13855. [FREE Full text] [CrossRef] [Medline]
- Ayers JW, Poliak A, Dredze M, Leas EC, Zhu Z, Kelley JB, et al. Comparing physician and artificial intelligence chatbot responses to patient questions posted to a public social media forum. JAMA Intern Med. 2023;183(6):589-596. [CrossRef] [Medline]
- Zaki J, Ochsner KN. The neuroscience of empathy: progress, pitfalls and promise. Nat Neurosci. 2012;15(5):675-680. [CrossRef] [Medline]
- Saxena A, Khanna A, Gupta D. Emotion recognition and detection methods: a comprehensive survey. J Artif Intell Syst. 2020;2(1):53-79. [CrossRef]
- Shukla A. Utilizing AI and machine learning for human emotional analysis through speech-to-text engine data conversion. J Artif Int Cloud Comp. 2022:1-4. [CrossRef]
- Chung LL, Kang J. "I’m Hurt Too": the effect of a chatbot's reciprocal self-disclosures on users’ painful experiences. Arch Des Res. 2023;36(4):67-85. [CrossRef]
- Yin Y, Jia N, Wakslak CJ. AI can help people feel heard, but an AI label diminishes this impact. Proc Natl Acad Sci U S A. 2024;121(14):e2319112121. [CrossRef] [Medline]
- Fuchs T. Empathy, group identity, and the mechanisms of exclusion: an investigation into the limits of empathy. Topoi. 2017;38(1):239-250. [CrossRef]
- Cameron CD, Hutcherson CA, Ferguson AM, Scheffer JA, Hadjiandreou E, Inzlicht M. Empathy is hard work: people choose to avoid empathy because of its cognitive costs. J Exp Psychol Gen. Jun 2019;148(6):962-976. [CrossRef] [Medline]
- Ferguson AM, Cameron CD, Inzlicht M. Motivational effects on empathic choices. J Exp Soc Psychol. 2020;90:104010. [CrossRef]
- Perry A. AI will never convey the essence of human empathy. Nat Hum Behav. 2023;7(11):1808-1809. [CrossRef] [Medline]
- Elliott R, Bohart AC, Watson JC, Murphy D. Therapist empathy and client outcome: an updated meta-analysis. Psychotherapy (Chic). Dec 2018;55(4):399-410. [FREE Full text] [CrossRef] [Medline]
- Rogers CR. A Way of Being. Boston. Houghton Mifflin; 1980.
- Wampold B, Imel ZE. The Great Psychotherapy Debate: The Evidence for What Makes Psychotherapy Work. New York. Routledge; 2015.
- DeRubeis RJ, Feeley M. Determinants of change in cognitive therapy for depression. Cogn Ther Res. Oct 1990;14(5):469-482. [CrossRef]
- Bordin ES. The generalizability of the psychoanalytic concept of the working alliance. Psychother Theory Res Pract. 1979;16(3):252-260. [CrossRef]
- Del Re AC, Flückiger C, Horvath AO, Wampold BE. Examining therapist effects in the alliance-outcome relationship: a multilevel meta-analysis. J Consult Clin Psychol. 2021;89(5):371-378. [CrossRef] [Medline]
- Flückiger C, Del Re AC, Wampold BE, Horvath AO. The alliance in adult psychotherapy: a meta-analytic synthesis. Psychotherapy (Chic). Dec 2018;55(4):316-340. [FREE Full text] [CrossRef] [Medline]
- Nienhuis JB, Owen J, Valentine JC, Winkeljohn Black S, Halford TC, Parazak SE, et al. Therapeutic alliance, empathy, and genuineness in individual adult psychotherapy: a meta-analytic review. Psychother Res. 2018;28(4):593-605. [CrossRef] [Medline]
- Persons JB, Burns DD. Mechanisms of action of cognitive therapy: the relative contributions of technical and interpersonal interventions. Cogn Ther Res. 1985;9(5):539-551. [FREE Full text] [CrossRef]
- Koelen J, Vonk A, Klein A, de Koning L, Vonk P, de Vet S, et al. Man vs. machine: a meta-analysis on the added value of human support in text-based internet treatments ("e-therapy") for mental disorders. Clin Psychol Rev. 2022;96:102179. [FREE Full text] [CrossRef] [Medline]
- Musiat P, Johnson C, Atkinson M, Wilksch S, Wade T. Impact of guidance on intervention adherence in computerised interventions for mental health problems: a meta-analysis. Psychol Med. 2022;52(2):229-240. [CrossRef] [Medline]
- Werntz A, Amado S, Jasman M, Ervin A, Rhodes JE. Providing human support for the use of digital mental health interventions: systematic meta-review. J Med Internet Res. 2023;25:e42864. [FREE Full text] [CrossRef] [Medline]
- Zalaznik D, Strauss AY, Halaj A, Barzilay S, Fradkin I, Katz BA, et al. Patient alliance with the program predicts treatment outcomes whereas alliance with the therapist predicts adherence in internet-based therapy for panic disorder. Psychother Res. 2021;31(8):1022-1035. [CrossRef] [Medline]
- Sedlakova J, Trachsel M. Conversational artificial intelligence in psychotherapy: a new therapeutic tool or agent? Am J Bioeth. 2023;23(5):4-13. [FREE Full text] [CrossRef] [Medline]
- Asman O, Tal A, Barilan YM. Conversational artificial intelligence-patient alliance Turing test and the search for authenticity. Am J Bioeth. 2023;23(5):62-64. [CrossRef] [Medline]
- Middleton A, Woodward A, Gunn J, Bassilios B, Pirkis J. How do frequent users of crisis helplines differ from other users regarding their reasons for calling? Results from a survey with callers to Lifeline, Australia's national crisis helpline service. Health Soc Care Community. 2017;25(3):1041-1049. [CrossRef] [Medline]
- Mishara BL, Côté LP, Dargis L. Systematic review of research and interventions with frequent callers to suicide prevention helplines and crisis centers. Crisis. 2023;44(2):154-167. [FREE Full text] [CrossRef] [Medline]
- Glikson E, Woolley AW. Human trust in artificial intelligence: review of empirical research. Acad Manag Ann. 2020;14(2):627-660. [CrossRef]
- Gould DJ, Dowsey MM, Glanville-Hearst M, Spelman T, Bailey JA, Choong PFM, et al. Patients' views on AI for risk prediction in shared decision-making for knee replacement surgery: qualitative interview study. J Med Internet Res. 2023;25:e43632. [FREE Full text] [CrossRef] [Medline]
- Lucas G, Rizzo A, Gratch J, Scherer S, Stratou G, Boberg J, et al. Reporting mental health symptoms: breaking down barriers to care with virtual human interviewers. Front Robot AI. Oct 12, 2017;4:51. [FREE Full text] [CrossRef]
- Minerva F, Giubilini A. Is AI the future of mental healthcare? Topoi. 2023;42(3):809-817. [CrossRef] [Medline]
- Beatty C, Malik T, Meheli S, Sinha C. Evaluating the therapeutic alliance with a free-text CBT conversational agent (Wysa): a mixed-methods study. Front Digit Health. 2022;4:847991. [FREE Full text] [CrossRef] [Medline]
- Lopes E, Jain G, Carlbring P, Pareek S. Talking mental health: a battle of wits between humans and AI. J Technol Behav Sci. Nov 08, 2023:1-11. [CrossRef]
- Hohenstein J, Kizilcec RF, DiFranzo D, Aghajari Z, Mieczkowski H, Levy K, et al. Artificial intelligence in communication impacts language and social relationships. Sci Rep. 2023;13(1):5487. [FREE Full text] [CrossRef] [Medline]
- Promberger M, Baron J. Do patients trust computers? Behav Decis Making. 2006;19(5):455-468. [CrossRef]
- Rob M. We provided mental health support to about 4,000 people—using GPT-3. Here's what happened. X (formerly Twitter). URL: https://twitter.com/RobertRMorris/status/1611450197707464706 [accessed 2023-09-14]
Abbreviations
AI: artificial intelligence |
IBI: internet-based intervention |
Edited by O Asman; submitted 18.01.24; peer-reviewed by D Meyer; comments to author 08.04.24; revised version received 18.04.24; accepted 23.04.24; published 11.06.24.
Copyright©Matan Rubin, Hadar Arnon, Jonathan D Huppert, Anat Perry. Originally published in JMIR Mental Health (https://mental.jmir.org), 11.06.2024.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on https://mental.jmir.org/, as well as this copyright and license information must be included.