JMIR Mental Health

Internet interventions, technologies, and digital innovations for mental health and behavior change.

JMIR Mental Health is the official journal of the Society of Digital Psychiatry

Editor-in-Chief:

John Torous, MD, MBI, Harvard Medical School, USA


Impact Factor 4.8 CiteScore 10.8

JMIR Mental Health (JMH, ISSN 2368-7959(Journal Impact Factor™ 4.8, (Journal Citation Reports™ from Clarivate, 2024)) is a premier, open-access, peer-reviewed journal indexed in PubMed Central and PubMed, MEDLINEScopus, Sherpa/Romeo, DOAJ, EBSCO/EBSCO Essentials, ESCI, PsycINFOCABI and SCIE.

JMIR Mental Health has a unique focus on digital health and Internet/mobile interventions, technologies, and electronic innovations (software and hardware) for mental health, addictions, online counseling, and behavior change. This includes formative evaluation and system descriptions, theoretical papers, review papers, viewpoint/vision papers, and rigorous evaluations related to digital psychiatry, e-mental health, and clinical informatics in psychiatry/psychology.

JMIR Mental Health received a CiteScore of 10.8, placing it in the 92nd percentile (#43 of 567) as a Q1 journal in the field of Psychiatry and Mental Health.

Recent Articles

Article Thumbnail
Theme Issue 2023 : Responsible Design, Integration, and Use of Generative AI in Mental Health

Knowledge has become more open and accessible to a large audience with the "democratization of information" facilitated by technology. This paper provides a socio-historical perspective for the Theme Issue “Responsible Design, Integration, and Use of Generative AI in Mental Health”. It evaluates ethical considerations in utilizing Generative Artificial Intelligence (GenAI) for the democratization of mental health knowledge and practice. It explores the historical context of democratizing information, transitioning from restricted access to widespread availability due to the internet, open-source movements, and most recently, GenAI technologies such as Large Language Models (LLMs). The paper highlights why GenAI technologies represent a new phase in the democratization movement, offering unparalleled access to highly advanced technology as well as information. In the realm of mental health, this requires a delicate and nuanced ethical deliberation. Including GenAI in mental health may allow, among other things, improved accessibility to mental health care, personalized responses, conceptual flexibility, and could facilitate a flattening of traditional hierarchies between health care providers and patients. At the same time, it also entails significant risks and challenges that must be carefully addressed. To navigate these complexities, the paper proposes a strategic questionnaire for assessing AI based mental health applications. This tool evaluates both the benefits and the risks, emphasizing the need for a balanced and ethical approach for GenAI integration in mental health. The paper calls for a cautious yet positive approach to GenAI in mental health, advocating for the active engagement of mental health professionals in guiding GenAI development. It emphasizes the importance of ensuring that GenAI advancements are not only technologically sound but also ethically grounded and patient centered. Keywords: Ethics, Generative Artificial Intelligence, Mental Health

|
Article Thumbnail
Knowledge and Attitudes of Mental Health Professionals towards Digital Interventions

Artificial intelligence (AI) has been increasingly recognised as a potential solution to address mental health service challenges by automating tasks and providing new forms of support.

|
Article Thumbnail
Methods and New Tools in Mental Health Research

Novel technologies, such as ecological momentary assessment (EMA) and wearable biosensor wristwatches, are increasingly being utilized to assess outcomes and mechanisms of change in psychological treatments. However, there is still a dearth of information on the feasibility and acceptability of these technologies and whether they can be reliably used to measure variables of interest.

|
Article Thumbnail
Innovations in Mental Health Systems

While the number of digital therapeutics (DTx) has proliferated, there is little real-world research on the characteristics of providers recommending DTx, their recommendation behaviors, or the characteristics of patients receiving recommendations in the clinical setting. Objective: Characterize the clinical and demographic characteristics of patients receiving DTx recommendations, and describe provider characteristics and behaviors regarding DTx.

|
Article Thumbnail
Reviews in Digital Mental Health

Digital mental health is a rapidly growing field with an increasing evidence base due to its potential scalability and impacts on access to mental health care. Further, within underfunded service systems, leveraging personal technologies to deliver or support specialized service delivery has garnered attention as a feasible and cost-effective means of improving access. Digital health relevance has also improved as technology ownership in individuals with schizophrenia has improved and is comparable to that of the general population. However, less digital health research has been conducted in groups with schizophrenia spectrum disorders compared to other mental health conditions, and overall feasibility, efficacy, and clinical integration remain largely unknown.

|
Article Thumbnail
Reviews in Digital Mental Health

Depression affects 5% of adults and it is a major cause of disability worldwide. Digital psychotherapies offer an accessible solution addressing this issue. This systematic review examines a spectrum of digital psychotherapies for depression, considering both their effectiveness and user perspectives.

|
Article Thumbnail
Depression and Mood Disorders; Suicide Prevention

For the provision of optimal care in a suicide prevention helpline, it is important to know what contributes to positive or negative effects on help seekers. Helplines can often be contacted through text-based chat services, which produce large amounts of text data for use in large-scale analysis.

|
Article Thumbnail
Smoking Cessation

Motivational Interviewing (MI) is a therapeutic technique that has been successful in helping smokers reduce smoking but has limited accessibility due to the high cost and low availability of clinicians. To address this, the MIBot project has sought to develop a chatbot that emulates an MI session with a client with the specific goal of moving an ambivalent smoker towards the direction of quitting. One key element of an MI conversation is reflective listening, where a therapist expresses their understanding of what the client has said by uttering a reflection that encourages the client to continue their thought process. Complex reflections link the client’s responses to relevant ideas and facts to enhance this contemplation. Backward-looking complex reflections (BLCRs) link the client’s most recent response to a relevant selection of the client’s previous statements. Our current chatbot can generate complex reflections - but not BLCRs - using large language models (LLMs) such as GPT-2, which allows the generation of unique, human-like messages customized to client responses. Recent advances in these models, such as the introduction of GPT-4, provide a novel way to generate complex text by feeding the models instructions and conversational history directly, making this a promising approach to generate BLCRs.

|
Article Thumbnail
Affective Computing

Empathy is a driving force in our connection to others, our mental well-being, and resilience to challenges. With the rise of generative artificial intelligence (AI) systems, mental health chatbots, and AI social support companions, it is important to understand how empathy unfolds toward stories from human versus AI narrators and how transparency plays a role in user emotions.

|
Article Thumbnail
Mobile Health in Psychiatry

Internet-based cognitive-behavioral therapy (iCBT) shows promise in the prevention of depression. However, the specific iCBT components that contribute to its effectiveness remain unclear.

|
Article Thumbnail
Theme Issue 2023 : Responsible Design, Integration, and Use of Generative AI in Mental Health

This article contends that the responsible artificial intelligence (AI) approach—which is the dominant ethics approach ruling most regulatory and ethical guidance—falls short because it overlooks the impact of AI on human relationships. Focusing only on responsible AI principles reinforces a narrow concept of accountability and responsibility of companies developing AI. This article proposes that applying the ethics of care approach to AI regulation can offer a more comprehensive regulatory and ethical framework that addresses AI’s impact on human relationships. This dual approach is essential for the effective regulation of AI in the domain of mental health care. The article delves into the emergence of the new “therapeutic” area facilitated by AI-based bots, which operate without a therapist. The article highlights the difficulties involved, mainly the absence of a defined duty of care toward users, and shows how implementing ethics of care can establish clear responsibilities for developers. It also sheds light on the potential for emotional manipulation and the risks involved. In conclusion, the article proposes a series of considerations grounded in the ethics of care for the developmental process of AI-powered therapeutic tools.

|
Article Thumbnail
Depression and Mood Disorders; Suicide Prevention

National suicide prevention strategies are general population-based approaches to prevent suicide by promoting help-seeking behaviours and implementing interventions. Crisis helplines are one of the suicide prevention resources available for public use where individuals experiencing a crisis can talk to a trained volunteer. Samaritans UK operates on a national scale, with a number of branches located in within each of the UK’s four countries or regions.

|

We are working in partnership with