Published on in Vol 9, No 1 (2022): January

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/32430, first published .
A Novel Peer-to-Peer Coaching Program to Support Digital Mental Health: Design and Implementation

A Novel Peer-to-Peer Coaching Program to Support Digital Mental Health: Design and Implementation

A Novel Peer-to-Peer Coaching Program to Support Digital Mental Health: Design and Implementation

Viewpoint

1Department of Psychology, University of California, Los Angeles, Los Angeles, CA, United States

2Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles, Los Angeles, CA, United States

3Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, Los Angeles, CA, United States

*these authors contributed equally

Corresponding Author:

Benjamin M Rosenberg, MA, CPhil

Department of Psychology

University of California, Los Angeles

1285 Franz Hall

Los Angeles, CA, 95030

United States

Phone: 1 4083068603

Email: benrosenberg@g.ucla.edu


Many individuals in need of mental health services do not currently receive care. Scalable programs are needed to reduce the burden of mental illness among those without access to existing providers. Digital interventions present an avenue for increasing the reach of mental health services. These interventions often rely on paraprofessionals, or coaches, to support the treatment. Although existing programs hold immense promise, providers must ensure that treatments are delivered with high fidelity and adherence to the treatment model. In this paper, we first highlight the tension between the scalability and fidelity of mental health services. We then describe the design and implementation of a peer-to-peer coach training program to support a digital mental health intervention for undergraduate students within a university setting. We specifically note strategies for emphasizing fidelity within our scalable framework, including principles of learning theory and competency-based supervision. Finally, we discuss future applications of this work, including the potential adaptability of our model for use within other contexts.

JMIR Ment Health 2022;9(1):e32430

doi:10.2196/32430

Keywords



Background

Mental illness is a pressing and growing global public health crisis with enormous societal costs [1]. Between 1990 and 2017, the number of cases of depression worldwide grew from 172 to 258 million [2]. Unfortunately, the majority of people in need of treatment do not receive care, due to a multitude of factors that reduce availability and accessibility of mental health services [3]. For instance, worldwide, shortages in trained professionals and resources allocated for mental health care limit access to treatment [4]. Although evidence-based treatments (EBTs) exist for mental health disorders, there is a major lag in translation of these treatments from laboratories to the real world [5]. Projections indicate that significant shortages of mental health practitioners will continue throughout the next decade, underscoring the need for innovative and scalable solutions to deliver EBTs [6,7].

One widely studied scalable approach, used most prominently in low-resource contexts, is for paraprofessionals to provide or support the delivery of scalable mental health services [8,9]. In this paper, we use the term “paraprofessionals” to refer to nonspecialists without formal mental health credentials who are trained to provide or support low-intensity mental health services in community settings. Under this umbrella, we include individuals who have been described using a variety of terms, such as “coaches,” “lay providers,” “community health workers,” and “peer specialists” [10-12]. Although paraprofessional support models represent a clear pathway to increasing access to care, little is known about the training, quality of care delivery, and sustainability of these models.

Digital mental health innovations via phone, computers, and other electronic devices offer another pathway for increasing access to care [13]. Digital mental health interventions hold particular promise for individuals who face obstacles to traditional, face-to-face mental health services, such as stigma, financial difficulties, time constraints, and location of services [14]. Although user uptake, engagement, and dropout have been problematic for digital mental health interventions [15], especially in routine clinical care settings [16], these problems can be addressed via human support [17-19].

Accordingly, mental health care models that combine paraprofessional workforces and digital mental health innovations have unique potential to expand the reach of and engagement with high-quality EBTs. One key consideration in efforts to design and implement paraprofessional-supported digital mental health interventions involves balancing scalability, to maximize intervention reach, with fidelity, to optimize quality and standards of treatment delivery. Scalability can be defined as “the capacity of an intervention to be applied in a way that reaches a large number of people” [6]. Fidelity encompasses both adherence (ie, Was the intervention delivered as intended?) and competence (ie, How skillfully was the intervention delivered?) [20] to ensure that patients receive efficacious treatment that leads to improved mental health outcomes [21].

Study Aim

The purpose of this paper is to demonstrate 1 way of designing a coaching program that maintains a focus on the fidelity and delivery of high-quality EBTs, while preserving key strengths of paraprofessional models of care, including scalability. Our program was developed to support the delivery of a digital mental health intervention [22] on college campuses, where rates of mental health problems are rapidly growing [23]. Given the current state of the literature, we first describe gaps in our knowledge about the fidelity of treatment delivery within existing paraprofessional programs, such as peer-to-peer support programs. Next, we highlight how pairing digital mental health innovations with paraprofessional support can increase the fidelity and scalability of mental health treatment. Third, we describe our approach to the design and implementation of a peer-to-peer training program, emphasizing potential avenues for optimizing learning processes to enhance the fidelity of treatment delivery.


Scalability and Fidelity

Paraprofessional models have gained widespread attention and support as scalable models of mental health service delivery with great potential to address unmet needs for care [8,24]. Evidence suggests that mental health interventions can be feasibly, acceptably, and effectively delivered by paraprofessionals in low-resource settings [13]. Paraprofessional training programs have the added benefit of increasing the clinical workforce, as these individuals often move on to receive advanced training in the clinical field after serving as paraprofessionals [25].

Fidelity-monitoring practices have the capacity to increase therapist accountability in service of promoting treatment adherence and competence [26]. Indeed, greater therapist competence has been associated with superior treatment outcomes [27]. However, numerous challenges with fidelity monitoring have been identified in the context of paraprofessional service delivery [8,28], such that existing paraprofessional care programs have focused primarily on scalability needs, with less attention given to fidelity of service delivery [29]. Given pressing demands to rapidly reach millions of underserved individuals in need, even paraprofessional interventions that are supported by research and contain evidence-based strategies often lack consistent fidelity-monitoring and quality assurance procedures. For instance, only 38% of studies in a review of community health worker–delivered interventions described procedures for fidelity monitoring, and among those that did report a monitoring procedure, the review noted significant variability in levels, methods, and assessment tools for fidelity measurement [8].

The financial and human resources needed to support fidelity monitoring in real-world contexts are often not available, limiting the external validity of many fidelity-monitoring strategies typically used in clinical trials [30]. Even when fidelity and quality assurance checks are integrated into training and supervision within paraprofessional models, sustained fidelity monitoring is often restricted due to limited supervision and insufficient resources to ensure continued quality assurance [28,30]. Paraprofessional programs delivered with less fidelity monitoring are thought to reduce intervention efficacy [27] and may discourage participants from future engagement in treatment. Randomized control trials have shown that with adequate training and ongoing supervision, paraprofessionals have the capacity to deliver interventions with similar levels of fidelity compared to mental health professionals [31,32]. However, less is known about how to design and implement high-fidelity training programs in more scalable contexts. Qualitative research suggests that lay health workers involved in mental health service delivery state a desire for more robust supervision. Yet, training and supervision best practices have not been established to date [33]. The limited research describing training and supervision procedures in paraprofessional delivery paradigms underscores the need for innovative solutions that have dual goals of sustaining potential for scalability, while also ensuring the fidelity of intervention delivery.

Pairing Technological Innovation With Paraprofessional Support to Enhance Fidelity and Scalability

Digital therapies hold significant promise for addressing problems with fidelity and bridging gaps in care access within wide-scale implementation efforts [27,30]. In particular, these approaches offer 1 way to support treatment delivery, paraprofessional training, and supervision, while minimizing human error or therapist drift, a common phenomenon in manualized treatment protocols [34]. Although humans often play a smaller role within digital therapy models relative to traditional face-to-face therapy, human support or coaching has been shown to augment the efficacy of digital interventions [35]. This is particularly important, given the many challenges and barriers associated with implementation of digital therapies, including limited engagement, poor rates of retention, lack of personalization, and significant cognitive load [15,36]. The involvement of human support increases intervention flexibility and acceptability by calibrating the fit between digital tools and users’ lived experiences, thereby boosting user engagement and retention [18,37]. Lattie et al [38] provide recommendations for the development of text-based coaching protocols (eg, [39]) to support digital mental health interventions and ensure high-fidelity treatment delivery. Thus, pairing paraprofessional coach support with digital therapies has several notable advantages that attend to the need for scalable innovations, while simultaneously emphasizing fidelity.

Peer-to-Peer Support

One consideration in designing paraprofessional models is who should be trained to provide, or support the delivery of, mental health interventions. A prominent model focuses on training of peer-to-peer specialists, or peer coaches [40]. Peer coaching models have been used to provide services or support to individuals with whom coaches share communities, identities, or lived experiences, with the goal of enhancing accessibility, engagement, and scalability of interventions [41]. In doing so, these models have the potential to overcome obstacles to care, such as lack of trust, stigma, and cultural and linguistic barriers (although the significance of peers’ own lived experiences is yet to be determined). One common example is peer recovery and support for individuals with substance use disorders [42], where a peer’s own experience and personal knowledge is harnessed to support individuals in starting and maintaining the recovery process [43-45]. Key legislation is paving the way to expand peer specialist programs to address a variety of population mental health needs, such as the 2020 California Senate Bill SB-803: Mental Health Services: Peer Support Specialist Certification.

Yet, a major barrier to broader implementation of peer support is the mixed empirical support for these models [46-49]. There is some evidence to suggest more positive effects from formal, structured peer support (eg, [50-53]) than informal support (eg, online chat forums) [54,55]. Nonetheless, the findings are inconsistent even within structured peer support interventions (eg, [56]). Methodological inconsistencies may partly explain the disparate findings [42,56], and 1 major example is training and quality assurance. Standardized procedures for peer training, certification, and fidelity monitoring are not well described in the literature [47,56]. Well-defined and replicable methods for training and quality assurance procedures are sorely needed.


Overview

In 2015, the University of California, Los Angeles (UCLA) launched a campus-wide research initiative, the Depression Grand Challenge (DGC), with the goal of cutting the burden of depression in half by 2050. The DGC comprises a number of studies that seek to uncover mechanisms underlying depression and to develop novel treatments and innovative approaches to treatment implementation. To begin tackling this problem at UCLA, the DGC launched the Screening and Treatment for Anxiety and Depression (STAND) program for UCLA students in fall 2017 (Figure 1). The STAND program provides all UCLA students with free mental health screening and tiered care, including digital cognitive-behavioral therapy (CBT) with certified peer coach support for students experiencing mild-to-moderate symptoms of depression and mild-to-severe symptoms of anxiety, as well as in-person psychotherapy and pharmacotherapy for students experiencing severe symptoms of depression. Students who enroll in the digital CBT arm are offered coaching from certified peers, provided via 30-minute weekly coaching sessions in which they review and troubleshoot the application of module content and skills.

STAND Digital Therapy is a modular program that combines interventions for depression, sleep, panic/agoraphobia, social anxiety, worry (generalized anxiety disorder), and trauma (posttraumatic stress disorder), drawing upon existing evidence-based programs [57-66]. There are 13 available packages that cover all principal disorders and critical patterns of comorbidity (eg, depression + sleep, trauma + depression) and comprise 6-8 modules, depending on the number of disorders targeted. Individuals are assessed at baseline on an adaptive battery of disorder-specific, self-report questionnaires that guide the package selection process [22]. The personalized packages are built to maximize engagement and interactivity and with a strong focus on diversity and inclusion. The modules are transdiagnostic and skill focused, involving psychoeducation, in session exercises, and between-session practice of techniques, including behavioral activation, cognitive restructuring, self-compassion, and exposure (eg, in vivo, interoceptive, imaginal).

Fitting within this model, the initial development of our coach training program specifically targets UCLA undergraduate students as both coaches and recipients of the intervention, consistent with the peer support models described before. Enrollment as a coach trainee does not rely on any prerequisite coursework, history of service provision, or experience of personal mental health concerns or psychotherapy. Training and supervision of coaches are provided by graduate students in the clinical psychology doctoral program at UCLA for all stages of coach training. Graduate supervisors attend group supervision-of-supervision with a licensed clinical psychologist (author EGG).

Figure 1. Navigating scalability and fidelity in mental health coaching programs. STAND: Screening and Treatment for Anxiety and Depression; UCLA: University of California, Los Angeles.
View this figure

Program Description

In our program, coach training occurs in weekly sessions, wherein trainees review digital CBT content, engage in didactic instruction of coaching materials, and complete role-play exercises focusing on basic interpersonal process skills. Coaches move through 4 primary phases of training: (1) beginner, (2) intermediate, (3) advanced, and (4) certified. Weekly training consists of a 2-hour training session as well as 2 hours of assignments completed between training sessions. Each level of training is completed over 1 academic quarter (10 weeks), at which point trainees are advanced to the subsequent level of training based on supervisor evaluations.

Beginner-Level Training

The goals of the beginner phase of training are to (1) introduce coaches to digital CBT content and increase knowledge of the intervention and (2) provide early practice with interpersonal process skills to initiate the process of translating declarative knowledge during coaching delivery. In service of these aims, beginner-level trainees enroll as users of the digital CBT and advance through the digital CBT content themselves, completing homework exercises associated with the program and reading foundational material on cornerstone CBT topics between didactic training sessions. In addition, beginner-level trainees are introduced to 6 core interpersonal process skills that are routinely assessed to monitor coaching effectiveness throughout the coach training program: (1) authenticity, (2) nonverbal skills, (3) open-ended questioning, (4) reflecting emotions, (5) content summaries, and (6) collaborative inquiry [67-69]. These process skills, in addition to sustained knowledge of the digital CBT content, provide the foundation for advancement throughout the coach training program.

Beginner-level trainees participate in (1) didactics regarding digital CBT content and interpersonal process skills, (2) discussions regarding other cornerstone topics (eg, mindfulness, cultural humility, trauma-informed care, ethics), and (3) role-play exercises to begin practicing application of the 6 core interpersonal process skills. Beginner trainees also attend sessions with advanced trainees, in which they serve as mock or practice participants for advanced trainees who are coaching full mock sessions (described in detail in the Advanced-Level Training section). Role-play exercises are recorded or observed live by supervisors, who provide oral and written feedback, as well as numerical ratings on each interpersonal process skill (eg, scale from 1 to 10 with behavioral anchors; see Multimedia Appendix 1). These evaluations provide benchmarks for certification and highlight areas of growth as trainees progress toward certification throughout the program.

Intermediate-Level Training

As trainees progress into the intermediate stage of the program, the primary goals are to provide trainees with intensive practice, (1) translating knowledge into coaching delivery and (2) applying interpersonal process skills to support engagement with digital CBT content. During these sessions, trainees participate in (1) brief digital CBT module content review, (2) intensive role-play exercises applying core process skills, and (3) introduction to protocols for managing advanced clinical issues (eg, suicidality, homicidality, abuse).

To continue supporting trainee development of interpersonal process skills and digital CBT content knowledge, trainees are continually rated on their process skills throughout intensive role-plays. Each week, supervisors review trainees’ intensive role-play segments and provide trainees with written feedback and numerical ratings on core interpersonal process skills. In addition, group supervision sessions incorporate oral feedback from supervisors and peer coaches, including in vivo corrective feedback during role-play exercises.

Advanced-Level Training

Once trainees reach the advanced stage, the main goal is for trainees to achieve certification to serve as coaches for participants. This is accomplished by demonstrating (1) competency across all 6 core interpersonal process skills and (2) continued knowledge of digital CBT content. Advanced trainees conduct practice coaching sessions (ie, full 30 minutes) with beginner trainees as mock participants. In addition to these practice sessions, advanced trainees attend a weekly supervision group consisting of intensive role-play exercises, with role-play targets focused on digital CBT content, interpersonal process skills, and management of advanced clinical issues (eg, suicidality, homicidality, abuse, sexual assault, self-disclosure).

To support advanced coaches in progressing toward certification, advanced-level trainees receive written and numerical ratings on their full 30-minute practice coaching sessions. These ratings are used to certify trainees on competency across all process skills. Next, trainees achieve certification on digital CBT content by passing quizzes, which ensures knowledge of the intervention and promotes continued fidelity to the treatment model.

Coach Certification

Following successful advancement through the prior 3 stages of the program, trainees are certified to support the digital CBT with continued supervision. Certified trainees who are engaged in coaching continue to attend weekly supervision groups in which they discuss coaching sessions with their supervisor and peers. To ensure continued fidelity to coaching standards, supervisors review video recordings of each coaching session and rate the coaches’ application of process skills according to the behavioral rating scale described before. Video review further enables supervisors to use didactics and role-play exercises in response to common challenges or to address drift from the coaching protocol. Certified trainees additionally provide feedback to the supervision team to inform potential future iterations of the coaching program.

Strategies for Monitoring and Enhancing Fidelity

Learning Theory

Increased attention to trainee learning processes within mental health provider training and supervision procedures has potential to increase fidelity to EBTs [70]. One way to enhance paraprofessional mental health service delivery, therefore, is to design training programs leveraging insights from learning theory and the use of specific pedagogical strategies (see Table 1 for examples) shown to improve knowledge building, skill acquisition, and long-term retention across domains such as learning a new language, mathematics, and sports [71-73]. Although these strategies may reduce performance in the short term (ie, during initial acquisition of skills or knowledge), research has consistently shown superior long-term retention and retrieval of learning [72,74].

Table 1. Pedagogical strategies and examples.
PrincipleDefinitionExample
Varying context of learningIncorporating contextual variability (eg, physical location, types of teaching strategies) into teaching and learningCompared with individuals who repeatedly study in 1 setting, individuals who study in a variety of physical settings have been shown to perform better on subsequent examinations in a new setting [75].
Spaced instructionSpacing out instruction of a single topic over a period, as opposed to solely providing instruction about a topic in 1 learning eventAlthough cramming for an exam may be a useful strategy for performing well in the short term (eg, on a quiz), spacing the presentation of materials over a longer period has been shown to support performance in the long term (eg, on a final examination).
Interleaved instructionInterleaving instruction of different topics within a common learning event (eg, covering multiple concepts within a single class)Interleaving questions that assess knowledge of multiple concepts (eg, geometric equations for angles and lines intermixed) has been shown to improve student learning compared with blocking of concepts (eg, equations for angles, then lines) [76].
Retrieval practices/ examinationsFormal assessment of knowledge (eg, tests, assessments, exams)Individuals who make incorrect guesses have been shown to benefit from these early mistakes during learning compared with individuals who are provided with the correct answers from the beginning of training [77].
Learning Theory: Applied

From the outset of coach training, we have applied core principles of learning theory to guide the instruction of digital CBT content and process skills. For example, variability of learning contexts is applied through (1) independent trainee review of digital CBT content (outside of sessions), (2) didactic training (during sessions), (3) role-play exercises (conducted in small groups), and (4) participation in mock sessions (observed by the entire supervision group). Likewise, applying the principle of spaced instruction, digital CBT content and interpersonal process skills are introduced and revisited at multiple timepoints within and across training levels. Interleaved instruction is similarly used to promote initial learning of digital CBT content and process skills simultaneously (eg, a single training session alternates between CBT and process skill content, and likewise combines the 2 domains, rather than blocking 1 instruction topic at a time). Furthermore, retrieval practices assess digital CBT knowledge throughout all stages of trainee development to support long-term retention of learning (eg, during the advanced stage of coach training, the process of obtaining certification involves trainees repeatedly completing mock coaching sessions with corrective feedback).

Following certification, ongoing fidelity-monitoring practices include (1) completion of a self-evaluation coaching checklist following all coaching sessions, (2) discussion of coach adherence to the digital CBT module during supervision, and (3) continued completion of mock coaching sessions during supervision with peer-to-peer and supervisor feedback.

Competency-Based Supervision

Following the acquisition of new knowledge and skills, competency-based supervision techniques can provide trainees with a pathway for transforming declarative knowledge into procedural knowledge [78-81]. Prior studies support the notion that competency-based supervision can increase effective CBT knowledge and acquisition [82]. Accordingly, the present coach training program integrates experiential learning and competency-based supervision strategies to support sustained fidelity to the treatment. For example, our program uses supervision practices that integrate a variety of experiential learning techniques (eg, skill modeling, role-plays, and corrective feedback), which have been shown to increase provider fidelity to EBTs [70]. Likewise, the program continuously assesses and monitors trainee development with clearly articulated, behaviorally anchored feedback [81].


Principal Findings

In this paper, we outlined 1 example of a scalable peer-to-peer mental health paraprofessional training and supervision program. Although many models of paraprofessional support have been described and tested previously, high demand and minimal resources have often corresponded with a reduced focus on fidelity monitoring and quality assurance [8]. Lack of standardized methods for paraprofessional training and supervision may have contributed to the disparate empirical support for paraprofessional, and specifically peer paraprofessional, models. Here we described a standardized and replicable model of training and supervision suitable for evaluation.

Strengths

We believe this model has several notable strengths. Of note, our program focuses explicitly on fidelity, while also attending to the need for scalable care. As illustrated, the focus on fidelity is integrated into the program in 2 primary ways: digital technology as the primary agent for CBT content delivery [83] and continuous, standardized procedures for fidelity monitoring of coaches who support digital CBT provision. In addition, our training and supervision program is grounded in key findings from the learning theory literature, aligned with data suggesting that optimized learning can serve as a pathway to higher fidelity of treatment delivery [70,78]. The integration of learning theory as a mechanism for enhancing fidelity is aligned with existing lay health worker training frameworks that focus on augmenting initial one-off training with on-the-job direct supervision, coaching, and feedback systems [28]. We believe that paraprofessional models anchored in learning theory principles have the greatest potential to improve quality of care.

Another strength is that our program is designed to be malleable and can be adapted in various ways based on implementation context factors. Along with fidelity, program flexibility is well established as a key ingredient to successful implementation of interventions in numerous settings [84,85]. Implementation science frameworks have frequently cited the importance of balancing both fidelity and flexibility in delivery of EBTs, and this concept has also been established as essential in lay health worker models [28]. Our program was designed with flexibility within fidelity as a key guiding principle. It contains both core components, defined within the Consolidated Framework for Implementation Research (CFIR) as the “essential and indispensable elements” of the program, and the adaptable periphery, defined as the aspects of the program that can be modified and varied from site to site [86,87]. Included in our program’s core components are (1) anchoring in principles of learning theory described before, (2) training on 6 core clinical process skills, and (3) training on digital CBT content. The adaptable periphery, however, depends on the structures, systems, and contexts involved with program implementation. In the process of designing adaptations, community stakeholder partnership and input are essential [88]. Although many adaptation frameworks have focused on adaptations to the intervention itself, stakeholders can also be used to consider adaptations to the implementation context.

In our program, we have identified several components of the adaptable periphery that have been tailored for various implementation contexts, with community partnership. For instance, although this paper describes implementation at 1 university, we are currently piloting coach training and supervision for the launch of STAND digital CBT in numerous other types of community settings, including local community colleges and health care systems. In partnership with community stakeholders, 1 example of a component in the adaptable periphery that we have modified to meet the needs of a new implementation site is the length of training time, which has been shortened to accommodate local resources. This has been accomplished by combining components of beginner and intermediate levels of training and including additional review and feedback of recorded role-plays outside of sessions to accelerate learning and growth. In another example of adaptation, we have worked with various sites to situate and design our coaching risk protocols (eg, suicide risk, abuse) within the contexts of existing resources, infrastructure, and referrals. Another example of adaptation has been to integrate specific training on trauma-informed care strategies to support implementation of this program in communities with higher trauma prevalence rates. Cultural considerations are also essential, particularly in planning implementation of coach training programs in diverse settings such as ours. Working in partnership with community stakeholders to co-design cultural adaptations can lead to improved program acceptability and community engagement. Although we have made and discussed modifications within the adaptable periphery based on the unique implementation and contextual factors within various environments, the same guiding principles described in this paper serve as the foundational core components across settings.

A final strength of our program is that it is intended not only to train students to serve as coaches to their peers but also to provide critical CBT skills to trainees themselves. Many coaches in our program anecdotally report that their experience throughout training has taught them invaluable interpersonal and cognitive-behavioral skills. In the broader literature, paraprofessionals describe feeling that their training experiences were associated with personal development and growth and increases in knowledge, self-confidence, and skill use [33]. In the context of our program, formal measurement of mental health benefits conferred by coaches in our program is needed.

Limitations

Several key limitations of our program should also be noted. First, because this program is situated within the scope of a large research initiative, ongoing funding has been available to sustain coach training and supervision. Beyond the realm of research, efforts to provide continuous funding for paraprofessional support programs in routine care settings are critical. In the initial iteration of our program, coaches have served as volunteers, engaged in all program elements as an additional responsibility outside of their other obligations. Data suggest that among volunteer staff supporting digital interventions, administrative issues, such as time constraints, may contribute to barriers to training completion and attrition [89]. Additional funding that encompasses financial payment or other incentives for peer coaches may represent 1 solution to address this obstacle. One model that is currently being tested as a component of our program’s adaptable periphery is paying coaches as university employees. Alternative methods of expanding and sustaining funding and resources are worthy of exploration.

Second, although we maintain a focus on fidelity in our program, the primary objective of our peer-to-peer program is to serve as a scalable model of care in real practice settings. Thus, given the resource constraints of real-world implementation contexts, we have designed our fidelity-monitoring procedures to minimize supervisor and trainee burden. However, in doing so, we recognize limitations in our capacity to optimally monitor fidelity, and acknowledge that fidelity is not monitored to the same degree in our program compared to standard clinical trials (eg, [90]).

Third, to maximize scalability of the program, coaching is provided virtually using videoconferencing. Prior research has raised the possibility that compared with self-administered or fully automatized options, digital mental health interventions may be most effective for adolescents and young adults when incorporating in-person elements [91]. However, the extent to which virtual interactions with a human coach may provide a similar degree of benefit is unknown. Additional research may clarify the effectiveness of fully remote coaching and guide potential adaptations to this program.

Last, our program was initially designed for use in a specific setting (ie, a peer-to-peer program supporting college students). Additional efforts and reliance on existing implementation science and human-centered design frameworks, such as the CFIR, are needed to determine how this program and similar ones may be adapted and augmented for use in other types of settings and with new populations. A number of conceptual frameworks to adapt interventions in new contexts have been proposed, and these can be used to guide adaptation of paraprofessional support programs for new settings (eg, [92]).

Conclusion and Future Directions

Finally, we consider future directions for this work, falling within the scope of the paraprofessional field at large. First, to meet rising rates of mental illness worldwide, expansion of paraprofessional mental health programs into new settings is critically needed. Second, funding for these programs must also encompass sufficient resources to support quality assurance in training, supervision, and treatment delivery [93], as has been the case throughout the development of the coach training program presented here. However, fidelity assurance strategies must be integrated with careful awareness of their scalability, enabling paraprofessional programs to continue expanding in reach. Third, adaptations should be designed in collaboration with community stakeholders to reduce drift from EBT protocols, while also addressing the implementation factors that drive adaptation needs [92]. Lastly, research protocols (eg, [94]) should be developed to enable empirical testing of our model, along with potential model adaptations to determine effectiveness and inform modifications to future iterations of the coach training program.

Acknowledgments

BMR and TK were responsible for conceptualization and writing of this paper. ZDC developed the digital intervention used by this program and provided crucial edits to the paper. EGG created the training program described in this paper, conducted supervision-of-supervision, and provided crucial edits to the paper. MGC oversaw the creation and implementation of this program and provided crucial edits to the paper.

This work would not have been possible without the immense contributions of the following individuals, who were central to the development, implementation, and supervision of the coaching program described in this paper: Amanda Loerinc, PhD; Allyson Pimentel, EdD; Bita Mesri, PhD; Blanche Wright, MA, CPhil; Brittany Drake, MA, CPhil; Dana Saifan, MA, CPhil; Jennifer Gamarra, PhD; Julia Hammett, PhD; Julia Yarrington, MA; Meghan Vinograd, PhD; Meredith Boyd, MA, CPhil; Sophie Arkin, MA, CPhil; and Stassja Sichko, MA.

Conflicts of Interest

ZDC received consultancy fees from Joyable for his work on cognitive-behavioral therapy during 2016-2017.

Multimedia Appendix 1

Rating form to evaluate interpersonal process skills.

PDF File (Adobe PDF File), 101 KB

  1. Vigo D, Thornicroft G, Atun R. Estimating the true global burden of mental illness. Lancet Psychiatry 2016 Feb;3(2):171-178 [FREE Full text] [CrossRef]
  2. Liu Q, He H, Yang J, Feng X, Zhao F, Lyu J. Changes in the global burden of depression from 1990 to 2017: findings from the Global Burden of Disease study. J Psychiatr Res 2020 Jul;126:134-140 [FREE Full text] [CrossRef] [Medline]
  3. Betancourt T, Chambers DA. Optimizing an era of global mental health implementation science. JAMA Psychiatry 2016 Feb;73(2):99-100 [FREE Full text] [CrossRef] [Medline]
  4. Butryn T, Bryant L, Marchionni C, Sholevar F. The shortage of psychiatrists and other mental health providers: causes, current state, and potential solutions. Int J Acad Med 2017;3(1):5. [CrossRef]
  5. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med 2011 Dec;104(12):510-520 [FREE Full text] [CrossRef] [Medline]
  6. Kazdin AE. Annual research review: expanding mental health services through novel models of intervention delivery. J Child Psychol Psychiatry 2019 Apr;60(4):455-472 [FREE Full text] [CrossRef] [Medline]
  7. Olfson M. Building the mental health workforce capacity needed to treat adults with serious mental illnesses. Health Aff (Millwood) 2016 Jun 01;35(6):983-990 [FREE Full text] [CrossRef] [Medline]
  8. Barnett ML, Gonzalez A, Miranda J, Chavira DA, Lau AS. Mobilizing community health workers to address mental health disparities for underserved populations: a systematic review. Adm Policy Ment Health 2018 Mar;45(2):195-211 [FREE Full text] [CrossRef] [Medline]
  9. Singla DR, Kohrt BA, Murray LK, Anand A, Chorpita BF, Patel V. Psychological treatments for the world: lessons from low- and middle-income countries. Annu Rev Clin Psychol 2017 May 08;13:149-181 [FREE Full text] [CrossRef] [Medline]
  10. Lewin S, Dick J, Pond P, Zwarenstein M, Aja GN, van Wyk BE, et al. Lay health workers in primary and community health care. Cochrane Database Syst Rev 2005 Jan 25(1):CD004015. [CrossRef] [Medline]
  11. Chinman M, McInnes DK, Eisen S, Ellison M, Farkas M, Armstrong M, et al. Establishing a research agenda for understanding the role and impact of mental health peer specialists. Psychiatr Serv 2017 Sep 01;68(9):955-957 [FREE Full text] [CrossRef] [Medline]
  12. Rosenthal EL, Brownstein JN, Rush CH, Hirsch GR, Willaert AM, Scott JR, et al. Community health workers: part of the solution. Health Aff (Millwood) 2010 Jul;29(7):1338-1342 [FREE Full text] [CrossRef] [Medline]
  13. Naslund JA, Aschbrenner KA, Araya R, Marsch LA, Unützer J, Patel V, et al. Digital technology for treating and preventing mental disorders in low-income and middle-income countries: a narrative review of the literature. Lancet Psychiatry 2017 Jun;4(6):486-500 [FREE Full text] [CrossRef]
  14. Schueller SM, Hunter JF, Figueroa C, Aguilera A. Use of digital mental health for marginalized and underserved populations. Curr Treat Options Psych 2019 Jul 5;6(3):243-255 [FREE Full text] [CrossRef]
  15. Torous J, Nicholas J, Larsen ME, Firth J, Christensen H. Clinical review of user engagement with mental health smartphone apps: evidence, theory and improvements. Evid Based Ment Health 2018 Aug;21(3):116-119 [FREE Full text] [CrossRef] [Medline]
  16. Gilbody S, Littlewood E, Hewitt C, Brierley G, Tharmanathan P, Araya R, REEACT Team. Computerised cognitive behaviour therapy (cCBT) as treatment for depression in primary care (REEACT trial): large scale pragmatic randomised controlled trial. BMJ 2015 Nov 11;351:h5627 [FREE Full text] [CrossRef] [Medline]
  17. Benton SA, Heesacker M, Snowden SJ, Lee G. Therapist-assisted, online (TAO) intervention for anxiety in college students: TAO outperformed treatment as usual. Prof Psychol: Res Pract 2016 Oct;47(5):363-371 [FREE Full text] [CrossRef]
  18. Schueller SM, Tomasino KN, Mohr DC. Integrating human support into behavioral intervention technologies: the efficiency model of support. Clin Psychol: Sci Pract 2017 Mar;24(1):27-45 [FREE Full text] [CrossRef]
  19. Conley CS, Durlak JA, Shapiro JB, Kirsch AC, Zahniser E. A meta-analysis of the impact of universal and indicated preventive technology-delivered interventions for higher education students. Prev Sci 2016 Aug;17(6):659-678 [FREE Full text] [CrossRef] [Medline]
  20. Cross WF, West JC. Examining implementer fidelity: conceptualising and measuring adherence and competence. J Child Serv 2011 Mar 18;6(1):18-33 [FREE Full text] [CrossRef] [Medline]
  21. Schoenwald SK, Sheidow AJ, Letourneau EJ. Toward effective quality assurance in evidence-based practice: links between expert consultation, therapist fidelity, and child outcomes. J Clin Child Adolesc Psychol 2004 Feb;33(1):94-104 [FREE Full text] [CrossRef]
  22. Cohen ZD, Craske MG. The development and pilot implementation of a modular, transdiagnostic, personalized digital therapy during a global pandemic. 2021 Presented at: European Association of Behavioral and Cognitive Therapies; 2021; Belfast, Northern Ireland.
  23. Duffy ME, Twenge JM, Joiner TE. Trends in mood and anxiety symptoms and suicide-related outcomes among U.S. undergraduates, 2007-2018: evidence from two national surveys. J Adolesc Health 2019 Nov;65(5):590-598 [FREE Full text] [CrossRef] [Medline]
  24. Padmanathan P, De Silva MJ. The acceptability and feasibility of task-sharing for mental healthcare in low and middle income countries: a systematic review. Soc Sci Med 2013 Nov;97:82-86 [FREE Full text] [CrossRef] [Medline]
  25. Bellerose M, Awoonor-Williams K, Alva S, Magalona S, Sacks E. 'Let me move to another level': career advancement desires and opportunities for community health nurses in Ghana. Glob Health Promot 2021 Jul 16:17579759211027426 [FREE Full text] [CrossRef] [Medline]
  26. Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Health 2011 Jan;38(1):32-43 [FREE Full text] [CrossRef] [Medline]
  27. Brown LA, Craske MG, Glenn DE, Stein MB, Sullivan G, Sherbourne C, et al. CBT competence in novice therapists improves anxiety outcomes. Depress Anxiety 2013 Feb;30(2):97-115 [FREE Full text] [CrossRef] [Medline]
  28. Murray LK, Dorsey S, Bolton P, Jordans MJ, Rahman A, Bass J, et al. Building capacity in mental health interventions in low resource countries: an apprenticeship model for training local providers. Int J Ment Health Syst 2011 Nov 18;5(1):30-12 [FREE Full text] [CrossRef] [Medline]
  29. van Ginneken N, Tharyan P, Lewin S, Rao GN, Meera SM, Pian J, et al. Non-specialist health worker interventions for the care of mental, neurological and substance-abuse disorders in low- and middle-income countries. Cochrane Database Syst Rev 2013 Nov 19(11):CD009149. [CrossRef] [Medline]
  30. Kemp CG, Petersen I, Bhana A, Rao D. Supervision of task-shared mental health care in low-resource settings: a commentary on programmatic experience. Glob Health Sci Pract 2019 Jun 27;7(2):150-159 [FREE Full text] [CrossRef]
  31. Montgomery EC, Kunik ME, Wilson N, Stanley MA, Weiss B. Can paraprofessionals deliver cognitive-behavioral therapy to treat anxiety and depressive symptoms? Bull Menninger Clin 2010;74(1):45-62 [FREE Full text] [CrossRef] [Medline]
  32. Diebold A, Ciolino JD, Johnson JK, Yeh C, Gollan JK, Tandon SD. Comparing fidelity outcomes of paraprofessional and professional delivery of a perinatal depression preventive intervention. Adm Policy Ment Health 2020 Jul;47(4):597-605 [FREE Full text] [CrossRef] [Medline]
  33. Shahmalak U, Blakemore A, Waheed MW, Waheed W. The experiences of lay health workers trained in task-shifting psychological interventions: a qualitative systematic review. Int J Ment Health Syst 2019;13:64-15 [FREE Full text] [CrossRef] [Medline]
  34. Waller G, Turner H. Therapist drift redux: Why well-meaning clinicians fail to deliver evidence-based therapy, and how to get back on track. Behav Res Ther 2016 Feb;77:129-137 [FREE Full text] [CrossRef] [Medline]
  35. Karyotaki E, Efthimiou O, Miguel C, Bermpohl FMG, Furukawa TA, Cuijpers P, Individual Patient Data Meta-Analyses for Depression (IPDMA-DE) Collaboration, et al. Internet-based cognitive behavioral therapy for depression: a systematic review and individual patient data network meta-analysis. JAMA Psychiatry 2021 Apr 01;78(4):361-371 [FREE Full text] [CrossRef] [Medline]
  36. Scholten H, Granic I. Use of the principles of design thinking to address limitations of digital mental health interventions for youth: viewpoint. J Med Internet Res 2019 Jan 14;21(1):e11528 [FREE Full text] [CrossRef] [Medline]
  37. Mohr DC, Burns MN, Schueller SM, Clarke G, Klinkman M. Behavioral intervention technologies: evidence review and recommendations for future research in mental health. Gen Hosp Psychiatry 2013;35(4):332-338 [FREE Full text] [CrossRef] [Medline]
  38. Lattie EG, Graham AK, Hadjistavropoulos HD, Dear BF, Titov N, Mohr DC. Guidance on defining the scope and development of text-based coaching protocols for digital mental health interventions. Digit Health 2019;5:2055207619896145 [FREE Full text] [CrossRef] [Medline]
  39. Mohr D, Duffecy J, Ho J, Kwasny M, Cai X, Burns MN, et al. A randomized controlled trial evaluating a manualized TeleCoaching protocol for improving adherence to a web-based intervention for the treatment of depression. PLoS One 2013;8(8):e70086 [FREE Full text] [CrossRef] [Medline]
  40. Myrick K, Del Vecchio P. Peer support services in the behavioral healthcare workforce: state of the field. Psychiatr Rehabil J 2016 Sep;39(3):197-203 [FREE Full text] [CrossRef] [Medline]
  41. Gagne CA, Finch WL, Myrick KJ, Davis LM. Peer workers in the behavioral and integrated health workforce: opportunities and future directions. Am J Prev Med 2018 Jun;54(6 Suppl 3):S258-S266 [FREE Full text] [CrossRef] [Medline]
  42. Bassuk EL, Hanson J, Greene RN, Richard M, Laudet A. Peer-delivered recovery support services for addictions in the United States: a systematic review. J Subst Abuse Treat 2016 Apr;63:1-9 [FREE Full text] [CrossRef] [Medline]
  43. Watson E. The mechanisms underpinning peer support: a literature review. J Ment Health 2019 Dec;28(6):677-688 [FREE Full text] [CrossRef] [Medline]
  44. Gillard S, Foster R, Gibson S, Goldsmith L, Marks J, White S. Describing a principles-based approach to developing and evaluating peer worker roles as peer support moves into mainstream mental health services. MHSI 2017 Jun 12;21(3):133-143 [FREE Full text] [CrossRef]
  45. Basset T, Faulkner A, Repper J, Stamou E. Lived Experience Leading the Way: Peer Support in Mental Health. London, UK: Together for Mental Wellbeing; 2010.
  46. Silver J, Nemec PB. The role of the peer specialists: unanswered questions. Psychiatr Rehabil J 2016 Sep;39(3):289-291 [FREE Full text] [CrossRef] [Medline]
  47. Lloyd-Evans B, Mayo-Wilson E, Harrison B, Istead H, Brown E, Pilling S, et al. A systematic review and meta-analysis of randomised controlled trials of peer support for people with severe mental illness. BMC Psychiatry 2014 Feb 14;14(1):1-12 [FREE Full text] [CrossRef]
  48. Fortuna KL, Naslund JA, LaCroix JM, Bianco CL, Brooks JM, Zisman-Ilani Y, et al. Digital peer support mental health interventions for people with a lived experience of a serious mental illness: systematic review. JMIR Ment Health 2020 Apr 03;7(4):e16460 [FREE Full text] [CrossRef] [Medline]
  49. Ali K, Farrer L, Gulliver A, Griffiths KM. Online peer-to-peer support for young people with mental health problems: a systematic review. JMIR Ment Health 2015;2(2):e19 [FREE Full text] [CrossRef] [Medline]
  50. van der Zanden R, Kramer J, Gerrits R, Cuijpers P. Effectiveness of an online group course for depression in adolescents and young adults: a randomized trial. J Med Internet Res 2012 Jun 07;14(3):e86 [FREE Full text] [CrossRef] [Medline]
  51. Day V, McGrath PJ, Wojtowicz M. Internet-based guided self-help for university students with anxiety, depression and stress: a randomized controlled clinical trial. Behav Res Ther 2013 Jul;51(7):344-351 [FREE Full text] [CrossRef] [Medline]
  52. Klatt C, Berg CJ, Thomas JL, Ehlinger E, Ahluwalia JS, An LC. The role of peer e-mail support as part of a college smoking-cessation website. Am J Prev Med 2008 Dec;35(6 Suppl):S471-S478 [FREE Full text] [CrossRef] [Medline]
  53. Conley C, Hundert CG, Charles JL, Huguenel BM, Al-khouja M, Qin S, et al. Honest, open, proud–college: effectiveness of a peer-led small-group intervention for reducing the stigma of mental illness. Stigma Health 2020 May;5(2):168-178 [FREE Full text] [CrossRef]
  54. Freeman E, Barker C, Pistrang N. Outcome of an online mutual support group for college students with psychological problems. Cyberpsychol Behav 2008 Oct;11(5):591-593 [FREE Full text] [CrossRef] [Medline]
  55. Horgan A, McCarthy G, Sweeney J. An evaluation of an online peer support forum for university students with depressive symptoms. Arch Psychiatr Nurs 2013 Apr;27(2):84-89 [FREE Full text] [CrossRef] [Medline]
  56. Eddie D, Hoffman L, Vilsaint C, Abry A, Bergman B, Hoeppner B, et al. Lived experience in new models of care for substance use disorder: a systematic review of peer recovery support services and recovery coaching. Front Psychol 2019;10:1052 [FREE Full text] [CrossRef] [Medline]
  57. Craske MG, Rose RD, Lang A, Welch SS, Campbell-Sills L, Sullivan G, et al. Computer-assisted delivery of cognitive behavioral therapy for anxiety disorders in primary-care settings. Depress Anxiety 2009;26(3):235-242 [FREE Full text] [CrossRef] [Medline]
  58. Craske MG, Stein MB, Sullivan G, Sherbourne C, Bystritsky A, Rose RD, et al. Disorder-specific impact of coordinated anxiety learning and management treatment for anxiety disorders in primary care. Arch Gen Psychiatry 2011 Apr;68(4):378-388 [FREE Full text] [CrossRef] [Medline]
  59. Craske MG, Meuret AE, Ritz T, Treanor M, Dour HJ. Treatment for anhedonia: a neuroscience driven approach. Depress Anxiety 2016 Oct;33(10):927-938 [FREE Full text] [CrossRef] [Medline]
  60. Craske MG, Meuret AE, Ritz T, Treanor M, Dour HJ, Rosenfield D. Positive affect treatment for depression and anxiety: a randomized clinical trial for a core feature of anhedonia. J Consult Clin Psychol 2019 May;87(5):457-471 [FREE Full text] [CrossRef] [Medline]
  61. Roy-Byrne P, Craske MG, Sullivan G, Rose RD, Edlund MJ, Lang AJ, et al. Delivery of evidence-based treatment for multiple anxiety disorders in primary care: a randomized controlled trial. JAMA 2010 May 19;303(19):1921-1928 [FREE Full text] [CrossRef] [Medline]
  62. Watkins ER, Mullan E, Wingrove J, Rimes K, Steiner H, Bathurst N, et al. Rumination-focused cognitive-behavioural therapy for residual depression: phase II randomised controlled trial. Br J Psychiatry 2011 Oct;199(4):317-322 [FREE Full text] [CrossRef] [Medline]
  63. Watkins E, Newbold A, Tester-Jones M, Javaid M, Cadman J, Collins LM, et al. Implementing multifactorial psychotherapy research in online virtual environments (IMPROVE-2): study protocol for a phase III trial of the MOST randomized component selection method for internet cognitive-behavioural therapy for depression. BMC Psychiatry 2016 Oct 06;16(1):345 [FREE Full text] [CrossRef] [Medline]
  64. Harvey AG. A transdiagnostic intervention for youth sleep and circadian problems. Cogn Behav Pract 2016 Aug;23(3):341-355 [FREE Full text] [CrossRef]
  65. Harvey AG, Hein K, Dolsen MR, Dong L, Rabe-Hesketh S, Gumport NB, et al. Modifying the impact of eveningness chronotype ("night-owls") in youth: a randomized controlled trial. J Am Acad Child Adolesc Psychiatry 2018 Oct;57(10):742-754 [FREE Full text] [CrossRef] [Medline]
  66. Harvey AG, Dong L, Hein K, Yu SH, Martinez AJ, Gumport NB, et al. A randomized controlled trial of the Transdiagnostic Intervention for Sleep and Circadian Dysfunction (TranS-C) to improve serious mental illness outcomes in a community setting. J Consult Clin Psychol 2021 Jun;89(6):537-550 [FREE Full text] [CrossRef] [Medline]
  67. Hettema J, Steele J, Miller WR. Motivational interviewing. Annu Rev Clin Psychol 2005;1:91-111 [FREE Full text] [CrossRef] [Medline]
  68. Rollnick S, Miller WR. What is motivational interviewing? Behav Cogn Psychother 2009 Jun 16;23(4):325-334 [FREE Full text] [CrossRef]
  69. Robertson K. Active listening: more than just paying attention. Aust Fam Physician 2005 Dec;34(12):1053-1055 [FREE Full text] [Medline]
  70. Bearman SK, Schneiderman RL, Zoloth E. Building an evidence base for effective supervision practices: an analogue experiment of supervision to increase EBT fidelity. Adm Policy Ment Health 2017 Mar;44(2):293-307 [FREE Full text] [CrossRef] [Medline]
  71. Bjork RA. Memory and metamemory considerations in the training of human beings. In: Metacognition: Knowing about Knowing. Cambridge, MA: MIT Press; 1994.
  72. Bjork EL, Bjork RA. Making things hard on yourself, but in a good way: Creating desirable difficulties to enhance learning. In: Psychology and the Real World: Essays Illustrating Fundamental Contributions to Society. New York, NY: Worth Publishers; 2011:56-64.
  73. Schmidt RA, Bjork RA. New conceptualizations of practice: common principles in three paradigms suggest new concepts for training. Psychol Sci 2017 Apr 25;3(4):207-218 [FREE Full text] [CrossRef]
  74. Soderstrom NC, Bjork RA. Learning versus performance: an integrative review. Perspect Psychol Sci 2015 Mar;10(2):176-199 [FREE Full text] [CrossRef] [Medline]
  75. Smith SM. A comparison of two techniques for reducing context-dependent forgetting. Mem Cognit 1984 Sep;12(5):477-482 [FREE Full text] [CrossRef] [Medline]
  76. Rohrer D, Dedrick RF, Burgess K. The benefit of interleaved mathematics practice is not limited to superficially similar kinds of problems. Psychon Bull Rev 2014 Oct;21(5):1323-1330 [FREE Full text] [CrossRef] [Medline]
  77. Kornell N, Hays MJ, Bjork RA. Unsuccessful retrieval attempts enhance subsequent learning. J Exp Psychol Learn Mem Cogn 2009 Jul;35(4):989-998 [FREE Full text] [CrossRef] [Medline]
  78. Bennett-Levy J, McManus F, Westling BE, Fennell M. Acquiring and refining CBT skills and competencies: which training methods are perceived to be most effective? Behav Cogn Psychother 2009 Aug 25;37(5):571-583 [FREE Full text] [CrossRef]
  79. Kolb DA. Experience as the Source of Learning and Development. Hoboken, NJ: Prentice Hall; 1984.
  80. Milne D, Aylott H, Fitzpatrick H, Ellis MV. How does clinical supervision work? Using a “best evidence synthesis” approach to construct a basic model of supervision. WCSU 2008 Nov 21;27(2):170-190 [FREE Full text] [CrossRef]
  81. Falender CA. Clinical supervision in a competency-based era. S Afr J Psychol 2014 Jan 07;44(1):6-17 [FREE Full text] [CrossRef]
  82. Bennett-Levy J. Therapist skills: a cognitive model of their acquisition and refinement. Behav Cogn Psychother 2005 Oct 20;34(1):57-78 [FREE Full text] [CrossRef]
  83. Enock PM, McNally RJ. How mobile apps and other web-based interventions can transform psychological treatment and the treatment development cycle. Behav Ther 2013;36(3):56-66.
  84. Kendall PC, Beidas RS. Smoothing the trail for dissemination of evidence-based practices for youth: flexibility within fidelity. Prof Psychol: Res Pract 2007;38(1):13-20 [FREE Full text] [CrossRef]
  85. Kendall PC, Frank HE. Implementing evidence-based treatment protocols: flexibility within fidelity. Clin Psychol (New York) 2018 Dec;25(4):e12271 [FREE Full text] [CrossRef] [Medline]
  86. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009 Aug 07;4(1):50-15 [FREE Full text] [CrossRef] [Medline]
  87. Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, Damschroder L. A systematic review of the use of the Consolidated Framework for Implementation Research. Implement Sci 2016 May 17;11(1):72-13 [FREE Full text] [CrossRef] [Medline]
  88. Wiltsey Stirman S, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci 2019 Jun 06;14(1):58-10 [FREE Full text] [CrossRef] [Medline]
  89. O'Dea B, King C, Subotic-Kerry M, Achilles MR, Cockayne N, Christensen H. Smooth sailing: a pilot study of an online, school-based, mental health service for depression and anxiety. Front Psychiatry 2019;10:574 [FREE Full text] [CrossRef] [Medline]
  90. Wiltsey Stirman S, Gutner CA, Crits-Christoph P, Edmunds J, Evans AC, Beidas RS. Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications to an evidence-based psychotherapy. Implement Sci 2015 Aug 13;10(1):115-110 [FREE Full text] [CrossRef] [Medline]
  91. Lehtimaki S, Martic J, Wahl B, Foster KT, Schwalbe N. Evidence on digital mental health interventions for adolescents and young people: systematic overview. JMIR Ment Health 2021 Apr 29;8(4):e25847 [FREE Full text] [CrossRef] [Medline]
  92. Allen JD, Linnan LA, Emmons KM, Brownson R, Colditz G, Proctor E. Fidelity and its relationship to implementation effectiveness, adaptation, and dissemination. In: Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford, UK: Oxford University Press; 2012:281-304.
  93. Borrelli B. The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. J Public Health Dent 2011;71(s1):S52-S63 [FREE Full text] [CrossRef] [Medline]
  94. Dohnt HC, Dowling MJ, Davenport TA, Lee G, Cross SP, Scott EM, et al. Supporting clinicians to use technology to deliver highly personalized and measurement-based mental health care to young people: protocol for an evaluation study. JMIR Res Protoc 2021 Jun 14;10(6):e24697 [FREE Full text] [CrossRef] [Medline]


CBT: cognitive-behavioral therapy
CFIR: Consolidated Framework for Implementation Research
DGC: Depression Grand Challenge
EBT: evidence-based treatment
STAND: Screening and Treatment for Anxiety and Depression
UCLA: University of California, Los Angeles


Edited by J Torous; submitted 02.08.21; peer-reviewed by D Frank, R Pine, L Balcombe; comments to author 27.09.21; revised version received 21.11.21; accepted 22.11.21; published 26.01.22

Copyright

©Benjamin M Rosenberg, Tamar Kodish, Zachary D Cohen, Elizabeth Gong-Guy, Michelle G Craske. Originally published in JMIR Mental Health (https://mental.jmir.org), 26.01.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on https://mental.jmir.org/, as well as this copyright and license information must be included.