Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 25.06.20 in Vol 7, No 6 (2020): June

Preprints (earlier versions) of this paper are available at http://preprints.jmir.org/preprint/15942, first published Aug 20, 2019.

This paper is in the following e-collection/theme issue:

    Review

    Surveying the Role of Analytics in Evaluating Digital Mental Health Interventions for Transition-Aged Youth: Scoping Review

    1Office of Education, Centre for Addiction and Mental Health, Toronto, ON, Canada

    2Information Management Group, Centre for Addiction and Mental Health, Toronto, ON, Canada

    3Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON, Canada

    4Education, Technology and Innovation, University Health Network, Toronto, ON, Canada

    5Department of Psychiatry, Faculty of Medicine, University of Toronto, Toronto, ON, Canada

    *these authors contributed equally

    Corresponding Author:

    David Wiljer, PhD

    Education, Technology and Innovation

    University Health Network

    3-RFE-441

    190 Elizabeth Street

    Toronto, ON, M5G 2C4

    Canada

    Phone: 1 4163404800 ext 6322

    Email: David.Wiljer@uhn.ca


    ABSTRACT

    Background: Consumer-facing digital health interventions provide a promising avenue to bridge gaps in mental health care delivery. To evaluate these interventions, understanding how the target population uses a solution is critical to the overall validity and reliability of the evaluation. As a result, usage data (analytics) can provide a proxy for evaluating the engagement of a solution. However, there is paucity of guidance on how usage data or analytics should be used to assess and evaluate digital mental health interventions.

    Objective: This review aimed to examine how usage data are collected and analyzed in evaluations of mental health mobile apps for transition-aged youth (15-29 years).

    Methods: A scoping review was conducted using the Arksey and O’Malley framework. A systematic search was conducted on 5 journal databases using keywords related to usage and engagement, mental health apps, and evaluation. A total of 1784 papers from 2008 to 2019 were identified and screened to ensure that they included analytics and evaluated a mental health app for transition-aged youth. After full-text screening, 49 papers were included in the analysis.

    Results: Of the 49 papers included in the analysis, 40 unique digital mental health innovations were evaluated, and about 80% (39/49) of the papers were published over the past 6 years. About 80% involved a randomized controlled trial and evaluated apps with information delivery features. There were heterogeneous findings in the concept that analytics was ascribed to, with the top 3 being engagement, adherence, and acceptability. There was also a significant spread in the number of metrics collected by each study, with 35% (17/49) of the papers collecting only 1 metric and 29% (14/49) collecting 4 or more analytic metrics. The number of modules completed, the session duration, and the number of log ins were the most common usage metrics collected.

    Conclusions: This review of current literature identified significant variability and heterogeneity in using analytics to evaluate digital mental health interventions for transition-aged youth. The large proportion of publications from the last 6 years suggests that user analytics is increasingly being integrated into the evaluation of these apps. Numerous gaps related to selecting appropriate and relevant metrics and defining successful or high levels of engagement have been identified for future exploration. Although long-term use or adoption is an important precursor to realizing the expected benefits of an app, few studies have examined this issue. Researchers would benefit from clarification and guidance on how to measure and analyze app usage in terms of evaluating digital mental health interventions for transition-aged youth. Given the established role of adoption in the success of health information technologies, understanding how to abstract and analyze user adoption for consumer digital mental health apps is also an emerging priority.

    JMIR Ment Health 2020;7(6):e15942

    doi:10.2196/15942

    KEYWORDS



    Introduction

    Background

    Transition-aged youth, youth between the ages of 15 and 29 years, may face difficult challenges during the transition from childhood to adulthood [1]. The ongoing challenges to provide adequate support for transition-aged youth are reflected by the observed increase in child and adolescent mental health problems [2-5]. In fact, over half of all mental health disorders among adults begin during childhood and adolescence [6,7]. And currently, mental health and substance use disorders are responsible for 25% of all years lived with disability [8]. In Canada, 11% of youth aged 15-24 years reported experiencing symptoms of depression, and 14% reported experiencing suicidal thoughts in the past [9]. These issues are further complicated by the ongoing stigma related to mental health issues [10,11] and the lower likelihood among this population to seek help [12-14]. Novel approaches to support the mental health needs and demands of this population are warranted to address these challenges [15].

    Mobile Health Interventions

    Mobile health (mHealth) interventions have been identified as a promising avenue to bridge the gap between seeking help and accessing mental health resources for the youth [16-18]. The youth are well acquainted with the use of technology [18], and mobile phone use is deeply embedded in their daily lives [19]. In addition, qualitative interviews with youth show positive perceptions of mHealth interventions for mental health needs [20]. Reviews on digital mHealth interventions for youth in general [21] and college students as a group [22] have suggested that mHealth interventions can be powerful platforms for improving the overall well-being or enhancing mental health treatments among this population. The promising effectiveness of mHealth as a means of delivering mental health interventions among the youth has led to the proliferation of digital technologies.

    Despite these advances, the proliferation of and interest in youth-oriented mental health apps do not always directly translate to real-world outcomes. Numerous explanations exist, and a major barrier identified by Torous et al [23] is the low engagement or uptake of these digital interventions. Since engagement is often considered a prelude to the effectiveness of digital interventions, this warrants a closer examination of how end user engagement is being measured across evaluations of youth-targeted mHealth interventions. [24]. As highlighted in a recent review of this area by Pham et al [25], the unique challenges in measuring and evaluating engagement are not limited to the heterogeneous terms used in reporting engagement levels (ie, adherence, usage, feasibility, adoption, and activity) but extend to the depth and breadth of analytics metrics being selected for measurement. This creates difficulty in selecting, interpreting, comparing, and aggregating data on engagement metrics related to these youth-targeted mHealth interventions. To determine the current state of engagement reporting and inform future efforts, we performed a scoping review of analytic metrics that were measured, reported, and used to inform the evaluation of youth-targeted digital mental health interventions. This study was used to inform the development of a randomized controlled trial (RCT) to evaluate Thought Spot [26-29], a mobile app designed to foster mental health and wellness help-seeking in transition-aged youth across the Greater Toronto Area.


    Methods

    Overview

    We conducted a scoping review using the framework proposed by Arksey and O’Malley [30], which consists of 5 main processes: (1) identifying the research question, (2) identifying relevant papers, (3) selecting studies, (4) extracting the data, and (5) collating and summarizing the data and results.

    Identifying the Research Question

    The main objective of this scoping review was to explore how analytics metrics are measured, reported, and used to evaluate mHealth interventions that target mental health–related issues among transition-aged youth. Under this overarching main research objective, we sought to answer the following 2 research questions to guide the review and analysis:

    1. Which analytics metrics are used in evaluation studies for mHealth interventions that target transition-aged youth?
    2. How do user activity and usage metrics contribute to the interpretation of study data?

    Identifying Relevant Papers

    A search strategy was developed in consultation with a specialist librarian and the research team. A search using key terms such as adoption, evaluation, mental health, transition-aged youth, and mHealth was conducted in July 2018, from 5 databases: Cumulative Index of Nursing and Allied Health Literature (CINAHL), EMBASE, Medical Literature Analysis and Retrieval System (MEDLINE), PsycINFO, and Cochrane Central Register of Controlled Trials (CENTRAL). The full-search strategy for one of the databases (MEDLINE) is presented in Multimedia Appendix 1. Given the timeline of mHealth proliferation, it was deemed adequate to include only papers from the past 10 years (2008-2018) in the initial search [31]. All peer-reviewed literature was included in the review. The search was updated in June 2019, and additional papers were included in the analysis.

    Selecting Studies

    After removal of duplicate papers, titles and abstracts were screened independently by 2 authors (BL and JS). There were 3 inclusion criteria: the papers examined mental health mHealth interventions designed for transition-aged youth (aged 15-29 years), included an analysis of engagement or user activity metrics, and published in English. For studies that did not indicate a specific age range, the mean and SD of the sample were used to determine the eligibility for inclusion. Studies containing only self-reported usage data or those that were developed for clinician use only were excluded, along with reviews, conference reports, and dissertations. No restrictions were placed on the type of research design or the method of comparison being made to ensure that a broad selection of studies were captured for this review.

    Inter-rater reliability in screening by both reviewers was enhanced by first conducting a pilot screen of 100 papers. Pilot screens were completed independently until a satisfactory kappa statistic >0.7 [32] was reached. Two reviewers then conducted title and abstract screening independently for all papers using Abstrackr (Center for Evidence Synthesis in Health, Brown University) [33]. Any discrepancies were resolved through discussion, resulting in a consensus.

    Extracting Data

    A data extraction form adapted from the Cochrane data extraction template [34] was used to extract data from each paper. A summary of the data extracted from each paper is given in Multimedia Appendix 2. This process was also performed independently by 2 authors (BL and JS) to ensure that all relevant data were accurately captured. The data were then collated into a spreadsheet for further analysis. Extracted study details included study design (eg, RCT, observational), sample size, participant characteristics, targeted condition, main research question, and primary and secondary outcomes. The target condition was extracted and categorized using the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders [35], and interventions were classified using the Classification of Digital Health Interventions [36] from the World Health Organization (WHO). Additional data were also gathered based on the research questions outlined above, such as usage metrics, terminology, results, and method of analysis.

    Summarizing the Data

    Quantitative and qualitative analyses of data were conducted. Descriptive statistics (eg, means, medians) were collected and used to characterize and describe each included paper. We identified the metrics of our included papers using a similar approach to that of another recently published review [25]. The themes were reviewed and extracted using a content analysis approach [37] to address the second research question. Two members of the research team (BL and JS) reviewed the data and ensured comprehensiveness and accuracy.


    Results

    Selection of Included Studies

    An overview of the selection of included studies is presented using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses [38] diagram in Figure 1. The database search identified 1784 nonduplicated papers. The kappa statistic, a measure of inter-rater reliability, for the pilot screen was 0.72, meeting the above-defined threshold. Following the title and the abstract review, 229 papers were fully reviewed and assessed based on the inclusion and exclusion criteria. This full-text screening was conducted independently by 2 authors (BL and JS).

    Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses diagram of included studies.
    View this figure

    Characteristics of Included Papers

    The characteristics of and references to the included studies are described in Table 1. Of the 49 papers included in this scoping review, there were 40 full reports that collected and reported usage data and 9 protocol papers that indicated an analysis of usage data in the evaluation. The findings identified 40 unique digital mental health interventions that were evaluated. Although the search strategy included studies published between 2008 and 2019, 80% (39/49) of the studies identified for inclusion were published within the last 6 years (2014-2019). The largest number of studies were conducted in the United States (21/49) and Australia (12/49).

    A large number of studies have examined the efficacy or effectiveness [39-69] of the intervention in supporting individuals (eg, symptoms) as their primary objective. Studies that included other constructs (eg, feasibility [51,66,70-73], acceptability [51,57,66,69-75]) as primary objectives were typically in the earlier stages of development. In terms of methodology, the majority of studies (38/49) used or intended to use RCT methodology to conduct their evaluation. The remaining 11 studies used quasi-experimental approaches, including observational studies and single-group pretest posttest design. Studies using quasi-experimental approaches seem to be in the earlier stages of development. The sample size (or expected sample size) had a median of 100 participants and ranged from 10 to 8242 participants. Approximately 51% (25/49) of the included studies had up to 100 participants for analysis. The length of exposure to the intervention was also captured, and 47 of the 49 studies reported the duration of intervention. Approximately 74% (36/49) of the studies evaluated participants after using the intervention for 1 to 3 months. About 22% (11/49) of the studies asked users to use the app for more than four months.

    Classification using the WHO Classification of Digital Health Interventions [36] revealed that several mental health apps (n=11) exhibited more than one function. Three features that were most commonly found among the evaluated interventions were targeted client communication (n=39), client-to-client communication (n=8), and on-demand information services (n=5). These interventions also targeted a range of mental health–related conditions, with depressive disorders as the most common condition (n=13), followed by interventions targeting participants’ overall well-being (n=11). Other targeted conditions included anxiety disorders (n=2), family violence (n=1), feeding and eating disorders (n=6), neurodevelopmental disorders (n=1), schizophrenia spectrum and other psychotic disorders (n=4), substance use and addictive disorders (n=6). The paper on neurodevelopmental disorders is a study conducted by Backman et al [76] that aims to deliver psychoeducation to adolescents with autism spectrum disorde. Five studies developed interventions that targeted comorbid conditions such as cooccurring depression and alcohol use.

    Table 1. Characteristics of included studies (N=49).
    View this table

    Analytics Metrics and Analysis

    All studies examined usage data in some aspects of their evaluation, and most papers were ascribed to one or more of the following constructs: user adherence [40,46,49-51,53,55,62,64,75,80,82,85], engagement [47,48,52,56,61,65,67,77,79,84], or acceptability [45,57,71]. Although adoption was included in our search, none of the reviewed studies used analytics as a proxy to measure that indicator.

    A summary of the findings for analytics metrics and analysis is provided in Table 2. Fifty percent (25/49) of the studies collected 1 or 2 metrics to examine user activity, and 29% (14/49) collected more than four metrics. Most studies had a limited amount of user activity data that were used to explore and understand how participants used their digital interventions.

    Across the reviewed studies, different types of metrics were used to evaluate the level of user activity in these interventions. Overall, a heterogeneous selection of metrics was collected, with the number of modules completed (n=27/49) and the session duration (n=24) being the most common metrics. In particular, a few studies collected metrics on the number of times different features were used (n=18). Because these features were highly specific to the intervention being evaluated (ie, number of likes, number of journal entries), these unique metrics were collapsed into a general category called number of features used. The other category consisted of metrics that did not fall into the common categories and included the bounce rate, the referral source, and the number of characters. The bounce rate and the referral source were collected by 1 of the studies [83] that used Google Analytics [88].

    In terms of how metrics were analyzed, many studies [40,42,44-48,50,54,57-59,62,63,65,66,68-70,72-77,79,80,82,85,86] presented usage metrics separately from the main study data using descriptive methods, which included the overall counts, frequencies, means, or SDs. Several studies [39,84] had the main objective of examining how different interventions (eg, behavioral activation) impacted user engagement on the app. These studies conducted analyses using hypothesis testing techniques (eg, chi-square test, t tests). Many studies [43,53,60,61,78,81] also attempted to examine if different levels of user activity had a differential effect on the main study outcomes. For example, Paschall et al [43] analyzed reductions in alcohol use by stratifying the course completion rate into 3 categories: high users (70%), medium users (30%-69%), and low users (0%-29%), and reported differential effects within each group. Other studies employed regression analysis and identified varying results. For example, Bidargaddi et al [56] evaluated an app on resource finding and found that the number of log-ins was not associated with the ecological momentary assessment outcomes.

    Table 2. Types of usage metrics collected in included studies (N=49).
    View this table

    Although some studies have also reported similar results [49,51,84], associations between specific analytic measures and outcomes have been identified. For example, Saekow et al [51] noted correlations between the number of log-ins and physical outcome measures of the Eating Disorder Examination Questionnaire. Logsdon et al [61] also found that the duration of a depression intervention was associated with reported attitudes of the participants. Another study on CATCH-IT [78], a module-based intervention for depression, also found that an increased duration on the website was associated with changes in depressive symptoms.

    Two studies incorporated more sophisticated metrics. Stallman and Kavanagh [74] leveraged the Google Analytics engine to examine if referral source as a metric influenced their outcomes. They found that different referral sources resulted in different expectations and disparities in user activities. In addition, Schlosser et al [63] separated their user activity data into active use and passive use by calculating the active use rate. The authors [63] defined active engagement as interacting with the features of the app and passive use as logging onto the app but not interacting with a specific feature of the app (ie, posting a moment, completing a challenge, participating in peer interactions). However, their exploration did not find any significant relationships between these user activity metrics and changes in primary and secondary outcomes.

    The findings of this scoping review will also be used to inform the selection of analytics metrics to be examined as a potential exploratory component of a larger RCT evaluating an mHealth intervention, Thought Spot [26].


    Discussion

    Principal Findings

    This scoping review explores how usage data are characterized and analyzed in evaluations of digital mental health innovations for transition-aged youth. There is an unprecedented demand to address current concerns and gaps in adolescent mental health [89,90], and the increasing ubiquity of mobile apps provides a unique opportunity to do so [91]. The recent nature of the 49 papers included in this review suggests a growing movement toward integrating analytics into mHealth evaluations [25]. In fact, the observed diversity in objectives, sample size, and duration of exposure suggests that analytics can bring value in evaluations of efficacy, effectiveness, and feasibility studies. Analytics is also of particular interest in the mental health domain because of its potential role in digital phenotyping, which is a growing area of study that explores the intersection of behavior and passively collected data [92-94]. As a result, understanding the value and significance of analytics can help to enhance our understanding and applications in digital psychiatry [95]. However, despite the increasing ubiquity of analytics in evaluations, the overall findings of the current review highlight several gaps in evidence [84].

    Foremost, there was significant heterogeneity in the construct that analytics was ascribed to. For example, Bidargaddi et al [56] used the term engagement, whereas Rickhi et al [54] used the term program use, even though both studies used the number of log-ins as their usage metric. Similar variability was also found in a recent review of analytic indicators by Pham et al [25] who suggest that distinctions between certain constructs such as acceptance and engagement are emerging. Likewise, although engagement was found to be the most common construct used to describe analytics, it is not surprising because there is a significant body of literature on conceptualizing this term [25,96,97]. As such, the emerging demarcation of these terminologies will likely foster guidance on the role of analytics in measuring these constructs.

    In addition, although adoption was included in the search strategy, no studies in our included papers examined adoption specifically and/or used analytics as a proxy to abstract the construct. This is interesting in contrast to literature in clinical informatics where adoption is often an end goal for implementation of technology [98-100] and is well described in implementation frameworks [101]. For example, the clinical adoption metamodel [101] uses the definition from Hall [102] and describes adoption as the process (eg, activities and decisions) of integrating a specific technology into an organization. Similarly, Roger’s diffusion of innovation theory defines adoption as “the decision to make full use of an innovation as the best course of action” [103]. Although adoption was a keyword included in our search strategy, we are unable to identify any studies that used analytics to measure adoption. The disparity between the clinical informatics literature and that of consumer informatics literature is intriguing. Increasing awareness of the potential of digital mental health in connecting care and empowering patients to participate in their own care [104] requires users to utilize the technology as appropriate to successfully realize the intended benefits [23]. Addressing this discordance is particularly important, given that Torous et al [105] noted that interest in technology does not necessarily result in the uptake and usage of technology. As such, similar to clinical informatics, it may be valuable to understand and characterize the successful adoption of consumer health information technology [63].

    Furthermore, understanding the process of how transition-aged youth decide to adopt technology would provide further insight into the characteristics of successful adoption [23,39,84]. Transition-aged youth are part of the generation where access to the internet and usage of mobile apps are fundamental to their daily lives [106]. As commented by Bewick et al [39], it is unclear how consumers choose to engage and uptake technologies. Although approaches such as user-centered design methodology [107] and gamification [108] have made significant progress in closing the gap between perceived and actual user needs, real-world engagement remains fairly low and heterogeneous [109]. As such, identifying predictors of uptake by understanding how this population adopts these types of technology may help inform the development of an app that will result in adoption by the intended population [49,55,74,80,84].

    The other key finding of this review is the significant heterogeneity in the number and types of metrics collected. We observed that most studies collected 1 or 2 metrics and usually included the number of modules completed and the session duration. These findings are in contrast to a similar review by Pham et al [25] on chronic diseases, which found that the number of recorded measures and the frequency of interactions were the most prevalent metrics collected. Given that many studies evaluated computerized cognitive-behavioral therapy programs that are modular in nature, it is not surprising that module completion was most commonly found in this review. Nevertheless, the significance and value of each of these metrics are unclear. At the time of writing this paper, a review by Pham et al [25] attempted to address this issue by categorizing metrics into different components of engagement: amount, breadth, depth, and duration. In addition, Pham et al [25] attempted to provide guidance on how to select appropriate metrics for an evaluation. Although they have made significant progress on some of these issues [25], we believe that the importance and consistency of this issue warrants more in-depth discussion.

    The heterogeneity in analytics metrics between studies also generates another level of complexity for analysis. For the various metrics selected, many studies often characterized their analytics findings as constituting high usage [45,50,58,63,72,73,79] or low usage [39,73,80]. However, there does not seem to be a standard as to what constitutes these demarcations [81]. Although it is likely that the boundaries of high or low usage are relative to authors’ expectations, the lack of a standardized definition to demarcate high or low usage can generate confusion. We found that many studies referenced other studies to define high or successful usage levels [50,70,71,81]. However, it is unclear if the threshold used for defining high usage in a single evaluation merits sufficient external validity to be used by others. Several studies also referred to the construct of dosage [54,80,81], particularly with respect to if users have received sufficient dosage to reap the benefits of the app. Although usage data may help provide insight into amount of usage, it is unclear how to determine if usage becomes sufficient, insufficient, or too much [61,80-82]. Identifying robust methods to evaluate the complexity of analytic data and understanding how these levels are demarcated to contribute to the validity of the evaluation would be useful.

    Similar to the observations made by Pham et al [25], many analyses of analytics in our included studies were limited to descriptive analysis, with only a few studies examining the relationship between analytic metrics and primary outcomes. Although these results were primarily mixed, it is also unclear what significance these correlations had in the overall evaluation. Studies that identified a correlation [78,81] suggested that this finding indicated that usage of the app was beneficial for the specific outcome. In other studies where no relationship was identified between analytics and outcomes, some authors questioned if this finding reflected a lack of efficacy of the mHealth app [56]. A few studies also conducted a per-protocol analysis in addition to the traditional intention-to-treat approach [110]. Although intention-to-treat [110] is the gold standard for RCTs, the high attrition of electronic health (eHealth) solutions may be too conservative in evaluating the actual impact on users who have sustained use of a solution [43].

    Limitations

    Several limitations should be noted when interpreting the findings of this scoping review. Because consumer health informatics has become popularized only over the last decade [31], we focused our review on the literature from 2008 onward. Although our findings suggest that most papers have been published within the last 6 years, it is possible that we might have missed papers outside of this window in our search strategy. In addition, due to our scope of work on Thought Spot [26,27,29], our search was limited to examining interventions designed exclusively for transition-aged youth. Because we excluded papers evaluating interventions for other populations (eg, adults) and disease sites (eg, cardiology), there may have been evidence and guidance on the use of analytics in other areas of health care [25]. Evaluating how analytics is applied in other domains of health care may provide more comprehensive insight into the role of analytics in consumer health informatics. It is also interesting to note that most of the papers included in this review were published in North America and Europe, and there was a lack of studies from other countries, including those in Asia and Africa. An evaluation of how adoption and engagement are measured in studies conducted in other countries is warranted.

    A second limitation is that, as per guidelines established for scoping reviews [30], we did not evaluate the quality of the included studies. In addition, our review focused on scientific literature and did not include interviews with key stakeholders or a survey of the gray literature. Future research should explore these other sources of literature and evaluate the quality of the studies identified for subsequent systematic reviews.

    Finally, it should be noted that, due to the limited number of papers in this scoping review, we did not limit our analyses by mental health conditions and types of research designs. In particular, it would be useful to examine how analytics is used to support the evaluation of different solutions and objectives. Studies such as Torous et al [111] and Connolly et al [112] have identified that individuals with different mental health conditions and culture can impact their engagement with technology. Future work should evaluate how these differences should be explored in the evaluation of adoption and engagement with technology.

    Future Directions

    This scoping review provides a preliminary insight into the role of analytics in evaluating mHealth apps for transition-aged youth and identifies both progress that has been made and future areas of exploration. Most importantly, this review adds to the sparse literature on analytics and highlights the need for researchers to assess and standardize how to integrate analytics into their evaluation plans [25]. The addition of more case studies on analytics may help to identify emergent patterns that help us understand how transition-aged youth decide to adopt a solution to address their needs [14].

    In addition, several areas of exploration have been identified for researchers. As suggested by Pham et al [25], there is currently no guidance on how to maximize the value of these data. However, in other fields such as marketing, analytics is a robust component in evaluating the success of products and solutions [113]. Thus, it is important to understand how to analyze these findings. This may include knowing how to identify thresholds for high and low users as well as recognizing the significance of correlations and relationships between analytic metrics and primary outcomes. Given that these thresholds may differ between different clinical conditions and methodologies, subanalyses examining how adoption and engagement are explored for different populations (eg, cultures) and research designs may be of interest. Additionally, similar to the progress made in clinical informatics [98-100], understanding the characteristics and predictors of successful adoption of a consumer technology would provide insights into developing a roadmap toward successful mHealth development for transition-aged youth [114].

    Conclusions

    This scoping review provided initial insights into the role of analytics in evaluating mobile mental health apps (mHealth) for transition-aged youth. Our analysis of 49 studies published between 2008 and 2019 revealed that the use of analytics is becoming increasingly ubiquitous in evaluating mHealth apps for transition-aged youth. Despite recent progress in the use of analytics, there is still heterogeneity in understanding the significance and value of analytics in these evaluations. In addition, the lack of guidance on metrics selection and analysis warrants future exploration. As digital mental health care continues to grow in popularity, particularly for transition-aged youth, understanding analytics and its impact on evaluations would help to streamline the journey toward using digital interventions to foster better mental health care for this population.

    Acknowledgments

    This study was supported through the eHealth Innovations Partnership Program grant (Reference # EH1-143558) from the Canadian Institute of Health Research and through in-kind support from the Centre for Addiction and Mental Health. We thank Howard Wong for assistance with preparation of the data, Vincci Lui, from the Institute of Health Policy, Management and Evaluation at the University of Toronto for her support in developing a search strategy and Hema Zbogar at the Centre for Addiction and Mental Health for copyediting the paper.

    Authors' Contributions

    DW, JS, A Johnson, AA-Jaoude, and EH conceived the idea for this project. BL and JS developed the search strategy and conducted a screening process. BL and JS conducted data extraction and analysis with support from DW, AJ, AA-Jaoude, and EH. BL and JS led to the writing of the manuscript. EH and DW made substantial edits to the manuscript before submission. All authors have reviewed and approved this manuscript.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Sample search strategy for scoping review.

    DOCX File , 16 KB

    Multimedia Appendix 2

    Data extraction table for scoping review.

    XLSX File (Microsoft Excel File), 34 KB

    References

    1. Mandarino K. Transitional-age youths: barriers to accessing adult mental health services and the changing definition of adolescence. J Hum Behav Soc Environ 2014 Apr 30;24(4):462-474. [CrossRef]
    2. Sawyer SM, Afifi RA, Bearinger LH, Blakemore S, Dick B, Ezeh AC, et al. Adolescence: a foundation for future health. Lancet 2012 Apr 28;379(9826):1630-1640. [CrossRef] [Medline]
    3. Bor W, Dean AJ, Najman J, Hayatbakhsh R. Are child and adolescent mental health problems increasing in the 21st century? A systematic review. Aust N Z J Psychiatry 2014 Jul;48(7):606-616. [CrossRef] [Medline]
    4. Lipson SK, Lattie EG, Eisenberg D. Increased rates of mental health service utilization by US college students: 10-year population-level trends (2007-2017). Psychiatr Serv 2019 Jan 1;70(1):60-63 [FREE Full text] [CrossRef] [Medline]
    5. Burstein B, Agostino H, Greenfield B. Suicidal attempts and ideation among children and adolescents in US emergency departments, 2007-2015. JAMA Pediatr 2019 Jun 1;173(6):598-600. [CrossRef] [Medline]
    6. Kessler RC, Wang PS. The descriptive epidemiology of commonly occurring mental disorders in the United States. Annu Rev Public Health 2008;29:115-129. [CrossRef] [Medline]
    7. Paus T, Keshavan M, Giedd JN. Why do many psychiatric disorders emerge during adolescence? Nat Rev Neurosci 2008 Dec;9(12):947-957 [FREE Full text] [CrossRef] [Medline]
    8. Erskine HE, Moffitt TE, Copeland WE, Costello EJ, Ferrari AJ, Patton G, et al. A heavy burden on young minds: the global burden of mental and substance use disorders in children and youth. Psychol Med 2015 May;45(7):1551-1563 [FREE Full text] [CrossRef] [Medline]
    9. Findlay L. Depression and suicidal ideation among Canadians aged 15 to 24. Health Rep 2017 Jan 18;28(1):3-11 [FREE Full text] [Medline]
    10. Clement S, Schauman O, Graham T, Maggioni F, Evans-Lacko S, Bezborodovs N, et al. What is the impact of mental health-related stigma on help-seeking? A systematic review of quantitative and qualitative studies. Psychol Med 2015 Jan;45(1):11-27. [CrossRef] [Medline]
    11. Nearchou FA, Bird N, Costello A, Duggan S, Gilroy J, Long R, et al. Personal and perceived public mental-health stigma as predictors of help-seeking intentions in adolescents. J Adolesc 2018 Jul;66:83-90. [CrossRef] [Medline]
    12. MacKinnon N, Colman I. Factors associated with suicidal thought and help-seeking behaviour in transition-aged youth versus adults. Can J Psychiatry 2016 Dec;61(12):789-796 [FREE Full text] [CrossRef] [Medline]
    13. Townsend E. Time to take self-harm in young people seriously. Lancet Psychiatry 2019 Apr;6(4):279-280 [FREE Full text] [CrossRef] [Medline]
    14. Fleming T, Merry S, Stasiak K, Hopkins S, Patolo T, Ruru S, et al. The importance of user segmentation for designing digital therapy for adolescent mental health: findings from scoping processes. JMIR Ment Health 2019 May 8;6(5):e12656 [FREE Full text] [CrossRef] [Medline]
    15. Davey CG, McGorry PD. Early intervention for depression in young people: a blind spot in mental health care. Lancet Psychiatry 2019 Mar;6(3):267-272. [CrossRef]
    16. Price M, Yuen EK, Goetter EM, Herbert JD, Forman EM, Acierno R, et al. mHealth: a mechanism to deliver more accessible, more effective mental health care. Clin Psychol Psychother 2014;21(5):427-436 [FREE Full text] [CrossRef] [Medline]
    17. Chandrashekar P. Do mental health mobile apps work: evidence and recommendations for designing high-efficacy mental health mobile apps. Mhealth 2018;4:6 [FREE Full text] [CrossRef] [Medline]
    18. Aschbrenner KA, Naslund JA, Tomlinson EF, Kinney A, Pratt SI, Brunette MF. Adolescents' use of digital technologies and preferences for mobile health coaching in public mental health settings. Front Public Health 2019;7:178 [FREE Full text] [CrossRef] [Medline]
    19. Fedele DA, Cushing CC, Fritz A, Amaro CM, Ortega A. Mobile health interventions for improving health outcomes in youth: a meta-analysis. JAMA Pediatr 2017 May 1;171(5):461-469 [FREE Full text] [CrossRef] [Medline]
    20. Mar MY, Neilson EK, Torchalla I, Werker GR, Laing A, Krausz M. Exploring e-mental health preferences of generation Y. J Technol Hum Serv 2014 Nov 26;32(4):312-327. [CrossRef]
    21. Seko Y, Kidd S, Wiljer D, McKenzie K. Youth mental health interventions via mobile phones: a scoping review. Cyberpsychol Behav Soc Netw 2014 Sep;17(9):591-602. [CrossRef] [Medline]
    22. Lattie EG, Adkins EC, Winquist N, Stiles-Shields C, Wafford QE, Graham AK. Digital mental health interventions for depression, anxiety, and enhancement of psychological well-being among college students: systematic review. J Med Internet Res 2019 Jul 22;21(7):e12869 [FREE Full text] [CrossRef] [Medline]
    23. Torous J, Nicholas J, Larsen ME, Firth J, Christensen H. Clinical review of user engagement with mental health smartphone apps: evidence, theory and improvements. Evid Based Ment Health 2018 Aug;21(3):116-119. [CrossRef] [Medline]
    24. Yardley L, Choudhury T, Patrick K, Michie S. Current issues and future directions for research into digital behavior change interventions. Am J Prev Med 2016 Nov;51(5):814-815. [CrossRef] [Medline]
    25. Pham Q, Graham G, Carrion C, Morita PP, Seto E, Stinson JN, et al. A library of analytic indicators to evaluate effective engagement with consumer mhealth apps for chronic conditions: scoping review. JMIR Mhealth Uhealth 2019 Jan 18;7(1):e11941 [FREE Full text] [CrossRef] [Medline]
    26. Wiljer D, Abi-Jaoude A, Johnson A, Ferguson G, Sanches M, Levinson A, et al. Enhancing self-efficacy for help-seeking among transition-aged youth in postsecondary settings with mental health and/or substance use concerns, using crowd-sourced online and mobile technologies: the thought spot protocol. JMIR Res Protoc 2016 Nov 4;5(4):e201 [FREE Full text] [CrossRef] [Medline]
    27. van Heerwaarden N, Ferguson G, Abi-Jaoude A, Johnson A, Hollenberg E, Chaim G, et al. The optimization of an ehealth solution (thought spot) with transition-aged youth in postsecondary settings: participatory design research. J Med Internet Res 2018 Mar 6;20(3):e79 [FREE Full text] [CrossRef] [Medline]
    28. Kaur A, Isaranuwatchai W, Jaffer A, Ferguson G, Abi-Jaoude A, Johnson A, et al. A web- and mobile-based map of mental health resources for postsecondary students (thought spot): protocol for an economic evaluation. JMIR Res Protoc 2018 Mar 29;7(3):e83 [FREE Full text] [CrossRef] [Medline]
    29. Sennah S, Shi J, Hollenberg E, Johnson A, Ferguson G, Abi-Jaoudé A, et al. Thought spot: embedding usability testing into the development cycle. Stud Health Technol Inform 2019;257:375-381. [Medline]
    30. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res 2005 Feb;8(1):19-32. [CrossRef]
    31. Xu W, Liu Y. mHealthapps: a repository and database of mobile health apps. JMIR Mhealth Uhealth 2015 Mar 18;3(1):e28 [FREE Full text] [CrossRef] [Medline]
    32. McHugh ML. Interrater reliability: the kappa statistic. Biochem Med (Zagreb) 2012;22(3):276-282 [FREE Full text] [CrossRef] [Medline]
    33. Wallace BC, Small K, Brodley C, Lau J, Trikalinos T. Deploying an Interactive Machine Learning System in an Evidence-Based Practice Center: Abstrackr. In: Proceedings of the 2nd ACM SIGHIT International Health Informatics Symposium. 2012 Presented at: IHI'12; January 28-30, 2012; New York, USA p. 819-824. [CrossRef]
    34. Cochrane Public Health. Data Extraction and Assessment Template   URL: https:/​/ph.​cochrane.org/​sites/​ph.cochrane.org/​files/​public/​uploads/​CPHG%20Data%20extraction%20template_0.​docx [accessed 2019-08-15]
    35. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders (DSM-5). Fifth Edition. Washington, DC: American Psychiatric Association Publication; 2013.
    36. World Health Organization. 2018. Classification of Digital Health Interventions v1.0: A Shared Language to Describe the Uses of Digital Technology for Health   URL: https:/​/www.​who.int/​reproductivehealth/​publications/​mhealth/​classification-digital-health-interventions/​en/​ [accessed 2019-08-15]
    37. Hsieh H, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res 2005 Nov;15(9):1277-1288. [CrossRef] [Medline]
    38. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009 Jul 21;6(7):e1000097 [FREE Full text] [CrossRef] [Medline]
    39. Bewick BM, Trusler K, Mulhern B, Barkham M, Hill AJ. The feasibility and effectiveness of a web-based personalised feedback and social norms alcohol intervention in UK university students: a randomised control trial. Addict Behav 2008 Sep;33(9):1192-1198. [CrossRef] [Medline]
    40. van Voorhees BW, Fogel J, Reinecke MA, Gladstone T, Stuart S, Gollan J, et al. Randomized clinical trial of an internet-based depression prevention program for adolescents (Project CATCH-IT) in primary care: 12-week outcomes. J Dev Behav Pediatr 2009 Feb;30(1):23-37 [FREE Full text] [CrossRef] [Medline]
    41. Christensen H, Griffiths KM, Mackinnon AJ, Kalia K, Batterham PJ, Kenardy J, et al. Protocol for a randomised controlled trial investigating the effectiveness of an online ehealth application for the prevention of generalised anxiety disorder. BMC Psychiatry 2010 Mar 21;10:25 [FREE Full text] [CrossRef] [Medline]
    42. Shandley K, Austin D, Klein B, Kyrios M. An evaluation of 'reach out central': an online gaming program for supporting the mental health of young people. Health Educ Res 2010 Aug;25(4):563-574. [CrossRef] [Medline]
    43. Paschall MJ, Antin T, Ringwalt CL, Saltz RF. Evaluation of an internet-based alcohol misuse prevention course for college freshmen: findings of a randomized multi-campus trial. Am J Prev Med 2011 Sep;41(3):300-308 [FREE Full text] [CrossRef] [Medline]
    44. Jacobi C, Völker U, Trockel MT, Taylor CB. Effects of an internet-based intervention for subthreshold eating disorders: a randomized controlled trial. Behav Res Ther 2012 Feb;50(2):93-99. [CrossRef] [Medline]
    45. Stice E, Rohde P, Durant S, Shaw H. A preliminary trial of a prototype internet dissonance-based eating disorder prevention program for young women with body image concerns. J Consult Clin Psychol 2012 Oct;80(5):907-916 [FREE Full text] [CrossRef] [Medline]
    46. Saulsberry A, Marko-Holguin M, Blomeke K, Hinkle C, Fogel J, Gladstone T, et al. Randomized clinical trial of a primary care internet-based intervention to prevent adolescent depression: one-year outcomes. J Can Acad Child Adolesc Psychiatry 2013 May;22(2):106-117 [FREE Full text] [Medline]
    47. Deady M, Teesson M, Kay-Lambkin F, Mills KL. Evaluating a brief, internet-based intervention for co-occurring depression and problematic alcohol use in young people: protocol for a randomized controlled trial. JMIR Res Protoc 2014 Feb 27;3(1):e6 [FREE Full text] [CrossRef] [Medline]
    48. Richards D, Timulak L, Doherty G, Sharry J, McLoughlin O, Rashleigh C, et al. Low-intensity internet-delivered treatment for generalized anxiety symptoms in routine care: protocol for a randomized controlled trial. Trials 2014 Apr 27;15:145 [FREE Full text] [CrossRef] [Medline]
    49. Kass AE, Trockel M, Safer DL, Sinton MM, Cunning D, Rizk MT, et al. Internet-based preventive intervention for reducing eating disorder risk: a randomized controlled trial comparing guided with unguided self-help. Behav Res Ther 2014 Dec;63:90-98 [FREE Full text] [CrossRef] [Medline]
    50. Nahum M, Fisher M, Loewy R, Poelke G, Ventura J, Nuechterlein KH, et al. A novel, online social cognitive training program for young adults with schizophrenia: a pilot study. Schizophr Res Cogn 2014 Mar 1;1(1):e11-e19 [FREE Full text] [CrossRef] [Medline]
    51. Saekow J, Jones M, Gibbs E, Jacobi C, Fitzsimmons-Craft EE, Wilfley D, et al. StudentBodies-eating disorders: a randomized controlled trial of a coached online intervention for subclinical eating disorders. Internet Interv 2015 Nov;2(4):419-428. [CrossRef]
    52. Morris RR, Schueller SM, Picard RW. Efficacy of a web-based, crowdsourced peer-to-peer cognitive reappraisal platform for depression: randomized controlled trial. J Med Internet Res 2015 Mar 30;17(3):e72 [FREE Full text] [CrossRef] [Medline]
    53. Perry Y, Calear AL, Mackinnon A, Batterham PJ, Licinio J, King C, et al. Trial for the prevention of depression (TriPoD) in final-year secondary students: study protocol for a cluster randomised controlled trial. Trials 2015 Oct 12;16:451 [FREE Full text] [CrossRef] [Medline]
    54. Rickhi B, Kania-Richmond A, Moritz S, Cohen J, Paccagnan P, Dennis C, et al. Evaluation of a spirituality informed e-mental health tool as an intervention for major depressive disorder in adolescents and young adults-a randomized controlled pilot trial. BMC Complement Altern Med 2015 Dec 24;15:450 [FREE Full text] [CrossRef] [Medline]
    55. Richards K, Marko-Holguin M, Fogel J, Anker L, Ronayne J, van Voorhees BW. Randomized clinical trial of an internet-based intervention to prevent adolescent depression in a primary care setting (Catch-It): 2.5-year outcomes. J Evid Based Psychother 2016 Sep;16(2):113-134 [FREE Full text] [Medline]
    56. Bidargaddi N, Musiat P, Winsall M, Vogl G, Blake V, Quinn S, et al. Efficacy of a web-based guided recommendation service for a curated list of readily available mental health and well-being mobile apps for young people: randomized controlled trial. J Med Internet Res 2017 May 12;19(5):e141 [FREE Full text] [CrossRef] [Medline]
    57. Lattie EG, Duffecy JL, Mohr DC, Kashima K. Development and evaluation of an online mental health program for medical students. Acad Psychiatry 2017 Oct;41(5):642-645 [FREE Full text] [CrossRef] [Medline]
    58. Tighe J, Shand F, Ridani R, Mackinnon A, de la Mata N, Christensen H. Ibobbly mobile health intervention for suicide prevention in Australian indigenous youth: a pilot randomised controlled trial. BMJ Open 2017 Jan 27;7(1):e013518 [FREE Full text] [CrossRef] [Medline]
    59. Podina IR, Fodor LA, Cosmoiu A, Boian R. An evidence-based gamified mHealth intervention for overweight young adults with maladaptive eating habits: study protocol for a randomized controlled trial. Trials 2017 Dec 12;18(1):592 [FREE Full text] [CrossRef] [Medline]
    60. Lima-Serrano M, Martínez-Montilla JM, Lima-Rodríguez JS, Mercken L, de Vries H. Design, implementation and evaluation of a web-based computer-tailored intervention to prevent binge drinking in adolescents: study protocol. BMC Public Health 2018 Apr 4;18(1):449 [FREE Full text] [CrossRef] [Medline]
    61. Logsdon MC, Myers J, Rushton J, Gregg JL, Josephson AM, Davis DW, et al. Efficacy of an internet-based depression intervention to improve rates of treatment in adolescent mothers. Arch Womens Ment Health 2018 Jun;21(3):273-285 [FREE Full text] [CrossRef] [Medline]
    62. Harrer M, Adam SH, Fleischmann RJ, Baumeister H, Auerbach R, Bruffaerts R, et al. Effectiveness of an internet- and app-based intervention for college students with elevated stress: randomized controlled trial. J Med Internet Res 2018 Apr 23;20(4):e136 [FREE Full text] [CrossRef] [Medline]
    63. Schlosser DA, Campellone TR, Truong B, Etter K, Vergani S, Komaiko K, et al. Efficacy of PRIME, a mobile app intervention designed to improve motivation in young people with schizophrenia. Schizophr Bull 2018 Aug 20;44(5):1010-1020 [FREE Full text] [CrossRef] [Medline]
    64. Karyotaki E, Klein AM, Riper H, Wit LD, Krijnen L, Bol E, et al. Examining the effectiveness of a web-based intervention for symptoms of depression and anxiety in college students: study protocol of a randomised controlled trial. BMJ Open 2019 May 14;9(5):e028739 [FREE Full text] [CrossRef] [Medline]
    65. Hides L, Dingle G, Quinn C, Stoyanov SR, Zelenko O, Tjondronegoro D, et al. Efficacy and outcomes of a music-based emotion regulation mobile app in distressed young people: randomized controlled trial. JMIR Mhealth Uhealth 2019 Jan 16;7(1):e11482 [FREE Full text] [CrossRef] [Medline]
    66. Palacios JE, Richards D, Palmer R, Coudray C, Hofmann SG, Palmieri PA, et al. Supported internet-delivered cognitive behavioral therapy programs for depression, anxiety, and stress in university students: open, non-randomised trial of acceptability, effectiveness, and satisfaction. JMIR Ment Health 2018 Dec 14;5(4):e11467 [FREE Full text] [CrossRef] [Medline]
    67. Wilksch SM, O'Shea A, Taylor CB, Wilfley D, Jacobi C, Wade TD. Online prevention of disordered eating in at-risk young-adult women: a two-country pragmatic randomized controlled trial. Psychol Med 2018 Sep;48(12):2034-2044 [FREE Full text] [CrossRef] [Medline]
    68. Brunette MF, Ferron JC, Robinson D, Coletti D, Geiger P, Devitt T, et al. Brief web-based interventions for young adult smokers with severe mental illnesses: a randomized, controlled pilot study. Nicotine Tob Res 2018 Sep 4;20(10):1206-1214 [FREE Full text] [CrossRef] [Medline]
    69. Eustis EH, Hayes-Skelton SA, Orsillo SM, Roemer L. Surviving and thriving during stress: a randomized clinical trial comparing a brief web-based therapist-assisted acceptance-based behavioral intervention versus waitlist control for college students. Behav Ther 2018 Nov;49(6):889-903. [CrossRef] [Medline]
    70. Alvarez-Jimenez M, Bendall S, Lederman R, Wadley G, Chinnery G, Vargas S, et al. On the HORYZON: moderated online social therapy for long-term recovery in first episode psychosis. Schizophr Res 2013 Jan;143(1):143-149. [CrossRef] [Medline]
    71. Rice S, Gleeson J, Davey C, Hetrick S, Parker A, Lederman R, et al. Moderated online social therapy for depression relapse prevention in young people: pilot study of a 'next generation' online intervention. Early Interv Psychiatry 2018 Aug;12(4):613-625. [CrossRef] [Medline]
    72. Schlosser D, Campellone T, Kim D, Truong B, Vergani S, Ward C, et al. Feasibility of PRIME: a cognitive neuroscience-informed mobile app intervention to enhance motivated behavior and improve quality of life in recent onset schizophrenia. JMIR Res Protoc 2016 Apr 28;5(2):e77 [FREE Full text] [CrossRef] [Medline]
    73. Deady M, Mills KL, Teesson M, Kay-Lambkin F. An online intervention for co-occurring depression and problematic alcohol use in young people: primary outcomes from a randomized controlled trial. J Med Internet Res 2016 Mar 23;18(3):e71 [FREE Full text] [CrossRef] [Medline]
    74. Stallman HM, Kavanagh DJ. Development of an internet intervention to promote wellbeing in college students. Aust Psychol 2016 Aug 28;53(1):60-67. [CrossRef]
    75. Juniar D, van Ballegooijen W, Karyotaki E, van Schaik A, Passchier J, Heber E, et al. Web-based stress management program for university students in Indonesia: systematic cultural adaptation and protocol for a feasibility study. JMIR Res Protoc 2019 Jan 25;8(1):e11493 [FREE Full text] [CrossRef] [Medline]
    76. Backman A, Mellblom A, Norman-Claesson E, Keith-Bodros G, Frostvittra M, Bölte S, et al. Internet-delivered psychoeducation for older adolescents and young adults with autism spectrum disorder (SCOPE): an open feasibility study. Res Autism Spec Dis 2018 Oct;54:51-64. [CrossRef]
    77. Bailey JV, Pavlou M, Copas A, McCarthy O, Carswell K, Rait G, et al. The Sexunzipped trial: optimizing the design of online randomized controlled trials. J Med Internet Res 2013 Dec 11;15(12):e278 [FREE Full text] [CrossRef] [Medline]
    78. Gladstone T, Marko-Holguin M, Henry J, Fogel J, Diehl A, van Voorhees BW. Understanding adolescent response to a technology-based depression prevention program. J Clin Child Adolesc Psychol 2014;43(1):102-114 [FREE Full text] [CrossRef] [Medline]
    79. Levin ME, Pistorello J, Seeley JR, Hayes SC. Feasibility of a prototype web-based acceptance and commitment therapy prevention program for college students. J Am Coll Health 2014;62(1):20-30 [FREE Full text] [CrossRef] [Medline]
    80. Lillevoll KR, Vangberg HC, Griffiths KM, Waterloo K, Eisemann MR. Uptake and adherence of a self-directed internet-based mental health intervention with tailored e-mail reminders in senior high schools in Norway. BMC Psychiatry 2014 Jan 21;14:14 [FREE Full text] [CrossRef] [Medline]
    81. Witkiewitz K, Desai SA, Bowen S, Leigh BC, Kirouac M, Larimer ME. Development and evaluation of a mobile intervention for heavy drinking and smoking among college students. Psychol Addict Behav 2014 Sep;28(3):639-650 [FREE Full text] [CrossRef] [Medline]
    82. Kruger JR, Kim P, Iyer V, Marko-Holguin M, Fogel J, DeFrino D, et al. Evaluation of protective and vulnerability factors for depression following an internet-based intervention to prevent depression in at-risk adolescents. Int J Ment Health Promot 2017 Apr 19;19(2):69-84. [CrossRef]
    83. van Rosmalen-Nooijens K, Lo Fo Wong S, Prins J, Lagro-Janssen T. Young people, adult worries: randomized controlled trial and feasibility study of the internet-based self-support method 'feel the vibe' for adolescents and young adults exposed to family violence. J Med Internet Res 2017 Jun 12;19(6):e204 [FREE Full text] [CrossRef] [Medline]
    84. Jaffe AE, Bountress KE, Metzger IW, Maples-Keller JL, Pinsky HT, George WH, et al. Student engagement and comfort during a web-based personalized feedback intervention for alcohol and sexual assault. Addict Behav 2018 Jul;82:23-27 [FREE Full text] [CrossRef] [Medline]
    85. Takahashi K, Takada K, Hirao K. Feasibility and preliminary efficacy of a smartphone application intervention for subthreshold depression. Early Interv Psychiatry 2019 Feb;13(1):133-136. [CrossRef] [Medline]
    86. Sagon AL, Danitz SB, Suvak MK, Orsillo SM. The mindful way through the semester: evaluating the feasibility of delivering an acceptance-based behavioral program online. J Contextual Behav Sci 2018 Jul;9:36-44. [CrossRef]
    87. Moir F, Fernando AT, Kumar S, Henning M, Moyes SA, Elley CR. Computer assisted learning for the mind (CALM): the mental health of medical students and their use of a self-help website. N Z Med J 2015 Mar 27;128(1411):51-58. [Medline]
    88. Google Analytics. 2019.   URL: https://analytics.google.com [accessed 2019-08-15]
    89. Moffitt TE, Caspi A. Psychiatry's opportunity to prevent the rising burden of age-related disease. JAMA Psychiatry 2019 May 1;76(5):461-462. [CrossRef] [Medline]
    90. Gunnell D, Kidger J, Elvidge H. Adolescent mental health in crisis. Br Med J 2018 Jun 19;361:k2608. [CrossRef] [Medline]
    91. Grist R, Porter J, Stallard P. Mental health mobile apps for preadolescents and adolescents: a systematic review. J Med Internet Res 2017 May 25;19(5):e176 [FREE Full text] [CrossRef] [Medline]
    92. Onnela J, Rauch SL. Harnessing smartphone-based digital phenotyping to enhance behavioral and mental health. Neuropsychopharmacology 2016 Jun;41(7):1691-1696 [FREE Full text] [CrossRef] [Medline]
    93. Insel TR. Digital phenotyping: technology for a new science of behavior. J Am Med Assoc 2017 Oct 03;318(13):1215-1216. [CrossRef] [Medline]
    94. Torous J, Gershon A, Hays R, Onnela J, Baker JT. Digital phenotyping for the busy psychiatrist: clinical implications and relevance. Psych Ann 2019 May 1;49(5):196-201. [CrossRef]
    95. Torous J, Keshavan M, Gutheil T. Promise and perils of digital psychiatry. Asian J Psychiatr 2014 Aug;10:120-122. [CrossRef] [Medline]
    96. Short CE, de Smet A, Woods C, Williams SL, Maher C, Middelweerd A, et al. Measuring engagement in ehealth and mhealth behavior change interventions: viewpoint of methodologies. J Med Internet Res 2018 Nov 16;20(11):e292 [FREE Full text] [CrossRef] [Medline]
    97. Yardley L, Spring BJ, Riper H, Morrison LG, Crane DH, Curtis K, et al. Understanding and promoting effective engagement with digital behavior change interventions. Am J Prev Med 2016 Nov;51(5):833-842. [CrossRef] [Medline]
    98. Vedel I, Lapointe L, Lussier M, Richard C, Goudreau J, Lalonde L, et al. Healthcare professionals' adoption and use of a clinical information system (CIS) in primary care: insights from the Da Vinci study. Int J Med Inform 2012 Feb;81(2):73-87. [CrossRef] [Medline]
    99. Ash JS, Bates DW. Factors and forces affecting EHR system adoption: report of a 2004 ACMI discussion. J Am Med Inform Assoc 2005;12(1):8-12 [FREE Full text] [CrossRef] [Medline]
    100. Strudwick G. Predicting nurses' use of healthcare technology using the technology acceptance model: an integrative review. Comput Inform Nurs 2015 May;33(5):189-98; quiz E1. [CrossRef] [Medline]
    101. Price M, Lau F. The clinical adoption meta-model: a temporal meta-model describing the clinical adoption of health information systems. BMC Med Inform Decis Mak 2014 May 29;14:43 [FREE Full text] [CrossRef] [Medline]
    102. Hall GE. A Developmental Conceptualization of the Adoption Process Within Educational Institutions. Texas, USA: Research and Development Center for Teacher Education; 1973.
    103. Rogers EM. Diffusion of Innovations. New York, USA: Free Press; 1995.
    104. Singh K, Meyer SR, Westfall JM. Consumer-facing data, information, and tools: self-management of health in the digital age. Health Aff (Millwood) 2019 Mar;38(3):352-358. [CrossRef] [Medline]
    105. Torous J, Wisniewski H, Liu G, Keshavan M. Mental health mobile phone app usage, concerns, and benefits among psychiatric outpatients: comparative survey study. JMIR Ment Health 2018 Nov 16;5(4):e11715 [FREE Full text] [CrossRef] [Medline]
    106. Wartella E, Rideout V, Montague H, Beaudoin-Ryan L, Lauricella A. Teens, health and technology: a national survey. Media Comm 2016 Jun 16;4(3):13. [CrossRef]
    107. McCurdie T, Taneva S, Casselman M, Yeung M, McDaniel C, Ho W, et al. mHealth consumer apps: the case for user-centered design. Biomed Instrum Technol 2012;Suppl:49-56. [CrossRef] [Medline]
    108. Cheng VW, Davenport T, Johnson D, Vella K, Hickie IB. Gamification in apps and technologies for improving mental health and well-being: systematic review. JMIR Ment Health 2019 Jun 26;6(6):e13717 [FREE Full text] [CrossRef] [Medline]
    109. Fleming T, Bavin L, Lucassen M, Stasiak K, Hopkins S, Merry S. Beyond the trial: systematic review of real-world uptake and engagement with digital self-help interventions for depression, low mood, or anxiety. J Med Internet Res 2018 Jun 6;20(6):e199 [FREE Full text] [CrossRef] [Medline]
    110. Gupta SK. Intention-to-treat concept: a review. Perspect Clin Res 2011 Jul;2(3):109-112 [FREE Full text] [CrossRef] [Medline]
    111. Torous J, Staples P, Slaters L, Adams J, Sandoval L, Onnela JP, et al. Characterizing smartphone engagement for schizophrenia: results of a naturalist mobile health study. Clin Schizophr Relat Psychoses 2017 Aug 4 epub ahead of print. [CrossRef] [Medline]
    112. Connolly SL, Miller CJ, Koenig CJ, Zamora KA, Wright PB, Stanley RL, et al. Veterans' attitudes toward smartphone app use for mental health care: qualitative study of rurality and age differences. JMIR Mhealth Uhealth 2018 Aug 22;6(8):e10748 [FREE Full text] [CrossRef] [Medline]
    113. Chaffey D, Patron M. From web analytics to digital marketing optimization: increasing the commercial value of digital analytics. J Direct Data Digit Mark Pract 2012 Aug 8;14(1):30-45. [CrossRef]
    114. Bakken S. The importance of consumer-and patient-oriented perspectives in biomedical and health informatics. J Am Med Inform Assoc 2019 Jul 1;26(7):583-584. [CrossRef] [Medline]


    Abbreviations

    eHealth: electronic health
    MEDLINE: Medical Literature Analysis and Retrieval System
    mHealth: mobile health
    RCT: randomized controlled trial
    WHO: World Health Organization


    Edited by G Eysenbach; submitted 20.08.19; peer-reviewed by P Cheng, H Xie; comments to author 01.10.19; revised version received 20.11.19; accepted 10.02.20; published 25.06.20

    ©Brian Lo, Jenny Shi, Elisa Hollenberg, Alexxa Abi-Jaoudé, Andrew Johnson, David Wiljer. Originally published in JMIR Mental Health (http://mental.jmir.org), 25.06.2020.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.