This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.
Consumer-facing digital health interventions provide a promising avenue to bridge gaps in mental health care delivery. To evaluate these interventions, understanding how the target population uses a solution is critical to the overall validity and reliability of the evaluation. As a result, usage data (analytics) can provide a proxy for evaluating the engagement of a solution. However, there is paucity of guidance on how usage data or analytics should be used to assess and evaluate digital mental health interventions.
This review aimed to examine how usage data are collected and analyzed in evaluations of mental health mobile apps for transition-aged youth (15-29 years).
A scoping review was conducted using the Arksey and O’Malley framework. A systematic search was conducted on 5 journal databases using keywords related to usage and engagement, mental health apps, and evaluation. A total of 1784 papers from 2008 to 2019 were identified and screened to ensure that they included analytics and evaluated a mental health app for transition-aged youth. After full-text screening, 49 papers were included in the analysis.
Of the 49 papers included in the analysis, 40 unique digital mental health innovations were evaluated, and about 80% (39/49) of the papers were published over the past 6 years. About 80% involved a randomized controlled trial and evaluated apps with information delivery features. There were heterogeneous findings in the concept that analytics was ascribed to, with the top 3 being engagement, adherence, and acceptability. There was also a significant spread in the number of metrics collected by each study, with 35% (17/49) of the papers collecting only 1 metric and 29% (14/49) collecting 4 or more analytic metrics. The number of modules completed, the session duration, and the number of log ins were the most common usage metrics collected.
This review of current literature identified significant variability and heterogeneity in using analytics to evaluate digital mental health interventions for transition-aged youth. The large proportion of publications from the last 6 years suggests that user analytics is increasingly being integrated into the evaluation of these apps. Numerous gaps related to selecting appropriate and relevant metrics and defining successful or high levels of engagement have been identified for future exploration. Although long-term use or adoption is an important precursor to realizing the expected benefits of an app, few studies have examined this issue. Researchers would benefit from clarification and guidance on how to measure and analyze app usage in terms of evaluating digital mental health interventions for transition-aged youth. Given the established role of adoption in the success of health information technologies, understanding how to abstract and analyze user adoption for consumer digital mental health apps is also an emerging priority.
Transition-aged youth, youth between the ages of 15 and 29 years, may face difficult challenges during the transition from childhood to adulthood [
Mobile health (mHealth) interventions have been identified as a promising avenue to bridge the gap between seeking help and accessing mental health resources for the youth [
Despite these advances, the proliferation of and interest in youth-oriented mental health apps do not always directly translate to real-world outcomes. Numerous explanations exist, and a major barrier identified by Torous et al [
We conducted a scoping review using the framework proposed by Arksey and O’Malley [
The main objective of this scoping review was to explore how analytics metrics are measured, reported, and used to evaluate mHealth interventions that target mental health–related issues among transition-aged youth. Under this overarching main research objective, we sought to answer the following 2 research questions to guide the review and analysis:
Which analytics metrics are used in evaluation studies for mHealth interventions that target transition-aged youth?
How do user activity and usage metrics contribute to the interpretation of study data?
A search strategy was developed in consultation with a specialist librarian and the research team. A search using key terms such as adoption, evaluation, mental health, transition-aged youth, and mHealth was conducted in July 2018, from 5 databases: Cumulative Index of Nursing and Allied Health Literature (CINAHL), EMBASE, Medical Literature Analysis and Retrieval System (MEDLINE), PsycINFO, and Cochrane Central Register of Controlled Trials (CENTRAL). The full-search strategy for one of the databases (MEDLINE) is presented in
After removal of duplicate papers, titles and abstracts were screened independently by 2 authors (BL and JS). There were 3 inclusion criteria: the papers examined mental health mHealth interventions designed for transition-aged youth (aged 15-29 years), included an analysis of engagement or user activity metrics, and published in English. For studies that did not indicate a specific age range, the mean and SD of the sample were used to determine the eligibility for inclusion. Studies containing only self-reported usage data or those that were developed for clinician use only were excluded, along with reviews, conference reports, and dissertations. No restrictions were placed on the type of research design or the method of comparison being made to ensure that a broad selection of studies were captured for this review.
Inter-rater reliability in screening by both reviewers was enhanced by first conducting a pilot screen of 100 papers. Pilot screens were completed independently until a satisfactory kappa statistic >0.7 [
A data extraction form adapted from the Cochrane data extraction template [
Quantitative and qualitative analyses of data were conducted. Descriptive statistics (eg, means, medians) were collected and used to characterize and describe each included paper. We identified the metrics of our included papers using a similar approach to that of another recently published review [
An overview of the selection of included studies is presented using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses [
Preferred Reporting Items for Systematic Reviews and Meta-Analyses diagram of included studies.
The characteristics of and references to the included studies are described in
A large number of studies have examined the efficacy or effectiveness [
Classification using the WHO Classification of Digital Health Interventions [
Characteristics of included studies (N=49).
Characteristic | Number of studies, n (%) | References | |
|
|||
|
Full | 40 (82) | [ |
|
Protocol | 9 (18) | [ |
|
|||
|
2008 | 1 (2) | [ |
|
2009 | 1 (2) | [ |
|
2010 | 2 (4) | [ |
|
2011 | 1 (2) | [ |
|
2012 | 2 (4) | [ |
|
2013 | 3 (6) | [ |
|
2014 | 8 (16) | [ |
|
2015 | 5 (10) | [ |
|
2016 | 4 (8) | [ |
|
2017 | 6 (12) | [ |
|
2018 | 12 (25) | [ |
|
2019 | 4 (8) | [ |
|
|||
|
Australia | 12 (25) | [ |
|
Canada | 1 (2) | [ |
|
Germany | 2 (4) | [ |
|
Ireland | 2 (4) | [ |
|
Japan | 1 (2) | [ |
|
The Netherlands | 3 (6) | [ |
|
New Zealand | 1 (2) | [ |
|
Norway | 1 (2) | [ |
|
Romania | 1 (2) | [ |
|
Spain | 1 (2) | [ |
|
Sweden | 1 (2) | [ |
|
United Kingdom | 2 (4) | [ |
|
United States | 21 (43) | [ |
|
|||
|
≤100 | 25 (51) | [ |
|
101-1000 | 17 (35) | [ |
|
>1000 | 6 (12) | [ |
|
Unknown | 1 (2) | [ |
|
|||
|
≤1 month | 10 (20) | [ |
|
2-3 months | 26 (53) | [ |
|
4-12 months | 8 (16) | [ |
|
>12 months | 3 (6) | [ |
|
Unknown | 2 (4) | [ |
|
|||
|
On-demand information services | 5 (10) | [ |
|
Client-to-client communication | 8 (16) | [ |
|
Personal health tracking | 4 (8) | [ |
|
Targeted client communication | 39 (80) | [ |
|
Untargeted client communication (eg, modular delivery) | 4 (8) | [ |
|
|||
|
Randomized controlled trials | 38 (78) | [ |
|
Quasi-experimental designs | 11 (22) | [ |
|
|||
|
Anxiety disorders | 2 (4) | [ |
|
Depressive disorders | 13 (27) | [ |
|
Feeding and eating disorders | 6 (12) | [ |
|
Other (family violence) | 1 (2) | [ |
|
Overall well-being | 11 (22) | [ |
|
Schizophrenia spectrum and other psychotic disorders | 4 (8) | [ |
|
Neurodevelopmental disorders | 1 (2) | [ |
|
Substance use and addictive disorders | 6 (12) | [ |
|
Mixed | 5 (10) | [ |
aOn the basis of World Health Organization Classification of Digital Health Interventions [
All studies examined usage data in some aspects of their evaluation, and most papers were ascribed to one or more of the following constructs: user adherence [
A summary of the findings for analytics metrics and analysis is provided in
Across the reviewed studies, different types of metrics were used to evaluate the level of user activity in these interventions. Overall, a heterogeneous selection of metrics was collected, with the number of modules completed (n=27/49) and the session duration (n=24) being the most common metrics. In particular, a few studies collected metrics on the number of times different features were used (n=18). Because these features were highly specific to the intervention being evaluated (ie, number of likes, number of journal entries), these unique metrics were collapsed into a general category called
In terms of how metrics were analyzed, many studies [
Types of usage metrics collected in included studies (N=49).
Characteristic | Number of studies, n (%) | References | |
|
|||
|
1 | 17 (35) | [ |
|
2 | 8 (16) | [ |
|
3 | 10 (20) | [ |
|
4+ | 14 (29) | [ |
|
|||
|
Number of clicks | 1(2) | [ |
|
Number of features used | 18 (37) | [ |
|
Number of log-ins | 20 (41) | [ |
|
Number of modules | 27 (55) | [ |
|
Number of page views | 10 (20) | [ |
|
Number of posts | 5 (10) | [ |
|
Number of sessions | 9 (18) | [ |
|
Rate of return | 4 (8) | [ |
|
Session duration | 24(49) | [ |
|
Other (eg, calculated metrics) | 14 (29) | [ |
Although some studies have also reported similar results [
Two studies incorporated more sophisticated metrics. Stallman and Kavanagh [
The findings of this scoping review will also be used to inform the selection of analytics metrics to be examined as a potential exploratory component of a larger RCT evaluating an mHealth intervention, Thought Spot [
This scoping review explores how usage data are characterized and analyzed in evaluations of digital mental health innovations for transition-aged youth. There is an unprecedented demand to address current concerns and gaps in adolescent mental health [
Foremost, there was significant heterogeneity in the construct that analytics was ascribed to. For example, Bidargaddi et al [
In addition, although adoption was included in the search strategy, no studies in our included papers examined adoption specifically and/or used analytics as a proxy to abstract the construct. This is interesting in contrast to literature in clinical informatics where adoption is often an end goal for implementation of technology [
Furthermore, understanding the process of how transition-aged youth decide to adopt technology would provide further insight into the characteristics of successful adoption [
The other key finding of this review is the significant heterogeneity in the number and types of metrics collected. We observed that most studies collected 1 or 2 metrics and usually included the number of modules completed and the session duration. These findings are in contrast to a similar review by Pham et al [
The heterogeneity in analytics metrics between studies also generates another level of complexity for analysis. For the various metrics selected, many studies often characterized their analytics findings as constituting high usage [
Similar to the observations made by Pham et al [
Several limitations should be noted when interpreting the findings of this scoping review. Because consumer health informatics has become popularized only over the last decade [
A second limitation is that, as per guidelines established for scoping reviews [
Finally, it should be noted that, due to the limited number of papers in this scoping review, we did not limit our analyses by mental health conditions and types of research designs. In particular, it would be useful to examine how analytics is used to support the evaluation of different solutions and objectives. Studies such as Torous et al [
This scoping review provides a preliminary insight into the role of analytics in evaluating mHealth apps for transition-aged youth and identifies both progress that has been made and future areas of exploration. Most importantly, this review adds to the sparse literature on analytics and highlights the need for researchers to assess and standardize how to integrate analytics into their evaluation plans [
In addition, several areas of exploration have been identified for researchers. As suggested by Pham et al [
This scoping review provided initial insights into the role of analytics in evaluating mobile mental health apps (mHealth) for transition-aged youth. Our analysis of 49 studies published between 2008 and 2019 revealed that the use of analytics is becoming increasingly ubiquitous in evaluating mHealth apps for transition-aged youth. Despite recent progress in the use of analytics, there is still heterogeneity in understanding the significance and value of analytics in these evaluations. In addition, the lack of guidance on metrics selection and analysis warrants future exploration. As digital mental health care continues to grow in popularity, particularly for transition-aged youth, understanding analytics and its impact on evaluations would help to streamline the journey toward using digital interventions to foster better mental health care for this population.
Sample search strategy for scoping review.
Data extraction table for scoping review.
electronic health
Medical Literature Analysis and Retrieval System
mobile health
randomized controlled trial
World Health Organization
This study was supported through the eHealth Innovations Partnership Program grant (Reference # EH1-143558) from the Canadian Institute of Health Research and through in-kind support from the Centre for Addiction and Mental Health. We thank Howard Wong for assistance with preparation of the data, Vincci Lui, from the Institute of Health Policy, Management and Evaluation at the University of Toronto for her support in developing a search strategy and Hema Zbogar at the Centre for Addiction and Mental Health for copyediting the paper.
DW, JS, A Johnson, AA-Jaoude, and EH conceived the idea for this project. BL and JS developed the search strategy and conducted a screening process. BL and JS conducted data extraction and analysis with support from DW, AJ, AA-Jaoude, and EH. BL and JS led to the writing of the manuscript. EH and DW made substantial edits to the manuscript before submission. All authors have reviewed and approved this manuscript.
None declared.