Published on in Vol 5, No 3 (2018): Jul-Sept

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/8594, first published .
A Process Evaluation of a Web-Based Mental Health Portal (WalkAlong) Using Google Analytics

A Process Evaluation of a Web-Based Mental Health Portal (WalkAlong) Using Google Analytics

A Process Evaluation of a Web-Based Mental Health Portal (WalkAlong) Using Google Analytics

Original Paper

1Department of Psychiatry, University of British Columbia, Vancouver, BC, Canada

2Centre for Health Evaluation and Outcome Sciences, St Paul's Hospital, Vancouver, BC, Canada

3Department of Pharmacology, Faculty of Medicine, Masaryk University, Brno, Czech Republic

4Centre for Applied Research in Mental Health and Addictions, Faculty of Health Sciences, Simon Fraser University, Burnaby, BC, Canada

5School of Population and Public Health, University of British Columbia, Vancouver, BC, Canada

Corresponding Author:

Michael Jae Song, MPH

Department of Psychiatry

University of British Columbia

430-5950 University Boulevard

David Strangway Building

Vancouver, BC, V6T1Z3

Canada

Phone: 1 778 899 5205

Email: michaeljaesong@gmail.com


Background: Despite the increasing amount of research on Web-based mental health interventions with proven efficacy, high attrition rates decrease their effectiveness. Continued process evaluations should be performed to maximize the target population’s engagement. Google Analytics has been used to evaluate various health-related Web-based programs and may also be useful for Web-based mental health programs.

Objective: The objective of our study was to evaluate WalkAlong.ca, a youth-oriented mental health web-portal, using Google Analytics to inform the improvement strategy for the platform and to demonstrate the use of Google Analytics as a tool for process evaluation of Web-based mental health interventions.

Methods: Google Analytics was used to monitor user activity during WalkAlong’s first year of operation (Nov 13, 2013-Nov 13, 2014). Selected Google Analytic variables were overall website engagement including pages visited per session, utilization rate of specific features, and user access mode and location.

Results: The results included data from 3076 users viewing 29,299 pages. Users spent less average time on Mindsteps (0 minute 35 seconds) and self-exercises (1 minute 08 seconds), which are important self-help tools, compared with that on the Screener tool (3 minutes 4 seconds). Of all visitors, 82.3% (4378/5318) were desktop users, followed by 12.7 % (677/5318) mobile phone and 5.0% (263/5318) tablet users. Both direct traffic (access via URL) and referrals by email had more than 7 pages viewed per session and longer than average time of 6 minutes per session. The majority of users (67%) accessed the platform from Canada.

Conclusions: Engagement and feature utilization rates are higher among people who receive personal invitations to visit the site. Low utilization rates with specific features offer a starting place for further exploration of users in order to identify the root cause. The data provided by Google Analytics, although informative, can be supplemented by other evaluation methods (ie, qualitative methods) in order to better determine the modifications required to improve user engagement. Google Analytics can play a vital role in highlighting the preferences of those using Web-based mental health tools.

JMIR Ment Health 2018;5(3):e50

doi:10.2196/mental.8594

Keywords



As technologies such as internet, mobile phones, and computers have become ubiquitous, Web-based interventions have become one of the major treatment and preventative tools for mental disorders. More than 100 randomized controlled trials have been published to demonstrate the efficacy of internet interventions for psychiatric disorders [1]. These tools have been shown to be effective for a range of mental illnesses including panic disorder, depression, posttraumatic stress disorder (PTSD), perceived stress in schizophrenia, stress, insomnia, and eating disorders [2]. The potential of Web-based mental health interventions to effectively and efficiently treat and prevent mental illnesses has attracted many health care providers and researchers to explore using them as one of the major components of the mental health care system.

Despite their promising benefits, online mental health interventions face problems in engagement [3-5]. In relation to taking a drug, Web-based interventions are more vulnerable to disengagement as they lack close supervision, are easy to discontinue, and have no immediate health benefits [5]. Dropout rates for Web-based interventions are up to 50% in guided interventions and up to 74% in unguided interventions [6,7]. With such high dropout rates, these interventions may provide only limited outcomes regardless of their proven efficacy [8,9].

Engagement is especially challenging in the context of mental health. Dropout from traditional treatment among those with mental illness is already a cause for concern [10]. The chronic nature of mental illness requires frequent assessments in conjunction with long-term management that is tailored to individuals’ needs [11]. In addition, disorder-specific features such as symptom severity, emotional distress, and medication side effects have been shown to predict adherence [12,13]. In a technological context, being engaged can be described as “a category of user experience characterized by attributes of challenge, positive affect, endurability, aesthetic and sensory appeal, attention, feedback, variety or novelty, interactivity, and perceived user control” [14]. In other words, scientists and developers have the responsibility of ensuring not only the efficacy of their intervention but also user engagement. Strategies to increase user engagement may include continual platform improvement and improving outreach or marketing strategies.

A key step to improve engagement is identifying ways to improve intervention uptake in real-world settings through process evaluation [5]. A process evaluation is a type of assessment that determines whether program activities have been implemented as intended [15]. The goal is to inform strategies toward achieving optimal engagement and effectiveness of an intervention [16-18]. By understanding how users engage with the intervention, such as the specific pages visited or how long they used the website, process evaluation can inform the adaptation of the intervention in order to maximize user exposure to the tools and the knowledge available in Web-based interventions [18]. Ideally, this would involve a mixed-methods approach where both quantitative and qualitative indicators can collectively measure user perceptions or behavior [17]. However, this is not always possible for asynchronous, open-access, Web-based interventions and with limited resources.

In this context, one tool that can be used is Google Analytics, which is an open tool that provides free quantitative data on website usage that can be leveraged for continual website improvements (Figure 1). Although this tool is designed to provide insights from a marketing perspective, numerous variables about the webpage traffic are collected that can inform the process evaluation of Web-based interventions. Indeed, Google Analytics has already been used in health research as part of process evaluation [19-21]. For example, this tool has been used to assess the usage of a website about sexual health [19], an internet-delivered genetics education resource developed for nurses [21], and a Web-based tool to encourage the proper use of antibiotics [16], as well as websites related to osteoporosis and fractures [22], smoking cessation [23], and knowledge translation [24]. These studies have presented various indicators available from Google Analytics to show overall user engagement with their platforms. However, to our knowledge, the use of Google Analytics has not yet been demonstrated for Web-based mental health platforms that provide direct support for youth with mental health challenges. Since Web-based mental health platforms may face challenges with adherence, Google Analytics could be used to better understand user behavior as part of process evaluation and to come up with strategies that would improve adherence.

Figure 1. Google Analytics can be used as a tool for process evaluation by receiving information on user traffic and subsequently informing website improvement. This process can also continue as a cycle for continual improvement of the website.
Figure 2. WalkAlong home page.

The goal of this study was to evaluate WalkAlong, a Web-based mental health platform, using Google Analytics. This platform is a youth-oriented mental health Web-portal designed to provide young people with tools and resources required to manage their own mental health (Figure 2; Multimedia Appendix 1). The Web-portal focuses on supporting mood and anxiety disorders and has received funding under Bell Canada’s Let’s Talk mental health initiative. The portal is freely accessible to all users with an internet connection and contains information, links to resources, and self-help tools including a description and link to MoodGYM, a Web-based resource for depression and anxiety. Self-help tools available directly on the platform include the “Mind Steps” page, which consists of regularly posted short tips for helping users get through the day, and “The Self-Help Exercises” section, displaying a menu with the individual self-help exercise pages. Assessment screeners for depression and anxiety (Patient Health Questionnaire-9 and Generalized Anxiety Disorder 7-item scale) are also available. Users may also create a password-protected, secure account that grants them access to additional resources including the PTSD CheckList—Civilian Version, community resources, and a Life-Chart tracking tool that tracks mood and behaviors.

Our objective was to use Google Analytics as a tool for conducting a process evaluation of the WalkAlong platform. As part of the process evaluation, the following evaluation questions using Google Analytics were asked: (1) How engaged (ie, time spent on the website) are users with the WalkAlong platform? (2) How can the WalkAlong platform be improved to better engage the users? (3) How can the marketing strategy be shaped to engage and reach out to more users? Another objective of this project is to extend the work to a mental health platform from other health interventions and inform website design and marketing strategies to effectively impact user behavior [19,21].


Google Analytics

Google Analytics was used to access user data over the first year of WalkAlong (Nov 13, 2013-Nov 13, 2014). Focusing on the first year of operation was considered the most appropriate approach in order to capture a snapshot of web traffic following the initial launch. The Google Analytics data do not contain any personally identifiable information and are presented in the form of aggregate data, making it an accessible tool used in research settings without ethical concerns [20,21].

The research team installed Google Analytics by adding a tracking tag for WalkAlong [20]. These tracking tags are snippets of JavaScript code, a computer programming language used to build websites. This code allows collection of various forms of data related to user behavior as soon as the user visits the website. The data can emanate from various avenues such as the URL of the page the user is viewing, the language or the name of the browser, and the device used to access the site. The code also collects information on the nature of the visit such as the contents viewed, length of the session, and channels used to access the platform (eg, Google, direct URL search, social media, and email link). Such information is summarized in a real-time, interactive dashboard format, which can be accessed by logging in.

Overall Engagement

Several indicators from Google Analytics that would allow inference of a level of engagement were calculated. Such indicators include the number of returning users (n), bounce rate (%), number of pages accessed per session (n), mean session duration (minutes, seconds), and goal conversion rate (%).

The number of returning users refers to the number of sessions visited through the same client id. A high number of returning users has been used as an indicator for a strong level of engagement with the platform [21,25].

The bounce rate is the percentage of only a single page visit during a session. A high bounce rate could indicate minimal exposure to the intervention due to minimal interaction, but it could also indicate users exiting as they have found what they were looking for right away. However, generally, a low bounce rate can be considered indicative of a high overall engagement, especially for a multicomponent platform like WalkAlong [19]. For instance, there are not much available resources that would provide mental health support in the home page of the platform alone as it simply offers an overview. Users will often need to interact with various tools and webpages in order to obtain the necessary information.

The number of pages per session refers to the number of webpages within the platform that the user viewed in a single session, and the mean session duration (minutes, seconds) refers to the mean duration of time the users spent on the platform. There are limitations to ascertaining engagement through these indicators since they allow for multiple interpretations: a high number of pages per session could result from an increased engagement, but it could also result from a superficial exploration of several pages; similarly, a long session duration can result from increased engagement, but it could also result from a user keeping the webpage open while engaging in other irrelevant activities. Nevertheless, despite these caveats, traffic information provides an approximation of the level of exposure the users had with the platform [25].

The goal conversion rate measures the proportion of sessions that achieved a goal out of the total sessions. The goal was predefined as creating an account but can be defined as any activity the web developer or owner chooses (eg, buying a product). As discussed above, users who create an account are able to access more resources than those who do not (anonymous users). Thus, creating an account was assumed to indicate a stronger level of engagement. Overall, a high number of returning users, low bounce rate, high number of pages viewed per session, high mean session duration, and high goal conversion rates collectively translate to an estimate of a strong level of engagement [20,25].

Platform Improvement

Several indicators from Google Analytics that can inform the improvement of the platform were also selected. Indicators of user behavior such as page views, mean duration of visit, and bounce rate when accessing self-help tools (eg, Mindsteps page, Self-Help Exercises page, and Screener) were analyzed. In addition, the most visited pages were observed in terms of their overall entrance rate, exit rate, and bounce rate to understand which tools or pages were most used or viewed. The entrance rate represents a proportion of sessions starting from a given page, while the exit rate represents a proportion of sessions ending from a given page. The information regarding the entrance rate may provide an understanding about which webpage is serving as the first impression for the users, and the exit rate may indicate the point when users felt disengaged or, on the contrary, had adequate information needed for the session.

Google Analytics also provided data on the type of devices used for access. Such information can allow us to consider whether developing a mobile app for WalkAlong would be helpful or not. The three main devices of interest to the current investigation were desktops, tablets, and mobile phones (counted here as mobile devices).

Marketing Strategy

Google Analytics was also used to inform our marketing strategy, with the goal of reaching as many users as possible. At the outset, the research team had reached out to different youth and university organizations, especially around Vancouver. Twitter and a Facebook accounts were also created to spread awareness about the platform. To improve the marketing strategy, the channels used to access the platform were observed. The channels are direct link (ie, typing the web URL directly into a browser); organic search (ie, entry through a search engine); and referrals via another website, via social media, and via email. Understanding which channels are underutilized and which channel results in the highest level of engagement can help improve the marketing strategy. Locations of users from different countries around the world were also observed.


Overall Engagement

The first year of operation for the WalkAlong platform saw a total of 3076 users, amounting to 5318 sessions and 29,299 page views (Figure 3). On average, users visited 5.51 pages per session with an average session duration of 5 minutes 6 seconds. The average bounce rate was 42.9% where users only viewed a single page; 31.7% (976/3076) users created an account (goal completion).

In terms of the frequency of visits, 80% (4259/5318) of sessions came from users visiting less than nine times, indicating a level of disengagement after a certain number of visits (Table 1).

The number of sessions during the study period decreased with increasing number of visits. However, there was a slight increase at the upper end of sessions from high-frequency visits: 5.8% (311/5318) accounted for 26-50 visits and 4.7% (250/5318) accounted for 51-100 visits over the time period. The number of sessions also decreased with longer session durations (Table 2). These results indicate that the majority of sessions or 65.4% (3477/531) resulted in disengagement within the first minute.

Figure 3. WalkAlong overview presented in Google Analytics.
Table 1. Proportion of total sessions and number of visits.
VisitsSessions (N=5318), n (%)
13066 (57.65)
2550 (10.3)
3241 (4.5)
4139 (2.6)
5100 (1.9)
667 (1.3)
750 (0.9)
846 (0.9)
9-14180 (3.4)
15-25205 (3.9)
26-50311 (5.8)
51-100250 (4.7)
101-200112 (2.1)
201+1 (0.0)
Table 2. Duration of session.
Session duration (in minutes)Sessions (N=5318), n (%)
≤13477 (65.4%)
1-3527 (9.9%)
3-10581 (10.9%)
>10733 (13.8%)

Platform Improvement

Visits to the Mindsteps tool comprised 11.9% (3493/29,299) of total page views, with mean duration spent of 35 seconds. Visits to the Self-Help Exercises page comprised 6.13% (1797/29,299), with mean duration spent of 1 minute 8 seconds. Visits to the Screener comprised only 3.36% (983/29,299) of the total page views, but the mean duration spent on the Screener was 3 minutes 4 seconds. Table 3 presents the entrance and exit rates for the most viewed pages, which included the Self-Help Exercises, Mindsteps, and the Screener. The WalkAlong home page, which acts as the landing page, accounted for 65.6% (3487/5308) of all entries.

A list of devices used by WalkAlong’s users to access the site is presented in Table 4, indicating that the platform was accessed mostly via desktops (4378/5318, 82.3%). Furthermore, sessions completed via desktops had a lower bounce rate (39.6%), higher pages per session (6.17), and a higher conversion rate (22.7%) than those completed via other devices.

Marketing Strategy

Direct traffic accounted for the highest proportion (2420/5318, 45.5%) of all visits to the site (Table 5). The combination of high bounce rate (50.4%) and low conversion rate (11.3%) among organic searches suggests that these particular users did not engage much with the site content. Visits via referrals or social media sites had relatively less traffic at 16.0% (849/5318) and 13.5% (717/5318), respectively, and both had more than 45% bounce rates. Although emails accounted for promoting only 1.4% of all sessions, they had a low bounce rate of 17% and a long average session duration of 5 minutes 6 seconds with a conversion rate of 25%, collectively indicating a relatively strong engagement.

Approximately two-thirds or 67.6% (2079/3076) of the users belonged to Canada. Users from Canada also had a relatively low bounce rate (34.35%), high number of pages viewed per session (6.57 pages per session), and long session duration (6:10). However, the users accessed the platform from around the world (Figure 4).

Table 3. Entrance and exit rates for the most viewed pages.
PageEntrances n (%)aExits (%)bBounce rate (%)
Home page3487 (65.6)29.237.6
Depression in Canada115 (2.2)73.486.1
Self-Help Exercises70 (1.3)1447.9
Mindsteps62 (1.2)5.424.2
Screener55 (1.0)32.050.9

aThe numbers do not add up to 100% because only several of the most viewed pages are included in the table.

bThe exit rate is calculated by the number of exits/number of times that page was viewed. Thus, the added percentages are higher than 100% where each row has different number of exits and the number of pages viewed.

Table 4. Devices used to access WalkAlong.
DeviceSessions (N=5318), n (%)Bounce rate (%)Pages per session, nMean session durationConversion rate (%)
Desktop4378 (82.32)39.66.175 min 43 s22.7
Mobile phone677 (12.7)61.62.151 min 53 s7.2
Tablet263 (5.0)48.73.083 min 15 s11.8
Table 5. Proportion of total sessions for each type of channel.
ChannelsSessions (N=5318), n (%)Bounce rate (%)Pages per session, nMean session durationConversion rate (%)
Direct Traffic2,420 (45.51)36.67.46 min 38 s24.4
Organic Search1,256 (23.62)50.43.53 min 42 s11.3
Referrals849 (16.0)46.93.53 min 46 s22.6
Social Media717 (13.5)48.84.63 min 44 s18.0
Email76 (1)17.110.57 min 39 s25.0
Figure 4. Map overlay about locations of users from Google Analytics.

Overall Engagement

The first year of operation for the WalkAlong platform saw a total of 3076 users, amounting to 5318 sessions of 5 minutes 6 seconds on average, 29,299 page views, and 31.7% goal completion rate. However, the high proportion of sessions comprising first visits and short session durations suggest a degree of disengagement for the users (Tables 1 and 2). Although the reason for the disengagement is unclear, as it could also be due to acquiring information that is needed early on, this finding does not contradict the pre-existing concerns about lack of engagement evidenced by Web-based mental health interventions [3-5]. These results call for further efforts to continuously improve the platform to be more engaging.

Platform Improvement

The platform can also be improved based on user behavior. For instance, there could be further efforts to improve engagement with tools such as the “Mindsteps” and “Self-Help Exercises.” The mean duration of 35 seconds spent on Mindsteps or 1 minute 8 seconds spent on Self-Help Exercises may be deemed too short considering that they are important components of WalkAlong intended to improve mental health outcomes. The relatively short time spent on “Mindsteps” and “Self-Help Exercises” needs to be addressed using better engagement strategies such as improved web design or involving youth to be part of the design process [26]. The WalkAlong home page had the highest entrance rate, indicating that users started their session from this page. This landing information reinforces the role of home page serving as the first impression, and determining the subsequent user behavior [27].

In terms of the devices used to access the WalkAlong platform, the site was viewed mostly via desktops. However, as the usage of mobile phones is ubiquitous among youth, future improvements in WalkAlong may benefit from making the platform more accessible and engaging for mobile phone users [28,29]. A next step could involve developing a native mobile phone app version of the WalkAlong website. This could allow users to access the platform wherever they are without the need of a desktop.

Marketing Strategy

The data indicate that some form of personal referrals indicated by either email or prior knowledge of the URL (direct traffic) results in a relatively stronger engagement (ie, longer average duration, more pages viewed per session, etc) than less personal channels such as referrals through social media or organic searches. In other words, direct referrals such as word-of-mouth strategies among peers could help increase the number of engaged users [30]. This may also include engaging with health care professionals so they can share the platform’s URL with their clients.

When observing the location of the users, 67% (2079/3076) users belonged to Canada. This finding aligns with the limited marketing strategy used in Vancouver. WalkAlong can be used by all English-speaking countries, but it can also be used as a template for other platform developments internationally. The WalkAlong team could consider spreading awareness beyond Canada to ensure that such a resource is available to as much youth population as possible.

Using Google Analytics as a Tool for Process Evaluation

Although Google Analytics has provided promising data on the usage patterns of the WalkAlong platform, the tool should be used with careful consideration. For example, comparing the results across various interventions is currently difficult as they serve different purposes with different standards in the number of users, sessions, and page views [19,25]. For instance, Crutzen et al’s website about sexual health showed 850,895 visitors with 5 minutes 6 seconds of average visiting time from March 2009 to December 2010 [19]. It can be assumed that this is a much higher number of visitors with similar duration of visiting time. However, with different periods of time being evaluated for a website serving different purposes, it is difficult to establish the standards for success. Instead, the overall engagement numbers, in particular, may be used to observe trends in usage across different time periods where continual evaluation of the platform is encouraged.

Google Analytics also conforms to a marketing perspective of Web-based behavior rather than to a full evaluation of user behavior [20]. Thus, some variables and information available may not reflect scientific inquiry. This is further complicated by the fact that Google Analytics provides aggregate data where testing of statistical significance for rigorous research purposes can be difficult. Furthermore, the validity of various indicators in measuring user engagement is yet to be established. The number of users may be inaccurate as a new client id is given every time the user deletes the browser cookies, switches devices, or uses a different browser. This may result in the same user being counted as a new user [20]. In addition, long session durations may not, in fact, indicate that users are engaging with the content or a high bounce rate may not indicate that users exit the page due to disinterest quickly, as they could have just quickly found the relevant information they needed. A more detailed analysis of longitudinal user data or a mixed-method assessment to supplement Google Analytics will be important for a more comprehensive process evaluation [31]. For instance, as this study looks only at the first year of operation of WalkAlong, a future study could examine subsequent years comparing internet traffic following changes to the platform, some of which are based on the recommendations mentioned in this paper. Overall, Google Analytics, in combination with other evaluation methods (eg, focus groups, surveys, etc), will provide more accurate interpretations when conducting process evaluation.

Conclusion

Google Analytics was helpful in informing the process evaluation of an open-access Web-based mental health platform. The process evaluation provided information about marketing strategies as well as the aspects of the platform that required improvement. Ideas for future improvements may include marketing the WalkAlong platform outside Canada to get more users from other countries and making the platform more accessible and engaging for mobile users. The rich aggregate data, when combined with other evaluation methods, may provide more accurate interpretations to reinforce or challenge these ideas. Therefore, future studies should focus on developing a mixed methodology that includes Google Analytics to conduct process evaluation of open-access Web-based mental health platforms. With high-quality process evaluation, Web-based mental health interventions may be not only effective but also engaging.

Acknowledgments

Bell Canada Let’s talk mental health initiative supported the development of WalkAlong.ca

Conflicts of Interest

None declared.

Multimedia Appendix 1

Screenshots of pages from WalkAlong.

PPTX File, 1MB

  1. Hedman E, Ljótsson B, Lindefors N. Cognitive behavior therapy via the Internet: a systematic review of applications, clinical efficacy and cost-effectiveness. Expert Rev Pharmacoecon Outcomes Res. Dec 2012;12(6):745-764. [CrossRef] [Medline]
  2. Griffiths KM, Christensen H. Review of randomised controlled trials of Internet interventions for mental disorders and related conditions. Clin Psychol. Mar 2006;10(1):16-29. [CrossRef]
  3. Rice SM, Goodall J, Hetrick SE, Parker AG, Gilbertson T, Amminger GP, et al. Online and social networking interventions for the treatment of depression in young people: a systematic review. J Med Internet Res. Sep 16, 2014;16(9):e206. [FREE Full text] [CrossRef] [Medline]
  4. Andersson G, Titov N. Advantages and limitations of Internet-based interventions for common mental disorders. World Psychiatry. Feb 2014;13(1):4-11. [FREE Full text] [CrossRef] [Medline]
  5. Eysenbach G. The law of attrition. J Med Internet Res. Mar 31, 2005;7(1):e11. [FREE Full text] [CrossRef] [Medline]
  6. Christensen H, Griffiths KM, Farrer L. Adherence in internet interventions for anxiety and depression. J Med Internet Res. Apr 24, 2009;11(2):e13. [FREE Full text] [CrossRef] [Medline]
  7. Richards D, Richardson T. Computer-based psychological treatments for depression: a systematic review and meta-analysis. Clin Psychol Rev. Jun 2012;32(4):329-342. [CrossRef] [Medline]
  8. Calear AL, Christensen H, Mackinnon A, Griffiths KM. Adherence to the MoodGYM program: outcomes and predictors for an adolescent school-based population. J Affect Disord. May 2013;147(1-3):338-344. [CrossRef] [Medline]
  9. Fuhr K, Schröder J, Berger T, Moritz S, Meyer B, Lutz W, et al. The association between adherence and outcome in an Internet intervention for depression. J Affect Disord. Dec 15, 2018;229:443-449. [CrossRef] [Medline]
  10. Bruwer B, Sorsdahl K, Harrison J, Stein DJ, Williams D, Seedat S. Barriers to mental health care and predictors of treatment dropout in the South African Stress and Health Study. Psychiatr Serv. Jul 2011;62(7):774-781. [FREE Full text] [CrossRef] [Medline]
  11. Byrne L, Schoeppe S, Bradshaw J. Recovery without autonomy: Progress forward or more of the same for mental health service users? Int J Ment Health Nurs. Feb 15, 2018. [CrossRef] [Medline]
  12. Davis MJ, Addis ME. Predictors of attrition from behavioral medicine treatments. Ann Behav Med. Dec 1999;21(4):339-349. [CrossRef] [Medline]
  13. Pampallona S, Bollini P, Tibaldi G, Kupelnick B, Munizza C. Patient adherence in the treatment of depression. Br J Psychiatry. Feb 2002;180:104-109. [Medline]
  14. O'Brien HL, Toms EG. What is user engagement? A conceptual framework for defining user engagement with technology. J Am Soc Inf Sci Technol. Apr 2008;59(6):938-955. [CrossRef]
  15. National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention. Centers for Disease Control and Prevention. URL: https://www.cdc.gov/std/Program/pupestd/Types%20of%20Evaluation.pdf [accessed 2018-02-26] [WebCite Cache]
  16. Bhattacharya A, Hopkins S, Sallis A, Budd EL, Ashiru-Oredope D. A process evaluation of the UK-wide Antibiotic Guardian campaign: developing engagement on antimicrobial resistance. J Public Health (Oxf). Jun 01, 2017;39(2):e40-e47. [CrossRef] [Medline]
  17. Steckler A, Linnan L, editors. Process evaluation for public health interventions and research. San Francisco, CA. Jossey-Bass; Dec 01, 2004.
  18. Volker D, Zijlstra-Vlasveld MC, Brouwers EPM, van DFCM. Process Evaluation of a Blended Web-Based Intervention on Return to Work for Sick-Listed Employees with Common Mental Health Problems in the Occupational Health Setting. J Occup Rehabil. Jun 2017;27(2):186-194. [FREE Full text] [CrossRef] [Medline]
  19. Crutzen R, Roosjen JL, Poelman J. Using Google Analytics as a process evaluation method for Internet-delivered interventions: an example on sexual health. Health Promot Int. Mar 2013;28(1):36-42. [FREE Full text] [CrossRef] [Medline]
  20. Clark D, Nicholas D, Jamali HR. Evaluating information seeking and use in the changing virtual world: the emerging role of Google Analytics. Learn Pub. Jul 01, 2014;27(3):185-194. [CrossRef]
  21. Kirk M, Morgan R, Tonkin E, McDonald K, Skirton H. An objective approach to evaluating an internet-delivered genetics education resource developed for nurses: using Google Analytics™ to monitor global visitor engagement. J Res Nurs. Oct 29, 2012;17(6):557-579. [CrossRef]
  22. McCloskey EV, Johansson H, Harvey NC, Compston J, Kanis JA. Access to fracture risk assessment by FRAX and linked National Osteoporosis Guideline Group (NOGG) guidance in the UK-an analysis of anonymous website activity. Osteoporos Int. Dec 2017;28(1):71-76. [CrossRef] [Medline]
  23. Danaher BG, Boles SM, Akers L, Gordon JS, Severson HH. Defining participant exposure measures in Web-based health behavior change programs. J Med Internet Res. 2006;8(3):e15. [FREE Full text] [CrossRef] [Medline]
  24. Makkar SR, Howe M, Williamson A, Gilham F. Impact of tailored blogs and content on usage of Web CIPHER - an online platform to help policymakers better engage with evidence from research. Health Res Policy Syst. Dec 01, 2016;14(1):85. [FREE Full text] [CrossRef] [Medline]
  25. Vona P, Wilmoth P, Jaycox LH, McMillen JS, Kataoka SH, Wong M, et al. A web-based platform to support an evidence-based mental health intervention: lessons from the CBITS web site. Psychiatr Serv. Nov 01, 2014;65(11):1381-1384. [FREE Full text] [CrossRef] [Medline]
  26. Thabrew H, Stasiak K, Merry S. Protocol for Co-Design, Development, and Open Trial of a Prototype Game-Based eHealth Intervention to Treat Anxiety in Young People With Long-Term Physical Conditions. JMIR Res Protoc. Sep 22, 2017;6(9):e171. [FREE Full text] [CrossRef] [Medline]
  27. Lindgaard G, Fernandes G, Dudek C, Brown J. Attention web designers: You have 50 milliseconds to make a good first impression!. Behav Inform Technol. Mar 2006;25(2):115-126. [CrossRef]
  28. Rainie L, Smith A. Pew Research Center. Washington, D.C.; Oct 18, 2013. URL: http:/​/www.​pewinternet.org/​files/​old-media/​/Files/Reports/2013/PIP_Tablets%20and%20e-readers%20update_101813.​pdf [accessed 2017-07-28] [WebCite Cache]
  29. Luxton DD, McCann RA, Bush NE, Mishkind MC, Reger GM. mHealth for mental health: Integrating smartphone technology in behavioral healthcare. Prof Psychol:Res Pract. 2011;42(6):505-512. [CrossRef]
  30. Crutzen R, de Nooijer J, Brouwer W, Oenema A, Brug J, de Vries N. Effectiveness of online word of mouth on exposure to an Internet-delivered intervention. Psychol Health. Jul 01, 2009;24(6):651-661. [CrossRef] [Medline]
  31. Kaushik A. Web analytics: an hour a day. Indianapolis, IN. Wiley Publishing, Inc; 2007.


PTSD: posttraumatic stress disorder


Edited by J Torous; submitted 07.09.17; peer-reviewed by A Mavragani, F Akbar, T Irizarry, F Solano Zapata; comments to author 01.10.17; revised version received 26.02.18; accepted 19.05.18; published 20.08.18.

Copyright

©Michael Jae Song, John Ward, Fiona Choi, Mohammadali Nikoo, Anastasia Frank, Farhud Shams, Katarina Tabi, Daniel Vigo, Michael Krausz. Originally published in JMIR Mental Health (http://mental.jmir.org), 20.08.2018.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.