JMIR Publications

JMIR Mental Health

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 23.11.16 in Vol 3, No 4 (2016): Oct-Dec

This paper is in the following e-collection/theme issue:

    Original Paper

    Development of a Mobile Phone App to Support Self-Monitoring of Emotional Well-Being: A Mental Health Digital Innovation

    1Emotion and Well-being Research Unit, School of Psychological Sciences, Monash University, Clayton, Australia

    2Centre for Positive Psychology, Melbourne Graduate School of Education, University of Melbourne, Melbourne, Australia

    *all authors contributed equally

    Corresponding Author:

    Nikki Rickard, BBSc (Hons), PhD

    Emotion and Well-being Research Unit

    School of Psychological Sciences

    Monash University

    Building 18

    Wellington Rd

    Clayton, 3800

    Australia

    Phone: 61 400 191 768

    Fax:61 3 9905 3948

    Email:


    ABSTRACT

    Background: Emotional well-being is a primary component of mental health and well-being. Monitoring changes in emotional state daily over extended periods is, however, difficult using traditional methodologies. Providing mental health support is also challenging when approximately only 1 in 2 people with mental health issues seek professional help. Mobile phone technology offers a sustainable means of enhancing self-management of emotional well-being.

    Objective: This paper aims to describe the development of a mobile phone tool designed to monitor emotional changes in a natural everyday context and in real time.

    Methods: This evidence-informed mobile phone app monitors emotional mental health and well-being, and it provides links to mental health organization websites and resources. The app obtains data via self-report psychological questionnaires, experience sampling methodology (ESM), and automated behavioral data collection.

    Results: Feedback from 11 individuals (age range 16-52 years; 4 males, 7 females), who tested the app over 30 days, confirmed via survey and focus group methods that the app was functional and usable.

    Conclusions: Recommendations for future researchers and developers of mental health apps to be used for research are also presented. The methodology described in this paper offers a powerful tool for a range of potential mental health research studies and provides a valuable standard against which development of future mental health apps should be considered.

    JMIR Ment Health 2016;3(4):e49

    doi:10.2196/mental.6202

    KEYWORDS



    Introduction

    Background

    Emotional well-being is broadly defined [1] as, “a positive sense of well-being and an underlying belief in our own and others’ dignity and worth” by the Mental Health Foundation (p. 8). Consistent with dual models of well-being, it encompasses both positive functioning (happiness, a sense of control and self-efficacy, and social connectedness) and an absence of stress and depression [2,3]. Monitoring changes in emotional well-being is fundamental to mental health, with increases in emotional well-being associated with resilience, creative thinking, social connectivity, and physical health [4-9]. In contrast, significant and sustained decreases in emotional well-being are associated with the development of affective disorders such as depression and anxiety, and reduced physical health [4,5,7].

    Monitoring for such changes is crucial for early detection of mental health problems. Rapid response to early risk indicators is one of the key predictors of better health outcomes, enabling preventative health approaches to be initiated early [10]. Regular monitoring of emotional health indices is therefore recommended by various national guidelines [11,12]. In practice, however, it remains difficult for clinicians or professional mental health service providers to obtain frequent monitoring in real time [13,14]. A priority challenge facing the health care system is to achieve practicable and sustainable means of supporting self-management of health and well-being. Self-monitoring is a particularly attractive goal for mental health care, given that many individuals with mental health needs do not seek professional health care support [15-17]. In addition, self-monitoring may develop an individual’s insight into their need to seek help. In particular, young people consistently indicate that they prefer nonprofessional or self-managed strategies for addressing mental health issues [18,19]. Obtaining temporally sensitive (eg, daily) information on significant changes in emotional state has the potential to profoundly improve the capacity to promote emotional health [12].

    Experience sampling methodologies (ESMs), or ecological momentary assessments, involve the systematic collection of self-report data from individuals at multiple time points throughout their everyday lives [20]. ESMs have been used to monitor changes in affective state, and to predict mental health with success to a certain extent [21,22]. In particular, the variability in emotional state over time provides more substantial information for understanding the causes and nature of psychopathology than do cross-sectional “snapshot” assessments. For example, when sampled multiple times a day for 6 days, negative affect was found to vary more in patients diagnosed with major depressive disorder than that in controls across the day [23]. ESM assessments in individuals diagnosed with panic disorder also revealed that the expectation of a panic attack was a significant precursor for the occurrence of a panic attack [24]. Ben-Zeev et al [25] also found that patients diagnosed with a major depressive disorder retrospectively reported higher levels of symptoms relating to anhedonia, suicidality, and sadness than captured in their ESM reports, highlighting the biases of traditional survey methods. To date, however, it has been methodologically difficult and obtrusive to obtain temporally regular and precise measures of emotional state [21]. The resources required to obtain such information repeatedly over lengthy time frames have made such an intensive monitoring prohibitive. In addition, the use of palm pilots and pagers (which were never as familiar to users as mobile phones have become) to prompt users for this information can be intrusive, and makes it less likely that users will continue to use this form of monitoring for extended periods [26].

    Mobile phone technology offers an unprecedented opportunity to unobtrusively track everyday behavior and changes in emotional state, all in real time [27,28]. Mobile phone health tools also offer the potential of immediate response to the outcome of this monitoring via delivery of mental health information contingent on changes in real-time emotional state [29]. This technology has not yet been fully leveraged for these purposes, despite mobile phones being one of the few pieces of technology that most people carry on their person every day [30]. This pervasiveness means that mobile phones offer a highly natural and regular means by which information on emotional state could be obtained. Mobile phones now penetrate 77%, 72%, and 68% of the Australian, US, and UK population, respectively [31], and are a cost-effective means of seeking help for mental health issues that may overcome socioeconomic and geographic boundaries [32,33].

    Mobile phone health technology holds great potential for facilitating the management of emotional health through its ability to deliver flexible, user-oriented intervention and self-management tools; a feature particularly relevant for young people who often report fear of stigma associated with seeking professional services for sensitive mental health issues [34,35]. In a 2010 study, 76% of an Australian sample reported being interested in using mobile phones to monitor and manage their own mental health [32]. A large number of mobile phone apps are currently available that claim to promote mental health and well-being [36,37] and a subset of these also attempt to track mood or emotional state over time. However, empirical support for the efficacy of these apps is extremely limited [36]. For instance, in a systematic review of 5464 mental health app abstracts, less than 5 apps were found to have experimental evidence [37]. In addition, a few have capitalized on the benefits enabled by the mobile phone technology such as experience sampling and automated data collection in identifying and evaluating potential time-sensitive behavioral indicators of mental health change [36].

    Of the mobile phone mental health programs that have utilized ESM to track mood over time, several favorable outcomes have been reported. For example, Reid et al [28,38] found that the majority of their adolescent sample using the mobile phone-based mental health app, mobiletype, completed their self-assessments, and that the use of the app increased the practitioners’ understanding of their patients’ mental health. Harrison et al [29] reported that the use of the mobile phone accessed Web-based cognitive behavioral therapy (CBT) course MyCompass for 6 weeks significantly reduced symptoms of depression and anxiety and improved self-efficacy. One of the barriers to sustainability of user engagement in such programs, however, is that they require extensive voluntary input from the user. When evaluated, a common theme is initial compliance, followed by high dropout and poor self-reporting rates (eg, less than 10% of the sample trialing MyCompass reported using it every day) [29]. Reasons for discontinued use include problems understanding how to use the program, invasiveness of the questions, the need for repetitive completion of questionnaires, insufficient personalization of the mental health advice, and little motivation to engage with the program [28,29].

    An innovative way to meet this challenge is to monitor indices of emotional health using methods that require minimal insight or subjective report from the user. Mobile phones contain a range of embedded sensors and features, including accelerometers and global positioning systems and apps, which can automatically record information about a user’s behavior [39]. Two recent studies have obtained a combination of data from mobile phones in an attempt to predict participants’ self-reported mood. LiKamWa et al [40] found that up to 93% of mood scores were accurately predicted by social activity, physical activity, and general mobile phone use data collected from mobile phones. Asselbergs et al [41] attempted to predict self-reported mood of 27 participants from metadata of 6 mobile phone indices (phone calls, text messages, screen time, app usage, accelerometer, and phone camera events). Although the accuracy of the models was no greater than models obtained without mobile phone data, the methodology was demonstrated to be technically feasible and to hold promise. The authors recommended that inclusion of more meaningful or relevant features from mobile phone data may be the key to improving prediction.

    Interestingly, young people use mobile phones for music listening, fitness, and social networking more than any other demographic [42], and these are among the most effective strategies for optimizing emotional health [43-46]. For example, the frequency of app-switching and the content of social network messages were found to predict depression [43] even prior to its onset [47]. Music listening patterns also appear to predict emotional health [48-50] and given that approximately two thirds of music listening by young people is via mobile devices such as mobile phones [31], it is surprising that relatively few apps have attempted to use music for this purpose [27]. Vocal expression too has been found to be a useful index of emotional state [51,52]. Short voice samples have been found to demonstrate 70% accuracy for simple affect recognition [53]. Monitoring a combination of behavioral indices such as physical activity, online social interactions, and music choices therefore offers a promising means of nonintrusive but sensitive assessment of affective state. Advances in statistical methods available through machine learning also enable powerful analysis of this more complex level of individualized multilevel modeling [52,54].

    Another limitation of most mental health apps currently available is that they tend to simplify the emotional well-being spectrum, with positive and negative affect anchors on a unidimensional rating scale. Contemporary conceptualizations of well-being however clearly show that optimal “emotional health and well-being” does not emerge from an absence of affective disorder alone, but also requires a state of positive functioning [2,55,56]. Although positive and negative emotional functioning are correlated, there is substantial evidence that they are orthogonal constructs [57]. Mobile phone technology that differentiates the quadrants created by categorizing according to mental illness or languishing and mental health or flourishing [3,55] is therefore encouraged.

    Objective

    In this paper, we capitalized on the extraordinary role that mobile phones play in people’s lives to develop a tool that has the potential to significantly extend the understanding of emotional health and well-being. The aim of this paper was to describe the design of the mobile phone app, MoodPrism, which was developed to monitor emotional well-being in context and in real time, and provide personalized feedback on the full spectrum of emotional well-being. The paper describes in detail the design and data collection functions of the app, which were incorporated to address major challenges for mental health research and practice, and presents feedback from a small sample of trial users (beta-testers), which tested the functionality and usability of the app.


    Methods

    Design and Development of the App

    MoodPrism was designed and developed in collaboration with a commercial digital creation studio, Two Bulls (Melbourne, Australia). The app was prepared for both the iOS and Android mobile phone platforms and was distributed by the Web-based Apple and Google Play stores, respectively. The term “MoodPrism” was selected to reflect its primary purpose of collecting emotional state data across the entire spectrum of emotional health and well-being and converting this into an array of color-coded feedback to the user.

    The development of MoodPrism involved designing 3 different methods of data collection within the software: (1) automated monitoring of selected online behavior, (2) experience sampling of emotional well-being self-reports, and (3) psychological assessment questionnaires. automated monitoring of selected online behavior, experience sampling of emotional well-being self-reports, and psychological assessment questionnaires. This triangulation of data collection is considered crucial for advancing the measurement of emotional state [58]. As part of the sign-up procedure to the app, permissions for sensitive data had to be obtained. Incentives to continue collecting data over an extended period were also generated.

    The development of MoodPrism was completed in March 2015. The required forms of data collection were achieved by developing a suite of app components, which were then collated into a cohesive app. The outcomes of this development process are described in the following.

    Sign Up

    As part of the sign-up procedure for the app, options were offered to users to provide the app with access to social networking and music apps as well as general (postcode) location. These data were then collected continuously and without the need for user input over the month’s research period. After sign up and consent procedures, MoodPrism administered the initial surveys that could be completed in multiple sittings and required 30-60 min in total to complete. The participants were then requested to use the app for at least thirty days, during which they would be prompted daily to answer a set of short questions, and weekly to complete a short audio recording. If they were unable to respond to daily prompts, MoodPrism advised they could complete them at a time of their convenience till midnight that day, or alternatively to ignore them. At the end of the 30 days, users were invited to complete a final set of surveys, which in total required 15-30 min to complete.

    Users were incentivized to continue using MoodPrism through 3 strategies. First, daily mood and mental health feedback was provided to the user, with additional feedback unlocked after sustained use (Multimedia Appendix 2). This promoted engagement by rewarding users and encouraging feelings of achievement, adhering to principles of gamification [59], which is recommended in mental health apps [36]. Second, completion of daily reports as well as the final surveys generated entries into a draw for 1 of the 4 $AU100 (approximately US $75) gift vouchers. Third, users were informed that their data were contributing to research into the value of mobile phone apps for monitoring mental health and well-being.

    Automated Monitoring

    MoodPrism acted as a portal for data accessed via several mobile phone sensors and apps. Two validated predictors of emotional state change were targeted: music use and web-based social network site activity. As a part of the sign-up process, users were invited to give permission for the app to access Facebook, Twitter, the user’s music library, and location (postcode only).

    Facebook, Twitter, and music use data were collected once every 24 h, and the information collected is provided in Multimedia Appendix 1. Data were accessed from Facebook and Twitter through their relevant application programming interfaces (APIs). This allows third-party access to selected data collected by both Facebook and Twitter. Facebook and Twitter content was analyzed automatically and locally on the user’s phone using several linguistic dictionaries from the Linguistic Inquiry and Word Count (LIWC) [60]. Summaries were obtained for frequencies of emotion words, which were supplemented with a range of emoticons and Internet slang expressions for emotions. Social words and personal pronoun counts were also obtained. A word count for the target categories in the dictionary was extracted and these counts were uploaded to the server. This was repeated every 24 h to collect the posts that occured across the duration of MoodPrism use. The post content temporarily stored by MoodPrism was then deleted.

    Experience Sampling

    MoodPrism utilized ESM to deliver a short set of questions to users daily (Figure 1). Prompts were delivered at a quasi-random time between user-defined hours (eg, 9:00 am-9:00 pm) for 30 days.

    The questions captured a real-time assessment of the user’s emotional well-being, event-related experiences, and their context. Emotional state questions comprised 4 questions on psychological illhealth (depression and anxiety), 4 on emotional state (positive and negative affect, arousal, and control), and 4 on positive functioning (social connection, motivation, meaning, and self-esteem). Positive and negative event-related experiences were assessed by the type of event experienced and a rating of the event’s affective strength (from “slightly” to “extremely positive or negative”). The type of event was selected from a range of options drawn from stressor event questionnaires [61-65] and modified as a short list of the most common event domains (eg, school or work, physical health, material possessions, or social experience domain). Context was assessed via 2 questions, 1 for social context (who the user was with at the time of the report) and environmental context (where they were at the time of the report). Specific questions are given in Table 1.

    In addition, a weekly prompt was delivered that requested a short voice recording to serve as an implicit measure of emotional state [51,53]. Users were prompted to read a standardized piece of text at the start and the end of the recording, and within that window to describe freely how they were feeling at that time.

    Psychological Assessment Questionnaires

    A number of questionnaires were available for completion at the onset of the app use, providing baseline measures of emotional well-being as well as data on potential moderators or confounding variables (see Figure 2). These questionnaires were categorized into survey “blocks” and displayed on the MoodPrism homescreen until their completion. This served to organize the questionnaires into manageable chunks for users to complete in their own time. A subset of these questionnaires was also delivered at the end of the month-long period to enable assessment of whether the app may have affected the well-being measures. A description of these questionnaires was provided in Table 2.

    Figure 1. Screen shots from app showing experience sampling method.
    View this figure
    Figure 2. Screenshots showing examples of longer psychological questionnaires.
    View this figure
    Table 1. Qualitative feedback: questions guiding qualitative feedback forums.
    View this table
    Table 2. Sample feedback provided by beta-testers.
    View this table
    Feedback

    The final design feature of MoodPrism was the provision of a range of feedback to the user on their emotional well-being and mental health. This feedback was organized in consultation with the Australian mental health organizations beyondblue [66] and headspace [67], research literature on mental health and well-being, and expert advice on currently available mental health apps.

    The feedback was available at several stages (see Multimedia Appendix 2):

    • On the completion of a survey block, users were provided a summary of their general score on one of the surveys within that block.
    • On completion of each daily report, users were provided with a color-coded brief description and custom emoticon representing their emotional state on that day. Weekly and monthly overviews were also available when multiple ESMs were completed.
    • On completion of 1 week’s worth of ESMs, “positive mental health” data provided individualized feedback (based on their positive health responses), which included links to positive health websites and apps.
    • On completion of 2 weeks’ worth of ESMs, depression and anxiety data were collated to provide individualized feedback on mental illness risk (based on their PHQ-4 responses). Recommendations and supporting links to mental health websites or contacts were also provided, as well as advice suitable to the user’s emotional functioning over the past 2 weeks.

    Database Security and Storage

    With such extensive and potentially identifiable information being collected by MoodPrism, data storage and data security became a major priority. The following considerations were made regarding data storage in adherence with industry and University [68] standards, the Privacy and Data Protection Act 2014, and the Guidelines for Ethical Practice in Psychological Research Online  as outlined by the British Psychological Society [69].

    Immediately following the survey collection, data were stored on the user’s mobile phone prior to being uploaded encrypted into a secure database every 24 h. All data uploaded from the user’s phone was stored on an Amazon Web Services server. This database was protected by a firewall and regularly updated security protocols. The data stored were anonymized at the point of upload. All potentially identifiable information was removed from the data and only the device ID was retained (functioning as a randomly generated participant code). Data were only accessible online by authorized users via Secure Shell (SSH), which authenticates server access with digital certificates and encrypted passwords. All communication between authorized users and the server also occurred through HTTPS. This ensured that all information passed between the server and the researchers was encrypted and cannot be accessed or manipulated by a third party.

    With regard to social media data, explicit consent to access Facebook or Twitter accounts (“opt-in”) was provided by the user. Their social media credentials were stored locally on the phone but were never uploaded to the server. All Facebook and Twitter posts’ content were processed locally in the mobile phone’s memory and aggregated word counts were generated. Only the aggregate word count was uploaded to the storage server.


    Results

    The app was initially tested by both the researchers and the app developers for minor issues and bugs. A small convenience sample of independent, nonclinical users (N=11; age range=16-52 years; 4 males, 7 females) was then recruited to test the app to generate feedback on the functionality and usability of the app to the researchers and app developers. They used MoodPrism daily over a 30-day period and kept notes of their user experience. Information about the study was provided to the participants and electronic consent was required before the app could be used.

    The test sample was invited to provide more intensive qualitative feedback by either Web-based questionnaire (n=5) or via attendance at a focus group session (n=6). Focus group participants also provided quantitative feedback by completing the Mobile Application Rating Scale (MARS) [70]. The MARS is a multidimensional measure for trialing and rating the quality of mobile phone apps, and has demonstrated interrater reliability and internal consistency. All beta-testers were also invited to discuss or provide emailed notes on their user experience. Broad questions were posed, and prompts were provided where necessary (see Table 1). (No attempt was made to analyze the emotional well-being data from the beta-testers, as the sample was small, and this aim was beyond the scope of the current paper, the primary aim of which was to provide information on the development of the app.)

    Themes extracted from the comments provided via the focus group or Web-based feedback are presented in Table 2.

    The testing of the app with this sample was approved by the Monash University Human Research Ethics Committee (Approval # CF14/968 – 2014000398). App development was completed in 2015 and tested over June-July 2015. The app was then revised in response to feedback received and the final version of the app prepared. The app was then released on the Google Play (Android) and Apple (iOS) stores. Future publications will report empirical data from this app, with the scope of the current publication limited to the development process only.

    Feedback about the functionality and usability of the app was obtained from 11 beta-testers, who completed a standard survey of app usability, the MARS. The results are presented in Figure 3.

    MARS ratings for the MoodPrism app exceeded the average rating for 50 apps reviewed by Stoyanov et al [70] for each MARS subscale. Highest satisfaction ratings were obtained for items relating to the app’s graphics quality (eg, buttons, icons), gestural design (eg, swipes, scrolls), ease of use (eg, clear menus), credibility of the information sources, the layout aesthetics, and increased awareness of mood. Lowest ratings were obtained for entertainment value (eg, fun to use), customization options, likelihood to change behavior, motivations to address mood and interest, and likelihood to recommend to others.

    The results from the focus group sessions and emailed responses from all 11 beta-testers are also summarized in Table 2.

    The majority of issues identified by the beta-testers were addressed in the final version of the app. For instance, the order of positively or negatively worded options was made consistent across all questionnaires, additional information on how location and social networking data will be used was provided, with reassurance that information collected was deidentified was added, and an explanatory key was provided for interpreting colors and emoticons. The only issues that were not able to be addressed related to the integrity of psychometrically validated questionnaires (and therefore wording could not be altered), inclusion of negative content (which was important to the primary purpose of the app), or installation difficulties (as they related to the trial version only, and would not be present in the Apple and Android Web-based stores).

    Figure 3. Quantitative feedback: beta-tester ratings on the Mobile Application Rating Scale (MARS) subscales (N=11).
    View this figure

    Discussion

    Principal Findings

    In this paper, we demonstrated how mobile phone technology could be harnessed to overcome several challenges in current mental health research and practices. Key needs we aimed to meet by developing this tool included the following: real-time monitoring of emotional functioning, assessing the full spectrum of emotional well-being, confidential access to mental health support and information when required, and to reduce obtrusiveness of regular monitoring.

    MoodPrism was developed on both iOS and Android mobile phone platforms as an app to monitor emotional well-being in real time. It achieved this using ESM and collection of behavioral data via mobile phone apps (addressing challenge 1). It included assessment of daily positive psychological functioning (or “flourishing” [55]) in addition to more traditional assessment of negative psychological functioning (depression and anxiety) (addressing challenge 2). MoodPrism offered users a range of resources and links to enhance mental health literacy and access to professional mental health support, which vary depending on their current emotional functioning (addressing challenge 3). MoodPrism also incorporated voice monitoring, social networking site, and music playlist data collection as the first steps toward less obtrusive monitoring of emotional well-being for extended periods (addressing challenge 4)—although extensive algorithmic modeling will be necessary to achieve this goal. In sum, MoodPrism successfully responded to 4 key challenges in the emotional mental health domain. A number of important learnings were also achieved during this project, which may be helpful to outline for future researchers considering developing a mental health app [36].

    Considerations When Developing a Research-Based Mental Health App

    Development of mental health apps is a relatively young field, and the guidelines to support researchers and app developers are not yet widespread. During the development of MoodPrism, a number of key issues were identified that could be helpful to researchers developing apps for mental health research and practice. These issues are briefly outlined in the following and then recommendations for consideration in future research are summarized in Figure 1.

    First, it is important to recognize the different priorities of app developers and researchers (and mental health practitioners). For example, the MoodPrism researchers’ main goals were database integrity, psychometrically sound questionnaires, and ethical administration of sensitive content. The app developers’ main goals were an enjoyable user experience, good design, simple user interface, brief page content, and anonymous data storage. Identifying these goals and coming to an agreement on how they should be prioritized could help design an app that optimizes functionality (and therefore will be used by the participants) with integrity (so that the data are suitable for analysis). With MoodPrism, the researchers’ priority to maintain psychometric properties of questionnaires was in conflict with the app developers’ priority for good user interface and design. Administration of long questionnaires was overcome by creating brief checkpoints or “blocks” of surveys to complete, each with a portion of feedback provided as a reward to incentivize completion of long surveys. Similarly, the developers’ database priorities were guided by industry standards for data collection and storage. At times, this conflicted with the researchers’ need to obtain sufficient details; for example, anonymity of social media posts initially prevented the integrity of coding processes from being verified. Coding solutions were eventually achieved, but considerable delays could have been avoided if the database requirements were thoroughly discussed at the project’s outset. When these conflicting priorities were identified, a solution was often achieved that produced the unexpected benefit of optimizing outcomes for both stakeholders. For example, the chunking of questionnaires not only improved the user experience, but also was likely to improve the validity of data as participants were less likely to fatigue, or resort to nonserious responding.

    Second, sufficient time should be quarantined at the outset for planning, and at the completion for beta testing and revision. App developers’ schedules can overlook the details involved in translating research requirements into the app space, and as a result underestimate the time involved. Database APIs for commercial apps also tend to have simpler output requirements than is often essential for advanced statistical analyses. A failure to identify the more complex necessities of the app’s function at the outset can result in over simplistic transition of features into the app, and subsequent delays in revision to meet research needs. Time spent presenting the entire app’s contents clearly up front to app developers will help avoid significant delays during development. Time should also be sufficient at the outset for complete storyboarding and wireframing of the app to ensure both parties agree on the app’s format and presentation. Aesthetics that work well in commercial apps do not always translate well for research content, which may out of a necessity include lengthier content or inflexible formatting or labeling of items (eg, traditional Likert-type scales in psychological questionnaires). Samples of similar app presentations that are known to work effectively with this type of content should if possible be reviewed and the best features identified. Allowing sufficient time for planning should also ensure that clear milestone dates are set, post which no further changes or additional content can be made by researchers or practitioners until trialing. Ongoing modifications can magnify delays for app developers and confuse versions being delivered. Sufficient time when the app is being finalized is also critical. Users should be allowed a sufficient trial period to allow testing of the app in various contexts, and the schedule should also ensure that they are able to report back both individually, and where possible as a part of group discussion. Focus groups are invaluable for identifying common themes across users, as well as allowing more singular experiences to emerge.

    Third, communication among app developers and researchers or practitioners should be managed centrally. A flexible Web-based platform (such as “Basecamp”) provides project management tools such as discussion threads, allocation of tasks, a central file repository, and reminders. Progress of tasks should be monitored regularly and updates provided when item check off is delayed. Clear assignment of tasks avoids tasks being overlooked, and ensures accountability.

    Fourth, methods to evaluate the app should be included within the app itself. Commercial apps can contain simple “thumbs up” or star ratings, but this is unlikely to be sufficiently informative for research or practitioner needs. Importantly, it is helpful to obtain assessments of the various aspects of the app, including commercial considerations such as aesthetics and functionality as well as those of central interest to researchers, such as ethics or trust and integrity. Published app assessment measures such as the MARS for health apps should be considered if possible. This will allow standardization and comparability across apps in the mental health space, and to build integrity and an evidence base for improvement of mental health apps over time.

    Our experiences researching and developing mental health apps have yielded a number of important practical insights of value to researchers in this field. The issues highlighted during the development of MoodPrism, taken together with our recommendations documented elsewhere [36], are summarized in Figure 4.

    Potential Applications of MoodPrism in Psychological Research

    The development of a research mobile phone tool such as MoodPrism has enormous potential within the mental health field. Several applications of MoodPrism currently in progress are summarized in the following to illustrate the power of flexible, real-time monitoring using this platform.

    Figure 4. Recommended steps for researchers engaging in the app development process.
    View this figure
    Automated Prediction of Mental Health Risk

    One of the most exciting promises for data-rich apps like MoodPrism is the development of algorithms which allow automated prediction of emotional health. This modeling could determine the minimum number of constructs required to reliably predict significant changes in emotional well-being, which could be used to inform a more streamlined and userfriendly app. Importantly, it is unlikely that any 1 or 2 variables will provide reliable prediction of such changes; a strength of MoodPrism is that it provides a breadth of variables that can be used to answer diverse and important research questions. Various algorithms may be identified, for instance, which discriminate between periods of stability and decline, and MoodPrism could then unobtrusively monitor for this change, and provide targeted mental health support to the user. This extends previous research that demonstrates feasibility of such modeling [40,41,71] by utilizing predictors already established in previous research to be associated with mental health (such as online social networking) rather than only those mobile phone sensors that are convenient to record (such as app use and activity).

    Improving Emotional Self-Awareness, Mental Health Literacy, and Mental Health and Well-Being Outcomes

    Bakker et al [36] detail how mental health apps can be categorized as reflection-, education-, or problem-focused. MoodPrism is largely a reflection-focused app aimed at improving a user’s emotional self-awareness by encouraging the user to report their thoughts, feelings, or behaviors and then reflect upon them. There is also an education component in MoodPrism that provides access to mental health information and resources. Use of this type of mental health app may therefore result in improvements in mental health and well-being. Kauer et al [72] found evidence that using a mobile phone app that promotes self-reflection through mood tracking can increase ESA and decrease depressive symptoms. Furthermore, rigorous study is needed to explore the mental health benefits of MoodPrism and other similar reflection-focused or education-focused apps, as very few randomized controlled trials have been conducted to investigate the efficacy of mental health apps [37]. Importantly, mobile phone technology complements traditional emotion monitoring techniques such as CBT-based recording worksheets [73,74], by increasing recording of subtle changes in behavior in real time. The innovative pairing of changes in emotional well-being with rapid delivery of mental health information has the potential to improve a user’s access to relevant resources such as Web-based health portals (eg, eheadspace, eHub), or local GPs when it is needed [75-77].

    Leveraging Behavioral Data on Social Media to Gain Insight Into Mental Health and Social Context

    Users of social networking sites leave rich digital traces of their social behavior, which includes the structure of their friendship networks and the written interactions between connections [78-80]. The quality of interactions on social network service (SNS) has been shown to hold important relationships with mental health. Positive interactions are associated with better mental health outcomes, and negative interactions may exacerbate mental illness [81-83]. However, how certain individual characteristics might lead a user to gain benefit or detriment from their SNS use is yet to be clearly described [84]. This requires access to both SNS data and the administration of psychometrically sound surveys to profile the users of SNSs. By profiling SNS users and better tapping into the interindividual variation in SNS use, the accuracy of SNS language models for mental health prediction could be improved [85] and some of the conflicting findings around the use of SNS and its mental health impact could be disentangled [85]. Furthermore, apps like MoodPrism enable SNS data to be associated in real time with ESM assessments of mood and psychological surveys. Time-sensitive linking of self-reported mood change and emotional expression in SNS posts may also provide evidence to support the use of SNS data and language analysis as a tool for mood and mental health tracking overtime.

    Predicting Resilience Patterns to Everyday Significant Events

    Event-based resilience research explores individual capacities to maintain healthy psychological functioning in response to naturally occurring stressor events [86,87]. Previous research methodologies use cross-sectionally designed studies and typically rely on retrospective reports [88-90]. These provide only partial snapshots of an individual’s capacity for resilient responding and can be subject to recall biases. The collection of MoodPrism's daily reports of psychological well-being, as well as the presence or absence of stressor events, is therefore pertinent to advancing event-based resilient research methodologies. Such methodological approaches allow for multiple snapshots in mood responding that, when compiled, create more representative, real-time observation of dynamic fluctuations that occur in an individual’s mood responses to stressor events. Such data will permit a more accurate exploration and identification of the heterogeneous mood trajectories that individuals display following stressor experiences [85,87,91-93]. Favorable patterns of responding, reflecting the maintenance of psychological functioning, can be identified and profiled to explore important factors that discriminate resilient individuals from other groups that reflect less-resilient patterns of responding.

    Conclusions

    Development of mental health apps such as MoodPrism maximize health impact by harnessing the opportunities offered by mobile phone technology. Approximately, three quarters of the US and Australian populations own a mobile phone, and around 3 in 4 of those never leave home without their mobile device [31,94]. People check their mobile phones up to 150 times a day [30], demonstrating that mobile devices offer unprecedented access to everyday behavior. Incorporating evidence-based monitoring of emotional health into routine mobile phone apps can provide a powerful and flexible methodology for increasing personal control over one’s own emotional health. Capitalizing on inbuilt tools within mobile phones—such as music players, voice recorders, and social network media—to contribute data further enhances the potential of such apps to sensitively monitor emotional health over extended periods of time, while remaining unobtrusive. People (particularly young people) often find mobile phone technologies more engaging, anonymous, and less stigmatizing than other means of accessing help, and therefore are much more likely to use this methodology [16]. The new technologies described in this paper not only complement traditional approaches or educational tools supporting mental health but also have the potential to enhance their reach by overcoming many of the barriers currently challenging the reliable surveillance of emotional well-being.

    Acknowledgments

    This research was funded by beyondblue.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Details on 3 forms of data (automatic, experience sampling, and psychological surveys) collected from MoodPrism.

    PDF File (Adobe PDF File), 55KB

    Multimedia Appendix 2

    Feedback generated by the subjects while using MoodPrism.

    PDF File (Adobe PDF File), 594KB

    References

    1. St John T, Leon L, McCulloch A. Childhood and adolescent mental health: understanding the lifetime impacts. The Mental Health Foundation. 2015. [FREE Full text] [CrossRef]
    2. Greenspoon PJ, Saklofske DH. Toward an integration of subjective well-being and psychopathology. Social Indicators Research. 2001. (1) p. 81   URL: http://link.springer.com/article/10.1023/A:1007219227883 [WebCite Cache]
    3. Westerhof GJ, Keyes CL. Mental illness and mental health: the two continua model across the lifespan. J Adult Dev 2010 Jun;17(2):110-119 [FREE Full text] [CrossRef] [Medline]
    4. Diener E, Chan MY. Happy people live longer: subjective well-being contributes to health and longevity. Appl Psychol Health Well-being 2011;3(1):1-43. [CrossRef]
    5. Fredrickson BL. What good are positive emotions? Rev Gen Psychol 1998 Sep;2(3):300-319 [FREE Full text] [CrossRef] [Medline]
    6. Folkman S, Moskowitz JT. Positive affect and the other side of coping. Am Psychol 2000 Jun;55(6):647-654. [Medline]
    7. Garland EL, Fredrickson B, Kring AM, Johnson DP, Meyer PS, Penn DL. Upward spirals of positive emotions counter downward spirals of negativity: insights from the broaden-and-build theory and affective neuroscience on the treatment of emotion dysfunctions and deficits in psychopathology. Clin Psychol Rev 2010 Nov;30(7):849-864 [FREE Full text] [CrossRef] [Medline]
    8. Isen AM. Positive affect. In: Dalgeish T, Power MJ, editors. Handboook of Cognition and Emotion. New York: Wiley; 1999:521-539.
    9. Tugade MM, Fredrickson BL, Barrett LF. Psychological resilience and positive emotional granularity; examining the benefits of positive emotions on coping and health. J Pers 2004;72(6):1161-1190. [CrossRef]
    10. Goodwin GM. Time in the course of major depressive disorder. Medicographia 2010;32:126-132. [FREE Full text]
    11. National Institute for Health and Care Excellence. Depression: management of depression in primary and secondary care. 2016.   URL: https://www.nice.org.uk/guidance/CG23 [accessed 2016-06-15] [WebCite Cache]
    12. McDermott B, Baigent M, Chanen A. beyondblue Expert Working Committee Clinical Practice Guidelines: Depression in Adolescents and Young Adults. Melbourne: beyondblue; 2010.
    13. Hetrick SE, Thompson A, Yuen K, Finch S, Parker AG. Is there a gap between recommended and 'real world' practice in the management of depression in young people? A medical file audit of practice. BMC Health Services 2012;12:178. [CrossRef]
    14. Morrato EH, Libby AM, Orton HD, Degruy 3rd FV, Brent DA, Allen R, et al. Frequency of provider contact after FDA advisory on risk of pediatric suicidality with SSRIs. Am J Psychiatry 2008 Jan;165(1):42-50. [CrossRef] [Medline]
    15. Cooper PJ, Goodyer I. A community study of depression in adolescent girls: I. Estimates of symptom and syndrome prevalence. Br J Psychiatry 1993;163:369-74, 379-80. [Medline]
    16. Rickwood D, Deane FP, Wilson CJ, Ciarrochi JV. Young people’s help-seeking for mental health problems. Aust eJournal Adv Ment Health 2005;4(3):218-251.   URL: http://ro.uow.edu.au/hbspapers/2106/ [WebCite Cache]
    17. Rickwood DF, Deane FP, Wilson CJ. When and how do young people seek professional help for mental health problems? Med J Aust 2007;187(7 Suppl):S35-S39. [Medline]
    18. Burns JR, Rapee RM. Adolescent mental health literacy: young people's knowledge of depression and help seeking. J Adolesc 2006 Apr;29(2):225-239. [CrossRef] [Medline]
    19. Jorm AF, Wright A, Morgan AJ. Where to seek help for a mental disorder? National survey of the beliefs of Australian youth and their parents. Med J Aust 2007 Nov 19;187(10):556-560. [Medline]
    20. aan het Rot M, Hogenelst K, Schoevers RA. Mood disorders in everyday life: a systematic review of experience sampling and ecological momentary assessment studies. Clin Psychol Rev 2012;32(6):510-523. [Medline]
    21. Myin-Germeys I, Oorschot M, Collip D, Lataster J, Delespaul P, van Os J. Experience sampling research in psychopathology: opening the black box of daily life. Psychol Med 2009 Sep;39(9):1533-1547. [CrossRef] [Medline]
    22. Telford C, McCarthy-Jones S, Corcoran R, Rowse G. Experience Sampling Methodology studies of depression: the state of the art. Psychol Med 2012 Jun;42(6):1119-1129. [CrossRef] [Medline]
    23. Peeters F, Berkhof J, Delespaul P, Rottenberg J, Nicolson NA. Diurnal mood variation in major depressive disorder. Emotion 2006;6(3):383-391. [CrossRef] [Medline]
    24. Kenardy J, Fried L, Kraemer H, Taylor CB. Psychological precursors of panic attacks. Br J Psychiatry 1992 May;160:668-673. [Medline]
    25. Ben-Zeev D, Young MA. Accuracy of hospitalized depressed patients' and healthy controls' retrospective symptom reports: an experience sampling study. J Nerv Ment Dis 2010;198(4):280-285. [Medline]
    26. Miller G. The smartphone psychology manifesto. Perspect Psychol Sci 2012 May;7(3):221-237. [CrossRef] [Medline]
    27. Randall WM, Rickard NS. Development and trial of a mobile Experience Sampling Method (m-ESM) for personal music listening. Music Perception Interdisciplinary J 2012;31(2):157-170.   URL: http://mp.ucpress.edu/content/31/2/157 [WebCite Cache] [CrossRef]
    28. Reid SC, Kauer SD, Khor AS, Hearps SJ, Sanci LA, Kennedy AD, et al. Using a mobile phone application in youth mental health - an evaluation study. Aust Fam Physician 2012 Sep;41(9):711-714. [Medline]
    29. Harrison V, Proudfoot J, Wee PP, Parker G, Pavlovic DH, Manicavasagar V. Mobile mental health: review of the emerging field and proof of concept study. J Ment Health 2011 Dec;20(6):509-524. [CrossRef] [Medline]
    30. Google. Our mobile planet: United States of America. 2013 May.   URL: http://services.google.com/fh/files/misc/omp-2013-us-en.pdf [WebCite Cache]
    31. Proudfoot J, Parker G, Hadzi PD, Manicavasagar V, Adler E, Whitton A. Community attitudes to the appropriation of mobile phones for monitoring and managing depression, anxiety, and stress. J Med Internet Res 2010;12(5):e64 [FREE Full text] [CrossRef] [Medline]
    32. Quine S, Bernard D, Booth M, Kang M, Usherwood T, Alperstein G, et al. Health and access issues among Australian adolescents: a rural-urban comparison. Rural Remote Health 2003;3(3):245 [FREE Full text] [Medline]
    33. Poushter J. Smartphone ownership and Internet usage continues to climb in emerging economies. Pew Research Center. 2016 Feb 22.   URL: http:/​/www.​pewglobal.org/​2016/​02/​22/​smartphone-ownership-and-internet-usage-continues-to-climb-in-emerging-economies/​ [accessed 2016-06-14] [WebCite Cache]
    34. Burns JM, Durkin LA, Nicholas J. Mental health of young people in the United States: what role can the internet play in reducing stigma and promoting help seeking? J Adolesc Health 2009 Jul;45(1):95-97. [CrossRef] [Medline]
    35. Rickwood D. Promoting youth mental health through computer-mediated communication. Int J Ment Health Promotion 2010;12(3):32-44.
    36. Bakker D, Kazantzis N, Rickwood D, Rickard NS. Mental health smartphone apps: review and evidence-based recommendations for future developments. JMIR Ment Health 2016;3(1):e7. [CrossRef] [Medline]
    37. Donker T, Petrie K, Proudfoot J, Clarke J, Birch MR, Christensen H. Smartphones for smarter delivery of mental health programs: a systematic review. J Med Internet Res 2013;15(11):e247. [CrossRef] [Medline]
    38. Reid SC, Kauer SD, Hearps SJ, Crooke AH, Khor AS, Sanci LA, et al. A mobile phone application for the assessment and management of youth mental health problems in primary care: health service outcomes from a randomised controlled trial of mobiletype. BMC Fam Pract 2013 Jun 19;14:84 [FREE Full text] [CrossRef] [Medline]
    39. Lathia N, Pejovic V, Rachuri KK, Mascolo C, Musolesi M, Rentfrow PJ. Smartphones for large scale behavior change interventions. IEEE CS. 2013   URL: http://www.cs.bham.ac.uk/~pejovicv/docs/Lathia2013IEEEPervasive.pdf [accessed 2016-11-02] [WebCite Cache]
    40. LiKamWa R, Liu Y, Lane N, Zhong L. MoodScope: building a mood sensor from smartphone usage patterns. In: Proceedings of the 11th Annual International Conference on Mobile Systems, Applications, and Services. 2013 Presented at: MobiSys'13th Annual International Conference on Mobile Systems, Applications, and Services; Jun; 2013; Taipei, Taiwan p. 25-28. [CrossRef]
    41. Asselbergs J, Ruwaard J, Ejdys M, Schrader N, Sijbrandij M, Riper H. Mobile phone-based unobtrusive ecological momentary assessment of day-to-day mood: an explorative study. J Med Internet Res 2016 Mar;18(3):e72 [FREE Full text] [CrossRef] [Medline]
    42. Nielsen. A Nielsen report on the myths and realities of teen media trends. 2009.   URL: http://blog.nielsen.com/nielsenwire/reports/nielsen_howteensusemedia_june09.pdf./ [accessed 2016-06-14] [WebCite Cache]
    43. DeChoudhury CM, Gamon M, Counts X, Horvitz E. Predicting depression via social media. Microsoft. 2013.   URL: http://research.microsoft.com/apps/pubs/default.aspx?id=192721 [accessed 2016-06-15] [WebCite Cache]
    44. Kross E, Verduyn P, Demiralp E, Park J, Lee DS, Lin N, et al. Facebook use predicts declines in subjective well-being in young adults. PLoS One 2013;8(8):e69841 [FREE Full text] [CrossRef] [Medline]
    45. Motl RW, Birnbaum AS, Kubik MY, Dishman RK. Naturally occurring changes in physical activity are inversely related to depressive symptoms during early adolescence. Psychosom Med 2004;66(3):336-342. [Medline]
    46. Thayer RE, Newman JR, McClain TM. Self-regulation of mood: strategies for changing a bad mood, raising energy, and reducing tension. J Pers Soc Psychol 1994 Nov;67(5):910-925. [Medline]
    47. De Choudhury CM. Predicting postpartum changes in emotion and behavior via social media. In: CHI '13 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2013 Presented at: Conference on Human Factors in Computing Systems; 2013; Paris, France p. 3267-3276. [CrossRef]
    48. Miranda D. Music listening and mental health: variations on internalizing psychopathology. In: Macdonald R, Kreutz G, Mitchell L, editors. Music, Health, and Well-being. Oxford: Oxford University Press; 2012.
    49. North A, Hargreaves D. The Social and Applied Psychology of Music. Oxford: Oxford University Press; 2008.
    50. Primack BA, Silk JS, DeLozier CR, Shadel WG, Dillman Carpentier FR, Dahl RE, et al. Using ecological momentary assessment to determine media use by individuals with and without major depressive disorder. Arch Pediatr Adolesc Med 2011 Apr;165(4):360-365 [FREE Full text] [CrossRef] [Medline]
    51. Ooi KE, Lech M, Allen NB. Multichannel weighted speech classification system for prediction of major depression in adolescents. IEEE Trans Biomed Eng 2013 Feb;60(2):497-506. [CrossRef] [Medline]
    52. Scherer K, Bänziger T, Roesch EB. Blueprint for Affective Computing: A Sourcebook. Oxford: Oxford University Press; 2010.
    53. Weninger F, Eyben F, Schuller BW, Mortillaro M, Scherer KR. On the acoustics of emotion in audio: what speech, music, and sound have in common. Front Psychol 2013;4:292 [FREE Full text] [CrossRef] [Medline]
    54. Calvo RA, D'Mello S, Gratch J, Kappas A. The Oxford Handbook of Affective Computing. Oxford: Oxford University Press; 2015.
    55. Keyes CL. Mental illness and/or mental health? Investigating axioms of the complete state model of health. J Consult Clin Psychol 2005 Jun;73(3):539-548. [CrossRef] [Medline]
    56. Suldo SM, Shaffer EJ. Looking beyond psychopathology: the dual-factor model of mental health in youth. School Psychology Review 2008;37:52-68.
    57. Huppert FA. Psychological well-being: evidence regarding its causes and consequences. Appl Psychol Health Well-Being 2009;1(2):137-164.
    58. Scherer K. Ways to study the nature and frequency of our daily emotions: reply to the commentaries on 'Emotions in everyday life'. Soc Sci Inf 2004;43:667-689.
    59. Kapp KM. The gamification of learning and instruction: Game-Based Methods and Strategies for Training and Education. 1st edition. San Francisco, CA: Pfeiffer; 2012.
    60. Pennebaker JW, Chung C, Ireland M, Gonzales A, Booth RJ. The LIWC 2007 Application. LIWC. 2016.   URL: http://liwc.wpengine.com/ [accessed 2016-06-15] [WebCite Cache]
    61. Kroenke K, Spitzer RL, Williams JB, Löwe B. An ultra-brief screening scale for anxiety and depression: the PHQ-4. Psychosomatics 2009;50(6):613-621. [CrossRef] [Medline]
    62. Mehrabian A, Russell J. An Approach to Environmental Psychology. 1st edition. Cambridge, MA: MIT Press; 1974.
    63. Russell JA. A circumplex model of affect. J Pers Soc Psychol 1980;39(6):1161-1178.
    64. Bech P, Kjoller OR, Rasmussen NK. Measuring well-being rather than the absence of distress symptoms: a comparison of the SF-36 Mental Health subscale and the WHO-Five Well-Being Scale. Int J Methods Psychiatr Res 2003;12(2):85-91. [Medline]
    65. Kroenke K, Spitzer RL, Williams JB. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med 2001 Sep;16(9):606-613 [FREE Full text] [Medline]
    66. beyondblue. beyondblue: Depression, Anxiety   URL: https://www.beyondblue.org.au/ [WebCite Cache]
    67. Headspace. Welcome to headspace.   URL: https://headspace.org.au/ [WebCite Cache]
    68. Monash University. Privacy policy. 2016   URL: http://www.privacy.monash.edu/ [accessed 2016-06-15] [WebCite Cache]
    69. The British Psychological Society. Research guidelines & policy documents. 2016   URL: http:/​/www.​bps.org.uk/​publications/​policy-and-guidelines/​research-guidelines-policy-documents/​research-guidelines-poli [accessed 2016-06-15] [WebCite Cache]
    70. Stoyanov S, Hides L, Kavanagh D, Zelenko O, Tjondronegoro D, Mani M. Mobile App Rating Scale: a new tool for assessing the quality of health mobile apps. JMIR mHealth uHealth 2015;3(1):e27. [CrossRef] [Medline]
    71. Picard RW, Vyza E, Healey J. Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Analysis Machine Intell 2001;23(10):1175-1191.
    72. Young and Well Cooperative Research Centre. Young and Well: researching the role of technology to improve the mental health and wellbeing of young people. 2016.   URL: http:// www.youngandwellcrc.org.au [WebCite Cache]
    73. Bakker G. Practical CBT: Using Functional Analysis and Standardised Homework in Everyday Therapy. Bowen Hills, Qld: Australian Academic Press; 2008.
    74. Kazantzis N, MacEwan J, Dattilio FM. A guiding model for practice. In: Kazantzis N, Deane FP, Ronan KR, L'Abate L, editors. Using Homework Assignments in Cognitive Behavior Therapy. New York: Routledge; 2005:359-407.
    75. Bennett K, Reynolds J, Christensen H, Griffiths KM. e-hub: an online self-help mental health service in the community. Med J Aust 2010;192(11 Suppl):S48-S52. [Medline]
    76. McGorry P, Bates T, Birchwood M. Designing youth mental health services for the 21st century: examples from Australia, Ireland, the UK. Br J Psychol 2013;202(s54):s30-s35. [CrossRef]
    77. Rickwood DJ, Telford NR, Mazzer KR, Parker AG, Tanti CJ, McGorry PD. The services provided to young people through the headspace centres across Australia. Med J Aust 2015 Jun 1;202(10):533-536. [Medline]
    78. Ellison N, Boyd DM. Sociality through social network sites. In: Dutton WF, editor. The Oxford Handbook of Internet Studies. Oxford: Oxford University Press; 2013:151-172.
    79. Kramer AD, Guillory JE, Hancock JT. Experimental evidence of massive-scale emotional contagion through social networks. In: Proc Natl Acad Sci U S A. 2014 Jun 17 Presented at: Proceedings of the National Academy of Sciences, United States; 2014 Jun 17; p. 8788-8790. [CrossRef]
    80. Park G, Schwartz HA, Eichstaedt JC, Kern ML, Kosinski M, Stillwell DJ, et al. Automatic personality assessment through social media language. J Pers Soc Psychol 2015;108(6):934-952. [CrossRef] [Medline]
    81. Davila J, Hershenberg R, Feinstein BA, Gorman K, Bhatia V, Starr LR. Frequency and quality of social networking among young adults: associations with depressive symptoms, rumination, and corumination. Psychol Popular Media Cult 2012;1(2):72-86. [CrossRef]
    82. Feinstein BA, Bhatia V, Hershenberg R, Davila J. Another venue for problematic interpersonal behavior: the effects of depressive and anxious symptoms on social networking experiences. J Soc Clin Psychol 2012 Apr;31(4):356-382. [CrossRef]
    83. Valkenburg PM, Peter J, Schouten AP. Friend networking sites and their relationship to adolescents' well-being and social self-esteem. Cyberpsychol Behav 2006 Oct;9(5):584-590. [CrossRef] [Medline]
    84. Seabrook E, Kern M, Rickard NS. Social networking sites, depression, and anxiety: a systematic review. JMIR Mental Health 2016 (forthcoming).
    85. Bonanno GA. Loss, trauma, and human resilience: have we underestimated the human capacity to thrive after extremely aversive events? Am Psychol 2004 Jan;59(1):20-28. [CrossRef] [Medline]
    86. Preotiuc-Pietro D, Eichstaedt JC, Park G, Sap M, Smith L, et al. The role of personality, age, and gender in tweeting about mental illness. 2015 Presented at: Proceedings of the Workshop on Computational Linguistics and Clinical Psychology: From Linguistic Signal to Clinical Reality: NAACL; 2015; Denver Colorado.
    87. Bonanno GA, Diminich ED. Annual research review: positive adjustment to adversity—trajectories of minimal-impact resilience and emergent resilience. J Child Psychol Psychiatry 2013 Apr;54(4):378-401 [FREE Full text] [CrossRef] [Medline]
    88. Dumont M, Provost MA. Resilience in adolescents: protective role of social support, coping strategies, self-esteem, and social activities on experience of stress and depression. Journal of Youth and Adolescence 1999;28:343. [CrossRef]
    89. Herman-Stahl M, Petersen AC. The protective role of coping and social resources for depressive symptoms among young adolescents. J Youth Adolescence 1996;25:733. [CrossRef]
    90. Noor NM, Alwi A. Stressors and well-being in low socio-economic status Malaysian adolescents: the role of resilience resources. Asian Journal of Social Psychology 2013;16:292-306. [CrossRef]
    91. Bonanno GA, Field NP, Kovacevic A, Kaltman S. Self-enhancement as a buffer against extreme adversity: civil war in Bosnia and traumatic loss in the United States. Personality and Social Psychology Bulletin 2002;28(2):184-196. [CrossRef]
    92. Ryff CD, Singer B. From social structure to biology: integrative science in pursuit of human healthwell-being. In: Snyder CR, Lopez SJ, editors. Hanbook of Positive Psychology. New York: Oxford University Press; 2002:541-555.
    93. Ryff CD, Singer BH, Dienberg Love G. Positive health: connecting well-being with biology. Philos Trans Royal Soc Lond B Biol Sci 2004;359(1449):1383-1394. [CrossRef] [Medline]
    94. Meeker M, Wu L. Slideshare. 2013. Meeker M, Wu L. KPCB Internet trends 2013. Slideshare. 2013.   URL: http://www.slideshare.net/kleinerperkins/kpcb-internet-trends-2013 [accessed 2016-06-14] [WebCite Cache]


    Abbreviations

    APIs: application programming interfaces
    CBT: cognitive behavioral therapy
    ESM: experience sampling methodology
    LIWC: Linguistic Inquiry and Word Count
    SSH: Secure shell
    SNS: social network service


    Edited by J Strauss; submitted 14.06.16; peer-reviewed by M Prietula, D Rickwood, J Asselbergs, S Rubinelli, TR Soron; comments to author 10.07.16; revised version received 10.08.16; accepted 04.10.16; published 23.11.16

    ©Nikki Rickard, Hussain-Abdulah Arjmand, David Bakker, Elizabeth Seabrook. Originally published in JMIR Mental Health (http://mental.jmir.org), 23.11.2016.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.