Published on in Vol 7, No 7 (2020): July

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/16338, first published .
Implementation Determinants and Outcomes of a Technology-Enabled Service Targeting Suicide Risk in High Schools: Mixed Methods Study

Implementation Determinants and Outcomes of a Technology-Enabled Service Targeting Suicide Risk in High Schools: Mixed Methods Study

Implementation Determinants and Outcomes of a Technology-Enabled Service Targeting Suicide Risk in High Schools: Mixed Methods Study

Original Paper

1Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, United States

2Seattle Children's Research Institute, Seattle, WA, United States

3Qntfy, Arlington, VA, United States

4The Dartmouth Institute for Health Policy and Clinical Practice, Geisel School of Medicine at Dartmouth, Hanover, NH, United States

Corresponding Author:

Molly Adrian, PhD

Department of Psychiatry and Behavioral Sciences

University of Washington

6200 NE 74th St

Suite 110

Seattle, WA, 98115

United States

Phone: 1 206 221 1689

Email: adriam@uw.edu


Background: Technology-enabled services (TESs), which integrate human service and digital components, are popular strategies to increase the reach and impact of mental health interventions, but large-scale implementation of TESs has lagged behind their potential.

Objective: This study applied a mixed qualitative and quantitative approach to gather input from multiple key user groups (students and educators) and to understand the factors that support successful implementation (implementation determinants) and implementation outcomes of a TES for universal screening, ongoing monitoring, and support for suicide risk management in the school setting.

Methods: A total of 111 students in the 9th to 12th grade completed measures regarding implementation outcomes (acceptability, feasibility, and appropriateness) via an open-ended survey. A total of 9 school personnel (school-based mental health clinicians, nurses, and administrators) completed laboratory-based usability testing of a dashboard tracking the suicide risk of students, quantitative measures, and qualitative interviews to understand key implementation outcomes and determinants. School personnel were presented with a series of scenarios and common tasks focused on the basic features and functions of the dashboard. Directed content analysis based on the Consolidated Framework for Implementation Research was used to extract multilevel determinants (ie, the barriers or facilitators at the levels of the outer setting, inner setting, individuals, intervention, and implementation process) related to positive implementation outcomes of the TES.

Results: Overarching themes related to implementation determinants and outcomes suggest that both student and school personnel users view TESs for suicide prevention as moderately feasible and acceptable based on the Acceptability of Intervention Measure and Feasibility of Intervention Measure and as needing improvements in usability based on the System Usability Scale. Qualitative results suggest that students and school personnel view passive data collection based on social media data as a relative advantage to the current system; however, the findings indicate that the TES and the school setting need to address issues of privacy, integration into existing workflows and communication patterns, and options for individualization for student-centered care.

Conclusions: Innovative suicide prevention strategies that rely on passive data collection in the school context are a promising and appealing idea. Usability testing identified key issues for revision to facilitate widespread implementation.

JMIR Ment Health 2020;7(7):e16338

doi:10.2196/16338

Keywords



Background

Suicide is the second leading cause of death for adolescents, and the rate of suicide in the United States has increased in recent years [1,2]. Suicidal thoughts and behaviors (including suicidal ideation, nonsuicidal self-injury, and suicide attempts) increase dramatically during adolescence and are unfortunately common. Data suggest that 17% of US high school students seriously consider suicide each year, and approximately 9% report a suicide attempt [3]. Despite this being a common health concern, our ability to predict suicide is poor [4].

High schools provide a convenient and accessible setting to promote mental health, as most children attend high schools, removing practical barriers for mental health services and promoting care for traditionally underserved groups [5-9]. Thus, schools have the potential to play an essential role in supporting the identification and treatment of youth with mental health difficulties. School is also the most common community setting where suicidal ideation is identified [10]; however, the majority of schools demonstrate poor adherence to gold standard practices related to the identification and treatment of youth at risk [11,12]. Many high schools do not adopt recommended suicide prevention strategies because of practical concerns, such as the capacity to manage false positives, lack of knowledge of evidence-based practices, fear, stigma, as well as legal and ethical issues [11,13].

Technology-enabled services (TESs) hold promise for adolescent suicide prevention because of their capacity to support best practices while achieving population health and may help address many of the practical concerns that act as barriers to the adoption of suicide prevention strategies (ie, cognitive load, burden, and costs). TESs are characterized by having both a human service component (eg, therapist-delivered psychotherapies) and a digital component (eg, dashboard app) that supports, or is supported by, the service [14]. A broad range of support falls under the category of TES, including web- or app-based supports for posttraumatic stress disorder treatment (eg, Prolonged Exposure Coach), interventions for insomnia (Cognitive Behavioral Therapy for Insomnia Coach), and substance use disorders (reSET) [15,16]. Efficacy trials indicate that TESs yield effects commensurate with well-established psychological interventions for depression, anxiety, and suicidal ideation/behavior [17-19]. Nearly all teens are heavy users of smartphones and other digital technologies and use these technologies for health-related concerns [20]. Furthermore, adolescents are seeking TES to manage their health [21]. TES via text-based extensions (eg, Text4Strength with Sources of Strength) of suicide prevention programs show initial feasibility, safety, and utility from the adolescent perspective [22].

High schools represent a key setting for which TES can be effectively adapted and applied. An important suicide prevention research priority is to evaluate how TES may address many of the challenges that schools face in adopting suicide prevention strategies. For example, TES that automatically and continuously monitors social media (SM) data and requires virtually no staff or resources such as time in the classroom to execute may be valuable in reducing the burden of universal screening. Furthermore, strategies that rely on student-generated SM data may reduce common concerns about specific suicide or emotional health screening tools [23,24]. Furthermore, as TES does not rely on explicit reports of suicidal ideation or behavior, the potential for student stigmatization is substantially reduced. A platform that allows for passive data aggregation and monitoring is ongoing rather than linked to one particular assessment time point, facilitating an identification approach that aligns with the episodic nature of suicidality [23-25]. For these reasons, strategic monitoring of SM, has the potential to have a considerable impact on public health via scalable early detection and intervention to decrease adolescent suicide rates.

Although research on the efficacy of TES for mental health broadly—and suicide prevention specifically—is promising [21], school settings have experienced few benefits from TES. Much of this is likely because of insufficient attention to (1) end users’ priorities and experiences regarding aspects of the technology and the human service and (2) the implementation strategies that promote their adoption and sustained use [14,26]. To improve TES implementation, developers are increasingly turning to the methods of human-centered design (HCD) to identify and address problematic system design and its impact on otherwise appealing and effective products [27]. HCD includes a set of approaches that ground the development process in information about the needs and desires of people who will ultimately use a product, with the goal of creating compelling, intuitive, and effective innovations [28,29]. Usability testing, a hallmark of HCD, provides an opportunity for representative end users to interact with the technology, complete specific tasks, and generate information about the functionality and presentation. Owing to the potential impact of TES usability on implementation outcomes [30], we prioritized this determinant in the study design.

Objectives

The goal of this research was to evaluate the implementation outcomes of acceptability, feasibility, and appropriateness from the primary users of a TES (Quinn Therapeutic) for school-based suicide prevention (ie, school staff and students). These perceived implementation outcomes are critical precursors to adoption and use [30] and can be most effectively assessed at early project stages before actual implementation occurs [31]. As articulated by Proctor et al [31], acceptability is defined as perceptions that an innovation is agreeable, palatable, or satisfactory; feasibility is the extent to which a new innovation can be successfully used or carried out within a given agency or setting; and appropriateness refers to the innovation’s fit, relevance, and compatibility with the setting, staff, and target problem. We also evaluated implementation determinants, including usability factors, driving our primary implementation outcomes. Our specific research questions were as follows: (1) What key, multilevel factors in the school context should drive the adaptation and implementation of a student-monitoring dashboard interface of the Quinn Therapeutic and (2) What changes to the digital dashboard interface are needed to maximize its acceptability, feasibility, appropriateness, and ultimate usability for school systems? We focus on the dashboard interface for applying HCD as the most salient and user-facing feature of TES.


Participants and Procedures

Participants were drawn from an urban area in the Pacific Northwest. Recruitment of participants occurred through past and ongoing research partnerships. Interested principals and/or school counselor leads were presented with the information regarding study procedures in an initial meeting and then recontacted if they were interested in having students and school personnel the opportunity to participate. Recruitment of clinician participants was done in the winter of 2018, with user testing procedures completed in February and March; student recruitment and study student data collection procedures were performed in May 2018. All procedures were approved by our institutional review board (study 3246).

Procedures for School Personnel: User Testing

In total, 9 school personnel whose responsibilities are most proximal to school-based suicide screening—school nurses, counselors, social workers, and administrators—were invited to participate in the laboratory-based user testing of our student-monitoring dashboard.

School personnel were included to identify key aspects of the Quinn Therapeutic system that were most in need of redesign. Drawing from established models of user testing [32], participants were presented with a series of scenarios that contain common tasks to accomplish. Tasks focused on the primary features of the dashboard system (ie, open exploration, identifying a student at risk for suicide, and identifying the risk status of a new student). During the task completion, participants used a think-aloud data collection technique [33], describing their processes and experiences as they navigate the system. Anticipated and actual task difficulties were assessed consistent with Albert and Dixon’s [34] method. During system testing, participants rated on a 5-point scale (ranging from 1=very easy to 5=very difficult) the expected difficulty of each task. Following task completion, participants rated the experienced difficulty on the same scale. We used prompts such as What are you thinking about? What details are you looking for? and What is your impression of this task to elicit information. Following the completion of the task and their post-task rating, we asked several follow-up questions including, Why did you answer the way you did? Was there anything confusing, usual, or difficult to understand about this task? as well as additional questions specific to individualized tasks. Each session concluded with a qualitative open-ended interview to gather additional feedback about the system and implementation determinants as well as completion of standardized measures of implementation outcomes and system usability (system usability scale; SUS [35]).

Procedures for Students: Social Media Data and Survey

Following school administrator approval, 111 students were recruited from a private high school in an urban area of the Pacific Northwest. Students were eligible to participate if they attended high school and used 1 of 5 popular SM platforms (Twitter, Facebook, Instagram, Reddit, and Tumblr) on a weekly basis.

The opportunity for participation was presented in an all-class assembly. A brief orientation to the procedures was given, and assent was obtained from youth. Parents/caregivers received a letter from the principal informing of study procedures and the option to opt out their high school from the study procedures. Youth who provided consent were given access to the University of Washington OurDataHelps website to donate SM data and complete questionnaires regarding their preferences for the Quinn Therapeutic dashboard. Students opted for study participation through the OurDataHelps website [36]. This website provided study information and provided a web-based data collection platform. The components of student data collection included SM data donation, and questionnaires regarding emotional health, implementation outcomes, and SM use and preferences were completed via a web or mobile platform. Following the presentation, eligible students were given 1 week to access the survey platform and complete study procedures. Students received US $30 for participation.

Materials: Quinn Therapeutic Dashboard

Quinn Therapeutic is a TES that aggregates patient-generated SM data to detect and monitor suicide risk, visualize data over time, and provide feedback to clinicians, which may be particularly suitable in the high school context [37-39]. Quinn Therapeutic’s core digital and human service features are outlined below (Table 1). The technology relies on deep learning (a subset of machine learning algorithms), which enable a computer to discover and use patterns in data, trained, and optimized using SM data [40]. The aggregated SM data and risk ratings based on machine learning algorithms are presented in a student-monitoring dashboard interface, which clinicians log into to view estimated student risk status, data over time, and SM content driving ratings of risk. The dashboard’s purpose is to allow school personnel to monitor student suicide risk over time. Core functions are represented in the web-based supplement. A timeline shows the overall data contribution for the population; there is a search function for finding specific students, and tabs categorize students with high risk for ease of viewing the at-risk population with past suicidal ideation and self-harm. School personnel can also view individuals, including their suicide risk level over time (ie, risk level graph), source SM data that generate the risk ratings, the strength and valence of sentiment (ie, sentiment graph), and students self-reported diagnostic and suicide risk information. Quinn Therapeutic’s predictive algorithms have been developed and evaluated in users who donate their data to an online portal as well as publicly available data from Twitter and demonstrated impressive accuracy in distinguishing those with identifying self-reported suicide attempts from those who did not report this history [41]. The algorithms demonstrated the capability to separate users who would attempt suicide from neurotypical controls. Evaluating SM data from the month before a suicide attempt, the area under the curve (AUC) from receiver operating characteristics for this binary decision task was 0.89, and for all available SM data, AUC was 0.94 (an AUC of 1 is perfect prediction) [42]. Quinn Therapeutic human service components include measurement-based care, crisis prevention planning, and risk management, all of which align with recommendations for the identification and management of adolescent suicide risk [43]. As a first step in evaluating the application of Quinn Therapeutic TES in the school setting, the current project focused only on an early prototype of Quinn Therapeutic’s digital technology component.

Table 1. Quinn Therapeutic technology specifications.
Data aggregationOngoing data capture from five SMa platforms—Facebook, Instagram, Twitter, Reddit, and Tumblr
Digital platform components

Risk predictionRelies on deep learning, specifically refined for suicide-specific predictions from multiple cohorts

Visualization of progressStudent-monitoring dashboard interface for selected school personnel

Data security and privacyHealth Insurance Portability and Accountability Act–compliant cloud-based server, opt in participation, and meets recommendations for ethical use of SM data [44]
Service components

Measurement-based careOngoing monitoring through passive data collection

Crisis prevention planningUse of real-time data to understand past prompting events and plan for future

Suicide-specific assessment and treatmentMaintains top priority of safety at the time it is needed

aSM: social media.

For user testing sessions, school personnel viewed the student-monitoring dashboard populated with dummy data. The dashboard allows for data visualization of the posterior probability (range 0-1) representing the likelihood that each individual SM post was written by someone at risk of suicide and status updates that were analyzed with machine learning algorithms developed in prior work [42]. Cohort data, individual monitoring data (time series and risk rating), and source content (SM posts/behavior) that generated ratings were viewed via the dashboard.

Measures

Mutltimedia Appendix 1 provides information regarding the measures by the reporter as well as the sample items.

Demographics

Participants self-reported their age, ethnicity/race, sexual orientation, and gender. In addition, school personnel reported their role in the school context and the years of experience in that role.

Implementation Outcomes
Acceptability

The 4-item acceptability of intervention measure [45] was used to assess school personnel’s perception of acceptability, including liking, approving, and welcoming use of the dashboard. Items were rated on a 5-point Likert scale (1=completely disagree and 5=completely agree). Prior psychometric evaluation suggested acceptable measurement model fit and high reliability [45], and internal consistency in this study was strong (α=.93).

Appropriateness

The 4-item intervention appropriateness scale [45] was used to assess school personnel’s perception of fit with items related to fit for the setting, applicability to their work, and a good match for the needs of the users. Items were rated on a 5-point Likert scale (1=completely disagree and 5=completely agree). Prior psychometric evaluation suggested acceptable measurement model fit and high reliability [45], and internal consistency in this study was excellent (α=.97).

Feasibility

The 4-item feasibility of intervention measure [45] was used to assess school personnel’s perception of acceptability, including possible, doable, and easy use of the dashboard. Items were rated on a 5-point Likert scale (1=completely disagree and 5=completely agree). Internal consistency in this study was strong (α=.91).

Implementation Determinants
Usability

School personnel completed the SUS following user testing. The SUS is a 10-item measure, with scores ranging from 0 to 100, with scores greater than 70 considered acceptable. The SUS is the best-researched and most sensitive usability measure available [35]. Internal consistency in this study was strong (α=.83).

Additional Determinants

Following user testing sessions, school personnel participants were asked a series of open-ended questions about what they saw as the positive aspects, the negative aspects, and specific suggestions for improvement based on other technologies with which they interacted. Questions focused on acceptability, feasibility, and appropriateness were asked to understand the reasons for their interview responses. Students completed the Preferences, Relationships, and Interventions using Social Media, a 22-item questionnaire developed by this team that assessed the use and frequency of SM platforms, priorities regarding intervention options, and open-ended questions around the ways to improve system alignment with the needs and expectations of students in their school.

Data Analysis Plan

Descriptive statistics, including means and SDs, were calculated for quantitative measures. Qualitative content was coded using the Consolidated Framework for Implementation Research (CFIR) [46]. The CFIR is a commonly used framework that organizes constructs that have been associated with effective implementation. It has been widely used as a practical guide to evaluate implementation efforts in preparation for or during active studies [46]. The codebook template was used to understand the multilevel determinants of implementation. Determinants include aspects of the innovation (eg, evidence strength and relative advantage), outer context (eg, external policies and incentives), the inner organizational context (eg, implementation climate and tension for change ), characteristics of the individuals operating within target settings (eg, attitudes and efficacy), and process of change in the organization (eg, engagement strategies and change agents) [47-51]. School personnel interviews were audio recorded, transcribed, and coded with directed content analysis. In total, 4 coders were trained to conduct directed content analysis based on the CFIR codebook by reviewing the codebook and example codes, reviewing the school personnel’ s responses to each question from the same two transcripts, identifying potential codes, and independently coding. Consensus among the four coders for the two transcripts was achieved through open dialog [52]. Following consensus on the two transcripts, two coding team members who had completed the consensus coding were split into two groups. The remaining transcripts were coded independently and then the two groups met to review codes in consensus meetings. A consensus coding process was used to reduce biases, groupthink, and errors [53]. This coding approach was used, as many qualitative researchers consider it to be more valid for analyzing human communication, as it explicitly uses coding ambiguities to prompt discussion and increases confidence in complex data compared with interrater reliability [54].


Participants

The demographic characteristics of the participants are included in Tables 2 and 3.

Table 2. Summary of demographics and clinical characteristics for student participants (N=111).
CharacteristicsStudents
Sex at birth, n (%)

Male40 (36.0)

Female71 (64.0)

Intersex0 (0.0)
Gender, n (%)

Male40 (36.0)

Female71 (64.0)

Transgender male0 (0.0)

Transgender female0 (0.0)
Age (years), mean (SD)16.5 (1.13)
Sexual orientation, n (%)

Asexual1 (0.9)

Bisexual or pansexual16 (14.4)

Gay or lesbian4 (3.6)

Heterosexual or straight83 (74.8)

Othera2 (1.8)

Prefer not to say5 (4.5)
Ethnicity (Hispanic or Latino), n (%)

Not Hispanic or Latino103 (92.8)

Hispanic, of Spanish Origin or Latino5 (4.5)

Prefer not to answer3 (2.7)
Race, n (%)

White63 (56.8)

Black or African American10 (9.0)

American Indian or Alaska Native0 (0.0)

Asian15 (13.5)

Native Hawaiian or Other Pacific Islander2 (1.8)

Other, not specified above7 (6.3)

Unknown or prefer not to answer2 (1.8)

Multiracial12 (10.8)
Number of suicide attempts, n (%)

0106 (95.5)

12 (1.8)

23 (2.7)

aSexual orientation: other: heterosexual and bicurious (n=1) and questioning (n=1).

Table 3. Summary of demographics for school personnel participants (N=9).
CharacteristicsSchool personnel, n (%)
Gender

Male2 (22)

Female7 (78)

Other0 (0)
Age (years)

25-344 (44)

35-444 (44)

55-641 (11)
Ethnicity (Hispanic or Latino)

Not Hispanic or Latino9 (100)

Hispanic, of Spanish Origin or Latino0 (0)
Race

White7 (78)

Black or African American0 (0)

American Indian or Alaska Native0 (0)

Asian2 (22)

Native Hawaiian or Other Pacific Islander0 (0)

Other, not specified above0 (0)

Multiracial0 (0)
Degree

Bachelor’s2 (22)

Master’s7 (78)
Professional Role

School counselor3 (33)

Mental health counselor2 (22)

School administrator1 (11)

Othera3 (33)
Years in role

1-33 (33)

4-63 (33)

7-92 (22)

≥201 (11)

aProfessional role: other: school nurse (n=2) and community-based behavioral health partner (n=1).

Implementation Outcomes

School personnel gave the student-monitoring dashboard moderate scores, on average, for acceptability (mean 3.69, SD 0.85; range 2.25-5.00), appropriate (mean 3.72, SD 1.09; range 2.00-5.00), and feasibility (mean 3.78, SD 0.75; range 2.25-4.75) of implementation in their setting, indicating school personnel viewed the student-monitoring dashboard as moderately appropriate for the school setting.

Determinants of Implementation

To understand the reasons for ratings of core implementation outcomes, qualitative themes at the level of the innovation, outer setting, inner setting, individual characteristics, and engagement were summarized (Multimedia Appendix 2). The following three themes emerged: (1) compatibility with culture, values, and norms in the school setting; (2) additional attention needed to confidentiality and privacy; and (3) flexibility in the way to support students. The majority of the qualitative codes related to the first theme, that is, the organizational context, culture, resources, and structure (161/350, 46.0% of school personnel comments and 118/222, 53.2% of student comments). Specific comments highlight positive aspects of the system being compatible with culture and values/norms in the school setting. For example, a student indicated:

The system would be great if it helps a student personally and on their phone, and includes lots of student choice.
[coded innovation characteristics, adaptability]

However, both participant groups reported difficulty in managing confidentiality and privacy within this context and adequately managing the workflow. For example, 1 clinician stated:

A barrier to implementation in that we are not an organization that is accessible. This level of oversight is appealing in some ways and so I wonder if it creates an expectation of supervision or the impression of supervision where it’s not always available.
[coded inner setting, available resources]

A student’s perspective highlighted:

If people feel like they can’t be themselves on the social media because they don’t trust the system to keep their confidentiality then I don’t think they’d use it. If students didn’t use the social media then the system wouldn’t work at all.
[coded outer setting, external policy]

The second theme highlighted the need for careful attention to how information would be used within the school setting and remain confidential. Some expressed uncertainty about the extent to which machine learning can discern the complexities of unstructured text and nuanced communication occurring on SM platforms:

In today’s society the young generation us [sic] tend to make jokes about suicide in a way to relieve stress so I’m afraid something like that will be taken the wrong way.
[coded innovation characteristics, evidence strength]

School personnel and students noted wanting clarity on how the TES would impact internal communications and other external systems outside the school, including the district and outside resources (eg, therapists outside of the school and crisis responding).

The third theme relates to the potential for an approach similar to this to expand options for youth at risk. Overall, 34.0% of school personnel comments and 18.0% of student comments highlight the innovative aspects of using passive and ongoing data collection in this way. Themes of positive comments related to the relative perceived advantage of a technology-based solution compared with the status quo as well as the ability to provide individualized solutions and options for youth who appear distressed and/or suicidal. One student noted:

It would allow social media to be safer and less stressful for people who have a lot of anxiety about it.
[coded innovation characteristics, relative advantage]

Another student stated the asset of flexibility:

I think that the best thing this system could do was just be an option for people who are struggling to go and have someone or something that could help them and be there for them.
[coded inner setting, available resources]

Prototype Interface Usability

Task Difficulty

Most tasks were estimated to be moderately easy (meanrange 3.78-4.22), with the exception of isolating a date range (meanpre 3.00 and meanpost 2.22). Users found the task of free exploration and navigation of the dashboard, similar to or easier than they had anticipated. The majority of users found the task of identifying posts within a set time frame more difficult than they anticipated.

Task Success

All 100% of participants identified the students and the risk level. Two-thirds of the participants correctly identified previous suicide risk, and about half of the participants were able to flag concerning posts.

System Usability Scale Usability

School personnel’s scores on the platform ranged from 22.5 to 75, with a mean of 54.17 (SD 16.58), showcasing the divergent opinions from school personnel on the system’s overall usability and an overall unacceptable rating of current usability of the prototype dashboard interface (acceptable ratings >70). Feedback after each testing scenario and during the formal qualitative interview highlighted several common issues, themes, and needed modifications identified by the participants for a subsequent version of the interface (eg, difficulty in isolating specific periods within the interface).


Principal Findings

In this study, high school students and school personnel provided feedback on implementation determinants and outcomes to facilitate the redesign of a TES to support suicide risk identification and prevention. Universal emotional health screening is recognized as an essential component of a multitier system of support and behavioral health framework [55]. Universal emotional health screening may facilitate the identification of undetected difficulties [56]; however, emotional health screening is rarely conducted in school settings because of feasibility, burden on school personnel, and lack of knowledge of best practices. A solution that supports accurate, ongoing, and passive screening for youth risk, clinical decision making, and improved communication and that fits within the school context would be a great asset toward facilitating identification and triage of students at risk for suicide. Through mixed qualitative and quantitative approaches, our study identified a number of strengths of the digital component of the Quinn Therapeutic TES. We additionally identified several challenges related to the school context and concerns regarding fit within workflow and the network of communications around protected health information such as suicidality. Three primary themes identified by students and school personnel were (1) compatibility with culture, values, and norms in the school setting; (2) additional attention needed to confidentiality and privacy; and (3) flexibility in the way to support students. With regard to compatibility with the culture, students, and school personnel highlighted that this approach aligned with the school’s value to support well-being and help achieve goals related to caring for the whole student. However, both stakeholders reported a need for additional information about the data and processes for analysis, interpretation, communication, and human responses were requested. Student concerns regarding confidentiality were centered on how school personnel would manage communications, not sharing data with a company for purposes of suicide prevention. Along with other researchers, before the widespread acceptance of a system that was supported by existing data sources such as SM, additional education regarding the validity of psychologically relevant data can be measured via SM language. Finally, both school personnel and students found appeal in TES flexibility, including the ability for multiple options for enrollment and strategies to support students.

Several researchers have suggested that digital innovations that rely on machine learning strategies similar to the one evaluated in this research would provide significant advances in the field [4,57-60]. In addition, programs that rely on suicide risk prediction algorithms have been deployed in the Veteran’s administration. This program, called Recovery Engagement and Coordination for Health—Veterans Enhanced Treatment, uses medical record data and applies machine learning to identify those at a statistically elevated risk for suicide or other adverse outcomes. At present, there is an active clinical trial of the program (NCT03280225). The evaluation of the application of machine learning algorithms to medical records for the prediction of suicide attempts has demonstrated good performance [61]. Few programs have been designed to provide rigorous evaluation of the usability and other implementation determinants for the use of a technology-based solution to universal suicide screening in the school context. Several limitations must be considered when interpreting the results. First, the sample of users was small, and not representative, as it was limited to participants from a small private school in an urban area. Second, we coded perceptions of the intervention implementation following scenario-based user testing, not actual implementation of the intervention, and, therefore, may not be valid for real-world implementation. Finally, we only included primary end users, that is, students and school personnel who would be involved in responding to suicide risk directly and, therefore, did not include other important stakeholders such as teachers and parents.

Conclusions

Strategies to make suicide prevention efforts in high schools scalable, sustainable, and supportive may benefit from attention to how technology can facilitate and aid human efforts. This research evaluated a system that aggregates existing data sources—SM data—to provide ongoing monitoring of suicide risk based on machine learning algorithms. Primary users—high school students and school personnel—highlighted the potential advantages of providing individualized solutions and options for youth compared with the current suicide prevention strategy within the school (which included gatekeeper training, mental health awareness group, and onsite counseling support). However, the management of private and sensitive communications in the school context and limited functionality of the prototype dashboard dampened enthusiasm for widespread implementation. Although further investment in an improved user interface may improve some of the concerns, the large fundamental challenge facing this and similar TES is a lack of understanding and policy surrounding the privacy and use of sensitive communications in the school context. Widespread agreement on community norms and commonly accepted guidelines for how and when to use this sort of data will be necessary for the widespread adoption of any similar TES.

Acknowledgments

This research was supported by the American Foundation of Suicide Prevention Linked Standard Grant and the Agency for Healthcare Research and Quality (grant no. K12HS022982).

Conflicts of Interest

GC is an employee and a shareholder of Qntfy.

Multimedia Appendix 1

Measures, QT user function, and qualitative coding examples.

DOCX File , 83 KB

Multimedia Appendix 2

Qualitative coding reference table.

XLSX File (Microsoft Excel File), 25 KB

  1. Curtin SC, Warner M, Hedegaard H. Increase in suicide in the United States, 1999-2014. NCHS Data Brief. Apr 2016;(241):1-8. [FREE Full text] [Medline]
  2. Quickstats: suicide rates* for teens aged 15-19 years, by sex - United States, 1975-2015. MMWR Morb Mortal Wkly Rep. Aug 4, 2017;66(30):816. [FREE Full text] [CrossRef] [Medline]
  3. Kann L, McManus T, Harris WA, Shanklin SL, Flint KH, Queen B, et al. Youth risk behavior surveillance - United States, 2017. MMWR Surveill Summ. Jun 15, 2018;67(8):1-114. [FREE Full text] [CrossRef] [Medline]
  4. Franklin JC, Ribeiro JD, Fox KR, Bentley KH, Kleiman EM, Huang X, et al. Risk factors for suicidal thoughts and behaviors: a meta-analysis of 50 years of research. Psychol Bull. Feb 2017;143(2):187-232. [CrossRef] [Medline]
  5. Allison MA, Crane LA, Beaty BL, Davidson AJ, Melinkovich P, Kempe A. School-based health centers: improving access and quality of care for low-income adolescents. Pediatrics. Oct 2007;120(4):e887-e894. [CrossRef] [Medline]
  6. Juszczak L, Melinkovich P, Kaplan D. Use of health and mental health services by adolescents across multiple delivery sites. J Adolesc Health. Jun 2003;32(6 Suppl):108-118. [CrossRef] [Medline]
  7. Pullmann MD, Daly BP, Sander MA, Bruns EJ. Improving the impact of school-based mental health and other supportive programs on students' academic outcomes: how do we get there from here? Adv Sch Ment Health Promot. 2014;7(1):1-4. [CrossRef]
  8. Pullmann MD, VanHooser S, Hoffman C, Heflinger CA. Barriers to and supports of family participation in a rural system of care for children with serious emotional problems. Community Ment Health J. Jun 2010;46(3):211-220. [FREE Full text] [CrossRef] [Medline]
  9. Lyon AR, Ludwig KA, Stoep AV, Gudmundsen G, McCauley E. Patterns and predictors of mental healthcare utilization in schools and other service sectors among adolescents at risk for depression. School Ment Health. Aug 1, 2013;5(3). [FREE Full text] [CrossRef] [Medline]
  10. Farmer EM, Burns BJ, Phillips SD, Angold A, Costello EJ. Pathways into and through mental health services for children and adolescents. Psychiatr Serv. Jan 2003;54(1):60-66. [CrossRef] [Medline]
  11. Hallfors D, Brodish PH, Khatapoush S, Sanchez V, Cho H, Steckler A. Feasibility of screening adolescents for suicide risk in 'real-world' high school settings. Am J Public Health. Feb 2006;96(2):282-287. [CrossRef] [Medline]
  12. Bradshaw CP, Pas ET, Goldweber A, Rosenberg MS, Leaf PJ. Integrating school-wide positive behavioral interventions and supports with tier 2 coaching to student support teams: The PBISplus model. Adv Sch Ment Health Promot. Jul 2012;5(3):177-193. [CrossRef]
  13. Hayden D, Lauer P. Prevalence of suicide programs in schools and roadblocks to implementation. Suicide Life Threat Behav. 2000;30(3):239-251. [Medline]
  14. Mohr DC, Lyon AR, Lattie EG, Reddy M, Schueller SM. Accelerating digital mental health research from early design and creation to successful implementation and sustainment. J Med Internet Res. May 10, 2017;19(5):e153. [FREE Full text] [CrossRef] [Medline]
  15. Reger GM, Hoffman J, Riggs D, Rothbaum BO, Ruzek J, Holloway KM, et al. The 'PE coach' smartphone application: an innovative approach to improving implementation, fidelity, and homework adherence during prolonged exposure. Psychol Serv. Aug 2013;10(3):342-349. [CrossRef] [Medline]
  16. Campbell AN, Nunes EV, Matthews AG, Stitzer M, Miele GM, Polsky D, et al. Internet-delivered treatment for substance abuse: a multisite randomized controlled trial. Am J Psychiatry. Jun 2014;171(6):683-690. [FREE Full text] [CrossRef] [Medline]
  17. Richards D, Richardson T. Computer-based psychological treatments for depression: a systematic review and meta-analysis. Clin Psychol Rev. Jun 2012;32(4):329-342. [CrossRef] [Medline]
  18. Cuijpers P, Marks IM, van Straten A, Cavanagh K, Gega L, Andersson G. Computer-aided psychotherapy for anxiety disorders: a meta-analytic review. Cogn Behav Ther. 2009;38(2):66-82. [CrossRef] [Medline]
  19. Torok M, Han J, Baker S, Werner-Seidler A, Wong I, Larsen ME, et al. Suicide prevention using self-guided digital interventions: a systematic review and meta-analysis of randomised controlled trials. Lancet Digit Health. Jan 2020;2(1):e25-e36. [CrossRef]
  20. Pew Research Center. Jun 12, 2019. URL: http://www.pewinternet.org/data-trend/mobile/cell-phone-and-smartphone-ownership-demographics/ [accessed 2020-03-19]
  21. Radovic A, McCarty CA, Katzman K, Richardson LP. Adolescents' perspectives on using technology for health: qualitative study. JMIR Pediatr Parent. 2018;1(1):e2. [FREE Full text] [CrossRef] [Medline]
  22. Thiha P, Pisani AR, Gurditta K, Cherry E, Peterson DR, Kautz H, et al. Efficacy of web-based collection of strength-based testimonials for text message extension of youth suicide prevention program: randomized controlled experiment. JMIR Public Health Surveill. Nov 9, 2016;2(2):e164. [FREE Full text] [CrossRef] [Medline]
  23. Dever BV, Raines TC, Barclay CM. Chasing the Unicorn: practical Implementation of Universal Screening for Behavioral and Emotional Risk. Sch Psychol. 2012;6(4):108-118. [FREE Full text]
  24. Fox J, Halpern L, Forsyth J. Mental health checkups for children and adolescents: a means to identify, prevent, and minimize suffering associated with anxiety and mood disorders. Clin Psychol Sci Pract. Sep 2008;15(3):182-211. [CrossRef]
  25. Adrian M, Miller AB, McCauley E, Stoep AV. Suicidal ideation in early to middle adolescence: sex-specific trajectories and predictors. J Child Psychol Psychiatry. May 2016;57(5):645-653. [FREE Full text] [CrossRef] [Medline]
  26. Lyon AR, Wasse JK, Ludwig K, Zachry M, Bruns EJ, Unützer J, et al. The Contextualized Technology Adaptation Process (CTAP): optimizing health information technology to improve mental health systems. Adm Policy Ment Health. May 2016;43(3):394-409. [FREE Full text] [CrossRef] [Medline]
  27. Littlejohns P, Wyatt J, Garvican L. Evaluating computerised health information systems: hard lessons still to be learnt. Br Med J. Apr 19, 2003;326(7394):860-863. [FREE Full text] [CrossRef] [Medline]
  28. International Organization for Standardization. 2010. URL: https://www.iso.org/standard/52075.html [accessed 2020-03-26]
  29. Courage C, Baxter K. Understanding Your Users: A Practical Guide to User Requirements Methods, Tools, and Techniques. Burlington, Massachusetts, United States. Morgan Kaufmann; 2005.
  30. Lyon AR, Bruns EJ. User-centered redesign of evidence-based psychosocial interventions to enhance implementation-hospitable soil or better seeds? JAMA Psychiatry. Jan 1, 2019;76(1):3-4. [CrossRef] [Medline]
  31. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. Mar 2011;38(2):65-76. [FREE Full text] [CrossRef] [Medline]
  32. Rubin J, Chisnell D. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Indianapolis, IN. John Wiley & Sons; 2008.
  33. Benbunan-Fich R. Using protocol analysis to evaluate the usability of a commercial web site. Inf Manag. 2001;39(2):151-163. [CrossRef]
  34. Albert W, Dixon E. Is This What You Expected? The Use of Expectation Measures in Usability Testing. In: Proceedings of the Usability Professionals’ Association 12th Annual Conference. 2003. Presented at: UPA'03; June 25, 2003; Scottsdale, AZ. URL: https://tinyurl.com/y7t5zbfb
  35. Sauro J. A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices. Scotts Valley, California, US. CreateSpace; 2011.
  36. OurDataHelps. URL: https://ourdatahelps.org/ [accessed 2019-07-17]
  37. Coppersmith G, Hilland C, Frieder O, Leary R. Scalable Mental Health Analysis in the Clinical Whitespace via Natural Language Processing. In: Proceedings of the 2017 IEEE EMBS International Conference on Biomedical & Health Informatics. 2017. Presented at: BHI'17; February 16-19, 2017; Orlando, FL, USA. [CrossRef]
  38. Loveys K, Crutchley P, Wyatt E, Coppersmith G. Small but Mighty: Affective Micropatterns for Quantifying Mental Health from Social Media Language. In: Proceedings of the Fourth Workshop on Computational Linguistics and Clinical Psychology — From Linguistic Signal to Clinical Reality. 2017. Presented at: CLPsych'17; August 3, 2017:85-95; Vancouver, BC. [CrossRef]
  39. Amir S, Coppersmith G, Carvalho P, Silva MJ, Wallace BC. Quantifying mental health from social media with neural user embeddings. arXiv preprint. 2017. [FREE Full text]
  40. Michalski RS, Carbonell JG, Mitchell TM. Machine Learning: An Artificial Intelligence Approach. Burlington, Massachusetts, United States. Morgan Kaufmann; 2014.
  41. OurDataHelps. URL: https://ourdatahelps.org/ [accessed 2020-03-30]
  42. Coppersmith G, Leary R, Crutchley P, Fine A. Natural language processing of social media as screening for suicide risk. Biomed Inform Insights. 2018;10:1178222618792860. [FREE Full text] [CrossRef] [Medline]
  43. Brodsky BS, Spruch-Feiner A, Stanley B. The zero suicide model: applying evidence-based suicide prevention practices to clinical care. Front Psychiatry. 2018;9:33. [FREE Full text] [CrossRef] [Medline]
  44. Benton A, Coppersmith G, Dredze M. Ethical Research Protocols for Social Media Health Research. In: Proceedings of the First ACL Workshop on Ethics in Natural Language Processing. 2017. Presented at: EthNLP'17; February 27, 2017:94-102; Valencia, Spain. [CrossRef]
  45. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. Aug 29, 2017;12(1):108. [FREE Full text] [CrossRef] [Medline]
  46. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. Aug 7, 2009;4:50. [FREE Full text] [CrossRef] [Medline]
  47. Lyon AR, Whitaker K, Locke J, Cook CR, King KM, Duong M, et al. The impact of inter-organizational alignment (IOA) on implementation outcomes: evaluating unique and shared organizational influences in education sector mental health. Implement Sci. Feb 7, 2018;13(1):24. [FREE Full text] [CrossRef] [Medline]
  48. Ehrhart M, Aarons G, Farahnak L. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. Oct 23, 2014;9:157. [FREE Full text] [CrossRef] [Medline]
  49. Aarons GA, Ehrhart MG, Farahnak LR. The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. Apr 14, 2014;9(1):45. [FREE Full text] [CrossRef] [Medline]
  50. Gifford W, Graham I, Ehrhart MG, Davies B, Aarons G. Ottawa model of implementation leadership and implementation leadership scale: mapping concepts for developing and evaluating theory-based leadership interventions. J Healthc Leadersh. 2017;9:15-23. [FREE Full text] [CrossRef] [Medline]
  51. Aarons GA, Sommerfeld DH, Walrath-Greene CM. Evidence-based practice implementation: the impact of public versus private sector organization type on organizational support, provider attitudes, and adoption of evidence-based practice. Implement Sci. Dec 31, 2009;4:83. [FREE Full text] [CrossRef] [Medline]
  52. Hill CE, Knox S, Thompson BJ, Williams EN, Hess SA, Ladany N. Consensual qualitative research: an update. J Couns Psychol. Apr 2005;52(2):196-205. [CrossRef]
  53. Hill CE. Consensual Qualitative Research: A Practical Resource for Investigating Social Science Phenomena. New York, USA. American Psychological Association; 2012.
  54. Hill CE, Thompson BJ, Williams EN. A guide to conducting consensual qualitative research. Couns Psychol. 1997;25(4):517-572. [CrossRef]
  55. Siceloff ER, Bradley WJ, Flory K. Universal behavioral/emotional health screening in schools: overview and feasibility. Rep Emot Behav Disord Youth. 2017;17(2):32-38. [FREE Full text] [Medline]
  56. Guo S, Kim JJ, Bear L, Lau AS. Does depression screening in schools reduce adolescent racial/ethnic disparities in accessing treatment? J Clin Child Adolesc Psychol. 2017;46(4):523-536. [CrossRef] [Medline]
  57. Christensen H, Cuijpers P, Reynolds CF. Changing the direction of suicide prevention research: a necessity for true population impact. JAMA Psychiatry. May 1, 2016;73(5):435-436. [CrossRef] [Medline]
  58. Franco-Martín MA, Muñoz-Sánchez JL, Sainz-de-Abajo B, Castillo-Sánchez G, Hamrioui S, de la Torre-Díez I. A systematic literature review of technologies for suicidal behavior prevention. J Med Syst. Mar 5, 2018;42(4):71. [CrossRef] [Medline]
  59. Luxton DD, June JD, Fairall JM. Social media and suicide: a public health perspective. Am J Public Health. May 2012;102(Suppl 2):S195-S200. [CrossRef] [Medline]
  60. Kreuze E, Jenkins C, Gregoski M, York J, Mueller M, Lamis DA, et al. Technology-enhanced suicide prevention interventions: a systematic review. J Telemed Telecare. 2017;23(6):605-617. [CrossRef]
  61. Walsh CG, Ribeiro JD, Franklin JC. Predicting suicide attempts in adolescents with longitudinal clinical data and machine learning. J Child Psychol Psychiatry. Dec 2018;59(12):1261-1270. [CrossRef] [Medline]


AUC: area under the curve
CFIR: Consolidated Framework for Implementation Research
HCD: human-centered design
SM: social media
SUS: system usability scale
TES: technology-enabled services


Edited by G Eysenbach; submitted 19.09.19; peer-reviewed by B Sainz-de-Abajo, J Richards, V Carli; comments to author 03.12.19; revised version received 31.01.20; accepted 21.02.20; published 20.07.20.

Copyright

©Molly Adrian, Jessica Coifman, Michael D Pullmann, Jennifer B Blossom, Casey Chandler, Glen Coppersmith, Paul Thompson, Aaron R Lyon. Originally published in JMIR Mental Health (http://mental.jmir.org), 20.07.2020.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.