Original Paper
Abstract
Background: Research suggests that direct exposure to suicidal behaviors and acts of self-harm through social media may increase suicidality through imitation and modeling, particularly in more vulnerable populations. One example of a social media phenomenon that demonstrates how self-harming behavior could potentially be propagated is the blue whale challenge. In this challenge, adolescents and young adults are encouraged to engage in self-harm and eventually kill themselves.
Objective: This paper aimed to investigate the way individuals portray the blue whale challenge on social media, with an emphasis on factors that could pose a risk to vulnerable populations.
Methods: We first used a thematic analysis approach to code 60 publicly posted YouTube videos, 1112 comments on those videos, and 150 Twitter posts that explicitly referenced the blue whale challenge. We then deductively coded the YouTube videos based on the Suicide Prevention Resource Center (SPRC) safe messaging guidelines as a metric for the contagion risk associated with each video.
Results: The thematic analysis revealed that social media users post about the blue whale challenge to raise awareness and discourage participation, express sorrow for the participants, criticize the participants, or describe a relevant experience. The deductive coding of the YouTube videos showed that most of the videos violated at least 50% of the SPRC safe and effective messaging guidelines.
Conclusions: These posts might have the problematic effect of normalizing the blue whale challenge through repeated exposure, modeling, and reinforcement of self-harming and suicidal behaviors, especially among vulnerable populations such as adolescents. More effort is needed to educate social media users and content generators on safe messaging guidelines and factors that encourage versus discourage contagion effects.
doi:10.2196/15973
Keywords
Introduction
Background
Adolescents and young adults are the largest population using the internet as it has become essential for their schoolwork, information collection, and socializing [
- ]. Communication tools on the internet, including emails, direct texting, and blogging, have become fundamental in adolescents’ social development [ - ]. More recently, web-based social networks, also known as social media, such as Facebook, Instagram, Twitter, and YouTube, have become increasingly popular and common among adolescents as sites where they can develop public accounts or profiles to connect with other individuals and see their list of connections and posts [ , - ].Although social media has created a number of opportunities for individuals to garner social support on the web [
, , - ], it also has the potential to negatively impact vulnerable individuals. Recent studies have highlighted how social media can be used to harass, discriminate [ , ], dox [ ], and socially disenfranchise individuals [ - ]. Some research even suggests that social media use may be a contributing factor to the significant increase in suicide rates and depressive symptoms among adolescents in the past decade [ , , ]. Evidence suggests that suicidal behavior can be propagated through social contagion effects, which occur when self-harming behavior is modeled, normalized, and reinforced in media depictions [ - ].The public health and psychological literature has established that nonsuicidal self-injury (NSSI), or the purposeful infliction of damage to one’s body through cutting, burning, or bruising, can be propagated through social modeling or imitating the behaviors of those we observe [
, ]. Similarly, direct exposure to suicidal behaviors through peers and/or media leads to an increase in suicidality through imitation and modeling [ , ]. These effects are referred to as suicide contagion and are most notable in adolescents and young adults [ , , ]. There is a strong relationship between stories of suicide in traditional media and a subsequent increase in suicide rates [ , ], especially for prominent stories [ ]. Vulnerable adolescents with preexisting mental health conditions and suicide risk factors are at a higher risk to perceive maladaptive self-injurious behavior as an effective coping strategy [ ], particularly when they see others use these behaviors to achieve an attractive goal such as garnering the attention of others [ ].Depictions of suicide and self-harm in traditional media have been shown to have harmful effects on vulnerable individuals, even when they describe a false or fictional behavior [
, ]. Through social contagion, these harmful behaviors may occur more frequently because of the repetitive exposure and modeling via social media, especially when such content goes viral [ - ]. Virality is defined as “achieving a large number of views in a short time period” [ ]. Such widespread exposure to harmful or suggestive content is particularly detrimental to vulnerable individuals, including adolescents and young adults [ , ]. As a result of research indicating contagion effects related to medial portrayals of suicide and self-harm behavior, the Suicide Prevention Resource Center (SPRC) developed a list of safe and effective messaging guidelines to advise media outlets on the means for reducing the contagion risks and harmful effects associated with media portrayals of suicidal behavior [ ]. However, these guidelines are not consistently applied across media outlets, and they have not been specifically tailored for messaging via social media.In the social media literature, Facebook’s controversial emotional contagion study provides large-scale empirical evidence that indirect interactions via social media can also unknowingly influence one’s emotional state [
- ]. For instance, researchers interviewed 90 inpatient adolescents with a history of NSSI, finding that most of the participants were exposed to NSSI via traditional or social media before engaging in self-harming behaviors [ ]. Recent attention has focused on how social media, in particular, may play a role in mental health [ ] and suicide risk [ , ]. Social media may influence suicidality via factors such as cyberbullying, peer pressure in forums to engage in self-injurious behavior, and glamorized graphic videos and images depicting lethal means used in such behavior [ , ]. In addition, exposure to certain methods of self-injury was directly related to engaging in this behavior, and the more frequently youth were exposed to self-injury–related content in the media, the more frequently they engaged in self-injurious activities [ ]. However, exposure to self-injury alone does not relate to engagement in this behavior; rather, it is the frequency of the exposure to self-injury as a coping mechanism that can result in engagement [ ]. The necessity for studying this contagion effect in the era of social media is that it can be spread and propagated through web-based social networks more rapidly and widely than through traditional media [ ]. Such widespread exposure to harmful or suggestive content is particularly detrimental to vulnerable populations, including adolescents and young adults [ , ]. Thus, suicide contagion is the theoretical framework that motivates this study.Most of the literature regarding adolescent web safety focuses on sexual or aggressive behaviors that put youth at risk [
]. Unfortunately, few studies have examined how social media can influence adolescents and young adults to engage in self-harming behavior [ , , ]. Therefore, interdisciplinary research is beginning to highlight the urgent need to form a cohesive research agenda around digital self-harm [ ], web-based NSSI [ ], the use of social media to discuss deliberate self-harm acts [ , ], and acts of cybersuicide [ , ]. Most of this work, however, is in its early stages, with few drawing upon the developmental aspect of vulnerable adolescents’ social media use.Objective
One example of viral self-harming behavior that has generated significant media attention is the blue whale challenge. Allegedly, in this challenge, adolescents and young adults are encouraged to engage in self-harm and eventually kill themselves. The media claims that this challenge, which was first released in Russia in 2013, includes a series of 50 challenges sent directly to teens, each with increasing levels of self-harm and isolation [
]. According to the media, the creator of the game, Philipp Budeikin, a Russian psychologist, wanted to “cleanse the society of biological waste,” meaning the people who are willing to kill themselves for the game are biological waste [ ]. Although there is insufficient evidence to support the existence of the challenge, we believe its portrayal on social media (or any similar medium) has the potential to impact vulnerable individuals. Research exploring ethical concerns related to the blue whale challenge, the effects the game may have on adolescents, and potential governmental interventions is needed. However, we are unaware of any research that examines potential suicide and NSSI contagion risk regarding the challenge. To address this gap in the literature, this study uses qualitative analysis research techniques to provide empirical evidence of the risk associated with self-harm and suicide contagion through the portrayal of the blue whale challenge on YouTube and Twitter posts. More specifically, the purpose of this study was to explore the following questions:Research question 1: How is the blue whale challenge presented and described on YouTube and Twitter?
Research question 2: To what extent are YouTube videos compliant with the safe and effective suicide messaging guidelines provided by the SPRC?
Methods
Study Overview
In this study, we selected 2 social media platforms for data collection, YouTube and Twitter, as these are among the most popular social media platforms used by youth and young adults [
]. We identified the common themes of YouTube videos, comments on those videos, and Twitter posts by conducting a thematic analysis [ ] on the data extracted from these platforms. The methodology we followed is an iterative qualitative coding process that starts by reviewing the raw data by multiple researchers and ends with a set of themes that represents the majority of the data [ ]. In addition to the thematic analysis, we deductively coded the YouTube videos based on the SPRC safe and effective messaging guidelines to explore if these videos violated or adhered to these guidelines, the extent of violation, and which guidelines were violated more frequently than the others. To protect the identity of the users, we have not included any specific information regarding the user’s names or their web identifier in the paper. In addition, the quotes included in the manuscript have been paraphrased without changing the meaning to make it challenging from a search-ability standpoint. Next, we describe in detail how we collected the data from social media platforms, how we conducted the thematic analysis, and how we coded the YouTube videos based on SPRC guidelines.Data Collection
YouTube and Twitter were selected as appropriate sources for data collection for 2 reasons. First, both platforms are ranked among the top 20 most popular social media sites. Second, posts on these platforms are normally open to the public [
, , ]. The videos were searched using the YouTube search engine with the key phrase blue whale challenge. Next, we sorted the results by relevance, an option provided by YouTube that ranks the videos in descending order relative to the keyword queries based on several factors including “how well the title, description, and video content match the query and which videos have driven the most engagement for the query” [ ]. Covington et al [ ] described the algorithm used by YouTube in detail. We collected information from the first 60 videos on the list, which, combined, have a length of around 12 hours. The process of collecting and coding the data was iterative, an approach suggested by Saunders et al [ ], to assure data saturation [ ], which indicates that “on the basis of the data that have been collected or analyzed hitherto, further data collection and/or analysis are unnecessary” [ ]. We did so by beginning the coding process with 40 videos and continued collecting videos until by the 60th video, wherein no more new codes in the data emerged. Interestingly, only 5 of these 60 videos required age verification. The following information was collected for each video: the link, the number of views, and the first 30 comments sorted by top comments, if present, as at this point, data saturation was achieved as found by iteratively collecting and coding data [ ]. Top comments are identified by YouTube based on how many users like versus dislike a comment. A total of 1112 comments were collected for coding. Inclusion criteria for the videos were (1) relation to the blue whale challenge and (2) in English, translated into English, or contained English subtitles. Inclusion criteria for the comments were as follows: the comment (1) had to be in English and (2) include words. This data collection strategy was chosen to mimic the typical user behavior search strategy [ - ].In addition to these YouTube videos, 150 Twitter posts were randomly collected using the social media monitoring and analytics tool Salesforce (Radian6) [
], which is a tool that listens, tracks, and analyzes conversations across different web-based channels based on the provided keywords [ ]. Although Radian6 can only analyze posts in 10 different languages and is the most expensive social media monitoring tool, it can analyze almost all types of complex posts including forums/news/media, blogs, microblogs, geography, history, and sentiment [ ]. These posts were collected from February 2012 to February 2018 using the keywords and inclusion criteria provided in . The timeline for collecting the data was chosen to cover the period where it is believed that blue whale challenge was most active [ ]. We also conducted a Google Trends analysis with the key words blue whale challenge and found that the challenge was searched for most frequently from 2017 to 2018, which includes our search period. We initially collected 100 Twitter posts for analysis and continued to collect and code posts until no new codes emerged, meaning all posts collected reflected the existing codebook, with data saturation [ ] occurring at 150 posts.Variable | Term or criteria |
Keywords |
|
Hashtags |
|
Inclusion criteria |
|
Thematic Analysis
We conducted a thematic analysis following the 6 phases recommended by Braun and Clarke [
]. First, the researchers familiarized themselves with the data (phase 1) and read and reread the data to note any initial observations. Then, the researchers started the coding process (phase 2) to identify the subthemes (codes) within the YouTube videos, YouTube comments, and Twitter posts. Due to the different policies, technical affordances, and data types of these platforms [ , , ], we first analyzed each data type separately and then compared similar findings across platforms.Initial codes were developed by dividing the top 40 videos, their top 30 related comments, and 100 Twitter posts between the 2 researchers. Each researcher then identified a set of codes on their own describing the data found in the videos, comments, and Twitter posts. The 2 researchers then had a meeting to combine their individual sets of codes into 3 comprehensive sets (1 for each type of data being analyzed). We called the resulting set of codes a codebook. The researchers conducted a pilot analysis on the top 40 YouTube videos, 695 YouTube comments under those videos, and 100 Twitter posts to ensure that the initial codebooks sufficiently summarized the data. Following the completion of the pilot analysis, the codebooks were modified by the both researchers together to more accurately summarize the data.
Once the codebooks were finalized, the full analysis of 60 YouTube videos, 1112 YouTube comments under those videos, and 150 Twitter posts was conducted. The 2 researchers separately coded all the data without collaboration to negate bias. Double-coding was allowed for the YouTube videos but not for the YouTube comments and Twitter posts. After coding was complete, they compared their codes with identify conflicts. When the 2 researchers disagreed, they used a consensus-forming process [
], where they discussed the underlying differences in their interpretation until they agreed which theme best suited the data. The final codebooks and results are provided in - . All coding results were entered into Atlas.ti [ ], a qualitative coding software, to conduct axial coding and develop emergent themes across the 3 data types (phase 3). Both researchers worked together to tie the related codes into overarching themes. Then, the researchers reviewed the themes to ensure that they represented the data (phase 4). Following this, the researchers named, defined, and wrote about the themes (phases 5 and 6).Coding Based on Safe and Effective Messaging Guidelines
The SPRC developed 9 safe and effective messaging guidelines based on best practices from research to reduce the risk of inducing self-harm or suicide-related contagion in those who view a message (
) [ , , ]. Either following or violating these guidelines represents a metric for the contagion risk associated with a social media post. Each of the YouTube videos was compared with the 9 SPRC safe messaging guidelines by 2 researchers to determine if it adheres or violates each guideline. The researchers conducted 2 rounds of deductive coding [ ] based on these guidelines. In the first round, they coded the first 40 YouTube videos individually. Then, they had a meeting to discuss any discrepancy in their coding strategy. Then, they started the second round of coding, where they coded all the 60 YouTube videos. The interrater reliability measured by Cohen κ was 0.66. If the 2 researchers disagreed on a code, the video was set aside for further consensus coding.Guideline | Description |
Emphasize seeking help and provide information on where to find it | Provide steps for finding mental health treatment. Advise that help is available through local service providers and crisis centers and through the National Suicide Prevention Lifeline (1-800-273-8255). |
Emphasize prevention | Highlight that suicide is avoidable and preventable and that there are actions for individuals who have suicidal thoughts to prevent them from acting on those thoughts [ | ].
List the warning signs as well as the risk and protective factors of suicide | List the warning signs like the ones developed by the American Association of Suicidology. List what could both reduce and increase the risk of suicide; these can be found on pages 35-36 in the National Strategy for Suicide Prevention. Educate people on how to identify a person with self-harming thoughts. |
Highlight effective treatments for underlying mental health problems | A total of 46% of people who died by suicide from 2014 to 2016 had a known mental health condition, with 67% of them having a history of treatment for substance abuse. Among them, 46.7% had recently been released from a psychiatric facility [ | ], a percentage which can be reduced by providing improved access to effective treatments and social support [ ].
Avoid glorifying or romanticizing suicide or people who died by suicide | Vulnerable younger adults may relate to the attention given to and sympathy for a person who committed suicide [ | ].
Avoid normalizing suicide by presenting it as a common event | Suicide ideation is not seen as normal by most people, and they do not consider it an option. However, presenting suicide as a common event may remove this bias [ | ].
Avoid presenting suicide as an inexplicable act or explain it as a result of stress only | Doing so may encourage identification with the victim as well as ignoring the complexity and preventability of suicide. Presenting suicide as explainable or a result of stress only misleads vulnerable individuals to believe that it is a normal response to common life situations [ | , - ].
Avoid focusing on personal details of people who died by suicide | Vulnerable younger adults may feel they are like the person who committed suicide, eventually leading them to consider taking their lives in the same way [ | ].
Avoid presenting overly detailed descriptions of suicide victims or methods of suicide | Including pictures and/or descriptions of where and how an individual died by suicide may lead a vulnerable person to imitate the act [ | , ].
Results
Overview
This research focused on 2 main components: first, exploring the types of messages social media users share about the blue whale challenge via YouTube and Twitter, and second, the violation of SPRC guidelines in YouTube videos. We identified the common themes across YouTube videos, YouTube comments, and Twitter posts using an inductive coding approach. Then, we used a deductive coding approach to explore how many of the YouTube videos violated the SPRC guidelines. We subsequently compared these videos with the number of views and finally looked at which of the guidelines tended to be violated more often than others. In this section, we first present the 4 identified themes, and then, we describe the violation of the SPRC guidelines.
Common Themes Across Social Media Platforms (Research Question 1)
We identified 4 common themes among all the posts in the 3 data types included in this study.
presents the themes, the corresponding codes (code definitions can be found in - ), and the percentages of posts from each type of data. Below, we define each of the themes and provide examples from the extracted data. Note: We paraphrased illustrative verbatim quotes without changing the meaning for the following reasons: (1) to make it challenging to identify the users who made the post and (2) to fix any grammatical inaccuracy or typographical error that have been made by the user that wrote the post or comment.Theme | Relevant codes from each codebook | YouTube videos, n (%) | YouTube comments, n (%) | Twitter posts, n (%) |
Raising awareness about blue whale challenge and discouraging participation |
| 50 (83) | 315 (28.33) | 103 (68.7) |
Expressing sorrow for people with mental health issues |
| 28 (47) | 123 (11.11) | 5 (3.3) |
Criticizing or making jokes about the participants or the challenge |
| 6 (10) | 530 (47.66) | 24 (16.0) |
Providing experiences and asking to play |
| 36 (60) | 178 (16.01) | 1 (0.7) |
Theme 1: Raising Awareness About the Blue Whale Challenge and Discouraging Participation
This theme included social media posts in which users were trying to raise awareness and warn parents about this dangerous phenomenon. YouTube videos in this context were either news reports or bloggers that started their videos by listing different names for the blue whale challenge and the statistics on how many people died by suicide due to this game. These videos, then, started discussing the tasks involved in the blue whale challenge, including that the only way out of the game is suicide. These videos frequently included clips from interviews with victims’ parents or pictures of the victims while describing the blue whale challenge.
The YouTube comments in the awareness theme were against blue whale challenge and suggested that parents and authorities pay more attention to their children’s safety. For example, 1 of the comments that falls in this category is:
This is getting ridiculous. The government, the Federal Bureau of Investigation, or someone needs to do something about it.
The Twitter posts in this theme centered on awareness of this dangerous game. Users were trying to spread the word to make others aware of and read more about the challenge. One example of the Twitter posts in this theme is:
Please understand and beware of the Blue Whale Challenge. This is for the parents.
This theme was most common in YouTube videos (approximately 83%) and Twitter posts (approximately 68.7%), suggesting that the majority of users on these 2 platforms are trying to spread the word about the danger of blue whale challenge.
Theme 2: Expressing Sorrow for People With Mental Health Issues
This theme included social media posts in which the users expressed sorrow for those who participated in the blue whale challenge or people with mental illness issues. These videos primarily presented pictures of these people, accompanied by sad music. YouTube comments in the expressing sorrow theme included words of encouragement and support for people with mental health issues. An example of these supportive comments is:
Love you girl. So many people care about you. Depression is so horrible. Stay strong girl. We love you.
The Twitter posts in the expressing sorrow theme primarily mentioned that people participate in the blue whale challenge because they have mental health issues, urge of self-harm, and suicidal ideation. One example of the Twitter posts reflecting this theme is:
The Blue Whale Challenge: Hey Mr. Scribe, Brush Up Your Archaic Knowledge of Mental Health.
A high percentage of YouTube videos (approximately 47%) fell under this theme, but fewer YouTube comments (approximately 11.10%) and Twitter posts (approximately 3.3%) did.
Theme 3: Criticizing or Making Jokes About the Participants or the Challenge
The third theme included social media posts from individuals who either criticized the individuals who participated in the blue whale challenge or made sarcastic comments about them or the challenge. The 6 YouTube videos that fell under this theme (10%) mentioned that adolescents participate in blue whale challenge just to “show off” and criticized them harshly or made sarcastic comments about them. The 511 YouTube comments in this theme (47.70%) criticized the blue whale challenge participants by saying things such as:
People who play this game are more stupid than the game itself. How can someone lose his sense and be manipulated by others like that. Grow up guys. You have brains to think and decide what is good and what is bad for you.
The 24 Twitter posts in the criticizing theme (16.0%) were primarily sarcastic posts about the challenge itself rather than the people who participated in it. For example, 1 of these Twitter posts is:
Husband silently downloaded the Blue Whale Game on his wife's phone. Fifty days later the blue whale committed suicide.
As seen in
, this theme was mainly dominant in the YouTube comments, with a few YouTube videos and Twitter posts falling under this theme.Theme 4: Providing Experiences and Asking to Play
The last theme we identified included social media posts where users spoke in detail about someone who had participated or asked to play the game themselves. The providing experiences theme was very common in the YouTube videos (60%), where these 36 videos interviewed parents and provided pictures of the participants’ bodies showing instances of self-harm. This theme was slightly less common among the YouTube comments (16.00%), and these 178 comments were mainly about experiences of acquaintances or users who were asking to participate in the blue whale challenge. An example of the comments by someone asking to participate in blue whale challenge is:
I want to play the Blue Whale Game. Please give me the link.
Only 2 Twitter posts fell in the providing experience theme (approximately 0.7%), which tended to be acquaintance stories. For example, 1 user posted the following on Twitter:
We just had a meeting here at work and this lady told us that her 10-year-old niece committed suicide because of this Blue Whale Challenge.
A large number of YouTube videos fall under this theme, as opposed to the anecdotal YouTube comments and Twitter posts.
Safe and Effective Messaging Guidelines (Research Question 2)
Of the 60 YouTube videos evaluated based on the SPRC safe messaging guidelines, 22 (37%) adhered to fewer than 3 of the 9 safe messaging guidelines, meaning that the videos were considered more unsafe than safe. Approximately 50% (30 videos) were considered neutral, meaning they adhered to only 4-6 of the guidelines, whereas the remaining 13% (8 videos) were considered more safe than unsafe because they adhered to 7 or more of the guidelines.
When compared with the number for views of each video, 50% (10 videos) of the top 20 viewed videos were more unsafe than safe, meaning that the videos violated at least 6 of the 9 criteria for safe and effective messaging about suicide. Only 10% (2 videos) of these 20 videos were considered more safe than unsafe, and 40% (8 videos) were considered neutral. The top 20 viewed videos had 46,099,923 views in total. Of the middle 20 most viewed videos, 30% (18 videos) were considered more unsafe than safe, 25% (15 videos) more safe than unsafe, and 45% (27 videos) neutral. The middle 20 viewed videos had 1,169,054 views in total. Of the 20 least viewed videos, 30% (18 videos) were considered more unsafe than safe, 5% (3 videos) more safe than unsafe, and 65% (39 videos) neutral. The 20 least viewed videos had only 123,450 views. All videos met at least one of the nine guidelines, and only 1 video met all 9 criteria. The total number of views for all 60 videos was 47,392,427, with the top 20 videos having 97.27% of the total views.
To better understand which guidelines were most frequently violated, we included the number of videos that violated each guideline in
. The guidelines most frequently violated were “Highlight effective treatments for underlying mental health problems” and “Emphasize seeking help and provide information on where to find it.” As seen in , most YouTube videos were intended to raise awareness and warn parents and society about the blue whale challenge. It is expected and advisable for these types of videos to provide steps on how to treat such problems or information on where these details could be found, such as the National Suicide Prevention Lifeline [ ]. However, as our results indicate, these videos failed to do so.In addition, most of the videos violated the guidelines “Avoid presenting overly detailed descriptions of suicide victims or methods of suicide,” “Avoid glorifying or romanticizing suicide or people who died by suicide,” and “Avoid focusing on personal details of people who died by suicide.” These videos tended to include personal pictures of the victims, such as pictures of an arm with a whale carved on it. Most of the videos also mentioned that Philipp Budeikin, a 22-year-old dropout psychology student, is the curator of the challenge. They included pictures and quotes from the curator of the game, the most frequently occurring quote as:
The people who have died by the blue whale challenge are biological waste, and I was cleaning the society from them.
A smaller but relatively high percentage of the videos included clips from interviews with the victims’ families and mock conversations between a victim and the curator showing how a person can participate in the challenge. Additionally, they mentioned that the challenge has other names, including but not limited to, The Silent House, A Sea of Whales, Wake Me Up at 4:20 AM, and F57. The majority of the videos described and provided details about the tasks and methods for attempting suicide in the challenge, including jumping off a high building, jumping in front of a train, and self-hanging. These videos also mentioned specific names of suicide cultures and web-based suicide groups or forums on social media that contributed to the spread of this challenge.
Approximately 67% (40 videos) of these videos violated the guideline “Avoid normalizing suicide by presenting it as a common event.” In these videos, general facts and statistics about the blue whale challenge, such as the number of suicides related to the challenge and the countries in which they occurred were provided. On the other hand, only a few videos provided support hotlines and recommendations to adolescents and parents about how to avoid this game.
Guideline | Count, n (%) |
Highlight effective treatments for underlying mental health problems | 54 (90) |
Emphasize seeking help and provide information on where to find it | 47 (78) |
Avoid presenting overly detailed descriptions of suicide victims or methods of suicide | 45 (75) |
List the warning signs as well as the risk and protective factors of suicide | 43 (72) |
Avoid normalizing suicide by presenting it as a common event | 40 (67) |
Avoid glorifying or romanticizing suicide or people who died by suicide | 36 (55) |
Emphasize prevention | 30 (50) |
Avoid focusing on personal details of people who died by suicide | 30 (50) |
Avoid presenting suicide as an inexplicable act or explain it as a result of stress only | 20 (33) |
Discussion
Overview
To the best of our knowledge, our study is the first to systematically document the quality, portrayal, and reach of the blue whale challenge on social media. In this study, we found that it is easy for adolescents to access almost any post about the blue whale challenge on social media, as only 5 of the YouTube videos blocked minors from viewing the content. We assessed the portrayal of the blue whale challenge on social media by investigating the common themes of the videos and posts and the videos’ adherence to the safe and effective messaging guidelines. We found that social media users post about blue whale challenge to raise awareness and discourage participation, to express sorrow for the participants, to criticize the participants, or to describe a relevant experience. Moreover, we found that the majority of the videos on YouTube violated at least 50% of the SPRC safe and effective messaging guidelines. In this section, we discuss in detail the themes found in the results and the potential consequences of the violated SPRC guidelines. We conclude this section by providing the design implications and recommendations to social media platforms and future directions for this research.
Common Themes Across Social Media Platforms (Research Question 1)
The raising awareness theme was the most dominant theme and included posts that were meant to raise awareness and warn parents and society about the blue whale challenge. These posts were primarily anti-blue whale challenge, as opposed to pro-blue whale challenge. This finding implies that it is difficult to find information on how to participate in blue whale challenge or prosuicide information in comparison with obtaining antisuicide information on the internet, as found in a previous study [
]. It also implies that it is difficult to find videos from actual participants of the blue whale challenge. This finding could partially be due to the nature of the challenge in that it encourages participants to conduct self-harm secretly.Another topic highlighted in blue whale challenge–related social media posts was that the users felt sorrow for people who participated in blue whale challenge or people with mental illness (theme expressing sorrow for people with mental health issues). This finding parallels our first 2 implications: it is not easy to find pro-blue whale challenge information on social media and even harder to find posts from actual participants. However, posts of this kind make the viewer think that there are a significant number of others participating in the blue whale challenge. This conclusion could lead them to believe that suicide and self-harm are normal responses to common life situations (eg, stress) rather than outcomes that can be prevented [
, ]. On the other hand, as seen in the criticizing theme, there were many posts in which people either criticized the adolescents who participated in the blue whale challenge or made fun of them, agreeing with the purpose of the blue whale challenge to clean “society of people with mental issues.” When people post critical comments about individuals with mental illness, they contribute to the stigma surrounding mental illness and discourage help-seeking behavior [ ].Finally, the providing experience theme showed that a large number of social media users tend to speak in detail about someone who participated in the blue whale challenge, providing their demographics or interviewing their parents or acquaintances who provide more details about the participant’s personal life. This theme was also found by other researchers who examined traditional media posts about suicide [
]. It is possible that reporting this level of detail might contribute to social modeling and contagion by leading vulnerable adolescents to feel that they are similar to the adolescents who participated in the blue whale challenge, making them more likely to participate or harm themselves [ ].Safe and Effective Messaging Guidelines (Research Question 2)
We suggest that videos similar to those we examined could contribute to the spread of these risky challenges instead of meeting their intended purpose of raising awareness and decreasing participation. Most posts romanticized people who have died by following this challenge, and younger vulnerable individuals may see those victims as role models, possibly leading them to end their lives in the same way [
, ]. In violation of guidelines, the videos presented statistics about the number of suicides believed to be related to this challenge in a way that made suicide seem common [ ]. In addition, the videos presented extensive personal information about the people who had died by suicide while playing the blue whale challenge. They also provided detailed descriptions of the final task, including pictures of self-harm and means of suicide, material that may encourage vulnerable adolescents to consider ending their lives and provide them with methods on how to do so [ , ]. On the other hand, these videos both failed to emphasize prevention by highlighting effective treatments for mental health problems and failed to encourage individuals with mental health problems to seek help as well as providing information on where to find it. For these reasons, we believe that these videos could contribute to the propagation of self-injury and suicidality through social modeling and imitation [ , - ], which is known in the literature as the suicide contagion [ , ]. Suicide contagion is commonly promoted through traditional media [ , ], and we believe it is more likely to occur when social media is employed to depict self-harming behavior, as the content could easily go viral [ - ].The SPRC safe and effective messaging guidelines are provided to help people working in suicide prevention, mental health promotion, or any other forms of media, which ensure that messages about suicide are safe, positive, and strategic [
]. Thus, these guidelines are applicable for assessing the appropriateness and safety of the content in messages in suicide campaigns and those discussing suicide across a variety of platforms [ , ]. Portrayals of both suicide and self-harm on social media platforms have the ability to increase that behavior in viewers regardless of their self-harm history, making the monitoring of self-harm and suicidal content crucial [ ]. As adolescents often look for emotional support from the internet within a plethora of potentially harmful content, it is critical that web-based resources follow these guidelines, as not doing so may contribute to and/or increase the likelihood of someone with suicidal thoughts attempting suicide [ , - ].Implications and Recommendations
Although there are no studies investigating how portrayals of the blue whale challenge can contribute to self-harm contagion, messages on social media may affect an individual’s perception, belief, or behavior regarding self-harm. There are also no studies investigating similar challenges that are said to encourage self-harm and suicidal behavior, such as the momo challenge [
]. However, studies have clearly shown the harmful effects of newspapers, movies, and television portrayals that have led to increased rates of suicide and self-harm after violating safe messaging guidelines [ , - ]. For example, Cooper et al [ ] and Ayers et al [ ] reported that suicide admission counts increased coinciding with the release of the Netflix series 13 Reasons Why, which depicts suicide in a highly graphic and unsafe manner. Suicide contagion has been shown across a variety of entertainment and communication media, and social media has been the focus of recent research interest as a medium for contagion because of the ease with which harmful content can be spread [ - ].As 2 of the most popular websites and social media platforms, YouTube and Twitter are potentially capable of influencing countless adolescents [
, , , , , ]. With most of the blue whale challenge-related posts on these platforms found to be potentially harmful to vulnerable populations, our findings suggest that it is urgent to monitor social media posts related to the challenge and similar self-harming challenges. The SPRC should appropriately inform social media users, particularly those with greater influence (eg, celebrities and news anchors) on how to address suicide or self-harm in a safe way to reduce contagion. This could be achieved using an algorithm that detects harmful content in the video before posting and provides recommendations to the user to edit or remove such content [ - ]. Although most YouTube users found the blue whale challenge to be a threat and forwarded a message as a means to warn others or speak out against it, they may unintentionally contribute to self-harm contagion. Therefore, it is critical for social media users to evaluate sources before creating and sharing information. They should also be educated on how to respond to unsafe posts and how to report such posts to social media administrators.YouTube has the potential to become a powerful positive information dissemination platform with the contribution of mental health professionals and organizations [
, ]. Professionals and organizations can do so by actively participating on YouTube channels by creating and uploading videos that are safe, accurate, and trustworthy. Moreover, although only minimal barriers can realistically be applied to video uploads due to the nature of such platforms, there is a need to develop advanced algorithms and interfaces to highlight information that is safe, accurate, and trustworthy. In addition, integrating approved information available from federal agencies such as SPRC might increase the veracity and safety of information available. Greater effort is needed to educate the community about mental health and factors that could lead to self-harm, in addition to educating users on how to respond to unsafe posts. This could be accomplished by adding resources in suicide prevention campaigns or on social media platforms by providing a link on suicide-related videos to educate the public on how to identify and respond to unsafe videos. In addition, features such as crowdsourcing (obtaining information from internet users), asking users to report inaccurate and misleading information to prevent the spread of misinformation, could be integrated on social media sites.Limitations and Future Work
There are several limitations to our study, with many related to the data used. We only studied videos and posts about blue whale challenge that are publicly available. It is possible that more harmful videos could be posted on private pages or personal accounts. In addition, we only used hashtags that included blue whale challenge in them to collect the data. It is possible that there are unofficial hashtags that can be used by individuals at high risk of self-harm, allowing them to find related content without censorship. Additionally, the YouTube videos were selected according to their relevance to the topic based on 1 keyword. Ideally, videos should be randomly selected using several keywords and multiple hashtags. Moreover, our sample may differ from videos sampled at a subsequent time as YouTube is rapidly changing by nature. Furthermore, this study focused on 2 social media platforms, whereas people could be posting about the challenge on other platforms including, but not limited to, Facebook, VKontakte, Snapchat, and Instagram. Our study focused on 1 self-harm and suicide challenge, the blue whale challenge, although there are many other self-harm challenges such as the tide pod challenge and other suicide challenges such as the momo challenge.
Future research could include a comparison of the characteristics of posts about different challenges that exemplify various levels of self-harm to better understand the reasons for their viral spread. Understanding these factors will help build simulation models, such as agent-based models, to visualize the spread of these challenges. In addition, understanding these factors will help inform improved policies and interventions for eliminating the spread of harmful challenges among adolescents, such as those found by Luxton et al [
]. Integrating these policies into simulation models will help to identify the policies that are most effective in reducing these challenges with minimal cost.Conclusions
We investigated the characteristics of YouTube and Twitter posts focusing on the blue whale challenge and the characteristics of these posts that make them potentially harmful to vulnerable populations. Through our qualitative analysis, we found that although most videos and Twitter posts attempt to raise awareness about the challenge and inform parents, they may have unintentional harmful effects as most violated the SPRC safe messaging guidelines. We conclude that safe messaging guidelines should be more widely disseminated. Our data show that the majority of posters were not professionals and were likely to violate safe messaging guidelines. More efforts are needed to disseminate and educate the average person on messaging guidelines as well as factors that encourage contagion effects.
Resources for Seeking Help
If you or a loved one has thoughts of suicide or self-harm, please contact the following resource: Suicide Prevention Hotline: 1-800-273-8255.
Acknowledgments
This work was supported by a grant from the United States National Science Foundation, Division of Information and Intelligent Systems, Cyber-Human Systems program under grant #1832904.
Conflicts of Interest
None declared.
Codebook for YouTube videos.
PDF File (Adobe PDF File), 54 KBCodebook for YouTube comments.
PDF File (Adobe PDF File), 72 KBCodebook for Twitter posts.
PDF File (Adobe PDF File), 67 KBReferences
- Lerman BI, Lewis SP, Lumley M, Grogan GJ, Hudson CC, Johnson E. Teen depression groups on Facebook: a content analysis. J Adolesc Res. Oct 22, 2016;32(6):719-741. [CrossRef]
- Gutowski E, White A, Liang B, Diamonti A, Berado D. How stress influences purpose development: the importance of social support. J Adolesc Res. Oct 30, 2017;33(5):571-597. [CrossRef]
- Subrahmanyam K, Lin G. Adolescents on the net: internet use and well-being. Adolescence. 2007;42(168):659-677. [Medline]
- Baker R, White K. Predicting adolescents’ use of social networking sites from an extended theory of planned behaviour perspective. Comput Hum Behav. Nov 2010;26(6):1591-1597. [FREE Full text] [CrossRef]
- Ponathil A, Agnisarman S, Khasawneh A, Narasimha S, Madathil KC. An Empirical Study Investigating the Effectiveness of Decision Aids in Supporting the Sensemaking Process on Anonymous Social Media. In: Proceedings of Human Factors and Ergonomics Society Annual Meeting. 2017. Presented at: HFES'17; October 9-13, 2017:798-802; Austin, TX. URL: https://journals.sagepub.com/doi/abs/10.1177/1541931213601693 [CrossRef]
- Khasawneh A, Ponathil A, Ozkan NF, Madathil KC. How Should I Choose My Dentist? A Preliminary Study Investigating the Effectiveness of Decision Aids on Healthcare Online Review Portals. In: Proceedings of Human Factors and Ergonomics Society Annual Meeting. 2018. Presented at: HFES'18; October 1-5, 2018:1694-1698; Philadelphia, PA. URL: https://journals.sagepub.com/doi/abs/10.1177/1541931218621383 [CrossRef]
- Lenhart A. Pew Research Center. 2009. URL: https://www.pewresearch.org/internet/2009/04/10/teens-and-social-media-an-overview/ [accessed 2020-04-20]
- Anderson M, Jiang J. Pew Research Center. 2018. URL: https://www.pewresearch.org/internet/2018/05/31/teens-social-media-technology-2018/ [accessed 2020-04-20]
- Scharett E, Madathil K, Lopes S, Rogers H, Agnisarman S, Narasimha S, et al. An investigation of the information sought by caregivers of Alzheimer's patients on online peer support groups. Cyberpsychol Behav Soc Netw. Oct 2017;20(10):640-657. [CrossRef] [Medline]
- Boyd D, Ellison N. Social network sites: definition, history, and scholarship. IEEE Eng Manag Rev. 2010;38(3):16-31. [CrossRef]
- Madathil KC, Greenstein JS, Juang KA, Neyens DM, Gramopadhye AK. An Investigation of the Informational Needs of Ovarian Cancer Patients and Their Supporters. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2013. Presented at: HFES'13; September 1, 2013:748-752; San Diego, CA. URL: https://journals.sagepub.com/doi/abs/10.1177/1541931213571163 [CrossRef]
- Jackson S, Bailey M, Welles BF. #GirlsLikeUs: trans advocacy and community building online. New Media Soc. Jun 9, 2017;20(5):1868-1888. [FREE Full text] [CrossRef]
- Blackwell L, Hardy J, Ammari T, Veinot T, Lampe C, Schoenebeck S. LGBT Parents and Social Media: Advocacy, Privacy, and Disclosure during Shifting Social Movements. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 2016. Presented at: CHI'16; May 7-12, 2016; San Jose, USA. [CrossRef]
- Gonzales AL. Disadvantaged minorities’ use of the internet to expand their social networks. Communic Res. Jan 13, 2015;44(4):467-486. [FREE Full text] [CrossRef]
- Lawson CE. Platform vulnerabilities: harassment and misogynoir in the digital attack on Leslie Jones. Inf Commun Soc. Feb 16, 2018;21(6):818-833. [CrossRef]
- Fritz N, Gonzales A. Privacy at the margins-not the normal trans story: negotiating trans narratives while crowdfunding at the margins. Int J Commun. 2018;12:20. [FREE Full text]
- Wood M, Rose E, Thompson C. Viral justice: online justice-seeking, intimate partner violence and affective contagion. Theor Criminol. Jan 8, 2018;23(3):375-393. [CrossRef]
- Page X, Wisniewski P, Knijnenburg B, Namara M. Social media’s have-nots: an era of social disenfranchisement. Internet Res. Oct 2, 2018;28(5):1253-1274. [FREE Full text] [CrossRef]
- Linabary J, Corple D. Privacy for whom?: a feminist intervention in online research practice. Inf Commun Soc. Feb 19, 2018;22(10):1447-1463. [CrossRef]
- Flores-Yeffal N, Vidales G, Martinez G. #WakeUpAmerica, #IllegalsAreCriminals: the role of the cyber public sphere in the perpetuation of the Latino cyber-moral panic in the US. Inf Commun Soc. Oct 20, 2017;22(3):402-419. [FREE Full text] [CrossRef]
- Mitchell KJ, Wells M, Priebe G, Ybarra ML. Exposure to websites that encourage self-harm and suicide: prevalence rates and association with actual thoughts of self-harm and thoughts of suicide in the United States. J Adolesc. Dec 2014;37(8):1335-1344. [CrossRef] [Medline]
- Guan L, Hao B, Cheng Q, Yip PS, Zhu T. Identifying chinese microblog users with high suicide probability using internet-based profile and linguistic features: classification model. JMIR Ment Health. 2015;2(2):e17. [FREE Full text] [CrossRef] [Medline]
- Hilton CH. Unveiling self-harm behaviour: what can social media site Twitter tell us about self-harm? A qualitative exploration. J Clin Nurs. Jun 2017;26(11-12):1690-1704. [CrossRef] [Medline]
- Zhu L, Westers NJ, Horton SE, King JD, Diederich A, Stewart SM, et al. Frequency of exposure to and engagement in nonsuicidal self-injury among inpatient adolescents. Arch Suicide Res. 2016;20(4):580-590. [CrossRef] [Medline]
- Berry N, Bucci S, Lobban F. Use of the internet and mobile phones for self-management of severe mental health problems: qualitative study of staff views. JMIR Ment Health. Nov 1, 2017;4(4):e52. [FREE Full text] [CrossRef] [Medline]
- Jarvi S, Jackson B, Swenson L, Crawford H. The impact of social contagion on non-suicidal self-injury: a review of the literature. Arch Suicide Res. 2013;17(1):1-19. [CrossRef] [Medline]
- Insel BJ, Gould MS. Impact of modeling on adolescent suicidal behavior. Psychiatr Clin North Am. Jun 2008;31(2):293-316. [CrossRef] [Medline]
- Gould M, Petrie K, Kleinman M, Wallenstein S. Clustering of attempted suicide: New Zealand national data. Int J Epidemiol. Dec 1994;23(6):1185-1189. [CrossRef] [Medline]
- Cheng Q, Li H, Silenzio V, Caine ED. Suicide contagion: a systematic review of definitions and research utility. PLoS One. 2014;9(9):e108724. [FREE Full text] [CrossRef] [Medline]
- Young R, Subramanian R, Miles S, Hinnant A, Andsager JL. Social representation of cyberbullying and adolescent suicide: a mixed-method analysis of news stories. Health Commun. Sep 2017;32(9):1082-1092. [CrossRef] [Medline]
- Stack S. The effect of the media on suicide: evidence from Japan, 1955-1985. Suicide Life Threat Behav. 1996;26(2):132-142. [Medline]
- Stack S. The media and suicide: a nonadditive model, 1968-1980. Suicide Life Threat Behav. 1993;23(1):63-66. [Medline]
- Gould M, Jamieson P, Romer D. Media contagion and suicide among the young. Am Behav Sci. Jul 27, 2016;46(9):1269-1284. [FREE Full text] [CrossRef]
- Nock MK, Borges G, Bromet EJ, Cha CB, Kessler RC, Lee S. Suicide and suicidal behavior. Epidemiol Rev. 2008;30:133-154. [FREE Full text] [CrossRef] [Medline]
- Purington A, Whitlock J. Non-suicidal self-injury in the media. Prev Res. 2010;17(1):11-14. [FREE Full text]
- Hawton K, Townsend E, Arensman E, Gunnell D, Hazell P, House A, et al. Psychosocial versus pharmacological treatments for deliberate self harm. Cochrane Database Syst Rev. 2000;(2):CD001764. [CrossRef] [Medline]
- Tellis GJ, MacInnis DJ, Tirunillai S, Zhang Y. What drives virality (sharing) of online digital content? The critical role of information, emotion, and brand prominence. J Mark. Apr 24, 2019;83(4):1-20. [CrossRef]
- Klingle K, van Vliet KJ. Self-compassion from the adolescent perspective: a qualitative study. J Adolesc Res. Aug 14, 2017;34(3):323-346. [CrossRef]
- Suicide Prevention Resource Center. 2006. URL: https://www.sprc.org/sites/default/files/migrate/library/SafeMessagingrevised.pdf [accessed 2020-04-20]
- Kramer AD, Guillory JE, Hancock JT. Experimental evidence of massive-scale emotional contagion through social networks. Proc Natl Acad Sci U S A. Jun 17, 2014;111(24):8788-8790. [FREE Full text] [CrossRef] [Medline]
- Harris IM, Roberts LM. Exploring the use and effects of deliberate self-harm websites: an internet-based study. J Med Internet Res. Dec 20, 2013;15(12):e285. [FREE Full text] [CrossRef] [Medline]
- Lachmar E, Wittenborn A, Bogen K, McCauley H. #MyDepressionLooksLike: examining public discourse about depression on Twitter. JMIR Ment Health. Oct 18, 2017;4(4):e43. [FREE Full text] [CrossRef] [Medline]
- Hilton CE. 'It's the symptom of the problem, not the problem itself': a qualitative exploration of the role of pro-anorexia websites in users' disordered eating. Issues Ment Health Nurs. Oct 2018;39(10):865-875. [CrossRef] [Medline]
- Berrouiguet S, Larsen M, Mesmeur C, Gravey M, Billot R, Walter M, HUGOPSY Network, et al. Toward mhealth brief contact interventions in suicide prevention: case series from the suicide intervention assisted by messages (SIAM) randomized controlled trial. JMIR Mhealth Uhealth. Jan 10, 2018;6(1):e8. [FREE Full text] [CrossRef] [Medline]
- Schlichthorst M, King K, Turnure J, Sukunesan S, Phelps A, Pirkis J. Influencing the conversation about masculinity and suicide: evaluation of the man up multimedia campaign using Twitter data. JMIR Ment Health. Feb 15, 2018;5(1):e14. [FREE Full text] [CrossRef] [Medline]
- Robert A, Suelves JM, Armayones M, Ashley S. Internet use and suicidal behaviors: internet as a threat or opportunity? Telemed J E Health. Apr 2015;21(4):306-311. [CrossRef] [Medline]
- Luxton DD, June JD, Fairall JM. Social media and suicide: a public health perspective. Am J Public Health. May 2012;102(Suppl 2):S195-S200. [CrossRef] [Medline]
- Wong QJ, Werner-Seidler A, Torok M, van Spijker B, Calear AL, Christensen H. Service use history of individuals enrolling in a web-based suicidal ideation treatment trial: analysis of baseline data. JMIR Ment Health. Apr 2, 2019;6(4):e11521. [FREE Full text] [CrossRef] [Medline]
- Livingstone S, Smith P. Annual research review: harms experienced by child users of online and mobile technologies: the nature, prevalence and management of sexual and aggressive risks in the digital age. J Child Psychol Psychiatry. Jun 2014;55(6):635-654. [CrossRef] [Medline]
- Pater J, Mynatt E. Jessica Pater. 2017. URL: http://jesspater.com/wp-content/uploads/2014/12/CSCW-2017-FINAL.pdf [accessed 2020-04-08]
- Lewis S, Seko Y. A double-edged sword: a review of benefits and risks of online nonsuicidal self-injury activities. J Clin Psychol. Mar 2016;72(3):249-262. [CrossRef] [Medline]
- Patton GC, Coffey C, Romaniuk H, Mackinnon A, Carlin JB, Degenhardt L, et al. The prognosis of common mental disorders in adolescents: a 14-year prospective cohort study. Lancet. Apr 19, 2014;383(9926):1404-1411. [CrossRef] [Medline]
- Muñoz-Sánchez JL, Delgado C, Parra-Vidales E, Franco-Martín M. Facilitating factors and barriers to the use of emerging technologies for suicide prevention in Europe: multicountry exploratory study. JMIR Ment Health. Jan 24, 2018;5(1):e7. [FREE Full text] [CrossRef] [Medline]
- Mukhra R, Baryah N, Krishan K, Kanchan T. 'Blue whale challenge': a game or crime? Sci Eng Ethics. Feb 2019;25(1):285-291. [CrossRef] [Medline]
- Thaploo M. U4UVoice. 2017. URL: https://u4uvoice.com/blue-whale-jammu-techie-has-decoded-the-mystery-5-important-points/ [accessed 2018-10-17]
- Maina A. Small Business Trends. 2016. URL: https://smallbiztrends.com/2016/05/popular-social-media-sites.html [accessed 2018-11-28]
- Clarke V, Braun V. Teaching thematic analysis: overcoming challenges and developing strategies for effective learning. Psychologist. 2013;26(2):120-123. [FREE Full text]
- Pater J, Haimson O, Andalibi N, Mynatt E. 'Hunger Hurts but Starving Works': Characterizing the Presentation of Eating Disorders Online. In: Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing. 2016. Presented at: CSCW'16; February 27-March 2, 2016; San Francisco, USA. URL: https://dl.acm.org/citation.cfm?id=2820030 [CrossRef]
- Pater J, Miller A, Mynatt E. This Digital Life: A Neighborhood-Based Study of Adolescents' Lives Online. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2015. Presented at: CHI'15; April 18, 2015; Seoul Republic of Korea. URL: https://dl.acm.org/citation.cfm?id=2702534 [CrossRef]
- YouTube Creator Academy. URL: https://creatoracademy.youtube.com/page/lesson/discovery [accessed 2019-11-06]
- Covington P, Adams J, Sargin E. Deep Neural Networks for YouTube Recommendations. In: Proceedings of the 10th ACM Conference on Recommender Systems. 2016. Presented at: RecSys'16; September 15-17, 2016; Boston, MA, USA. [CrossRef]
- Saunders B, Sim J, Kingstone T, Baker S, Waterfield J, Bartlam B, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant. 2018;52(4):1893-1907. [FREE Full text] [CrossRef] [Medline]
- Fusch P, Ness L. NSUWorks - Nova Southeastern University. 2015. URL: http://nsuworks.nova.edu/tqr/vol20/iss9/3/ [accessed 2018-11-13]
- Bromberg JE, Augustson EM, Backinger CL. Portrayal of smokeless tobacco in YouTube videos. Nicotine Tob Res. Apr 2012;14(4):455-462. [FREE Full text] [CrossRef] [Medline]
- Richardson A, Vallone DM. YouTube: a promotional vehicle for little cigars and cigarillos? Tob Control. Jan 2014;23(1):21-26. [CrossRef] [Medline]
- Luo C, Zheng X, Zeng DD, Leischow S. Portrayal of electronic cigarettes on YouTube. BMC Public Health. Oct 3, 2014;14:1028. [FREE Full text] [CrossRef] [Medline]
- Web Analytics World. URL: https://www.webanalyticsworld.net/analytics-measurement-and-management-tools/radian-6-overview [accessed 2018-11-28]
- Laine M, Frühwirth C. Monitoring Social Media: Tools, Characteristics and Implications. In: Proceedings of the International Conference of Software Business. 2010. Presented at: ICSOB'10; June 21-23, 2010; Jyväskylä, Finland. [CrossRef]
- Khattar A, Dabas K, Gupta K, Chopra S, Kumaraguru P. arXiv. 2018. URL: http://arxiv.org/abs/1801.05588 [accessed 2019-09-08]
- Saldana J. The Coding Manual for Qualitative Researchers. Thousand Oaks, California. Sage Publications; 2015.
- ATLAS.ti: The No 1 Software for Qualitative and Mixed Methods Data Analysis. URL: https://atlasti.com/ [accessed 2019-11-06]
- Chambers DA, Pearson JL, Lubell K, Brandon S, O'brien K, Zinn J. The science of public messages for suicide prevention: a workshop summary. Suicide Life Threat Behav. Apr 2005;35(2):134-145. [CrossRef] [Medline]
- US Department of Health and Human Services. National Strategy for Suicide Prevention: Goals and Objectives for Action. Scotts Valley, California, US. CreateSpace Independent Publishing; 2010.
- Stone DM, Simon TR, Fowler KA, Kegler SR, Yuan K, Holland KM, et al. Vital signs: trends in state suicide rates-United States, 1999-2016 and circumstances contributing to suicide-27 states, 2015. MMWR Morb Mortal Wkly Rep. Jun 8, 2018;67(22):617-624. [FREE Full text] [CrossRef] [Medline]
- Baldessarini RJ, Tondo L, Hennen J. Effects of lithium treatment and its discontinuation on suicidal behavior in bipolar manic-depressive disorders. J Clin Psychiatry. 1999;60(Suppl 2):77-84; discussion 111. [FREE Full text] [Medline]
- Fekete S, Schmidtke A. The impact of mass media reports on suicide and attitudes toward self-destruction: previous studies and some new data from Hungary and Germany. In: Mishara BL, editor. The Impact of Suicide (Springer Series on Death and Suicide). New York, USA. Springer Publishing; 1995:142-155.
- Cialdini RB. Crafting normative messages to protect the environment. Curr Dir Psychol Sci. Jun 22, 2016;12(4):105-109. [CrossRef]
- Jacobs D. The Harvard Medical School Guide to Suicide Assessment and Intervention. San Francisco, California. Jossey-Bass; 1999.
- Fekete S, Macsai E. Hungarian suicide models, past and present. In: Ferrari G, editor. Suicidal Behavior and Risk Factors. Bologna. Monduzzi Editore; 1990:149-156.
- Sonneck G, Etzersdorfer E, Nagel-Kuess S. Imitative suicide on the Viennese subway. Soc Sci Med. Feb 1994;38(3):453-457. [CrossRef] [Medline]
- Recupero PR, Harms SE, Noble JM. Googling suicide: surfing for suicide information on the internet. J Clin Psychiatry. Jun 2008;69(6):878-888. [Medline]
- Armstrong G, Vijayakumar L, Niederkrotenthaler T, Jayaseelan M, Kannan R, Pirkis J, et al. Assessing the quality of media reporting of suicide news in India against World Health Organization guidelines: a content analysis study of nine major newspapers in Tamil Nadu. Aust N Z J Psychiatry. Sep 2018;52(9):856-863. [CrossRef] [Medline]
- Noble-Carr D, Woodman E. Considering identity and meaning constructions for vulnerable young people. J Adolesc Res. Dec 28, 2016;33(6):672-698. [CrossRef]
- Gould MS, Shaffer D. The impact of suicide in television movies. Evidence of imitation. N Engl J Med. Sep 11, 1986;315(11):690-694. [CrossRef] [Medline]
- Lewis S. CBS News. 2019. URL: https://www.cbsnews.com/news/momo-challenge-resurfaces-police-issue-warning-to-parents/ [accessed 2020-02-15]
- Tanner MA, Murray JA, Phillips DP. The impact of televised movies about suicide. N Engl J Med. Mar 17, 1988;318(11):707-708. [CrossRef] [Medline]
- Cooper Jr MT, Bard D, Wallace R, Gillaspy S, Deleon S. Suicide attempt admissions from a single children's hospital before and after the introduction of Netflix series 13 reasons why. J Adolesc Health. Dec 2018;63(6):688-693. [CrossRef] [Medline]
- Ayers JW, Althouse BM, Leas EC, Dredze M, Allem J. Internet searches for suicide following the release of 13 reasons why. JAMA Intern Med. Oct 1, 2017;177(10):1527-1529. [FREE Full text] [CrossRef] [Medline]
- Ortiz P, Khin EK. Traditional and new media's influence on suicidal behavior and contagion. Behav Sci Law. Mar 2018;36(2):245-256. [CrossRef] [Medline]
- Roth R, Abraham J, Zinzow H, Wisniewski P, Khasawneh A, Chalil Madathil K. Evaluating News Media Reports on the 'Blue Whale Challenge' for Adherence to Suicide Prevention Safe Messaging Guidelines. Proc. ACM Hum.-Comput. Interact. May 28, 2020;4(CSCW1):1-27. [CrossRef]
- Arendt F, Scherr S, Till B, Prinzellner Y, Hines K, Niederkrotenthaler T. Suicide on TV: minimising the risk to vulnerable viewers. Br Med J. Aug 22, 2017;358:j3876. [CrossRef] [Medline]
- Hong V, Ewell Foster CJ, Magness CS, McGuire TC, Smith PK, King CA. 13 reasons why: viewing patterns and perceived impact among youths at risk of suicide. Psychiatr Serv. Feb 1, 2019;70(2):107-114. [CrossRef] [Medline]
- Zimerman A, Caye A, Zimerman A, Salum GA, Passos IC, Kieling C. Revisiting the werther effect in the 21st century: bullying and suicidality among adolescents who watched 13 reasons why. J Am Acad Child Adolesc Psychiatry. Aug 2018;57(8):610-3.e2. [CrossRef] [Medline]
- Vaterlaus JM, Tulane S, Porter BD, Beckert TE. The perceived influence of media and technology on adolescent romantic relationships. J Adolesc Res. May 31, 2017;33(6):651-671. [CrossRef]
- Swendeman D, Arnold E, Harris D, Fournier J, Comulada W, Reback C, et al. Adolescent Medicine Trials Network (ATN) CARES Team. Text-messaging, online peer support group, and coaching strategies to optimize the HIV prevention continuum for youth: protocol for a randomized controlled trial. JMIR Res Protoc. Aug 9, 2019;8(8):e11165. [FREE Full text] [CrossRef] [Medline]
- Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information on YouTube: a systematic review. Health Informatics J. Sep 2015;21(3):173-194. [CrossRef] [Medline]
- Khasawneh A, Madathil KC, Dixon E, Wisniewski P, Zinzow H, Roth R. An Investigation on the Portrayal of Blue Whale Challenge on YouTube and Twitter. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2019. Presented at: HFES'19; October 28-November 1, 2019:887-888; Seattle, WA. [CrossRef]
- Khasawneh A. Tiger Prints. 2019. URL: https://tigerprints.clemson.edu/all_dissertations/2526 [accessed 2020-04-20]
- Khasawneh K, Ozsoy M, Donovick C, Ghazaleh NA, Ponomarev D. EnsembleHMD: accurate hardware malware detectors with specialized ensemble classifiers. IEEE Trans Dependable and Secure Comput. 2018:1-1. [FREE Full text] [CrossRef]
- Khasawneh K, Ozsoy M, Donovick C. Ensemble Learning for Low-Level Hardware-Supported Malware Detection. In: Proceedings of the International Symposium on Recent Advances in Intrusion Detection. 2015. Presented at: RAID'15; November 2-4, 2015:2-24; Kyoto, Japan. URL: https://link.springer.com/chapter/10.1007/978-3-319-26362-5_ [CrossRef]
Abbreviations
NSSI: nonsuicidal self-injury |
SPRC: Suicide Prevention Resource Center |
Edited by J Torous; submitted 22.08.19; peer-reviewed by E Charlotte Hilton, M Alvarez de Mon, C Jacob, Z Ma; comments to author 21.09.19; revised version received 19.12.19; accepted 27.03.20; published 05.06.20.
Copyright©Amro Khasawneh, Kapil Chalil Madathil, Emma Dixon, Pamela Wiśniewski, Heidi Zinzow, Rebecca Roth. Originally published in JMIR Mental Health (http://mental.jmir.org), 29.05.2020.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.