This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.
The Supporting Our Valued Adolescents (SOVA) intervention aims to use a moderated social media website to encourage peer discussion about negative health beliefs, which may prevent treatment uptake. Web moderators with a background in behavioral health are used to facilitate peer conversation to promote a sense of community, provide social support, and ensure safety.
Although moderation is a core component of this intervention, little is known on best practices for moderators to ensure safety while encouraging engagement. This study sought to describe interactions between moderators and peer users and understand moderator experiences through individual interviews.
Adolescents and young adults aged 14 to 26 years with depression or anxiety history were recruited for a usability study of the SOVA intervention. During this study, 14 moderators were trained to regularly review comments to blog posts for safety, facilitate conversation, and correct misinformation. A total of 110 blog posts and their associated comments were extracted and coded using a codebook based on items from the supportive accountability model and a peer social support analysis. Closing interviews with 12 moderators assessing their experience of moderating were conducted, recorded, and transcribed. Blog post text and comments as well as transcripts of moderator interviews were assessed using a thematic analysis approach, and blog posts were examined for trends in content of moderator comments comparing blog posts with differences in comment contributor order.
There were no safety concerns during the study, and moderators only intervened to remove identifiable information. Web moderators exhibited elements of supportive accountability (such as being perceived as experts and using verbal rewards as well as offering informational and emotional support). When the moderators provided the last comment under a blog post, thereby potentially ending contribution by users, they were at times found to be commenting about their own experiences. Moderators interviewed after completing their role expressed challenges in engaging users. A cohort of moderators who received more extensive training on supportive accountability and peer social support felt their ability to engage users improved because of the training.
Moderators of a Web-based support site for adolescents with depression or anxiety were able to ensure safety while promoting user engagement. Moderators can elicit user engagement by offering gratitude and encouragement to users, asking users follow-up questions, and limiting their own opinions and experiences when responding to comments.
Web-based interventions enhanced with social media components offer a novel approach to intervening on key mechanisms which may increase adolescent use of mental health services. These key mechanisms include addressing parents’ and adolescents’ health beliefs [
Social media allows interventions to target individuals who may otherwise be hard to reach, with moderators as a means to reduce risk [
The Supporting Our Valued Adolescents (SOVA) social media website is a Web-based intervention for adolescents and young adults aged 14 to 26 years with depression or anxiety. The site uses daily blog posts to address mental health and increase the perceived need [
Few previous studies assessed how Web moderators can engage users, especially in adolescent internet support groups. As a part of a usability study of the SOVA intervention [
Adolescents and young adults aged 14 to 26 years with a self-report of experiencing symptoms of depression or anxiety were recruited for a usability study of the SOVA intervention which is described in detail in a previous paper [
During the usability study, 14 moderators were involved in site moderation with up to 4 at a time, always including the principal investigator (PI) as a backup. Moderators included research assistants who were also licensed social workers, graduate students in social work, graduate medical students in psychiatry, or graduate students in psychology. The PI was not interviewed for this study.
Moderation occurred through receiving mobile phone notifications for a research study email, which would forward new comments posted on a SOVA blog post. Moderators worked in shifts, and during a shift, they were vigilant to check for new emails at least every 3 hours. Any time that was not actively moderated by these individuals was moderated by the PI.
All moderators received a 2-hour live training by the PI with specific case examples describing potential moderating scenarios. The PI was also made available for questions throughout training and while moderating. Both Mohr and Kraut’s work investigating the role of Web moderation informed the training for moderators on the SOVA website and were subsequently used to evaluate the success of this moderating role [
Moderation of the SOVA website involved screening all blog post comments within 3 hours after they were published and judging whether to respond. Moderators were instructed to review and respond to comments as necessary to (1) facilitate conversation and indicate to users that someone read their comment, but only if no other user had commented after 24 hours to not stifle conversation; (2) correct misinformation (eg, regarding incorrect medical advice); (3) remove identifiable information; (4) address cyberbullying; (5) be available to give feedback and advice; and (6) screen for safety. Users were provided with ground rules of SOVA site use (
The blog post and their associated comments were extracted from the website database for 110 blog posts from April 2015 to February 2017 and downloaded into NVivo (QSR International) qualitative software. A codebook was developed which included labelling of comment author (user vs moderator) and author order, and individual codes based on items retrieved from the supportive accountability model [
The first 50 blog posts were coded by 2 coders independently using an initial codebook. They then met with the PI to review codes and modified the codebook based on this feedback, mostly regarding the definition of specific codes. The units of coding used were one entire comment; although at times, parts of a blog post could be included as a
The analysis of comments was approved by the University of Pittsburgh Human Research Protection Office (Institutional Review Board [IRB]).
As moderators discontinued their work with SOVA, usually because of a finishing graduate school practicum or job transition, they were asked to participate in a closing interview conducted by a research assistant. Individual interviews were conducted with 12 moderators, all of whom had moderated for a period of 6 months or more. Interview questions assessed the experience of moderating and perceptions about training. These were recorded, transcribed, and double coded until greater than 80% agreement was achieved across all nodes. Interviews were transcribed verbatim excluding filler words (eg, like and um). A prespecified codebook (
We used a template analysis approach for both analysis of the blog post comments as well as the moderator interviews, using a prespecified codebook and a hierarchical approach to the coding, being open to future changes in codes as coding progressed [
This work was approved by the IRB of the University of Pittsburgh (PRO15060158).
Out of 363 total blog posts, the 110 that received comments (2 of which were written by a user) were coded and assessed. First, there were no safety concerns throughout moderating. There were fewer than 5 instances that moderators intervened to remove identifiable information from comments or correct misinformation. There were no instances of cyberbullying or crisis situations. Moderators most often commented to facilitate conversation or offer feedback and advice.
Of the 110 blog posts, there were 8.1% (9/110) where only the moderator responded; 44.5% (49/110) where only one or more users responded; and 47.2% (52/110) where both the moderator and user(s) responded. Of the 52 blog posts that received user and moderator comments, for 36% (19/52; 1 user commenting) and 6% (3/52; multiple users commenting) the conversation stopped after the moderator responded (moderator stop). For 58% (30/52) the conversation continued after the moderator responded (moderator continues).
In blog posts where conversation stopped, moderators were found to be self-disclosing, commenting about their own experiences. Alternatively, the conversations that continued (eg, user, moderator, and user) revealed several trends. These trends included the moderator utilizing emotional support including thanking the user, asking them a question, asking for their tips, and offering them encouragement. As further detailed below, moderators commonly displayed elements of Mohr’s supportive accountability model in conversations which continued after moderator input by (1) sharing or displaying moderator expertise, (2) verbally rewarding user, and (3) mirroring (eg, mimicking the same emoticon as a user) [
An example of a blog post where the conversation continued after the moderator responded was one called, “What Depression Really Looks Like,” which addressed the stereotype that people who are depressed will look depressed, when often that is not the case. The question asked at the end of the blog post was, “Do you think the pictures used to portray depression play a role in the stigma around it?” A user responded after which the moderator responded, and a conversation followed (
Another example of continued conversation addressed negative emotions, specifically, the loss of a loved one. In this case, a user commented, a moderator responded, and a conversation continued (
A final example of continued conversation was one surrounding a post that discussed depression and how it manifests in the individual. The same user commented back to the moderator in this case (
An example of a blog post where the moderator stopped the conversation was one called
Moderator continues conversation through rewards and questions.
Moderator continues conversation through expertise and support.
Moderator continues conversation through support and question.
Moderator stops conversation.
In the first example, the moderator’s question was not answered by the original user but did not prevent further commenting. In the second example, the moderator provided both emotional and informational support, and neither comment deterred further user interaction. In the third example, the moderator asked the user a question, to which the same user responded, thus fostering further engagement. In the final example where the conversation stopped, perhaps the moderator commenting about their own experiences may have deterred further user response.
An analysis of the coded comments revealed several trends in the way moderators engaged on SOVA. All codes occurring more than 10 times in blogpost comments can be found in
Moderators also offered potential solutions to problems users expressed in 10 comments, often suggesting ways to avoid rumination and adopt more positive attitudes. An example of a moderator providing a solution was when giving advice to someone struggling with sleep, encouraging the user to practice “controlling your exposure to light, creating bedtime rituals that help with relaxation, and keeping the bedroom cool and quiet.” Users expressed that they learned something new from the site on 10 occasions, though not overlapping with moderators’ solutions but in response to informational posts or tips from other users. In 51 comments, moderators provided users with verbal rewards including phrases such as “thanks for sharing!” and “great point!”
Users occasionally shared desire for emotional support or information (2 and 4 comments, respectively). Of these 6 cases, the moderator responded 5 times, and another user responded once. For example, when a user asked, “How would the average person know when someone’s having a panic attack?” the moderator responded with details and a link to a website outlining mental health first aid guidelines. Moreover, moderators responded thoroughly to all aspects of the users’ comments—thanking them for sharing, addressing the story they shared with positive affirmations, answering any questions they may have posed, and occasionally sharing some new information or asking a follow-up question to promote discussion. For example, when a user stated, “I have a hard time wording things,” the moderator asked, “What are some ways you might try to word what you want to say from reading this article?” which prompted the user to continue the conversation. This demonstrates the moderator reacting to a disclosure by the user and using an open-ended question to further discussion.
Emotional support was provided most commonly by users, but also by moderators, for a total of 57 times. This support was most often in the form of moderators’ or users’ acknowledgment of what the original user posted (“I agree, journaling can be a great way to deal with things”), or from users sharing similar stories of one’s own, saying things such as “I completely agree [and I, too] want to share my struggle with this.” Informational support was also provided by both users and moderators 167 times. This involved sharing resources and strategies with fellow users, including providing emotional coping strategies, such as links to websites as mentioned or advice on how to deal with therapists, physicians, stressful situations, or negative thoughts.
In 63 comments, users made positive remarks highlighting the value of the SOVA website, most often responding to blog posts, but also to other users’ comments or moderator’s comments, such as “This was very insightful and helpful” and “I’m definitely going to try [that].”
Moderators shared challenges in engaging users in thoughtful conversation during the usability testing phase. They believed this was because of a limited number of users on the site at one time, and iterative updates to website functionality (eg, new article notifications not working). They also shared challenges engaging users because of the nature of users themselves, “I think it just takes a unique type of adolescent to want to engage online with this type of site...it really makes sense that our adolescents do use a lot of online communities and that this is something they would be interested in, but I think it takes a lot of forethought and insight that adolescents might not have about their own mental health.” One moderator mentioned some of the challenging aspects surrounding engaging users, “How do you create a conversation...if you are responding to a comment and you say, ‘...That’s a really good point...’ how do you make it into a conversation so that they want to respond more? Or how do you get other users to interact with each other? That is an important role of the moderator to create that space where conversations can happen.” Following concerns from moderators that more guidance on peer engagement and crisis training were needed, this was updated in training procedures, and later moderators perceived that training was adequate and valuable to performing their duties, especially when previous or existing moderators were accessible and approachable. Often, it took the moderators several weeks to become fully comfortable with their duties and those with a stronger background in mental health reported feeling more prepared. They felt their role in commenting was relatively unimportant when compared with keeping the site and users safe. Taking that into account, there were no safety concerns, and moderators found using a research study mobile phone with all new site content emailed to the mobile phone a feasible way to incorporate moderating with balancing their other daily tasks, so they could be available 24×7. Overall, the moderators stated that they most enjoyed interacting with other users (including replying to comments), gaining mental health experience and training, and feeling as though they were making a difference through their role. The moderators had many positive things to say about the study, “I like that the role [of the moderator] is important, even though I do not necessarily always have a lot of interactions with the users. I know that the subscribers know that someone is there to make sure the information is accurate and that it is a safe environment to discuss and talk is good. I like knowing my role is needed.”
In this study, after examining the role of the moderator in a Web-based intervention for adolescents with depression or anxiety, we found that moderating such an intervention was feasible and resulted in no safety concerns. Additionally, moderators exhibited various approaches that may impact user engagement. Moderators themselves expressed satisfaction with receiving training on techniques which may enhance user engagement and keep users safe on the Web, stating that they found the experience valuable. The findings of this study influenced changes to current moderator training including incorporating more feedback on emergency and safety protocols as well as enhancing feedback on how to increase user engagement by limiting moderator self-disclosure.
The role of the moderator in Web-based behavior interventions for adolescents and young adults is a necessary one to ensure the safety of users and quality of Web content. Moderation has been found to foster a welcoming and safe environment, prevent cyberbullying [
The supportive accountability model considers some moderator behaviors that may be exhibited in an interactive Web coaching scenario where traditional goal setting and rapport building may occur. As no such expectations were stated to users in our intervention, we did not expect to find some of the code families including bond, trustworthiness, benevolence, reciprocity, process expectations, and mirroring. Other code families from supportive accountability for which we did expect to find codes included: expertise, definition, identifying a problem, interest, reward, cues, social support, and seeking and providing emotional and informational support.
We found that varying moderator approaches may impact user engagement. Our findings led us to conclude that our concern of moderators deterring further conversation did not happen most of the time, but that at times when moderators exhibited self-disclosure, conversation from users would stop. Moderators may experience success by omitting their own opinions and experiences when responding to comments as this behavior may deter further discussion among users. One explanation for this finding is derived from the theory of Rogerian
User engagement is essential for the success of Web-based interventions [
We found that moderators exhibited techniques aligned with supportive accountability and social support and that users did seem to engage when these techniques were used, for example, when moderators would offer encouragement, informational or emotional support, display expertise, or offer verbal rewards. Despite this information on moderator strategies, the way users interact on the Web is, not surprisingly, user-dependent and varies significantly from person to person [
Many therapy-based Web interventions reveal several important aspects of how humans may provide support. First, supported interventions, either by a therapist or nonhealth professional administrator, are more successful than unsupported mechanisms even when human resources are low [
This study has several limitations. First, the study was not designed to empirically examine the impact of utilizing specific techniques to increase user engagement. The data presented here are highly explorative and observational in nature, and a future experimental study would be needed to confirm our results, as we only describe trends in coding for when conversation continues and conversation stops. Supplementary analysis of moderator-user order on other social media platforms could be used to further investigate the trends found in this study. Thus, more qualitative research regarding moderator techniques in other behavioral interventions could further inform intervention design. In addition, the study was conducted during a feasibility and usability study with a smaller sample of users engaged on the site. If there were a larger sample of active users, there may be additional findings not accounted for in this study. Regardless, initial stakeholder feedback for the design of SOVA raised many concerns about safety [
The high rate of suicide and low rate of mental health treatment among adolescents highlight the need for social media interventions such as SOVA. Moderation is key for this sort of intervention to be both effective and safe. Moderators on the SOVA site elicited user engagement by offering gratitude and encouragement to users, asking users follow-up questions and limiting their own opinions and experiences when responding to comments. Users commenting on SOVA perceived that it had positive effects on increasing their adoption of healthy attitudes and behaviors. The research described is innovative specifically in investigating strategies a moderator can use to balance a potentially punitive and interfering role (enforcing site rules) with a supportive role (providing social support and facilitating conversation to promote peer-peer social support) for an adolescent Web-based support group intervention, all while effectively promoting user engagement.
User ground rules for Supporting Our Valued Adolescents website.
Blog post codebook and number comments coded.
Moderator interview codebook.
Codes occurring more than 10 times in blog post comments.
electronic health
Institutional Review Board
principal investigator
supporting our valued adolescents
The authors thank Jing Hua and Sharanya Bandla for technical assistance with the SOVA website. The authors thank former social work students and graduates for assistance in site moderation and interview participation. The authors thank and acknowledge the SOVA community members and stakeholders for informing this study and making it possible.
This study was funded by the National Institute of Mental Health (NIMH 1K23MH111922-01A1); UPMC Children's Hospital of Pittsburgh (Student Research Training Program); and the University of Pittsburgh School of Medicine (Dean’s Summer Research Program).
AR conceived the study and was involved in protocol development and gaining ethical approval. CW and MC researched the literature. MC, CW, CL, and LB were involved in patient recruitment and data analysis. CW wrote the first draft of the manuscript. All authors reviewed and edited the manuscript and approved the final version of the manuscript.
None declared.