Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Monday, March 11, 2019 at 4:00 PM to 4:30 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 27.04.20 in Vol 7, No 4 (2020): April

Preprints (earlier versions) of this paper are available at http://preprints.jmir.org/preprint/17497, first published Dec 17, 2019.

This paper is in the following e-collection/theme issue:

    Original Paper

    A Web-Based Adaptation of the Quality of Life in Bipolar Disorder Questionnaire: Psychometric Evaluation Study

    1Department of Psychiatry, University of British Columbia, Vancouver, BC, Canada

    2Faculty of Health, Arts and Design, Swinburne University, Hawthorn, Australia

    3University of Guelph, Guelph, ON, Canada

    4Faculty of Medicine, University of British Columbia, Vancouver, BC, Canada

    5Department of Psychology, University of British Columbia, Vancouver, BC, Canada

    6Department of Psychiatry, University of California, San Diego, CA, United States

    Corresponding Author:

    Emma Morton, BSc (Hons), PhD

    Department of Psychiatry

    University of British Columbia

    420-5950 University Blvd

    Vancouver, BC, V6T 1Z3

    Canada

    Phone: 1 604 827 3393

    Email: emma.morton@ubc.ca


    ABSTRACT

    Background: Quality of life (QoL) is considered a key treatment outcome in bipolar disorder (BD) across research, clinical, and self-management contexts. Web-based assessment of patient-reported outcomes offer numerous pragmatic benefits but require validation to ensure measurement equivalency. A web-based version of the Quality of Life in Bipolar Disorder (QoL.BD) questionnaire was developed (QoL Tool).

    Objective: This study aimed to evaluate the psychometric properties of a web-based QoL self-report questionnaire for BD (QoL Tool). Key aims were to (1) characterize the QoL of the sample using the QoL Tool, (2) evaluate the internal consistency of the web-based measure, and (3) determine whether the factor structure of the original version of the QoL.BD instrument was replicated in the web-based instrument.

    Methods: Community-based participatory research methods were used to inform the development of a web-based adaptation of the QoL.BD instrument. Individuals with BD who registered for an account with the QoL Tool were able to opt in to sharing their data for research purposes. The distribution of scores and internal consistency estimates, as indicated by Cronbach alpha, were inspected. An exploratory factor analysis using maximum likelihood and oblique rotation was conducted. Inspection of the scree plot, eigenvalues, and minimum average partial correlation were used to determine the optimal factor structure to extract.

    Results: A total of 498 people with BD (349/498, 70.1% female; mean age 39.64, SD 12.54 years; 181/498, 36.3% BD type I; 195/498, 39.2% BD type II) consented to sharing their QoL Tool data for the present study. Mean scores across the 14 QoL Tool domains were, in general, significantly lower than that of the original QoL.BD validation sample. Reliability estimates for QoL Tool domains were comparable with that observed for the QoL.BD instrument (Cronbach alpha=.70-.93). Exploratory factor analysis supported the extraction of an 11-factor model, with item loadings consistent with the factor structure suggested by the original study. Findings for the sleep and physical domains differed from the original study, with this analysis suggesting one shared latent construct.

    Conclusions: The psychometric properties of the web-based QoL Tool are largely concordant with the original pen-and-paper QoL.BD, although some minor differences in the structure of the sleep and physical domains were observed. Despite this small variation from the factor structure identified in the QoL.BD instrument, the latent factor structure of the QoL Tool largely reproduced the original findings and theoretical structure of QoL areas relevant to people with BD. These findings underscore the research and clinical utility of this instrument, but further comparison of the psychometric properties of the QoL Tool relative to the QoL.BD instrument is warranted. Future adaptations of the QoL Tool, including the production of an app-based version of the QoL Tool, are also discussed.

    JMIR Ment Health 2020;7(4):e17497

    doi:10.2196/17497

    KEYWORDS



    Introduction

    Background

    Applications of quality of life (QoL) assessment instruments in bipolar disorder (BD) research have grown substantially [1,2] since the introduction of the concept in psychiatric research more generally in the 1980s [3]. Broadly speaking, QoL instruments holistically assess an individual’s satisfaction and functioning across a range of life domains and are, therefore, increasingly used to evaluate BD treatment outcomes beyond symptomatic response. Assessment of QoL may be particularly important in the context of BD, given the chronic course and significant impacts across diverse life domains associated with this mood disorder [1]. Indeed, there is some evidence to indicate that both patients with BD and health care providers view QoL as the most important outcome in the treatment of the condition [4].

    In the study of BD, QoL has been primarily measured with universal or generic instruments (most commonly, the 36-Item Short Form Health Survey and Quality of Life Enjoyment and Satisfaction Questionnaire [1]). Although generic measures assess areas of life, which may be considered fundamentally important [5], patient groups may have unique priorities that are best assessed with disorder-specific instruments [6]. To address this gap, the first condition-specific QoL instrument for BD was developed: the Quality of Life in Bipolar Disorder (QoL.BD) [7]. Informed by consultation with people with lived experience of BD, their family members, and field experts, the resulting scale assesses cardinal life areas directly impacted by BD symptoms (mood, sleep, physical health, and cognition), pragmatic and functional outcomes (home, work, education, leisure, and finances), and more psychosocially orientated constructs (relationships, self-esteem, spirituality, identity, and independence). A decade since its development, the QoL.BD instrument has seen international adoption: it has undergone formal adaptation and validation in Iranian, Chinese, and Chilean populations [8-10] and has been translated into over 20 languages [11]. It has also seen application in diverse research contexts, including clinical trials of psychotherapy [12-16] and pharmacological interventions [17,18]. Materials have also been developed to support the use of the QoL.BD instrument by health care practitioners in a clinical context (eg, case formulation [11]) and by individuals with BD themselves in their self-management practices [19].

    Although the uptake of the QoL.BD instrument has been encouraging, its research and clinical utility may be enhanced with a web-based delivery format. Relative to traditional pen-and-paper instruments, web-based administration formats reduce administrative burden (through, for instance, automatic scoring and practical data storage), data entry and coding errors, and item nonresponse [20]. Web-based questionnaires may also enhance the accessibility of instruments for both researchers and patients: they are cost-effective [21], instantaneously available to potential users with an internet connection (regardless of location), and navigation is user-friendly, with the ability to skip or eliminate irrelevant questions from the view [22]. Respondents may also prefer web-based administration formats [23], and for questionnaires that assess sensitive topics (such as factors related to mental health), web-based questionnaires may potentially reduce social desirability effects [24]. For ongoing self-monitoring purposes, web-based instruments are advantaged by their ability to provide immediate feedback to respondents, reduce the burden of tracking large volumes of data, and potentially lessen experiences of stigma by decreasing the visibility of symptom monitoring. Given the numerous pragmatic benefits and enhanced user-friendliness of web-based self-report questionnaires, adaptation of the pen-and-paper version of the QoL.BD instrument to a web-based interface was undertaken to support utilization of this instrument across research, clinical, and self-management contexts.

    However, simple migration of pen-and-paper scales to web-based formats does not guarantee preservation of a scale’s psychometric properties. A number of factors can impact the way a scale performs when adapted for web-based administration, including modifications to layout, instructions, or changes in item wording and response options [25,26]. The Professional Society for Health Economics and Outcomes Research (ISPOR) guidelines suggest that evidence needed to support measurement equivalence between pen-and-paper and electronic adaptations varies depending on the extent of modifications, from minor (eg, simply displaying a scale text on screen) to more substantive (ranging from moderate alterations such as splitting the presentation of items over several screens, up to large scale changes to items, presentation or response format). Supporting evidence can include usability testing, appraisal of interformat reliabilities, comparable means and standard deviations, and preservation of scale reliability and factor structures across formats. Although the majority of Web-adaptation studies have reported interformat reliabilities, informing confidence about the consistency of measurement of self-reported mental health data across formats [27,28], fewer studies have made a comment on whether the original factor structure is replicated in web-based questionnaire formats. As such, exploration of the psychometric properties, particularly factor structure, is needed to support the use of any web-based adaptation of the QoL.BD instrument.

    Objective

    The overarching aim of this study was to compare the psychometric performance of the web-based QoL Tool with the original pen-and-paper version of the QoL.BD scale. To do this, we aimed to (1) describe the means and standard deviations of QoL Tool responses, (2) evaluate the internal consistency of the web-based measure, and (3) determine whether the factor structure of the original QoL.BD could be replicated in the web-based adaptation of this instrument.


    Methods

    Overview

    The project was conducted by the Collaborative RESearch Team to study psychosocial issues in Bipolar Disorder (CREST.BD [19]), a Canadian-based network dedicated to collaborative research and knowledge translation (KT) in BD. CREST.BD specializes in community-based participatory research (CBPR), where researchers and knowledge users work collaboratively [29]. Informed by a decade of research and integrated KT, CREST.BD has developed a specific model of CBPR for BD [30]. Funding from the Canadian Institutes of Health Research was granted to extend on prior work to design and validate a pen-and-paper QoL questionnaire for BD (described below). This psychometric evaluation follows the development of a web-based adaptation of the QoL.BD instrument using CBPR methods.

    Design and Validation of the Pen-and-Paper Quality of Life in Bipolar Disorder Questionnaire

    The development and validation of the QoL.BD instrument is described in detail elsewhere [7]. In brief, candidate items were generated through (1) qualitative interviews with people with BD, their family members, and field experts and (2) a literature review of existing research on QoL in BD. Following item reduction, preliminary psychometric analyses, and further consultation with field and lived experience experts, a final subset of 56 items was retained. Items are organized into 14 4-item domains: 12 core (physical, sleep, mood, cognitive, leisure, social, spirituality, finances, household, self-esteem, independence, and identity) and 2 optional (work and study, which respondents are directed to complete if they are currently employed or in school). A 12-item brief version was also developed.

    The questionnaire items are presented on a standard 5-point Likert response scale (strongly disagree–strongly agree). Items are all positively worded (ie, describing the presence of a desirably quality) for two reasons: (1) a positive question frame is consistent with the strengths-based approach to QoL adopted as a result of CBPR consultation and (2) reverse worded items can reduce the reliability and validity of a scale [31]. Domains are scored by summing the responses, for a potential score range of 4 to 16. Calculation of an overall QoL score is possible by summing responses to the 12 core domains. Initial field testing of the QoL.BD instrument indicated that both the full and brief version of the instrument represent a feasible, reliable, and valid BD-specific QoL measure with solid internal validity and appropriate test-retest reliability. Factor analysis affirmed that the 12 basic scales were represented in the latent structure of the instrument.

    Development of the Web-Based Adaptation: The Quality of Life Tool

    A synergistic combination of CBPR and the principles of user-centered design [32] were applied to develop the QoL Tool. The primary goal of the development process for the QoL Tool was to produce a web-based version of the QoL.BD instrument that was faithful as possible to the original measurement principles of the QoL.BD instrument but also adapted and expanded to enhance user experience and functionality. A priori, we established which features of the QoL.BD instrument were immutable, specifically, preservation of precise wording of the scale’s 56 items (with one exception, described below), the ordering of the items, and the 5-point Likert response scale (1=strongly disagree; 5=strongly agree) and item ordering. The approach to scoring the domains and the range of potential scores are consistent with the QoL.BD instrument.

    Beyond these parameters, however, it was expected that the web-based version of the scale would differ in some aspects from its pen-and-paper counterpart. One adaptation was made in the delivery format of the web-based version on the basis of user feedback, the name of each domain was made visible to the user (see Figure 1). Furthermore, as the inclusion of graphical feedback of results has been described as a highly prioritized feature for self-management apps for people with BD [32,33], we determined a priori that the addition of a results display feature would be essential (see Figure 2). All other adaptations were identified via user-centered design processes. One minor change in wording was made to a sleep domain item (“woken up” was changed to “awoken”). All other items in the QoL Tool were precisely as worded in the QoL.BD instrument. Substantial changes were made in the web-based version in terms of features and functionality. For example, registered users of the QoL Tool are provided with the option of an interactive results feature that demonstrates their QoL scores over time (Figure 3) and the option to email their results to a health care provider.

    Figure 1. QoL Tool questionnaire screen and response options.
    View this figure
    Figure 2. QoL Tool graphical display of results.
    View this figure
    Figure 3. Users are able to drag a slider to compare their QoL Tool results over time.
    View this figure

    Recruitment

    The final version of the QoL Tool was formally launched on World Bipolar Day (March 30, 2015). The QoL Tool was promoted primarily through social media channels (eg, CREST.BD Facebook page, Twitter, and website) but was also highlighted as part of a series of knowledge translation events (May-June 2015) focused on sharing knowledge on self-management strategies for BD [34]. Informed consent for data collection for research purposes was provided at the point of registration in the QoL Tool system but was not required for the use of the web-based interface. Inclusion criteria were (1) older than 19 years; (2) have a self-reported diagnosis of BD I, II, or not otherwise specified (NOS); and (3) able to communicate in English. No compensation was offered for participants’ completion of the QoL Tool. Ethics approval was obtained from the University of British Columbia’s Behavior Research Ethics Board.

    Data Collection

    Data collection for this study occurred between March 30, 2015, and October 17, 2018. Demographic details and responses to the QoL Tool were saved in a secure database hosted at the University of British Columbia. For the purposes of this study, only baseline responses to the QoL Tool were analyzed to avoid contaminating the extracted factor structure with potential learning effects [35]. Data from the baseline time point of the original QoL.BD validation study (n=224; sample described [7]) were included where relevant for comparison purposes.

    Statistical Analysis

    Internal Consistency

    Internal consistency was evaluated using Cronbach alpha for each of the 14 domains. Scales were deemed to be adequately reliable if Cronbach alpha exceeded .7 [36].

    Exploratory Factor Analysis

    Analyses were carried out on the 12 basic domains of the web-based QoL.BD ie, all except work and study). Factorability was confirmed by using the Kaiser-Meyer-Oklin measure of sampling adequacy and Bartlett test of sphericity. No corrections for missing data were required, as the design of the QoL Tool does not permit users to submit their data unless responses have been provided for all questions. An exploratory factor analysis (EFA) using maximum likelihood extraction was conducted in SPSS 27 (IBM, Armonk, New York). Given that the variables were assumed to be correlated, oblique (oblimin) rotation was applied. Multiple criteria were reviewed to determine the optimal number of factors retained. First, visual inspection of the point of inflexion displayed by the scree plot was conducted [37]. Second, Kaiser criterion was used to determine the number of factors with eigenvalues with a value greater than 1 [38]. Third, the minimum average partial correlation (MAP) test was applied using SPSS [39] to identify the number of components that produces the minimum mean squared partial correlation [40]. Finally, the percentage of variance explained by each factor was considered: amount of variance explained (with 5% a generally accepted cutoff) may be used as a decision rule [41,42].

    Interpretability of the extracted factors was evaluated by confirming primary factor loadings exceeded 0.4 and that cross-loadings were less than 0.3 [42]. Finally, the item content of each factor was evaluated to confirm whether the domains proposed by the original validation of the QoL.BD instrument were represented.


    Results

    Participants

    A final sample of 498 participants (349/498, 70.1% female; 128/498, 25.7% male; 6/498, 1.2% transgender or nonbinary) with a mean age of 39.64 years (SD 12.54) were included in the analysis. In total, 36.3% (181/498) of the participants reported having a diagnosis of BD type I (BD-I), 39.2% (195/498) reported having a diagnosis of BD type II (BD-II), and 2.2% (11/498) self-identified as having a diagnosis of BD NOS. Individuals with other unspecified BD (50/498, 10.5%), unclear or pending diagnoses (31/498, 6.5%), or rapid-cycling BD (7/498, 1.5%) comprised the remainder. The majority of participants were located in North America (22/498, 44.2% Canadian; 138/498, 27.7% American), with the remainder comprising international respondents (most commonly from Australia with 28/498, 5.6% or Germany with 16/498, 3.2%). Characteristics of this sample (QoL Tool respondents) and the original pen-and-paper QoL.BD validation sample can be found in Table 1. The two samples did not significantly differ with respect to gender composition, X21=1.6 (N=690), P=.21; nor age, t720=1.31, P=.19. The two samples did differ with respect to diagnosis, X21=53.7 (N=590), P<.001; with the QoL Tool sample containing a greater proportion of individuals with BD-II than the pen-and-paper sample.

    Table 1. Sample characteristics of the quality of life (QoL) Tool (n=498) and Quality of Life in Bipolar Disorder (n=224) validation sample.
    View this table

    Distributions by Domain

    Mean, standard deviation, and skew of the 14 domains of the QoL Tool are presented in Table 2, along with comparison data from the first time point of the original QoL.BD validation study. The optional work and study sections were completed by 63.7% (317/498) and 30.9% (154/498) of the web-based sample and 54% (121/225) and 23.2% (52/224) of the pen-and-paper sample, respectively. The distribution of QoL Tool domain scores was approximately normal, with all skew values well under the recommended absolute value of 2 [43] and no evidence of floor or ceiling effects. Mean scores for the web-based sample were significantly lower across all domains except finance, relative to the pen-and-paper sample.

    Table 2. Descriptive statistics for the 12 basic and two optional domains of the quality of life (QoL) Tool.
    View this table

    Internal Consistency

    Acceptable to excellent reliability estimates were observed for all 14 QoL Tool domains (see Table 3 for Cronbach alpha values for the QoL Tool and comparison data from the first time point of the original QoL.BD validation study). Reliability estimates for the QoL Tool were comparable with those reported for the QoL.BD instrument across all domains.

    Table 3. Internal consistency estimates for the quality of life (QoL) Tool and Quality of Life in Bipolar Disorder.
    View this table

    Exploratory Factor Analysis

    The suitability of the data for factor analysis was confirmed, with a “very good” sample size [44] and appropriate factorability. The Bartlett test of sphericity was significant, c21128=15,621.64; P<.001, and the Kaiser-Meyer-Olkin measure of sampling adequacy was .93, exceeding the recommended minimum value of .60 [35].

    Visual inspection of the scree plot using the William guidelines [37] suggested an 11-factor structure. The Kaiser criterion suggested an 11-factor structure, which accounted for 70.48% of the variance. The MAP test identified that the extraction of 12 components was required to produce the minimum mean squared partial correlation. Weighting these findings together, both an 11-factor structure and a 12-factor structure were considered. Owing to the fact that retention of the 12th factor explained less than 2% of the additional variance (below the 5% cutoff used for factor retention [41]), a final 11-item factor structure was retained.

    The interpretability of the 11 extracted factors was supported, with the majority of factors (n=9) having at least four items with factor loadings above 0.4. Only one item (“I have felt emotionally balanced”) was observed to have a cross-loading above 0.3 (primary loading mood, secondary loading cognition). The pattern matrix with oblique rotation and factor loadings above 0.3 is shown in Multimedia Appendix 1. The item content of the extracted factors largely aligned with the factor structure suggested in the original validation study, as well as the conceptual labeling of the domains, with the exception of items belonging to the sleep and physical domains. The rotated factor structure suggested that a single latent factor best explained the variance of items belonging to these domains. Furthermore, two items from the original physical domain (“I have had the right amount of exercise for me” and “I have been content with my sex life”) did not demonstrate significant loadings on any extracted factor.


    Discussion

    Principal Findings

    This study describes the psychometric properties of a web-based QoL questionnaire for individuals with BD (the QoL Tool) adapted from a pen-and-paper measure (QoL.BD [7]). Distributions of the core 12 QoL Tool domains were comparable with those found using the QoL.BD instrument in the original validation sample (although QoL Tool respondents reported lower QoL across the majority of domains), standards for appropriate internal consistency were met, and EFA suggested a factor structure that is adequately concordant with the full pen-and-paper version.

    EFA of the core item set of the QoL Tool suggested an 11-factor latent structure, accounting for a similar proportion of variance (70.48%) to the factor structure identified in the original QoL.BD validation study (12-factor structure accounting for 71% of variance). Furthermore, the same items (with the exception of certain sleep and physical items) were observed to load on the domains identified by the original validation study and conceptual structure of the QoL.BD instrument, suggesting that the same constructs are being measured by the web-based and paper-based versions of this instrument [35].

    A range of data suggests that, not surprisingly, directly copying a paper questionnaire into a web page results in negligible change to its psychometric properties. A systematic review conducted by Alfonsson et al [27] on the adaptation of 40 symptom scales into digital format indicated that most web-based instruments appear reliable across administration formats. More specifically, van Ballegooijen et al [28] conducted a focused review of the psychometric data of digitized paper questionnaires measuring symptoms of mood and anxiety disorders, demonstrating adequate psychometric properties of the tools in their web-based formats. Despite a growing body of evidence comparing the psychometric properties of web-based adaptations of questionnaires, three notable limitations of this body of work exist. First, studies have typically examined the interformat reliability (Web-based vs pen-and-paper) of psychosocial instruments. Second, those which have validated web-based mental health measures have typically used general population samples rather than testing the instrument in a clinical population. Third, few studies have reported the type and extent of modifications made in adapting pen-and-paper questionnaires to a web-based format [27], limiting the ability to make inferences about the effect of delivery mode on degree of similarity. Consequently, this study contributes some initial evidence (couched in the limitations discussed below) supporting that factor structure may be largely preserved in web-based adaptations of patient-reported outcomes including minor to moderate modifications (as defined by ISPOR recommendations [25]) when tested in a BD sample.

    Although the EFA results support that, overall, the QoL Tool and QoL BD have concordant factor structures, one point of divergence warrants further discussion. In the QoL Tool sample, EFA results suggest one latent factor may best account for items from both the sleep and physical QoL domains, whereas the original validation study of the QoL.BD instrument supported two distinct factors. This design does not allow us to unpack the determinant of this difference. We can speculate that, potentially, the minor-to-moderate modifications in the web-based delivery of the QoL Tool (changes to wording of one item, a graphical representation of results, and the ability for users to see the domain names) may have contributed to this small divergence in factor structure. A second candidate explanation that must be considered is differing sample compositions [45]: In this study, similar proportions of individuals reporting BD-I and BD-II diagnoses participated; the original validation study predominantly consisted of individuals with BD-I (see Table 1). Given that the prevalence of various physical health comorbidities [46,47] and the experience of sleep disturbances [48] may vary according to BD subtype, heterogeneity between samples may underpin the differences in factor structure observed over the sleep and physical items viz the original validation study.

    There are no clear-cut guidelines regarding the optimal way to respond when faced with points of divergence in the psychometric properties of web-based and pen-and-paper instruments; developers must consider the impact on psychometric properties in light of the numerous advantages of web-based adaptations (discussed above) as well as supporting interpretation of results by preserving the surface structure and face validity of the questionnaire. ISPOR recommendations highlight that fidelity to the original pen-and-paper instrument must be balanced against the potential to improve functionality and performance in web-based adaptations [25], and as such, concrete, universal recommendations about standards of evidence and quantitative cutoffs for acceptable psychometric properties cannot be made. In fact, the meaning of divergence between pen-and-paper instruments and web-based adaptations is not clear cut and may in fact reflect improved data quality and user-friendliness on the part of the web-based instrument, as the potential for social desirability effects or missing data to bias findings is ameliorated. Furthermore, psychometric findings about factor structure is only one piece of evidence which should drive decisions about the surface structure of an instrument. In the case of this study, although it is perhaps unsurprising that a latent factor may underpin items assessing both sleep and physical health, given that some of the physical health items (eg, “I have had plenty of energy”) are likely to be impacted by achieving adequate sleep, there is also evidence to suggest the face validity of distinct domains. Assessing these domains separately is key for the instrument to have clinical and research utility, given that sleep changes in BD are one of the most prominent prognostic indicators of mood destabilization [49], and sleep difficulties require different self-management and clinical interventions relative to physical health comorbidities [50]. In light of this and given strong conceptual arguments for separate sleep and physical domains, we suggest these items continue to be scored according to the original QoL.BD.

    Limitations

    A number of limitations to this study should be noted. First, participants self-reported their diagnosis of BD; although the confirmation of diagnosis by structured psychiatric interview would have been preferable, there is some evidence that people who self-identify as living with BD typically do meet diagnostic criteria [51]. Furthermore, as the sample was self-selected, higher levels of digital literacy may have been present: qualitative interviews with a small subsample of participants who were given the opportunity to use a web-based BD self-management intervention suggest that some participants struggled to access that website because of technological barriers [34]. Care must be taken to evaluate the feasibility and psychometric properties of the QoL Tool in samples with lower levels of digital literacy or those facing a digital divide.

    Finally, the web-based and pen-and-paper versions of the QoL.BD instrument were not directly compared in the same sample. Therefore, we are unable to determine whether factor equivalence was impacted by differing demographic compositions, rather than modifications to the delivery of the instrument itself. Furthermore, we were not able to assess concordance with the pen-and-paper QoL.BD in the form of intraclass correlation coefficients. However, it has been noted that it is not typically feasible nor warranted to assess test-retest reliability across instrument modes [25] and indeed, this may introduce confounding learning effects.

    Implications and Future Directions

    This study provides evidence for concordance between the web-based and paper-based versions of the widely adopted QoL.BD questionnaire, providing some confidence in the use of the QoL Tool in research or clinical assessments. Furthermore, there is now also qualitative evidence to suggest that the QoL Tool can be integrated positively into the self-management practices of individuals with BD; respondents described the breadth of areas assessed as enabling them to identify areas of strengths as well as areas in need of improvement [52]. This emerging body of evidence for the utility of the QoL Tool, both from a psychometric and subjective perspective, suggests further development and dissemination of this measure is warranted.

    One avenue of expansion is the translation of the QoL Tool from a web-based interface to a mobile phone app. The project to develop the QoL Tool was initiated in 2014; significant advances have occurred in the digital mental health landscape since that time—we now need to avail of developing technologies to enhance the delivery and functionality of this instrument. People with BD have shown interest in digitally supported self-management [32,53], and self-monitoring apps have been found to be feasible and acceptable in this population [54-56]. However, current apps do not adequately meet consumer needs to track a broad spectrum of outcomes [32], with most apps developed to assess domains of well-being in isolation, such as sleep or mood. Individuals with BD have described resorting to elaborate, self-generated systems to track multiple indicators [57]; between-app integration is a requested feature of apps for BD [32]. The range of wellness outcomes assessed by the QoL Tool means that its adaptation into an app format is likely to meet this consumer need.

    Moving forward, CREST.BD has now initiated a 3-year project to incorporate the evidence and tools held in the Bipolar Wellness Centre [58] and the QoL Tool [19] into a new mobile health app—“Bipolar Bridges.” The project aims to address some of the limitations of existing BD apps by using CBPR approaches to co-design an app that synergistically combines different forms of digital health data (including QoL Tool results), enabling individuals to learn what self-management strategies are most effective for optimizing their QoL.

    Conclusions

    This study provides initial support for the psychometric validity of the QoL Tool, a web-based adaptation of an instrument to measure QoL in BD (the QoL.BD instrument). Specifically, internal consistency estimates, distribution of scores, and factor structure were largely consistent with the pen-and-paper version. Although evidence supporting the overall equivalence of these instruments was observed, the findings of this study suggested a latent structure of 11 compared with the original 12 basic domains in the original instrument. This 11-factor structure combined items from the sleep and physical domains in a single shared factor. Two explanations for this minor divergence must be considered: changes to the user experience in the web-based interface and differences in sample composition. As the design of this study does not permit separating the influence of these two potential explanations, further research is required. However, in light of the face validity and clinical utility of a distinct sleep domain in QoL in BD, we recommend these items continue to be treated according to the structure of the original QoL.BD. Given the increasing role of web-based self-report questionnaires for research, clinical, and self-management contexts, findings of overall psychometric equivalence between these QoL instruments validate current applications of the QoL Tool and encourage further efforts to optimize its web-based delivery and associated self-management strategies via a novel mobile phone app.

    Acknowledgments

    This study was supported by a Canadian Institutes of Health Research Knowledge to Action grant, grant #201210KAL-259464. The authors would like to thank the research participants who were involved in this project, the expertise of the CREST.BD Community Advisory Group, and CREST.BD network members.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Primary factor loadings of the quality of life tool based on an exploratory factor analysis with maximum likelihood extraction and oblique rotation.

    DOCX File , 16 KB

    References

    1. Morton E, Michalak EE, Murray G. What does quality of life refer to in bipolar disorders research? A systematic review of the construct's definition, usage and measurement. J Affect Disord 2017 Apr 1;212:128-137. [CrossRef] [Medline]
    2. Murray G, Michalak EE. The quality of life construct in bipolar disorder research and practice: past, present, and possible futures. Bipolar Disord 2012 Dec;14(8):793-796. [CrossRef] [Medline]
    3. Basu D. Quality-of-life issues in mental health care: past, present, and future. German J Psychiatry 2004;7(3):35-43 [FREE Full text]
    4. Maczka G, Siwek M, Skalski M, Dudek D. [Patients' and doctors' attitudes towards bipolar disorder--do we share our beliefs?]. Psychiatr Pol 2009;43(3):301-312. [Medline]
    5. Joyce CR, O'Boyle CA, McGee H. Introduction: The individual perspective. In: Joyce CR, O'Boyle CA, McGee H, editors. Individual Quality of Life: Approaches to Conceptualisation and Assessment. New York: Routledge; 1999:3-8.
    6. Bowling A. What things are important in people's lives? A survey of the public's judgements to inform scales of health related quality of life. Soc Sci Med 1995 Nov;41(10):1447-1462. [CrossRef] [Medline]
    7. Michalak EE, Murray G, Collaborative RESearch Team to Study Psychosocial Issues in Bipolar Disorder (CREST.BD). Development of the QoL.BD: a disorder-specific scale to assess quality of life in bipolar disorder. Bipolar Disord 2010 Nov;12(7):727-740. [CrossRef] [Medline]
    8. Modabbernia A, Yaghoubidoust M, Lin C, Fridlund B, Michalak EE, Murray G, et al. Quality of life in Iranian patients with bipolar disorder: a psychometric study of the Persian Brief Quality of Life in Bipolar Disorder (QoL.BD). Qual Life Res 2016 Jul;25(7):1835-1844. [CrossRef] [Medline]
    9. Xiao L, Gao Y, Zhang L, Chen P, Sun X, Tang S. Validity and reliability of the Brief version of Quality of Life in Bipolar Disorder' (Bref QoL.BD) among Chinese bipolar patients. J Affect Disord 2016 Mar 15;193:66-72. [CrossRef] [Medline]
    10. Morgado C, Tapia T, Ivanovic-Zuvic F, Antivilo A. Evaluación de la calidad de vida de pacientes bipolares chilenos: propiedades psicométricas y utilidad diagnóstica de la versión chilena del cuestionario Quality of Life Bipolar Disorder (QoL.BD-CL). Rev méd Chile 2015 Feb;143(2):213-222. [CrossRef]
    11. CREST.BD. 2016. Quality of Life: Researching and Exchanging Knowledge on Quality of Life in Bipolar Disorder   URL: http://www.crestbd.ca/research/research-areas/quality-of-life/ [accessed 2019-11-27]
    12. Jones SH, Smith G, Mulligan LD, Lobban F, Law H, Dunn G, et al. Recovery-focused cognitive-behavioural therapy for recent-onset bipolar disorder: randomised controlled pilot trial. Br J Psychiatry 2015 Jan;206(1):58-66. [CrossRef] [Medline]
    13. Jones SH, Knowles D, Tyler E, Holland F, Peters S, Lobban F, et al. The feasibility and acceptability of a novel anxiety in bipolar disorder intervention compared to treatment as usual: A randomized controlled trial. Depress Anxiety 2018 Oct;35(10):953-965. [CrossRef] [Medline]
    14. Beck AK, Baker A, Jones S, Lobban F, Kay-Lambkin F, Attia J, et al. Exploring the feasibility and acceptability of a recovery-focused group therapy intervention for adults with bipolar disorder: trial protocol. BMJ Open 2018 Jan 31;8(1):e019203 [FREE Full text] [CrossRef] [Medline]
    15. Tyler E, Lobban F, Sutton C, Depp C, Johnson S, Laidlaw K, et al. Feasibility randomised controlled trial of Recovery-focused Cognitive Behavioural Therapy for Older Adults with bipolar disorder (RfCBT-OA): study protocol. BMJ Open 2016 Mar 3;6(3):e010590 [FREE Full text] [CrossRef] [Medline]
    16. Fletcher K, Foley F, Thomas N, Michalak E, Berk L, Berk M, et al. Web-based intervention to improve quality of life in late stage bipolar disorder (ORBIT): randomised controlled trial protocol. BMC Psychiatry 2018 Jul 13;18(1):221 [FREE Full text] [CrossRef] [Medline]
    17. Calabrese JR, Sanchez R, Jin N, Amatniek J, Cox K, Johnson B, et al. Symptoms and functioning with aripiprazole once-monthly injection as maintenance treatment for bipolar I disorder. J Affect Disord 2018 Feb;227:649-656. [CrossRef] [Medline]
    18. Yatham LN, Mackala S, Basivireddy J, Ahn S, Walji N, Hu C, et al. Lurasidone versus treatment as usual for cognitive impairment in euthymic patients with bipolar I disorder: a randomised, open-label, pilot study. Lancet Psychiatry 2017 Mar;4(3):208-217. [CrossRef] [Medline]
    19. Michalak E, Murray G. CREST.BD Quality of Life Tool. 2015.   URL: https://www.bdqol.com/ [accessed 2019-11-27]
    20. Kongsved SM, Basnov M, Holm-Christensen K, Hjollund NH. Response rate and completeness of questionnaires: a randomized study of Internet versus paper-and-pencil versions. J Med Internet Res 2007 Sep 30;9(3):e25 [FREE Full text] [CrossRef] [Medline]
    21. van Gelder MM, Bretveld RW, Roeleveld N. Web-based questionnaires: the future in epidemiology? Am J Epidemiol 2010 Dec 1;172(11):1292-1298. [CrossRef] [Medline]
    22. Miller ET, Neal DJ, Roberts LJ, Baer JS, Cressler SO, Metrik J, et al. Test-retest reliability of alcohol measures: is there a difference between internet-based assessment and traditional methods? Psychol Addict Behav 2002 Mar;16(1):56-63. [Medline]
    23. Wijndaele K, Matton L, Duvigneaud N, Lefevre J, Duquet W, Thomis M, et al. Reliability, equivalence and respondent preference of computerized versus paper-and-pencil mental health questionnaires. Comput Hum Behav 2007;23(4):1958-1970 [FREE Full text] [CrossRef]
    24. Richman WL, Kiesler S, Weisband S, Drasgow F. A meta-analytic study of social desirability distortion in computer-administered questionnaires, traditional questionnaires, and interviews. J Appl Psychol 1999;84(5):754-775. [CrossRef]
    25. Coons SJ, Gwaltney CJ, Hays RD, Lundy JJ, Sloan JA, Revicki DA, ISPOR ePRO Task Force. Recommendations on evidence needed to support measurement equivalence between electronic and paper-based patient-reported outcome (PRO) measures: ISPOR ePRO Good Research Practices Task Force report. Value Health 2009 Jun;12(4):419-429 [FREE Full text] [CrossRef] [Medline]
    26. Rothman M, Burke L, Erickson P, Leidy NK, Patrick DL, Petrie CD. Use of existing patient-reported outcome (PRO) instruments and their modification: the ISPOR Good Research Practices for Evaluating and Documenting Content Validity for the Use of Existing Instruments and Their Modification PRO Task Force Report. Value Health 2009;12(8):1075-1083 [FREE Full text] [CrossRef] [Medline]
    27. Alfonsson S, Maathz P, Hursti T. Interformat reliability of digital psychiatric self-report questionnaires: a systematic review. J Med Internet Res 2014 Dec 3;16(12):e268 [FREE Full text] [CrossRef] [Medline]
    28. van Ballegooijen W, Riper H, Cuijpers P, van Oppen P, Smit JH. Validation of online psychometric instruments for common mental health disorders: a systematic review. BMC Psychiatry 2016 Feb 25;16:45 [FREE Full text] [CrossRef] [Medline]
    29. Satcher D, Israel BA, Eng EE, Schulz AJ, Parker EA. Methods in Community-Based Participatory Research for Health. San-Fransico, CA: Jossey-Bass; 2005.
    30. Michalak E, Lane K, Hole R, Barnes S, Khatri N, Lapsley S, et al. Towards a better future for Canadians with bipolar disorder: principles and implementation of a community-based participatory research model. Engaged Scholar J 2015;1(1):132-147 [FREE Full text] [CrossRef]
    31. Woods CM. Careless responding to reverse-worded items: implications for confirmatory factor analysis. J Psychopathol Behav Assess 2006;28(3):186-191. [CrossRef]
    32. Nicholas J, Fogarty AS, Boydell K, Christensen H. The reviews are in: a qualitative content analysis of consumer perspectives on apps for bipolar disorder. J Med Internet Res 2017 Apr 7;19(4):e105 [FREE Full text] [CrossRef] [Medline]
    33. Daus H, Kislicyn N, Heuer S, Backenstrass M. Disease management apps and technical assistance systems for bipolar disorder: investigating the patients´ point of view. J Affect Disord 2018 Mar 15;229:351-357. [CrossRef] [Medline]
    34. Michalak EE, Morton E, Barnes SJ, Hole R, CREST.BD, Murray G. Supporting self-management in bipolar disorder: mixed-methods knowledge translation study. JMIR Ment Health 2019 Apr 15;6(4):e13493 [FREE Full text] [CrossRef] [Medline]
    35. Tabachnick BG, Fidell LS. Using Multivariate Statistics. Boston, MA: Pearson; 2007.
    36. Nunnally JC, Bernstein IH. Psychometric Theory. New York: McGraw-Hill; 1994.
    37. Williams B, Onsman A, Brown T. Exploratory factor analysis: A five-step guide for novices. Australas J Paramedicine 2010;8(3):1-13. [CrossRef]
    38. Kaiser HF. The application of electronic computers to factor analysis. Educ Psychol Meas 1960;20(1):141-151. [CrossRef]
    39. O'Connor BP. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test. Behav Res Methods Instrum Comput 2000 Aug;32(3):396-402. [CrossRef] [Medline]
    40. Velicer WF. Determining the number of components from the matrix of partial correlations. Psychometrika 1976;41(3):321-327. [CrossRef]
    41. Pett MA, Lackey NR, Sullivan JJ. Making Sense of Factor Analysis: The Use of Factor Analysis for Instrument Development in Health Care Research. Thousand Oaks, CA: Sage; 2003.
    42. Howard MC. A review of exploratory factor analysis decisions and overview of current practices: what we are doing and how can we improve? Int J Hum-Comput Interact 2016;32(1):51-62. [CrossRef]
    43. Kim HY. Statistical notes for clinical researchers: assessing normal distribution (2) using skewness and kurtosis. Restor Dent Endod 2013 Feb;38(1):52-54 [FREE Full text] [CrossRef] [Medline]
    44. Comrey AL, Lee HB. A First Course in Factor Analysis. Second Edition. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc; 1992.
    45. Herrero J, Meneses J. Short Web-based versions of the perceived stress (PSS) and Center for Epidemiological Studies-Depression (CESD) Scales: a comparison to pencil and paper responses among Internet users. Comput Hum Behav 2006;22(5):830-846 [FREE Full text] [CrossRef]
    46. Kemp DE, Sylvia LG, Calabrese JR, Nierenberg AA, Thase ME, Reilly-Harrington NA, LiTMUS Study Group. General medical burden in bipolar disorder: findings from the LiTMUS comparative effectiveness trial. Acta Psychiatr Scand 2014 Jan;129(1):24-34 [FREE Full text] [CrossRef] [Medline]
    47. Forty L, Ulanova A, Jones L, Jones I, Gordon-Smith K, Fraser C, et al. Comorbid medical illness in bipolar disorder. Br J Psychiatry 2014 Dec;205(6):465-472 [FREE Full text] [CrossRef] [Medline]
    48. Lewis KS, Gordon-Smith K, Forty L, Di Florio A, Craddock N, Jones L, et al. Sleep loss as a trigger of mood episodes in bipolar disorder: individual differences based on diagnostic subtype and gender. Br J Psychiatry 2017 Sep;211(3):169-174 [FREE Full text] [CrossRef] [Medline]
    49. Morton E, Murray G. An update on sleep in bipolar disorders: presentation, comorbidities, temporal relationships and treatment. Curr Opin Psychol 2019 Aug 26;34:1-6. [CrossRef] [Medline]
    50. Morton E, Murray G. Assessment and treatment of sleep problems in bipolar disorder-A guide for psychologists and clinically focused review. Clin Psychol Psychother 2020 Feb 18 [Epub ahead of print]. [CrossRef] [Medline]
    51. Kupfer DJ, Frank E, Grochocinski VJ, Cluss PA, Houck PR, Stapf DA. Demographic and clinical characteristics of individuals in a bipolar disorder case registry. J Clin Psychiatry 2002 Feb;63(2):120-125. [CrossRef] [Medline]
    52. Morton E, Hole R, Murray G, Buzwell S, Michalak E. Experiences of a web-based quality of life self-monitoring tool for individuals with bipolar disorder: a qualitative exploration. JMIR Ment Health 2019 Dec 4;6(12):e16121 [FREE Full text] [CrossRef] [Medline]
    53. Todd NJ, Jones SH, Lobban FA. What do service users with bipolar disorder want from a web-based self-management intervention? A qualitative focus group study. Clin Psychol Psychother 2013;20(6):531-543. [CrossRef] [Medline]
    54. Faurholt-Jepsen M, Frost M, Ritz C, Christensen EM, Jacoby AS, Mikkelsen RL, et al. Daily electronic self-monitoring in bipolar disorder using smartphones - the MONARCA I trial: a randomized, placebo-controlled, single-blind, parallel group trial. Psychol Med 2015 Oct;45(13):2691-2704. [CrossRef] [Medline]
    55. Hidalgo-Mazzei D, Mateu A, Reinares M, Murru A, del Mar Bonnín C, Varo C, et al. Psychoeducation in bipolar disorder with a SIMPLe smartphone application: feasibility, acceptability and satisfaction. J Affect Disord 2016 Aug;200:58-66. [CrossRef] [Medline]
    56. Schwartz S, Schultz S, Reider A, Saunders EF. Daily mood monitoring of symptoms using smartphones in bipolar disorder: a pilot study assessing the feasibility of ecological momentary assessment. J Affect Disord 2016 Feb;191:88-93 [FREE Full text] [CrossRef] [Medline]
    57. Murnane EL, Cosley D, Chang P, Guha S, Frank E, Gay G, et al. Self-monitoring practices, attitudes, and needs of individuals with bipolar disorder: implications for the design of technologies to manage mental health. J Am Med Inform Assoc 2016 May;23(3):477-484. [CrossRef] [Medline]
    58. CREST.BD. 2015. Bipolar Wellness Centre   URL: http://www.crestbd.ca/tools/bipolar-wellness-centre/ [accessed 2019-11-27]


    Abbreviations

    BD: bipolar disorder
    BD-I: bipolar disorder type 1
    BD-II: bipolar disorder type 2
    CBPR: community-based participatory research
    CREST.BD: The Collaborative RESearch Team to study psychosocial issues in Bipolar Disorder
    EFA: exploratory factor analysis
    ISPOR: Professional Society for Health Economics and Outcomes Research
    KT: knowledge translation
    MAP: minimum average partial correlation
    NOS: not otherwise specified
    QoL.BD: Quality of Life in Bipolar Disorder
    QoL: quality of life


    Edited by J Torous; submitted 17.12.19; peer-reviewed by B Johnson; comments to author 03.02.20; accepted 09.02.20; published 27.04.20

    ©Emma Morton, Sharon HJ Hou, Oonagh Fogarty, Greg Murray, Steven Barnes, Colin Depp, CREST.BD, Erin Michalak. Originally published in JMIR Mental Health (http://mental.jmir.org), 27.04.2020.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on http://mental.jmir.org/, as well as this copyright and license information must be included.