Hostname: page-component-7c8c6479df-nwzlb Total loading time: 0 Render date: 2024-03-28T22:27:48.470Z Has data issue: false hasContentIssue false

Developing a reporting guideline for social and psychological intervention trials

Published online by Cambridge University Press:  02 January 2018

Evan Mayo-Wilson
Affiliation:
Centre for Outcomes Research and Effectiveness, Research Department of Clinical, Educational & Health Psychology, University College London, London, UK
Paul Montgomery
Affiliation:
Centre for Evidence-Based Intervention, University of Oxford, Oxford, UK
Sally Hopewell
Affiliation:
Centre for Statistics in Medicine, University of Oxford, Oxford, UK
Geraldine Macdonald
Affiliation:
Institute of Child Care Research, Queen's University Belfast, Belfast. UK
David Moher
Affiliation:
Clinical Epidemiology Program, Ottawa Hospital Research Institute, Centre for Practice-Changing Research (CPCR), The Ottawa Hospital, Ottawa, Canada
Sean Grant
Affiliation:
Centre for Evidence-Based Intervention, University of Oxford, Oxford, UK
Rights & Permissions [Opens in a new window]

Summary

Social and psychological interventions are often complex. Understanding randomised controlled trials (RCTs) of these complex interventions requires a detailed description of the interventions tested and the methods used to evaluate them. However, RCT reports often omit, or inadequately report, this information. Incomplete and inaccurate reporting hinders the optimal use of research, wastes resources and fails to meet ethical obligations to research participants and consumers. In this paper, we explain how reporting guidelines have improved the quality of reports in medicine, and describe the ongoing development of a new reporting guideline for RCTs: CONSORT-SPI (an extension for social and psychological interventions). We invite readers to participate in the project by visiting our website, in order to help us reach the best-informed consensus on these guidelines (http://tinyurl.com/CONSORT-study).

Type
Special article
Copyright
Copyright © Royal College of Psychiatrists, 2013 

Social and psychological interventions aim to improve physical health, mental health, and associated social outcomes. They are often complex and typically involve multiple, interacting intervention components (for example several behaviour change techniques) that may act and target outcomes at several levels (for example individual, family, community). 1 Moreover, these interventions may be contextually dependent on the hard-to-control environments in which they are delivered (for example healthcare settings, correctional facilities). Reference Bonell2 The functions and processes of these interventions may be designed to accommodate particular individuals or contexts, taking on different forms, while still aiming to achieve the same objective. Reference Hawe, Shiell and Riley3

Complex interventions are common in public health, psychology, education, social work, criminology and related disciplines. For example, multisystemic therapy is an intensive intervention for juvenile offenders. Reference Henggeler, Schoenwald, Rowland and Cunningham4 Based on social ecological and family systems theories, multisystemic therapy providers target a variety of individual, family, school, peer, neighbourhood and community influences on psychosocial and behavioural problems. Treatment teams of professional therapists and case-workers work with individuals, their families and their peer groups to provide tailored services. These services may be delivered in homes, social care and community settings. Other examples of social and psychological interventions may be found in reviews by the Cochrane Collaboration (for example the Developmental, Psychosocial and Learning Problems Group; the Cochrane Public Health Group) and the Campbell Collaboration.

To understand their effects and to keep services up to date, academics, policy makers, journalists, clinicians and consumers rely on research reports of intervention studies in scientific journals. Such reports should explain the methods, including the design, delivery, uptake and context of interventions, as well as subsequent results. Accurate, complete and transparent reporting is essential for readers to make best use of new evidence, to achieve returns on research investment, to meet ethical obligations to research participants and consumers of interventions, and to minimise waste in research. However, randomised controlled trials (RCTs) are often poorly reported within and across disciplines including criminology, Reference Perry, Weisburd and Hewitt5 social work, Reference Naleppa and Cagle6 education, Reference Torgerson, Torgerson, Birks and Porthouse7 psychology Reference Stinson, McGrath and Yamada8 and public health. Reference Semaan, Kay, Strouse, Sogolow, Mullen and Neumann9 Biomedical researchers have developed guidelines to improve the reporting of RCTs of health-related interventions. Reference Schulz, Altman and Moher10 However, many social and behavioural scientists have not fully adopted these guidelines, which may not be wholly adequate for social and psychological interventions in their current form. Reference Perry, Weisburd and Hewitt5,Reference Stinson, McGrath and Yamada8 Because of the unique features of these interventions, updated reporting guidance is needed.

This article describes the development of a reporting guideline that aims to improve the quality of reports of RCTs of social and psychological interventions. We explain how reporting guidelines have improved the quality of reports in medicine, and why guidelines have not yet improved the quality of reports in other disciplines. We then introduce a plan to develop a new reporting guideline for RCTs - CONSORT-SPI (an extension for social and psychological interventions) - that will be written using best practices for guideline development and dissemination. Reference Moher, Schulz, Simera and Altman11 Wide stakeholder involvement and consensus are needed to create a useful, acceptable and evidence-based guideline, so we hope to recruit stakeholders from multiple disciplines and professions. Randomised trials are not the only rigorous method for evaluating interventions; many alternatives exist when RCTs are not possible or appropriate because of scientific, practical and ethical concerns. Reference Bonell, Hargreaves, Cousens, Ross, Hayes and Petticrew12 Nonetheless, RCTs are important to policy makers, practitioners, scientists and service users, as they are generally considered the most valid and reliable research method for estimating the effectiveness of interventions. Reference Chalmers13 Moreover, many of the issues faced in reporting RCTs also relate to other evaluation designs. As a result, this project will focus on standards for RCTs, which could then also inform the development of future guidelines for other evaluation designs.

Impact of CONSORT guidelines

Reporting guidelines list (in the form of a checklist) the minimum information required to understand the methods and results of studies. They do not prescribe research conduct, but facilitate the writing of transparent reports by authors and appraisal of reports by research consumers. For example, the Consolidated Standards of Reporting Trials (CONSORT) Statement 2010 is an evidence-based guideline; to identify items, the developers reviewed evidence of trial design and conduct that could contribute to bias. Using consensus methods, they developed a checklist of 25 items and a flow diagram. Reference Schulz, Altman and Moher10 The reporting of thousands of medical experiments has been improved by CONSORT. Reference Turner, Shamseer, Altman, Weeks, Peters and Kober14 It has been endorsed by over 600 journals, and it is supported by the Institute of Educational Sciences. Reference Torgerson, Torgerson, Birks and Porthouse7

The need for a new reporting guideline

Despite the impact of CONSORT guidelines in other disciplines, social and psychological interventions remain poorly reported. Information about masking, sequence generation and allocation concealment is rarely reported. Reference Grant, Mayo-Wilson, Melendez-Torres and Montgomery15 Few social and psychological journals ask authors to follow a reporting guideline in their Instructions to Authors. Reference Grant, Mayo-Wilson, Melendez-Torres and Montgomery15 Editors and researchers may consider existing guidelines insufficient for social and psychological intervention trials; although obviously appropriate for drug trials, existing guidelines do not address the complexities inherent in the evaluation of social and psychological research. Reference Grant, Mayo-Wilson, Melendez-Torres and Montgomery15 Given that CONSORT is the most rigorously developed guideline for reporting RCTs, and it has remained more prominent that any other guideline for over 15 years, for greatest impact any further reporting guidelines related to RCTs should be developed in collaboration with the CONSORT Group.

Limitations of CONSORT guidelines for social and psychological interventions

Researchers and journal editors in the social and behavioural sciences are generally aware of CONSORT but often object that it is not fully appropriate for social and psychological interventions. Reference Perry, Weisburd and Hewitt5,Reference Stinson, McGrath and Yamada8 As a result, uptake of CONSORT guidelines in these disciplines is low. Reference Grant, Mayo-Wilson, Melendez-Torres and Montgomery15 Although some criticisms are as a result of inaccurate perceptions about common features of RCTs across disciplines, many relate to real limitations for social and psychological interventions. Reference Mayo-Wilson16 For example, CONSORT is most relevant to RCTs in medical disciplines; it was developed by biostatisticians and medical researchers with minimal input from experts in other disciplines. Journal editors, as well as social and behavioural science researchers, believe there is a need to include appropriate stakeholders in developing a new, targeted guideline to improve uptake in their disciplines. Reference Torgerson, Torgerson, Birks and Porthouse7 The CONSORT Group has produced extensions of the original CONSORT statement relevant to social and psychological interventions, such as additional checklists for cluster, Reference Campbell, Elbourne and Altman17 non-pharmacological, Reference Boutron, Moher, Altman, Schulz and Ravaud18 pragmatic Reference Zwarenstein, Treweek, Gagnier, Altman, Tunis and Haynes19 and quality-of-life RCTs. Reference Calvert, Blazeby, Revicki, Moher and Brundage20 These extensions provide important insights, but complex social and psychological interventions, for example, include multiple, interacting components at several levels, with various outcomes require use of several extensions at once, creating a barrier to guideline uptake; increasing intervention complexity also gives rise to new issues that are not included in existing guidelines. Therefore, simply disseminating CONSORT guidelines as they stand is insufficient, as this would not address the need for editors and authors to ‘buy-in’ to this process. To improve uptake in these disciplines, CONSORT guidelines need to be extended to specifically address the important features of social and psychological interventions.

Limitations of existing social and psychological reporting guidelines

Social and behavioural scientists have developed other reporting guidelines, including the Workgroup for Intervention Development and Evaluation Research (WIDER) Recommendations for behavioural change interventions, Reference Abraham21 the American Educational Research Association's (AERA) Standards for Reporting Research, 22 the REPOSE guidelines for primary research in education Reference Newman and Elbourne23 and the Journal Article Reporting Standards (JARS) of the American Psychological Association (APA). 24 Although they address issues not covered by the CONSORT statement and its extensions, these guidelines (except for JARS 24 ) do not provide specific guidance for RCTs. Moreover, compared with the CONSORT statement and its official extensions, guidelines in the social and behavioural sciences have not consistently followed optimal techniques for guideline development and dissemination that are recommended by international leaders in the advancement of reporting guidelines, Reference Moher, Schulz, Simera and Altman11 such as the use of systematic literature reviews and formal consensus methods to select reporting standards. Reference Grant, Mayo-Wilson, Melendez-Torres and Montgomery15 Researchers in public health, psychology, education, social work and criminology have noted that these guidelines could be more ‘user-friendly’, and dissemination could benefit from up-to-date knowledge transfer techniques. Reference Naleppa and Cagle6,Reference Torgerson, Torgerson, Birks and Porthouse7,Reference Stinson, McGrath and Yamada8,Reference Abraham21

For example, JARS - a notable and valuable guideline for empirical psychological research - is endorsed by few journals outside of the APA, whereas CONSORT is endorsed by hundreds of journals internationally. According to ISI Web of Knowledge and Google Scholar citations, JARS is cited approximately a dozen times annually, whereas CONSORT guidelines are cited hundreds of times per year. Moreover, the APA commissioned a select group of APA journal editors and reviewers to develop JARS, and the group based most of their work on existent CONSORT guidelines. By comparison, official CONSORT extensions have been developed using rigorous consensus methods, have involved various international stakeholders in guideline development and dissemination, and update content on the most recent scientific literature. Nonetheless, no current CONSORT guideline adequately addresses the unique features of social and psychological interventions. This new CONSORT extension will incorporate lessons from previous extensions, reporting guidelines and the research literature to aid the critical appraisal, replication and uptake of this research (see online supplement).

Aspects of internal validity

Internal validity is the extent to which the results of a study may be influenced by bias. Like other study designs, the validity of RCTs depends on high-quality execution. Poorly conducted RCTs can produce more biased results than well-conducted RCTs and well-conducted non-randomised studies. Reference Prescott, Counsell, Gillespie, Grant, Russell and Kiauka25 For example, evidence indicates that RCTs that do not adequately conceal the randomisation sequence can exaggerate effect estimates by up to 30% Reference Schulz, Chalmers, Hayes and Altman26 and low-quality reports of these RCTs are associated with effect estimates exaggerated by up to 35%. Reference Moher, Pham, Jones, Cook, Jadad and Moher27 Social and psychological intervention RCTs are susceptible to these risks of bias as well.

Poor reporting of current CONSORT items

Some aspects of internal validity, although included in CONSORT, remain poorly reported - even in the least complex social and psychological intervention studies. Reports of RCTs should describe procedures for minimising selection bias, but reports often omit information about random sequence generation and allocation concealment Reference Ladd, McCrady, Manuel and Campbell28 and psychological journals report methods of sequence generation less frequently than medical journals. Reference Stinson, McGrath and Yamada8 A review of educational reports found no studies that adequately reported allocation concealment Reference Torgerson, Torgerson, Birks and Porthouse7 and reports in criminology often lack information about randomisation procedures. Reference Perry, Weisburd and Hewitt5 In addition, RCTs of social and psychological interventions may use non-traditional randomisation techniques, such as stepped-wedge or natural allocation, 29 which need to be thoroughly described. Also, reports of social and psychological intervention trials often fail to include details about trial registration, protocols and adverse events, Reference Ladd, McCrady, Manuel and Campbell28 which may include important negative consequences at individual, familial and community levels.

Possible amendments to CONSORT items

Other aspects of CONSORT may require greater emphasis or modification for RCTs of social and psychological interventions. In developing this CONSORT extension, we expect to identify new items and to adapt existing items that relate to the internal validity. These may include items discussed during the development of previous CONSORT extensions or other guidelines, as well as items suggested by participants in this project. For example, it may not be possible to mask participants and providers of interventions, but masking of outcome assessors is often possible but rarely reported, and few studies explain whether masking was maintained or how lack of masking was handled. Reference Ladd, McCrady, Manuel and Campbell28 In social and psychological intervention studies, outcome measures are often subjective, variables may relate to latent constructs, and information may come from multiple sources (for example participants, providers). Although an issue in other areas of research, the influence on RCT results of the quality of subjective outcome measures in social and psychological intervention research has long been highlighted given their prevalence in research in these disciplines. Reference Marshall, Lockwood, Bradley, Adams, Joy and Fenton30 Descriptions of the validity, reliability and psychometric properties of such measures are therefore particularly useful for social and psychological intervention trials, especially when they are not widely available or discussed in the research literature. Reference Campbell, Elbourne and Altman17 Moreover, multiple measures may be analysed in several ways, so authors need to transparently report which procedures were performed and to explain their rationale.

Aspects of external validity

External validity is the extent to which a study's results are applicable in other settings or populations. Currently, given that RCTs are primarily designed to increase the internal validity of study findings, the CONSORT Statement gives relatively little attention to external validity. Although high internal validity is an important precondition for any discussion of an RCT's external validity, updating the CONSORT Statement to include more information about external validity is critical for the relevance and uptake of a CONSORT extension for social and psychological interventions. These interventions may be influenced by context, as different underlying social, institutional, psychological and physical structures may yield different causal and probabilistic relations between interventions and observed outcomes. Contextual information is necessary to compare the effectiveness of an interventions across time and place. Lack of information relevant to external validity may prevent practitioners or policy makers from using evidence appropriately to inform decision-making, yet existing guidelines do not adequately explain how authors should describe (a) how interventions work, (b) for whom, and (c) under what conditions. Reference Moore and Moore31

Details of the intervention and comparator

It is useful for authors to explain the key components of interventions, how those components could be delivered, and how they relate to the outcomes selected. At present, authors can follow current standards for reporting interventions without providing adequate details about complex interventions. Reference Shepperd, Lewin, Straus, Clarke, Eccles and Fitzpatrick32 Many reports neither contain sufficient information about the interventions tested nor reference treatment manuals. Reference Glasziou, Meats, Heneghan and Shepperd33 Providing logic models - as described in the Medical Research Council Framework for Complex Interventions Reference Craig, Dieppe, Macintyre, Mitchie, Nazareth and Petticrew34 - or presenting theories of change can help elucidate links in causal chains that can be tested, identify important mediators and moderators and facilitate syntheses in reviews. Moreover, interventions are rarely implemented exactly as designed, and complex interventions may be designed to be implemented with some flexibility, in order to accommodate differences across participants, Reference Hawe, Shiell and Riley3 so it is important to report how interventions were actually delivered by providers and actually received by participants. Particularly for social and psychological interventions, the integrity of implementing the intended functions and processes of the intervention are essential to understand. Reference Hawe, Shiell and Riley3 As RCTs of a particular intervention can yield different relative effects depending on the nature of the control groups, information about delivery and uptake should be provided for all trial arms.

Participant characteristics

Reports should describe recruitment processes and representativeness of samples. Participants in RCTs of social and psychological interventions are often recruited outside of routine practice settings via processes that differ from routine services. 22 An intervention that works for one group of people may not work for people living in different cultures or physical spaces, or it may not work for people with slightly different problems and comorbidities. Enrolling in an RCT can be a complex process that affects the measured and unmeasured characteristics of participants, and recruitment may differ from how users normally access interventions. Well-described RCT reports will include the characteristics of all participants (volunteers, those who enrolled and those who completed) in sufficient detail for readers to assess the comparability of the study sample to populations and in everyday services. 22,24

Contextual influences

Given that these interventions often occur in social environments, reports should describe factors of the RCT context that are believed to support, attenuate or frustrate observed effects. Reference Moore35 Interventions may differ across groups of different social or socioeconomic positions, and equity considerations should be addressed explicitly. Reference Welch, Petticrew, Tugwell, Moher, O'Neill and Waters36 Several aspects of setting and implementation may be important to consider, such as administrative support, staff training and supervision, organisational resources, the wider service system and concurrent political or social events. Reference Shepperd, Lewin, Straus, Clarke, Eccles and Fitzpatrick32 Reporting process evaluations may help understand mechanisms and outcomes.

Developing a new CONSORT extension

This new reporting guideline for RCTs of social and psychological interventions will be an official extension of the CONSORT Statement. Optimally, it will help improve the reporting of these studies. Like other official CONSORT extensions, Reference Campbell, Elbourne and Altman17-Reference Zwarenstein, Treweek, Gagnier, Altman, Tunis and Haynes19,Reference Hopewell, Clarke, Moher, Wager, Middleton and Altman37 this guideline will be integrated with the CONSORT Statement and previous extensions, and updates of the CONSORT Statement may incorporate references to this extension.

Guideline developers

The project is being led by an international collaboration of researchers, methodologists, guideline developers, funders, service providers, journal editors and consumer advocacy groups. We will be recruiting participants in a manner similar to other reporting guideline initiatives - identifying stakeholders through literature reviews, the project's International Advisory Group and stakeholder-initiated interest in the project. Reference Schulz, Altman and Moher10 We hope to recruit stakeholders with expertise from all related disciplines and regions of the world, including low- and middle-income countries. Methodologists will identify items that relate to known sources of bias, and they will identify items that facilitate systematic reviews and research synthesis. Funders will consider how the guideline can aid the assessment of grant applications for RCTs and methodological innovations in intervention evaluation. Practitioners will identify information that can aid decision-making. Journal editors will identify practical steps to implement the guideline and to ensure uptake.

Consensus methods

We will use formal consensus techniques to reduce bias in group decision-making and to promote widespread guideline uptake and knowledge translation activities on project completion. Reference Grant, Mayo-Wilson, Melendez-Torres and Montgomery15 As indicated by previous research and the development of existing CONSORT guidelines, these methods are the most appropriate and successful ways to synthesise expertise and research evidence for our purposes, and they are beneficial to use in combination. Reference Moher, Schulz, Simera and Altman11 Following a rigorous review of existing guidelines and current reporting quality, Reference Grant, Mayo-Wilson, Melendez-Torres and Montgomery15 we will conduct a Delphi process to identify a prioritised list of reporting items to consider for the extension. That is, we will invite a group of experts to answer questions about reporting items and to suggest further questions. We will circulate their feedback to the group and ask a second round of questions. The Delphi process will capture a variety of international perspectives and allow participants to share their views anonymously. Following the Delphi process, we will host a consensus meeting to review the findings and to generate a list of minimal reporting standards, mirroring the development of previous CONSORT guidelines. Reference Schulz, Altman and Moher10,Reference Boutron, Moher, Altman, Schulz and Ravaud18,Reference Zwarenstein, Treweek, Gagnier, Altman, Tunis and Haynes19

Project outputs

Together, participants in this process will create a guideline consisting of a checklist of reporting items and a flow chart for reporting social and psychological intervention RCTs. In addition, we will develop an explanation and elaboration document to explain the scientific rationale for each recommendation and to provide examples of clear reporting; a similar document was developed by the CONSORT group to help disseminate a better understanding for each included checklist item. Reference Moher, Hopewell, Schultz, Montori, G⊘tzsche and Devereaux38 This document will help persuade editors, authors and funders of the importance of the guideline. It will be a useful pedagogical tool, helping students and researchers understand the methods for conducting RCTs of social and psychological interventions, and it will help authors meet the guideline requirements. Reference Moher, Schulz, Simera and Altman11

Stakeholder participation and uptake

The success of this project depends on widespread involvement and agreement among key international stakeholders in research, policy and practice. For example, previous developers have obtained guideline endorsement by journal editors who require authors and peer reviewers to use the guideline during manuscript submission and who must enforce journal article word limits. Reference Moher, Schulz, Simera and Altman11 Many journal editors have already agreed to participate, and we hope other researchers and stakeholders will volunteer their time and expertise.

Conclusions

Reporting guidelines help us use scarce resources efficiently and ethically. Randomised controlled trials are expensive, and the public have a right to expect returns on their investments through transparent, usable reports. When RCT reports cannot be used (for whatever reason), resources are wasted. Participants contribute their time and put themselves at risk of harm to generate evidence that will help others, and researchers should disseminate that information effectively. Policy makers benefit from research when developing effective, affordable standards of practice and choosing which programmes and services to fund. Administrators and managers are required to make contextually appropriate decisions. Transparent reporting of primary studies is essential for their inclusion in systematic reviews that inform these activities. For example, there is the need to determine whether primary studies are comparable, examine biases within included studies, assess the generalisability of results, and implement effective interventions. Finally, we hope this guideline will reduce the effort and time required for authors to write reports of RCTs.

Randomised controlled trials are not the only valid method for evaluating interventions, Reference Bonell, Hargreaves, Cousens, Ross, Hayes and Petticrew12 nor are they the only type of research that would benefit from better reporting. Colleagues have identified the importance of reporting standards for other types of research, including observational, Reference von Elm, Altman, Egger, Pocock, Gotzsche and Vandenbroucke39 quasi-experimental Reference Des Jarlais, Lyles and Crepaz40 and qualitative studies. Reference Tong, Sainsbury and Craig41 This guideline is the first step towards improving reports of many designs for evaluating social and psychological interventions, which we hope will be addressed by this and future projects. We invite stakeholders from disciplines that frequently research these interventions to join this important effort and participate in guideline development by visiting our website, where they can find more information about the project, updates on its progress and sign up to be involved (http://tinyurl.com/CONSORT-study).

Funding

This project is funded by the UK Economic and Social Research Council (ES/K00087X/1). We thank the Centre for Evidence Based Intervention (Oxford University), the Centre for Outcomes Research and Effectiveness (University College London), and the National Collaborating Centre for Mental Health (NCCMH) for their support. S.G. is supported by a linked Clarendon Fund-Green Templeton College Annual Fund Scholarship to support his doctoral studies and research. D.M. is supported by a University Research Chair.

Acknowledgements

The CONSORT-SPI (Social and Psychological Interventions) International Advisory Group includes: J. Lawrence Aber, Distinguished Professor of Applied Psychology and Public Policy, Steinhardt School of Culture, Education, and Human Development, New York University; Chris Bonell, Professor of Sociology and Social Intervention, Centre for Evidence Based Intervention, University of Oxford; David M. Clark, Chair of Psychology, Department of Experimental Psychology, University of Oxford; Frances Gardner, Professor of Child and Family Psychology, Centre for Evidence Based Intervention, University of Oxford; Steven Hollon, American Psychological Association Guidelines Committee (Chair), Gertrude Conaway Professor of Psychology, Department of Psychology, Vanderbilt University; Jim McCambridge, Senior Lecturer in Behaviour Change, Department of Social and Environmental Health Research, London School of Hygiene and Tropical Medicine; Laurence Moore, Professor of Public Health Improvement, Cardiff School of Social Sciences, Cardiff University; Mark Petticrew, Professor of Public Health Evaluation, Department Social and Environmental Health Research, London School of Hygiene and Tropical Medicine; Lawrence Sherman, Wolfson Professor of Criminology, Cambridge Institute of Criminology, Cambridge University; Steve Pilling, Director, Centre for Outcomes Research and Effectiveness, University College London; James Thomas, Associate Director EPPI-Centre, Reader in Social Policy, Institute of Education, University of London; Elizabeth Waters, Jack Brockhoff Chair of Child Public Health, Melbourne School of Population and Global Health, The University of Melbourne; David Weisburd, Director and Walter E. Meyer Professor of Law and Criminal Justice, Institute of Criminology, Hebrew University Faculty of Law, Jerusalem; Joanne Yaffe, Associate Professor, College of Social Work, University of Utah.

Footnotes

Declaration of interest

None.

References

1 Medical Research Council. A Framework for Development and Evaluation of RCTs for Complex Interventions to Improve Health. MRC, 2008.Google Scholar
2 Bonell, C. The utility of randomized controlled trials of social interventions: an examination of two trials of HIV prevention. Crit Public Health 2002; 12: 321–34.Google Scholar
3 Hawe, P., Shiell, A., Riley, T. Complex interventions: how “out of control” can a randomised controlled trial be? BMJ 2004; 328: 1561–3.Google Scholar
4 Henggeler, SW, Schoenwald, SK, Rowland, MD, Cunningham, PB. Serious Emotional Disturbances in Children and Adolescents: Multisystemic Therapy. Guildford Press, 2002.Google Scholar
5 Perry, AE, Weisburd, D., Hewitt, C. Are criminologists describing randomized controlled trials in ways that allow us to assess them? Findings from a sample of crime and justice trials. J Exp Criminol 2010; 6: 245–62.Google Scholar
6 Naleppa, MJ, Cagle, JG. Treatment fidelity in social work intervention research: a review of published studies. Res Soc Work Pract 2010; 20: 674–81.Google Scholar
7 Torgerson, CJ, Torgerson, DJ, Birks, YF, Porthouse, J. A comparison of RCTs in health and education. Br Educ Res J 2005; 31: 761–85.Google Scholar
8 Stinson, JN, McGrath, PJ, Yamada, JT. Clinical trials in the Journal of Pediatric Psychology, applying the CONSORT Statement. J Pediatr Psychol 2003; 28: 159–67.Google Scholar
9 Semaan, S., Kay, L., Strouse, D., Sogolow, E., Mullen, PD, Neumann, MS, et al A profile of U.S.-based trials of behavioral and social interventions for HIV risk reduction. J Acquir Immune Defic Syndr 2002; 30: S3050.Google Scholar
10 Schulz, KF, Altman, DG, Moher, D. CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. BMJ 2010: 340: 698702.Google Scholar
11 Moher, D., Schulz, KF, Simera, I., Altman, DG. Guidance for developers of health research reporting guidelines. PLoS Med 2010; 7: e1000217.Google Scholar
12 Bonell, CP, Hargreaves, J., Cousens, S., Ross, D., Hayes, R., Petticrew, M., et al Alternatives to randomisation in the evaluation of public health interventions: design challenges and solutions. J Epidemiol Community Health 2011; 65: 582–7.Google Scholar
13 Chalmers, I. Trying to do more good than harm in policy and practice: the role of rigorous, transparent, up-to-date evaluations. Ann Am Acad Pol Soc Sci 2003; 589: 2240.Google Scholar
14 Turner, L., Shamseer, L., Altman, DG, Weeks, L., Peters, J., Kober, T., et al Consolidated standards of reporting trials (CONSORT) and the completeness of reporting of randomised controlled trials (RCTs) published in medical journals. Cochrane Database Syst Rev 2012; 11: MR000030.Google Scholar
15 Grant, S., Mayo-Wilson, E., Melendez-Torres, GJ, Montgomery, P. The reporting quality of social and psychological intervention trials: a systematic review of reporting guidelines and trial publications. PLoS One 2013; 8: e65442.Google Scholar
16 Mayo-Wilson, E. Reporting implementation in randomized trials: proposed additions to the consolidated standards of reporting trials statement. Am J Public Health. 2007; 97: 630.CrossRefGoogle ScholarPubMed
17 Campbell, MK, Elbourne, DR, Altman, DG. CONSORT statement: extension to cluster randomised trials. BMJ 2004; 328: 702–8.Google Scholar
18 Boutron, I., Moher, D., Altman, DG, Schulz, K., Ravaud, P. Methods and processes of the CONSORT group: example of an extension for trials assessing nonpharmacologic treatments. Ann Intern Med 2008; 148: W607.Google Scholar
19 Zwarenstein, M., Treweek, S., Gagnier, JJ, Altman, DG, Tunis, S., Haynes, B., et al Improving the reporting of pragmatic trials: an extension of the CONSORT statement. BMJ 2008; 337: a2390.Google Scholar
20 Calvert, M., Blazeby, J., Revicki, D., Moher, D., Brundage, M. Reporting quality of life in clinical trials: a CONSORT extension. Lancet 2011; 378: 1684–5.Google Scholar
21 Abraham, C. WIDER Recommendations to Improve Reporting of the Content of Behaviour Change Interventions. Abraham C, 2009 (http://interventiondesign.co.uk/wp-content/uploads/2009/02/wider-recommendations.pdf).Google Scholar
22 American Educational Research Association. Standards for reporting on empirical social science research in AERA publications. Educ Res 2006; 35: 3340.Google Scholar
23 Newman, M., Elbourne, D. Improving the usability of educational research: guidelines for the REPOrting of primary empirical research Studies in Education (The REPOSE Guidelines). Eval Res Educ 2004; 18: 201–12.Google Scholar
24 APA Publications and Communications Board Working Group on Journal Article Reporting Standards. Reporting standards for research in psychology: why do we need them? What might they be? Am Psychol 2008; 63: 839–51.Google Scholar
25 Prescott, RJ, Counsell, CE, Gillespie, WJ, Grant, AM, Russell, IT, Kiauka, S., et al Factors that limit the quality, number and progress of randomised controlled trials. Health Technol Assess 1999; 3: 1143.Google Scholar
26 Schulz, KF, Chalmers, I., Hayes, RJ, Altman, DG. Allocation concealment in randomised trials: defending against deciphering. Lancet 1995; 359: 614–7.Google Scholar
27 Moher, D., Pham, B., Jones, A., Cook, DJ, Jadad, AR, Moher, M., et al Does quality of reports of randomized trials affect estimates of intervention efficacy reported in meta-analyses? Lancet 1999; 352: 609–13.Google Scholar
28 Ladd, BO, McCrady, BS, Manuel, JK, Campbell, W. Improving the quality of reporting alcohol outcome studies: effects of the CONSORT statement. Addict Behav 2010; 35: 660–6.Google Scholar
29 Medical Research Council. Using Natural Experiments to Evaluate Population Health Interventions: Guidance for Producers and Users of Evidence. MRC, 2011.Google Scholar
30 Marshall, M., Lockwood, A., Bradley, C., Adams, C., Joy, C., Fenton, M. Unpublished rating scales: a major source of bias in randomised controlled trials of treatments for schizophrenia. Br J Psychiatry 2000; 176: 249–52.Google Scholar
31 Moore, L., Moore, GF. Public health evaluation: which designs work, for whom and under what circumstances? J Epidemiol Community Health 2011; 65: 596–7.Google Scholar
32 Shepperd, S., Lewin, S., Straus, S., Clarke, M., Eccles, MP, Fitzpatrick, R., et al Can we systematically review studies that evaluate complex interventions? PLoS Med. 2009; 6: 31000086.CrossRefGoogle ScholarPubMed
33 Glasziou, P., Meats, E., Heneghan, C., Shepperd, S. What is missing from descriptions of treatment in trials and reviews? BMJ 2008; 336: 1472–4.Google Scholar
34 Craig, P., Dieppe, P., Macintyre, S., Mitchie, S., Nazareth, I., Petticrew, M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ 2008; 337: 979–83.Google Scholar
35 Moore, L. Research design for the rigorous evaluation of complex educational interventions: lessons from health services research. Build Res Capacity 2002; 1: 45.Google Scholar
36 Welch, V., Petticrew, M., Tugwell, P., Moher, D., O'Neill, J., Waters, E., et al PRISMA-Equity 2012 extension: reporting guidelines for systematic reviews with a focus on health equity. PLoS Med 2012; 9: e1001333.Google Scholar
37 Hopewell, S., Clarke, M., Moher, D., Wager, E., Middleton, P., Altman, DG, et al CONSORT for reporting randomised trials in journal and conference abstracts. Lancet 2008; 371: 281–3.Google Scholar
38 Moher, D., Hopewell, S., Schultz, KF, Montori, V., G⊘tzsche, PC, Devereaux, PJ, et al CONSORT 2010 Explanation and Elaboration: updated guidelines for reporting parallel group randomised trials. BMJ 2010; 340: c869.Google Scholar
39 von Elm, E., Altman, DG, Egger, M., Pocock, SJ, Gotzsche, PC, Vandenbroucke, JP. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies. Ann Intern Med 2007; 147: 573–7.Google Scholar
40 Des Jarlais, DC, Lyles, C., Crepaz, N. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health 2004; 94: 361–6.Google Scholar
41 Tong, A., Sainsbury, P., Craig, J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19: 349–57.Google Scholar
Supplementary material: PDF

Mayo-Wilson et al. supplementary material

Supplementary Material

Download Mayo-Wilson et al. supplementary material(PDF)
PDF 49 KB
Submit a response

eLetters

No eLetters have been published for this article.