Authors

  1. Stern, Cindy

Article Content

Systematic reviews attempt to uncover evidence on a particular intervention, condition or issue. In health care, they inform practice and policy with the end goal of improving outcomes. The process of conducting a systematic review is recognized internationally, comprising a series of structured yet complex steps.1 Borah et al.2 estimate that the mean time to complete and publish a review is 67.3 weeks, indicating a significant investment made by the review team. Following completion of a review, the primary goal is to disseminate, transfer and ultimately translate the results of the review at a local and international level. Both active and passive dissemination strategies are recommended3 and much research has been conducted into their strengths and weaknesses.4-6

 

Dissemination is still largely undertaken through academic journal publication and each journal has their own guidelines and policies in regards to how manuscripts (including systematic reviews) should be both reported and presented. Transparency and reproducibility are two core concepts often referred to in the realm of systematic reviews. They imply that a well-reported systematic review should essentially provide enough detail for the reader to follow and to reach the same or similar conclusions. In other words, the reader should be provided with enough information on how the reviewers approached each step of the review process in terms of the methods followed and decisions made (all of which are outlined in an a priori protocol), to enable it to be replicated and thus yield comparable results.

 

A systematic review that follows a recognizable methodology is often considered "rigorous" and "well conducted"; however, this does not necessarily translate into a review that is well reported. Research into the quality of systematic review reporting shows poor results.7,8 So how do reviewers, journal editors, peer reviewers and readers of systematic reviews know what constitutes a well reported review? What information should be included and what level of detail is needed for it to be suitable for publication? There is a need for a standardized approach to how systematic reviews should be reported (much like the process of conducting a systematic review), to reduce variability and increase transparency. This will enable the users of systematic reviews to accurately assess their strengths and weaknesses which will influence their decision to use the results and follow the recommendations of the review.

 

The need for such guidance has led to the establishment of a number of international groups to provide clarity, one of which was the Quality of Reporting of Meta-analyses (QUOROM) group. It consisted of clinical epidemiologists, researchers, statisticians, editors and clinicians committed to addressing the issues of reporting in meta-analyses of randomized controlled trials (RCTs).9 The group was tasked with reviewing the literature and identifying items they thought should be included in a checklist of standards (governed by evidence, where possible). A modified Delphi technique was utilized to assess items and in 1999 the QUOROM statement was developed.9 To better reflect the advances made in systematic reviews, an update of the guidance was undertaken in 2009 and renamed PRISMA, standing for Preferred Reporting Items for Systematic Reviews and Meta-Analyses.8

 

PRISMA comprises a set of minimum reporting criteria for systematic reviews and meta-analyses. The checklist consists of 27 items organized into seven sections (title, abstract, introduction, methods, results, discussion and funding) and a four-phase flow diagram.8 It is directly applicable to reviews focused on interventions that include RCTs; however, it can be used across other types of systematic reviews. Several extensions of PRISMA have been developed that provide guidance on different elements related to systematic reviews such as review protocols (PRISMA-P), individual patient data (PRISMA-IPD), network meta-analyses (PRISMA-NMA), abstracts, equity-focused systematic reviews, reviews including Harm outcomes (PRISMA Harms) and Diagnostic Test Accuracy reviews. In addition, a number of others are in development, including guidance for scoping reviews (PRISMA-ScR), review involving children (PRISMA-C) and protocols for children (PRISMA-PC). All resources are freely available and a number have been translated into languages other than English. PRISMA, along with a number of other important resources, can be found through the EQUATOR Network, an international initiative that functions to improve the reliability of published health research.

 

To support the spread of this information, journals have been invited to formally "endorse" the PRISMA statement. Historically, the JBI Database of Systematic Reviews and Implementation Reports has supported such guidance (albeit informally) but is moving to strengthen its commitment to high quality reporting by officially endorsing PRISMA. This is explicitly outlined in the JBI Database of Systematic Reviews and Implementation Reports journal policies; a manuscript will not be considered for publication otherwise. In addition, adherence to PRISMA is included in supporting materials provided to peer reviewers for the journal, the JBI global systematic review training program, as well as its supporting online training modules.1 Research into the impact of journal endorsement of reporting standards has commenced and preliminary results are promising, indicating that journal endorsement of PRISMA (ranging from simply suggesting use to submission of the checklist) was associated with more complete reporting of systematic reviews overall.10

 

Systematic reviews have, and continue to play, a crucial role in the evidence-based healthcare revolution and adherence to standards regarding their conduct as well as their reporting is imperative. However, the availability of such guidance has not translated into consistent, high quality systematic review publication1 and as such, journals who support and endorse these initiatives can play an important role in changing practice. On behalf of the JBI Database of Systematic Reviews and Implementation Reports along with hundreds of other journals that publish systematic reviews, we call out to other editors to lead by example and encourage authors, peer reviewers and readers of systematic reviews to visit the PRISMA website to learn more.

 

References

 

1. Stern C, Munn Z, Porritt K, Lockwood C, Peters MDJ, Bellman S, et al. An international educational training course for conducting systematic reviews in health care: the Joanna Briggs Institute's Comprehensive Systematic Review Training Program. Worldviews Evid Based Nurs 2018; [Epub ahead of print]. [Context Link]

 

2. Borah R, Brown AW, Capers PL, Kaiser KA. Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry. BMJ Open 2017; 7 2:e012545. [Context Link]

 

3. Munn Z, Stern C, Porritt K, Lockwood C, Aromataris E, Jordan Z. Evidence transfer: ensuring end users are aware of, have access to, and understand the evidence. Int J Evid Based Healthc 2018; 16 2:83-89. [Context Link]

 

4. McCormack L, Sheridan S, Lewis M, Boudewyns V, Melvin CL, Kistler C, et al. Communication and dissemination strategies to facilitate the use of health-related evidence. Evid Rep Technol Assess (Full Rep) 2013; 213:1-520. [Context Link]

 

5. Wilson PM, Petticrew M, Calnan MW, Nazareth I. Disseminating research findings: what should researchers do? A systematic scoping review of conceptual frameworks. Implement Sci 2010; 5:91. [Context Link]

 

6. Schipper K, Bakker M, De Wit M, Ket JC, Abma TA. Strategies for disseminating recommendations or guidelines to patients: a systematic review. Implement Sci 2016; 11 1:82. [Context Link]

 

7. Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an update. Mt Sinai J Med 1996; 63 (3-4):216-224. [Context Link]

 

8. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009; 6 7:e1000097. [Context Link]

 

9. Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D, Stroup DF. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement. Quality of reporting of meta-analyses. Lancet 1999; 354 9193:1896-1900. [Context Link]

 

10. Stevens A, Shamseer L, Weinstein E, Yazdi F, Turner L, Thielman J, et al. Relation of completeness of reporting of health research to journals' endorsement of reporting guidelines: systematic review. BMJ 2014; 348:g3804. [Context Link]