• - Google Chrome

Intended for healthcare professionals

  • Access provided by Google Indexer
  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • The PRISMA 2020...

The PRISMA 2020 statement: an updated guideline for reporting systematic reviews

PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews

  • Related content
  • Peer review
  • Matthew J Page , senior research fellow 1 ,
  • Joanne E McKenzie , associate professor 1 ,
  • Patrick M Bossuyt , professor 2 ,
  • Isabelle Boutron , professor 3 ,
  • Tammy C Hoffmann , professor 4 ,
  • Cynthia D Mulrow , professor 5 ,
  • Larissa Shamseer , doctoral student 6 ,
  • Jennifer M Tetzlaff , research product specialist 7 ,
  • Elie A Akl , professor 8 ,
  • Sue E Brennan , senior research fellow 1 ,
  • Roger Chou , professor 9 ,
  • Julie Glanville , associate director 10 ,
  • Jeremy M Grimshaw , professor 11 ,
  • Asbjørn Hróbjartsson , professor 12 ,
  • Manoj M Lalu , associate scientist and assistant professor 13 ,
  • Tianjing Li , associate professor 14 ,
  • Elizabeth W Loder , professor 15 ,
  • Evan Mayo-Wilson , associate professor 16 ,
  • Steve McDonald , senior research fellow 1 ,
  • Luke A McGuinness , research associate 17 ,
  • Lesley A Stewart , professor and director 18 ,
  • James Thomas , professor 19 ,
  • Andrea C Tricco , scientist and associate professor 20 ,
  • Vivian A Welch , associate professor 21 ,
  • Penny Whiting , associate professor 17 ,
  • David Moher , director and professor 22
  • 1 School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
  • 2 Department of Clinical Epidemiology, Biostatistics and Bioinformatics, Amsterdam University Medical Centres, University of Amsterdam, Amsterdam, Netherlands
  • 3 Université de Paris, Centre of Epidemiology and Statistics (CRESS), Inserm, F 75004 Paris, France
  • 4 Institute for Evidence-Based Healthcare, Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Australia
  • 5 University of Texas Health Science Center at San Antonio, San Antonio, Texas, USA; Annals of Internal Medicine
  • 6 Knowledge Translation Program, Li Ka Shing Knowledge Institute, Toronto, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
  • 7 Evidence Partners, Ottawa, Canada
  • 8 Clinical Research Institute, American University of Beirut, Beirut, Lebanon; Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada
  • 9 Department of Medical Informatics and Clinical Epidemiology, Oregon Health & Science University, Portland, Oregon, USA
  • 10 York Health Economics Consortium (YHEC Ltd), University of York, York, UK
  • 11 Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada; Department of Medicine, University of Ottawa, Ottawa, Canada
  • 12 Centre for Evidence-Based Medicine Odense (CEBMO) and Cochrane Denmark, Department of Clinical Research, University of Southern Denmark, Odense, Denmark; Open Patient data Exploratory Network (OPEN), Odense University Hospital, Odense, Denmark
  • 13 Department of Anesthesiology and Pain Medicine, The Ottawa Hospital, Ottawa, Canada; Clinical Epidemiology Program, Blueprint Translational Research Group, Ottawa Hospital Research Institute, Ottawa, Canada; Regenerative Medicine Program, Ottawa Hospital Research Institute, Ottawa, Canada
  • 14 Department of Ophthalmology, School of Medicine, University of Colorado Denver, Denver, Colorado, United States; Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA
  • 15 Division of Headache, Department of Neurology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, USA; Head of Research, The BMJ , London, UK
  • 16 Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, Bloomington, Indiana, USA
  • 17 Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
  • 18 Centre for Reviews and Dissemination, University of York, York, UK
  • 19 EPPI-Centre, UCL Social Research Institute, University College London, London, UK
  • 20 Li Ka Shing Knowledge Institute of St. Michael's Hospital, Unity Health Toronto, Toronto, Canada; Epidemiology Division of the Dalla Lana School of Public Health and the Institute of Health Management, Policy, and Evaluation, University of Toronto, Toronto, Canada; Queen's Collaboration for Health Care Quality Joanna Briggs Institute Centre of Excellence, Queen's University, Kingston, Canada
  • 21 Methods Centre, Bruyère Research Institute, Ottawa, Ontario, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
  • 22 Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
  • Correspondence to: M J Page matthew.page{at}monash.edu
  • Accepted 4 January 2021

The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews.

Systematic reviews serve many critical roles. They can provide syntheses of the state of knowledge in a field, from which future research priorities can be identified; they can address questions that otherwise could not be answered by individual studies; they can identify problems in primary research that should be rectified in future studies; and they can generate or evaluate theories about how or why phenomena occur. Systematic reviews therefore generate various types of knowledge for different users of reviews (such as patients, healthcare providers, researchers, and policy makers). 1 2 To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did (such as how studies were identified and selected) and what they found (such as characteristics of contributing studies and results of meta-analyses). Up-to-date reporting guidance facilitates authors achieving this. 3

The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) 4 5 6 7 8 9 10 is a reporting guideline designed to address poor reporting of systematic reviews. 11 The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an “explanation and elaboration” paper 12 13 14 15 16 providing additional reporting guidance for each item, along with exemplars of reporting. The recommendations have been widely endorsed and adopted, as evidenced by its co-publication in multiple journals, citation in over 60 000 reports (Scopus, August 2020), endorsement from almost 200 journals and systematic review organisations, and adoption in various disciplines. Evidence from observational studies suggests that use of the PRISMA 2009 statement is associated with more complete reporting of systematic reviews, 17 18 19 20 although more could be done to improve adherence to the guideline. 21

Many innovations in the conduct of systematic reviews have occurred since publication of the PRISMA 2009 statement. For example, technological advances have enabled the use of natural language processing and machine learning to identify relevant evidence, 22 23 24 methods have been proposed to synthesise and present findings when meta-analysis is not possible or appropriate, 25 26 27 and new methods have been developed to assess the risk of bias in results of included studies. 28 29 Evidence on sources of bias in systematic reviews has accrued, culminating in the development of new tools to appraise the conduct of systematic reviews. 30 31 Terminology used to describe particular review processes has also evolved, as in the shift from assessing “quality” to assessing “certainty” in the body of evidence. 32 In addition, the publishing landscape has transformed, with multiple avenues now available for registering and disseminating systematic review protocols, 33 34 disseminating reports of systematic reviews, and sharing data and materials, such as preprint servers and publicly accessible repositories. To capture these advances in the reporting of systematic reviews necessitated an update to the PRISMA 2009 statement.

Summary points

To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did, and what they found

The PRISMA 2020 statement provides updated reporting guidance for systematic reviews that reflects advances in methods to identify, select, appraise, and synthesise studies

The PRISMA 2020 statement consists of a 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and revised flow diagrams for original and updated reviews

We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders

Development of PRISMA 2020

A complete description of the methods used to develop PRISMA 2020 is available elsewhere. 35 We identified PRISMA 2009 items that were often reported incompletely by examining the results of studies investigating the transparency of reporting of published reviews. 17 21 36 37 We identified possible modifications to the PRISMA 2009 statement by reviewing 60 documents providing reporting guidance for systematic reviews (including reporting guidelines, handbooks, tools, and meta-research studies). 38 These reviews of the literature were used to inform the content of a survey with suggested possible modifications to the 27 items in PRISMA 2009 and possible additional items. Respondents were asked whether they believed we should keep each PRISMA 2009 item as is, modify it, or remove it, and whether we should add each additional item. Systematic review methodologists and journal editors were invited to complete the online survey (110 of 220 invited responded). We discussed proposed content and wording of the PRISMA 2020 statement, as informed by the review and survey results, at a 21-member, two-day, in-person meeting in September 2018 in Edinburgh, Scotland. Throughout 2019 and 2020, we circulated an initial draft and five revisions of the checklist and explanation and elaboration paper to co-authors for feedback. In April 2020, we invited 22 systematic reviewers who had expressed interest in providing feedback on the PRISMA 2020 checklist to share their views (via an online survey) on the layout and terminology used in a preliminary version of the checklist. Feedback was received from 15 individuals and considered by the first author, and any revisions deemed necessary were incorporated before the final version was approved and endorsed by all co-authors.

The PRISMA 2020 statement

Scope of the guideline.

The PRISMA 2020 statement has been designed primarily for systematic reviews of studies that evaluate the effects of health interventions, irrespective of the design of the included studies. However, the checklist items are applicable to reports of systematic reviews evaluating other interventions (such as social or educational interventions), and many items are applicable to systematic reviews with objectives other than evaluating interventions (such as evaluating aetiology, prevalence, or prognosis). PRISMA 2020 is intended for use in systematic reviews that include synthesis (such as pairwise meta-analysis or other statistical synthesis methods) or do not include synthesis (for example, because only one eligible study is identified). The PRISMA 2020 items are relevant for mixed-methods systematic reviews (which include quantitative and qualitative studies), but reporting guidelines addressing the presentation and synthesis of qualitative data should also be consulted. 39 40 PRISMA 2020 can be used for original systematic reviews, updated systematic reviews, or continually updated (“living”) systematic reviews. However, for updated and living systematic reviews, there may be some additional considerations that need to be addressed. Where there is relevant content from other reporting guidelines, we reference these guidelines within the items in the explanation and elaboration paper 41 (such as PRISMA-Search 42 in items 6 and 7, Synthesis without meta-analysis (SWiM) reporting guideline 27 in item 13d). Box 1 includes a glossary of terms used throughout the PRISMA 2020 statement.

Glossary of terms

Systematic review —A review that uses explicit, systematic methods to collate and synthesise findings of studies that address a clearly formulated question 43

Statistical synthesis —The combination of quantitative results of two or more studies. This encompasses meta-analysis of effect estimates (described below) and other methods, such as combining P values, calculating the range and distribution of observed effects, and vote counting based on the direction of effect (see McKenzie and Brennan 25 for a description of each method)

Meta-analysis of effect estimates —A statistical technique used to synthesise results when study effect estimates and their variances are available, yielding a quantitative summary of results 25

Outcome —An event or measurement collected for participants in a study (such as quality of life, mortality)

Result —The combination of a point estimate (such as a mean difference, risk ratio, or proportion) and a measure of its precision (such as a confidence/credible interval) for a particular outcome

Report —A document (paper or electronic) supplying information about a particular study. It could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report, or any other document providing relevant information

Record —The title or abstract (or both) of a report indexed in a database or website (such as a title or abstract for an article indexed in Medline). Records that refer to the same report (such as the same journal article) are “duplicates”; however, records that refer to reports that are merely similar (such as a similar abstract submitted to two different conferences) should be considered unique.

Study —An investigation, such as a clinical trial, that includes a defined group of participants and one or more interventions and outcomes. A “study” might have multiple reports. For example, reports could include the protocol, statistical analysis plan, baseline characteristics, results for the primary outcome, results for harms, results for secondary outcomes, and results for additional mediator and moderator analyses

PRISMA 2020 is not intended to guide systematic review conduct, for which comprehensive resources are available. 43 44 45 46 However, familiarity with PRISMA 2020 is useful when planning and conducting systematic reviews to ensure that all recommended information is captured. PRISMA 2020 should not be used to assess the conduct or methodological quality of systematic reviews; other tools exist for this purpose. 30 31 Furthermore, PRISMA 2020 is not intended to inform the reporting of systematic review protocols, for which a separate statement is available (PRISMA for Protocols (PRISMA-P) 2015 statement 47 48 ). Finally, extensions to the PRISMA 2009 statement have been developed to guide reporting of network meta-analyses, 49 meta-analyses of individual participant data, 50 systematic reviews of harms, 51 systematic reviews of diagnostic test accuracy studies, 52 and scoping reviews 53 ; for these types of reviews we recommend authors report their review in accordance with the recommendations in PRISMA 2020 along with the guidance specific to the extension.

How to use PRISMA 2020

The PRISMA 2020 statement (including the checklists, explanation and elaboration, and flow diagram) replaces the PRISMA 2009 statement, which should no longer be used. Box 2 summarises noteworthy changes from the PRISMA 2009 statement. The PRISMA 2020 checklist includes seven sections with 27 items, some of which include sub-items ( table 1 ). A checklist for journal and conference abstracts for systematic reviews is included in PRISMA 2020. This abstract checklist is an update of the 2013 PRISMA for Abstracts statement, 54 reflecting new and modified content in PRISMA 2020 ( table 2 ). A template PRISMA flow diagram is provided, which can be modified depending on whether the systematic review is original or updated ( fig 1 ).

Noteworthy changes to the PRISMA 2009 statement

Inclusion of the abstract reporting checklist within PRISMA 2020 (see item #2 and table 2 ).

Movement of the ‘Protocol and registration’ item from the start of the Methods section of the checklist to a new Other section, with addition of a sub-item recommending authors describe amendments to information provided at registration or in the protocol (see item #24a-24c).

Modification of the ‘Search’ item to recommend authors present full search strategies for all databases, registers and websites searched, not just at least one database (see item #7).

Modification of the ‘Study selection’ item in the Methods section to emphasise the reporting of how many reviewers screened each record and each report retrieved, whether they worked independently, and if applicable, details of automation tools used in the process (see item #8).

Addition of a sub-item to the ‘Data items’ item recommending authors report how outcomes were defined, which results were sought, and methods for selecting a subset of results from included studies (see item #10a).

Splitting of the ‘Synthesis of results’ item in the Methods section into six sub-items recommending authors describe: the processes used to decide which studies were eligible for each synthesis; any methods required to prepare the data for synthesis; any methods used to tabulate or visually display results of individual studies and syntheses; any methods used to synthesise results; any methods used to explore possible causes of heterogeneity among study results (such as subgroup analysis, meta-regression); and any sensitivity analyses used to assess robustness of the synthesised results (see item #13a-13f).

Addition of a sub-item to the ‘Study selection’ item in the Results section recommending authors cite studies that might appear to meet the inclusion criteria, but which were excluded, and explain why they were excluded (see item #16b).

Splitting of the ‘Synthesis of results’ item in the Results section into four sub-items recommending authors: briefly summarise the characteristics and risk of bias among studies contributing to the synthesis; present results of all statistical syntheses conducted; present results of any investigations of possible causes of heterogeneity among study results; and present results of any sensitivity analyses (see item #20a-20d).

Addition of new items recommending authors report methods for and results of an assessment of certainty (or confidence) in the body of evidence for an outcome (see items #15 and #22).

Addition of a new item recommending authors declare any competing interests (see item #26).

Addition of a new item recommending authors indicate whether data, analytic code and other materials used in the review are publicly available and if so, where they can be found (see item #27).

PRISMA 2020 item checklist

  • View inline

PRISMA 2020 for Abstracts checklist*

Fig 1

PRISMA 2020 flow diagram template for systematic reviews. The new design is adapted from flow diagrams proposed by Boers, 55 Mayo-Wilson et al. 56 and Stovold et al. 57 The boxes in grey should only be completed if applicable; otherwise they should be removed from the flow diagram. Note that a “report” could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report or any other document providing relevant information.

  • Download figure
  • Open in new tab
  • Download powerpoint

We recommend authors refer to PRISMA 2020 early in the writing process, because prospective consideration of the items may help to ensure that all the items are addressed. To help keep track of which items have been reported, the PRISMA statement website ( http://www.prisma-statement.org/ ) includes fillable templates of the checklists to download and complete (also available in the data supplement on bmj.com). We have also created a web application that allows users to complete the checklist via a user-friendly interface 58 (available at https://prisma.shinyapps.io/checklist/ and adapted from the Transparency Checklist app 59 ). The completed checklist can be exported to Word or PDF. Editable templates of the flow diagram can also be downloaded from the PRISMA statement website.

We have prepared an updated explanation and elaboration paper, in which we explain why reporting of each item is recommended and present bullet points that detail the reporting recommendations (which we refer to as elements). 41 The bullet-point structure is new to PRISMA 2020 and has been adopted to facilitate implementation of the guidance. 60 61 An expanded checklist, which comprises an abridged version of the elements presented in the explanation and elaboration paper, with references and some examples removed, is available in the data supplement on bmj.com. Consulting the explanation and elaboration paper is recommended if further clarity or information is required.

Journals and publishers might impose word and section limits, and limits on the number of tables and figures allowed in the main report. In such cases, if the relevant information for some items already appears in a publicly accessible review protocol, referring to the protocol may suffice. Alternatively, placing detailed descriptions of the methods used or additional results (such as for less critical outcomes) in supplementary files is recommended. Ideally, supplementary files should be deposited to a general-purpose or institutional open-access repository that provides free and permanent access to the material (such as Open Science Framework, Dryad, figshare). A reference or link to the additional information should be included in the main report. Finally, although PRISMA 2020 provides a template for where information might be located, the suggested location should not be seen as prescriptive; the guiding principle is to ensure the information is reported.

Use of PRISMA 2020 has the potential to benefit many stakeholders. Complete reporting allows readers to assess the appropriateness of the methods, and therefore the trustworthiness of the findings. Presenting and summarising characteristics of studies contributing to a synthesis allows healthcare providers and policy makers to evaluate the applicability of the findings to their setting. Describing the certainty in the body of evidence for an outcome and the implications of findings should help policy makers, managers, and other decision makers formulate appropriate recommendations for practice or policy. Complete reporting of all PRISMA 2020 items also facilitates replication and review updates, as well as inclusion of systematic reviews in overviews (of systematic reviews) and guidelines, so teams can leverage work that is already done and decrease research waste. 36 62 63

We updated the PRISMA 2009 statement by adapting the EQUATOR Network’s guidance for developing health research reporting guidelines. 64 We evaluated the reporting completeness of published systematic reviews, 17 21 36 37 reviewed the items included in other documents providing guidance for systematic reviews, 38 surveyed systematic review methodologists and journal editors for their views on how to revise the original PRISMA statement, 35 discussed the findings at an in-person meeting, and prepared this document through an iterative process. Our recommendations are informed by the reviews and survey conducted before the in-person meeting, theoretical considerations about which items facilitate replication and help users assess the risk of bias and applicability of systematic reviews, and co-authors’ experience with authoring and using systematic reviews.

Various strategies to increase the use of reporting guidelines and improve reporting have been proposed. They include educators introducing reporting guidelines into graduate curricula to promote good reporting habits of early career scientists 65 ; journal editors and regulators endorsing use of reporting guidelines 18 ; peer reviewers evaluating adherence to reporting guidelines 61 66 ; journals requiring authors to indicate where in their manuscript they have adhered to each reporting item 67 ; and authors using online writing tools that prompt complete reporting at the writing stage. 60 Multi-pronged interventions, where more than one of these strategies are combined, may be more effective (such as completion of checklists coupled with editorial checks). 68 However, of 31 interventions proposed to increase adherence to reporting guidelines, the effects of only 11 have been evaluated, mostly in observational studies at high risk of bias due to confounding. 69 It is therefore unclear which strategies should be used. Future research might explore barriers and facilitators to the use of PRISMA 2020 by authors, editors, and peer reviewers, designing interventions that address the identified barriers, and evaluating those interventions using randomised trials. To inform possible revisions to the guideline, it would also be valuable to conduct think-aloud studies 70 to understand how systematic reviewers interpret the items, and reliability studies to identify items where there is varied interpretation of the items.

We encourage readers to submit evidence that informs any of the recommendations in PRISMA 2020 (via the PRISMA statement website: http://www.prisma-statement.org/ ). To enhance accessibility of PRISMA 2020, several translations of the guideline are under way (see available translations at the PRISMA statement website). We encourage journal editors and publishers to raise awareness of PRISMA 2020 (for example, by referring to it in journal “Instructions to authors”), endorsing its use, advising editors and peer reviewers to evaluate submitted systematic reviews against the PRISMA 2020 checklists, and making changes to journal policies to accommodate the new reporting recommendations. We recommend existing PRISMA extensions 47 49 50 51 52 53 71 72 be updated to reflect PRISMA 2020 and advise developers of new PRISMA extensions to use PRISMA 2020 as the foundation document.

We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders. Ultimately, we hope that uptake of the guideline will lead to more transparent, complete, and accurate reporting of systematic reviews, thus facilitating evidence based decision making.

Acknowledgments

We dedicate this paper to the late Douglas G Altman and Alessandro Liberati, whose contributions were fundamental to the development and implementation of the original PRISMA statement.

We thank the following contributors who completed the survey to inform discussions at the development meeting: Xavier Armoiry, Edoardo Aromataris, Ana Patricia Ayala, Ethan M Balk, Virginia Barbour, Elaine Beller, Jesse A Berlin, Lisa Bero, Zhao-Xiang Bian, Jean Joel Bigna, Ferrán Catalá-López, Anna Chaimani, Mike Clarke, Tammy Clifford, Ioana A Cristea, Miranda Cumpston, Sofia Dias, Corinna Dressler, Ivan D Florez, Joel J Gagnier, Chantelle Garritty, Long Ge, Davina Ghersi, Sean Grant, Gordon Guyatt, Neal R Haddaway, Julian PT Higgins, Sally Hopewell, Brian Hutton, Jamie J Kirkham, Jos Kleijnen, Julia Koricheva, Joey SW Kwong, Toby J Lasserson, Julia H Littell, Yoon K Loke, Malcolm R Macleod, Chris G Maher, Ana Marušic, Dimitris Mavridis, Jessie McGowan, Matthew DF McInnes, Philippa Middleton, Karel G Moons, Zachary Munn, Jane Noyes, Barbara Nußbaumer-Streit, Donald L Patrick, Tatiana Pereira-Cenci, Ba’ Pham, Bob Phillips, Dawid Pieper, Michelle Pollock, Daniel S Quintana, Drummond Rennie, Melissa L Rethlefsen, Hannah R Rothstein, Maroeska M Rovers, Rebecca Ryan, Georgia Salanti, Ian J Saldanha, Margaret Sampson, Nancy Santesso, Rafael Sarkis-Onofre, Jelena Savović, Christopher H Schmid, Kenneth F Schulz, Guido Schwarzer, Beverley J Shea, Paul G Shekelle, Farhad Shokraneh, Mark Simmonds, Nicole Skoetz, Sharon E Straus, Anneliese Synnot, Emily E Tanner-Smith, Brett D Thombs, Hilary Thomson, Alexander Tsertsvadze, Peter Tugwell, Tari Turner, Lesley Uttley, Jeffrey C Valentine, Matt Vassar, Areti Angeliki Veroniki, Meera Viswanathan, Cole Wayant, Paul Whaley, and Kehu Yang. We thank the following contributors who provided feedback on a preliminary version of the PRISMA 2020 checklist: Jo Abbott, Fionn Büttner, Patricia Correia-Santos, Victoria Freeman, Emily A Hennessy, Rakibul Islam, Amalia (Emily) Karahalios, Kasper Krommes, Andreas Lundh, Dafne Port Nascimento, Davina Robson, Catherine Schenck-Yglesias, Mary M Scott, Sarah Tanveer and Pavel Zhelnov. We thank Abigail H Goben, Melissa L Rethlefsen, Tanja Rombey, Anna Scott, and Farhad Shokraneh for their helpful comments on the preprints of the PRISMA 2020 papers. We thank Edoardo Aromataris, Stephanie Chang, Toby Lasserson and David Schriger for their helpful peer review comments on the PRISMA 2020 papers.

Contributors: JEM and DM are joint senior authors. MJP, JEM, PMB, IB, TCH, CDM, LS, and DM conceived this paper and designed the literature review and survey conducted to inform the guideline content. MJP conducted the literature review, administered the survey and analysed the data for both. MJP prepared all materials for the development meeting. MJP and JEM presented proposals at the development meeting. All authors except for TCH, JMT, EAA, SEB, and LAM attended the development meeting. MJP and JEM took and consolidated notes from the development meeting. MJP and JEM led the drafting and editing of the article. JEM, PMB, IB, TCH, LS, JMT, EAA, SEB, RC, JG, AH, TL, EMW, SM, LAM, LAS, JT, ACT, PW, and DM drafted particular sections of the article. All authors were involved in revising the article critically for important intellectual content. All authors approved the final version of the article. MJP is the guarantor of this work. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.

Funding: There was no direct funding for this research. MJP is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618) and was previously supported by an Australian National Health and Medical Research Council (NHMRC) Early Career Fellowship (1088535) during the conduct of this research. JEM is supported by an Australian NHMRC Career Development Fellowship (1143429). TCH is supported by an Australian NHMRC Senior Research Fellowship (1154607). JMT is supported by Evidence Partners Inc. JMG is supported by a Tier 1 Canada Research Chair in Health Knowledge Transfer and Uptake. MML is supported by The Ottawa Hospital Anaesthesia Alternate Funds Association and a Faculty of Medicine Junior Research Chair. TL is supported by funding from the National Eye Institute (UG1EY020522), National Institutes of Health, United States. LAM is supported by a National Institute for Health Research Doctoral Research Fellowship (DRF-2018-11-ST2-048). ACT is supported by a Tier 2 Canada Research Chair in Knowledge Synthesis. DM is supported in part by a University Research Chair, University of Ottawa. The funders had no role in considering the study design or in the collection, analysis, interpretation of data, writing of the report, or decision to submit the article for publication.

Competing interests: All authors have completed the ICMJE uniform disclosure form at http://www.icmje.org/conflicts-of-interest/ and declare: EL is head of research for the BMJ ; MJP is an editorial board member for PLOS Medicine ; ACT is an associate editor and MJP, TL, EMW, and DM are editorial board members for the Journal of Clinical Epidemiology ; DM and LAS were editors in chief, LS, JMT, and ACT are associate editors, and JG is an editorial board member for Systematic Reviews . None of these authors were involved in the peer review process or decision to publish. TCH has received personal fees from Elsevier outside the submitted work. EMW has received personal fees from the American Journal for Public Health , for which he is the editor for systematic reviews. VW is editor in chief of the Campbell Collaboration, which produces systematic reviews, and co-convenor of the Campbell and Cochrane equity methods group. DM is chair of the EQUATOR Network, IB is adjunct director of the French EQUATOR Centre and TCH is co-director of the Australasian EQUATOR Centre, which advocates for the use of reporting guidelines to improve the quality of reporting in research articles. JMT received salary from Evidence Partners, creator of DistillerSR software for systematic reviews; Evidence Partners was not involved in the design or outcomes of the statement, and the views expressed solely represent those of the author.

Provenance and peer review: Not commissioned; externally peer reviewed.

Patient and public involvement: Patients and the public were not involved in this methodological research. We plan to disseminate the research widely, including to community participants in evidence synthesis organisations.

This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) license, which permits others to distribute, remix, adapt and build upon this work, for commercial use, provided the original work is properly cited. See: http://creativecommons.org/licenses/by/4.0/ .

  • Gurevitch J ,
  • Koricheva J ,
  • Nakagawa S ,
  • Liberati A ,
  • Tetzlaff J ,
  • Altman DG ,
  • PRISMA Group
  • Tricco AC ,
  • Sampson M ,
  • Shamseer L ,
  • Leoncini E ,
  • de Belvis G ,
  • Ricciardi W ,
  • Fowler AJ ,
  • Leclercq V ,
  • Beaudart C ,
  • Ajamieh S ,
  • Rabenda V ,
  • Tirelli E ,
  • O’Mara-Eves A ,
  • McNaught J ,
  • Ananiadou S
  • Marshall IJ ,
  • Noel-Storr A ,
  • Higgins JPT ,
  • Chandler J ,
  • McKenzie JE ,
  • López-López JA ,
  • Becker BJ ,
  • Campbell M ,
  • Sterne JAC ,
  • Savović J ,
  • Sterne JA ,
  • Hernán MA ,
  • Reeves BC ,
  • Whiting P ,
  • Higgins JP ,
  • ROBIS group
  • Hultcrantz M ,
  • Stewart L ,
  • Bossuyt PM ,
  • Flemming K ,
  • McInnes E ,
  • France EF ,
  • Cunningham M ,
  • Rethlefsen ML ,
  • Kirtley S ,
  • Waffenschmidt S ,
  • PRISMA-S Group
  • ↵ Higgins JPT, Thomas J, Chandler J, et al, eds. Cochrane Handbook for Systematic Reviews of Interventions : Version 6.0. Cochrane, 2019. Available from https://training.cochrane.org/handbook .
  • Dekkers OM ,
  • Vandenbroucke JP ,
  • Cevallos M ,
  • Renehan AG ,
  • ↵ Cooper H, Hedges LV, Valentine JV, eds. The Handbook of Research Synthesis and Meta-Analysis. Russell Sage Foundation, 2019.
  • IOM (Institute of Medicine)
  • PRISMA-P Group
  • Salanti G ,
  • Caldwell DM ,
  • Stewart LA ,
  • PRISMA-IPD Development Group
  • Zorzela L ,
  • Ioannidis JP ,
  • PRISMAHarms Group
  • McInnes MDF ,
  • Thombs BD ,
  • and the PRISMA-DTA Group
  • Beller EM ,
  • Glasziou PP ,
  • PRISMA for Abstracts Group
  • Mayo-Wilson E ,
  • Dickersin K ,
  • MUDS investigators
  • Stovold E ,
  • Beecher D ,
  • Noel-Storr A
  • McGuinness LA
  • Sarafoglou A ,
  • Boutron I ,
  • Giraudeau B ,
  • Porcher R ,
  • Chauvin A ,
  • Schulz KF ,
  • Schroter S ,
  • Stevens A ,
  • Weinstein E ,
  • Macleod MR ,
  • IICARus Collaboration
  • Kirkham JJ ,
  • Petticrew M ,
  • Tugwell P ,
  • PRISMA-Equity Bellagio group

reporting guidelines for quantitative research

Writing Quantitative Research Studies

  • Reference work entry
  • First Online: 13 January 2019
  • Cite this reference work entry

reporting guidelines for quantitative research

  • Ankur Singh 2 ,
  • Adyya Gupta 3 &
  • Karen G. Peres 4  

1349 Accesses

1 Citations

Summarizing quantitative data and its effective presentation and discussion can be challenging for students and researchers. This chapter provides a framework for adequately reporting findings from quantitative analysis in a research study for those contemplating to write a research paper. The rationale underpinning the reporting methods to maintain the credibility and integrity of quantitative studies is outlined. Commonly used terminologies in empirical studies are defined and discussed with suitable examples. Key elements that build consistency between different sections (background, methods, results, and the discussion) of a research study using quantitative methods in a journal article are explicated. Specifically, recommended standard guidelines for randomized controlled trials and observational studies for reporting and discussion of findings from quantitative studies are elaborated. Key aspects of methodology that include describing the study population, sampling strategy, data collection methods, measurements/variables, and statistical analysis which informs the quality of a study from the reviewer’s perspective are described. Effective use of references in the methods section to strengthen the rationale behind specific statistical techniques and choice of measures has been highlighted with examples. Identifying ways in which data can be most succinctly and effectively summarized in tables and graphs according to their suitability and purpose of information is also detailed in this chapter. Strategies to present and discuss the quantitative findings in a structured discussion section are also provided. Overall, the chapter provides the readers with a comprehensive set of tools to identify key strategies to be considered when reporting quantitative research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

reporting guidelines for quantitative research

Quantitative Research

reporting guidelines for quantitative research

Case Study 3: Application of Quantitative Methodology

Bhaumik S, Arora M, Singh A, Sargent JD. Impact of entertainment media smoking on adolescent smoking behaviours. Cochrane Database Syst Rev. 2015;6:1–12. https://doi.org/10.1002/14651858.CD011720 .

Article   Google Scholar  

Dickersin K, Manheimer E, Wieland S, Robinson KA, Lefebvre C, McDonald S. Development of the Cochrane Collaboration’s CENTRAL register of controlled clinical trials. Eval Health Prof. 2002;25(1):38–64.

Google Scholar  

Docherty M, Smith R. The case for structuring the discussion of scientific papers: much the same as that for structuring abstracts. Br Med J. 1999;318(7193):1224–5.

Greenland S, Pearl J, Robins JM. Causal diagrams for epidemiologic research. Epidemiology. 1999;10(1):37–48.

Horton R. The rhetoric of research. Br Med J. 1995;310(6985):985–7.

Kool B, Ziersch A, Robinson P, Wolfenden L, Lowe JB. The ‘Seven deadly sins’ of rejected papers. Aust N Z J Public Health. 2016;40(1):3–4.

Mannocci A, Saulle R, Colamesta V, D’Aguanno S, Giraldi G, Maffongelli E, et al. What is the impact of reporting guidelines on public health journals in Europe? The case of STROBE, CONSORT and PRISMA. J Public Health. 2015;37(4):737–40.

Rothwell PM. External validity of randomised controlled trials: “to whom do the results of this trial apply?”. Lancet. 2005;365(9453):82–93.

Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. PLoS Med. 2010;7(3):e1000251.

Szklo M. Quality of scientific articles. Rev Saude Publica. 2006;40 Spec no:30–5.

Vandenbroucke JP, von Elm E, Altman DG, Gotzsche PC, Mulrow CD, Pocock SJ, et al. Strengthening the reporting of observational studies in epidemiology (STROBE): explanation and elaboration. PLoS Med. 2007;4(10):e297.

Weiss NS, Koepsell TD, Psaty BM. Generalizability of the results of randomized trials. Arch Intern Med. 2008;168(2):133–5.

Singh A, Gupta A, Peres MA, Watt RG, Tsakos G, Mathur MR. Association between tooth loss and hypertension among a primarily rural middle aged and older Indian adult population. J Public Health Dent. 2016;76:198–205.

Download references

Author information

Authors and affiliations.

Centre for Health Equity, Melbourne School of Population and Global Health, The University of Melbourne, Melbourne, VIC, Australia

Ankur Singh

School of Public Health, The University of Adelaide, Adelaide, SA, Australia

Adyya Gupta

Australian Research Centre for Population Oral Health (ARCPOH), Adelaide Dental School, The University of Adelaide, Adelaide, SA, Australia

Karen G. Peres

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Ankur Singh .

Editor information

Editors and affiliations.

School of Science and Health, Western Sydney University, Penrith, NSW, Australia

Pranee Liamputtong

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this entry

Cite this entry.

Singh, A., Gupta, A., Peres, K.G. (2019). Writing Quantitative Research Studies. In: Liamputtong, P. (eds) Handbook of Research Methods in Health Social Sciences. Springer, Singapore. https://doi.org/10.1007/978-981-10-5251-4_117

Download citation

DOI : https://doi.org/10.1007/978-981-10-5251-4_117

Published : 13 January 2019

Publisher Name : Springer, Singapore

Print ISBN : 978-981-10-5250-7

Online ISBN : 978-981-10-5251-4

eBook Packages : Social Sciences Reference Module Humanities and Social Sciences Reference Module Business, Economics and Social Sciences

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Reporting Guidelines

It is important that your manuscript gives a clear and complete account of the research that you have done. Well reported research is more useful and complete reporting allows editors, peer reviewers and readers to understand what you did and how.

Poorly reported research can distort the literature, and leads to research that cannot be replicated or used in future meta-analyses or systematic reviews.

You should make sure that you manuscript is written in a way that the reader knows exactly what you did and could repeat your study if they wanted to with no additional information. It is particularly important that you give enough information in the methods section of your manuscript.

To help with reporting your research, there are reporting guidelines available for many different study designs. These contain a checklist of minimum points that you should cover in your manuscript. You should use these guidelines when you are preparing and writing your manuscript, and you may be required to provide a completed version of the checklist when you submit your manuscript. 

The EQUATOR (Enhancing the Quality and Transparency Of health Research) Network is an international initiative that aims to improve the quality of research publications. It provides a comprehensive list of reporting guidelines and other material to help improve reporting. 

A list full of all of the reporting guidelines endorsed by the EQUATOR Network can be found here . Some of the reporting guidelines for common study designs are:

  • Randomized controlled trials – CONSORT
  • Systematic reviews – PRISMA
  • Observational studies – STROBE
  • Case reports – CARE
  • Qualitative research – COREQ
  • Pre-clinical animal studies – ARRIVE

Peer reviewers may be asked to use these checklists when assessing your manuscript. If you follow these guidelines, editors and peer reviewers will be able to assess your manuscript better as they will more easily understand what you did. It may also mean that they ask you for fewer revisions.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Korean Med Sci
  • v.37(16); 2022 Apr 25

Logo of jkms

A Practical Guide to Writing Quantitative and Qualitative Research Questions and Hypotheses in Scholarly Articles

Edward barroga.

1 Department of General Education, Graduate School of Nursing Science, St. Luke’s International University, Tokyo, Japan.

Glafera Janet Matanguihan

2 Department of Biological Sciences, Messiah University, Mechanicsburg, PA, USA.

The development of research questions and the subsequent hypotheses are prerequisites to defining the main research purpose and specific objectives of a study. Consequently, these objectives determine the study design and research outcome. The development of research questions is a process based on knowledge of current trends, cutting-edge studies, and technological advances in the research field. Excellent research questions are focused and require a comprehensive literature search and in-depth understanding of the problem being investigated. Initially, research questions may be written as descriptive questions which could be developed into inferential questions. These questions must be specific and concise to provide a clear foundation for developing hypotheses. Hypotheses are more formal predictions about the research outcomes. These specify the possible results that may or may not be expected regarding the relationship between groups. Thus, research questions and hypotheses clarify the main purpose and specific objectives of the study, which in turn dictate the design of the study, its direction, and outcome. Studies developed from good research questions and hypotheses will have trustworthy outcomes with wide-ranging social and health implications.

INTRODUCTION

Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses. 1 , 2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results. 3 , 4 Both research questions and hypotheses are essentially formulated based on conventional theories and real-world processes, which allow the inception of novel studies and the ethical testing of ideas. 5 , 6

It is crucial to have knowledge of both quantitative and qualitative research 2 as both types of research involve writing research questions and hypotheses. 7 However, these crucial elements of research are sometimes overlooked; if not overlooked, then framed without the forethought and meticulous attention it needs. Planning and careful consideration are needed when developing quantitative or qualitative research, particularly when conceptualizing research questions and hypotheses. 4

There is a continuing need to support researchers in the creation of innovative research questions and hypotheses, as well as for journal articles that carefully review these elements. 1 When research questions and hypotheses are not carefully thought of, unethical studies and poor outcomes usually ensue. Carefully formulated research questions and hypotheses define well-founded objectives, which in turn determine the appropriate design, course, and outcome of the study. This article then aims to discuss in detail the various aspects of crafting research questions and hypotheses, with the goal of guiding researchers as they develop their own. Examples from the authors and peer-reviewed scientific articles in the healthcare field are provided to illustrate key points.

DEFINITIONS AND RELATIONSHIP OF RESEARCH QUESTIONS AND HYPOTHESES

A research question is what a study aims to answer after data analysis and interpretation. The answer is written in length in the discussion section of the paper. Thus, the research question gives a preview of the different parts and variables of the study meant to address the problem posed in the research question. 1 An excellent research question clarifies the research writing while facilitating understanding of the research topic, objective, scope, and limitations of the study. 5

On the other hand, a research hypothesis is an educated statement of an expected outcome. This statement is based on background research and current knowledge. 8 , 9 The research hypothesis makes a specific prediction about a new phenomenon 10 or a formal statement on the expected relationship between an independent variable and a dependent variable. 3 , 11 It provides a tentative answer to the research question to be tested or explored. 4

Hypotheses employ reasoning to predict a theory-based outcome. 10 These can also be developed from theories by focusing on components of theories that have not yet been observed. 10 The validity of hypotheses is often based on the testability of the prediction made in a reproducible experiment. 8

Conversely, hypotheses can also be rephrased as research questions. Several hypotheses based on existing theories and knowledge may be needed to answer a research question. Developing ethical research questions and hypotheses creates a research design that has logical relationships among variables. These relationships serve as a solid foundation for the conduct of the study. 4 , 11 Haphazardly constructed research questions can result in poorly formulated hypotheses and improper study designs, leading to unreliable results. Thus, the formulations of relevant research questions and verifiable hypotheses are crucial when beginning research. 12

CHARACTERISTICS OF GOOD RESEARCH QUESTIONS AND HYPOTHESES

Excellent research questions are specific and focused. These integrate collective data and observations to confirm or refute the subsequent hypotheses. Well-constructed hypotheses are based on previous reports and verify the research context. These are realistic, in-depth, sufficiently complex, and reproducible. More importantly, these hypotheses can be addressed and tested. 13

There are several characteristics of well-developed hypotheses. Good hypotheses are 1) empirically testable 7 , 10 , 11 , 13 ; 2) backed by preliminary evidence 9 ; 3) testable by ethical research 7 , 9 ; 4) based on original ideas 9 ; 5) have evidenced-based logical reasoning 10 ; and 6) can be predicted. 11 Good hypotheses can infer ethical and positive implications, indicating the presence of a relationship or effect relevant to the research theme. 7 , 11 These are initially developed from a general theory and branch into specific hypotheses by deductive reasoning. In the absence of a theory to base the hypotheses, inductive reasoning based on specific observations or findings form more general hypotheses. 10

TYPES OF RESEARCH QUESTIONS AND HYPOTHESES

Research questions and hypotheses are developed according to the type of research, which can be broadly classified into quantitative and qualitative research. We provide a summary of the types of research questions and hypotheses under quantitative and qualitative research categories in Table 1 .

Research questions in quantitative research

In quantitative research, research questions inquire about the relationships among variables being investigated and are usually framed at the start of the study. These are precise and typically linked to the subject population, dependent and independent variables, and research design. 1 Research questions may also attempt to describe the behavior of a population in relation to one or more variables, or describe the characteristics of variables to be measured ( descriptive research questions ). 1 , 5 , 14 These questions may also aim to discover differences between groups within the context of an outcome variable ( comparative research questions ), 1 , 5 , 14 or elucidate trends and interactions among variables ( relationship research questions ). 1 , 5 We provide examples of descriptive, comparative, and relationship research questions in quantitative research in Table 2 .

Hypotheses in quantitative research

In quantitative research, hypotheses predict the expected relationships among variables. 15 Relationships among variables that can be predicted include 1) between a single dependent variable and a single independent variable ( simple hypothesis ) or 2) between two or more independent and dependent variables ( complex hypothesis ). 4 , 11 Hypotheses may also specify the expected direction to be followed and imply an intellectual commitment to a particular outcome ( directional hypothesis ) 4 . On the other hand, hypotheses may not predict the exact direction and are used in the absence of a theory, or when findings contradict previous studies ( non-directional hypothesis ). 4 In addition, hypotheses can 1) define interdependency between variables ( associative hypothesis ), 4 2) propose an effect on the dependent variable from manipulation of the independent variable ( causal hypothesis ), 4 3) state a negative relationship between two variables ( null hypothesis ), 4 , 11 , 15 4) replace the working hypothesis if rejected ( alternative hypothesis ), 15 explain the relationship of phenomena to possibly generate a theory ( working hypothesis ), 11 5) involve quantifiable variables that can be tested statistically ( statistical hypothesis ), 11 6) or express a relationship whose interlinks can be verified logically ( logical hypothesis ). 11 We provide examples of simple, complex, directional, non-directional, associative, causal, null, alternative, working, statistical, and logical hypotheses in quantitative research, as well as the definition of quantitative hypothesis-testing research in Table 3 .

Research questions in qualitative research

Unlike research questions in quantitative research, research questions in qualitative research are usually continuously reviewed and reformulated. The central question and associated subquestions are stated more than the hypotheses. 15 The central question broadly explores a complex set of factors surrounding the central phenomenon, aiming to present the varied perspectives of participants. 15

There are varied goals for which qualitative research questions are developed. These questions can function in several ways, such as to 1) identify and describe existing conditions ( contextual research question s); 2) describe a phenomenon ( descriptive research questions ); 3) assess the effectiveness of existing methods, protocols, theories, or procedures ( evaluation research questions ); 4) examine a phenomenon or analyze the reasons or relationships between subjects or phenomena ( explanatory research questions ); or 5) focus on unknown aspects of a particular topic ( exploratory research questions ). 5 In addition, some qualitative research questions provide new ideas for the development of theories and actions ( generative research questions ) or advance specific ideologies of a position ( ideological research questions ). 1 Other qualitative research questions may build on a body of existing literature and become working guidelines ( ethnographic research questions ). Research questions may also be broadly stated without specific reference to the existing literature or a typology of questions ( phenomenological research questions ), may be directed towards generating a theory of some process ( grounded theory questions ), or may address a description of the case and the emerging themes ( qualitative case study questions ). 15 We provide examples of contextual, descriptive, evaluation, explanatory, exploratory, generative, ideological, ethnographic, phenomenological, grounded theory, and qualitative case study research questions in qualitative research in Table 4 , and the definition of qualitative hypothesis-generating research in Table 5 .

Qualitative studies usually pose at least one central research question and several subquestions starting with How or What . These research questions use exploratory verbs such as explore or describe . These also focus on one central phenomenon of interest, and may mention the participants and research site. 15

Hypotheses in qualitative research

Hypotheses in qualitative research are stated in the form of a clear statement concerning the problem to be investigated. Unlike in quantitative research where hypotheses are usually developed to be tested, qualitative research can lead to both hypothesis-testing and hypothesis-generating outcomes. 2 When studies require both quantitative and qualitative research questions, this suggests an integrative process between both research methods wherein a single mixed-methods research question can be developed. 1

FRAMEWORKS FOR DEVELOPING RESEARCH QUESTIONS AND HYPOTHESES

Research questions followed by hypotheses should be developed before the start of the study. 1 , 12 , 14 It is crucial to develop feasible research questions on a topic that is interesting to both the researcher and the scientific community. This can be achieved by a meticulous review of previous and current studies to establish a novel topic. Specific areas are subsequently focused on to generate ethical research questions. The relevance of the research questions is evaluated in terms of clarity of the resulting data, specificity of the methodology, objectivity of the outcome, depth of the research, and impact of the study. 1 , 5 These aspects constitute the FINER criteria (i.e., Feasible, Interesting, Novel, Ethical, and Relevant). 1 Clarity and effectiveness are achieved if research questions meet the FINER criteria. In addition to the FINER criteria, Ratan et al. described focus, complexity, novelty, feasibility, and measurability for evaluating the effectiveness of research questions. 14

The PICOT and PEO frameworks are also used when developing research questions. 1 The following elements are addressed in these frameworks, PICOT: P-population/patients/problem, I-intervention or indicator being studied, C-comparison group, O-outcome of interest, and T-timeframe of the study; PEO: P-population being studied, E-exposure to preexisting conditions, and O-outcome of interest. 1 Research questions are also considered good if these meet the “FINERMAPS” framework: Feasible, Interesting, Novel, Ethical, Relevant, Manageable, Appropriate, Potential value/publishable, and Systematic. 14

As we indicated earlier, research questions and hypotheses that are not carefully formulated result in unethical studies or poor outcomes. To illustrate this, we provide some examples of ambiguous research question and hypotheses that result in unclear and weak research objectives in quantitative research ( Table 6 ) 16 and qualitative research ( Table 7 ) 17 , and how to transform these ambiguous research question(s) and hypothesis(es) into clear and good statements.

a These statements were composed for comparison and illustrative purposes only.

b These statements are direct quotes from Higashihara and Horiuchi. 16

a This statement is a direct quote from Shimoda et al. 17

The other statements were composed for comparison and illustrative purposes only.

CONSTRUCTING RESEARCH QUESTIONS AND HYPOTHESES

To construct effective research questions and hypotheses, it is very important to 1) clarify the background and 2) identify the research problem at the outset of the research, within a specific timeframe. 9 Then, 3) review or conduct preliminary research to collect all available knowledge about the possible research questions by studying theories and previous studies. 18 Afterwards, 4) construct research questions to investigate the research problem. Identify variables to be accessed from the research questions 4 and make operational definitions of constructs from the research problem and questions. Thereafter, 5) construct specific deductive or inductive predictions in the form of hypotheses. 4 Finally, 6) state the study aims . This general flow for constructing effective research questions and hypotheses prior to conducting research is shown in Fig. 1 .

An external file that holds a picture, illustration, etc.
Object name is jkms-37-e121-g001.jpg

Research questions are used more frequently in qualitative research than objectives or hypotheses. 3 These questions seek to discover, understand, explore or describe experiences by asking “What” or “How.” The questions are open-ended to elicit a description rather than to relate variables or compare groups. The questions are continually reviewed, reformulated, and changed during the qualitative study. 3 Research questions are also used more frequently in survey projects than hypotheses in experiments in quantitative research to compare variables and their relationships.

Hypotheses are constructed based on the variables identified and as an if-then statement, following the template, ‘If a specific action is taken, then a certain outcome is expected.’ At this stage, some ideas regarding expectations from the research to be conducted must be drawn. 18 Then, the variables to be manipulated (independent) and influenced (dependent) are defined. 4 Thereafter, the hypothesis is stated and refined, and reproducible data tailored to the hypothesis are identified, collected, and analyzed. 4 The hypotheses must be testable and specific, 18 and should describe the variables and their relationships, the specific group being studied, and the predicted research outcome. 18 Hypotheses construction involves a testable proposition to be deduced from theory, and independent and dependent variables to be separated and measured separately. 3 Therefore, good hypotheses must be based on good research questions constructed at the start of a study or trial. 12

In summary, research questions are constructed after establishing the background of the study. Hypotheses are then developed based on the research questions. Thus, it is crucial to have excellent research questions to generate superior hypotheses. In turn, these would determine the research objectives and the design of the study, and ultimately, the outcome of the research. 12 Algorithms for building research questions and hypotheses are shown in Fig. 2 for quantitative research and in Fig. 3 for qualitative research.

An external file that holds a picture, illustration, etc.
Object name is jkms-37-e121-g002.jpg

EXAMPLES OF RESEARCH QUESTIONS FROM PUBLISHED ARTICLES

  • EXAMPLE 1. Descriptive research question (quantitative research)
  • - Presents research variables to be assessed (distinct phenotypes and subphenotypes)
  • “BACKGROUND: Since COVID-19 was identified, its clinical and biological heterogeneity has been recognized. Identifying COVID-19 phenotypes might help guide basic, clinical, and translational research efforts.
  • RESEARCH QUESTION: Does the clinical spectrum of patients with COVID-19 contain distinct phenotypes and subphenotypes? ” 19
  • EXAMPLE 2. Relationship research question (quantitative research)
  • - Shows interactions between dependent variable (static postural control) and independent variable (peripheral visual field loss)
  • “Background: Integration of visual, vestibular, and proprioceptive sensations contributes to postural control. People with peripheral visual field loss have serious postural instability. However, the directional specificity of postural stability and sensory reweighting caused by gradual peripheral visual field loss remain unclear.
  • Research question: What are the effects of peripheral visual field loss on static postural control ?” 20
  • EXAMPLE 3. Comparative research question (quantitative research)
  • - Clarifies the difference among groups with an outcome variable (patients enrolled in COMPERA with moderate PH or severe PH in COPD) and another group without the outcome variable (patients with idiopathic pulmonary arterial hypertension (IPAH))
  • “BACKGROUND: Pulmonary hypertension (PH) in COPD is a poorly investigated clinical condition.
  • RESEARCH QUESTION: Which factors determine the outcome of PH in COPD?
  • STUDY DESIGN AND METHODS: We analyzed the characteristics and outcome of patients enrolled in the Comparative, Prospective Registry of Newly Initiated Therapies for Pulmonary Hypertension (COMPERA) with moderate or severe PH in COPD as defined during the 6th PH World Symposium who received medical therapy for PH and compared them with patients with idiopathic pulmonary arterial hypertension (IPAH) .” 21
  • EXAMPLE 4. Exploratory research question (qualitative research)
  • - Explores areas that have not been fully investigated (perspectives of families and children who receive care in clinic-based child obesity treatment) to have a deeper understanding of the research problem
  • “Problem: Interventions for children with obesity lead to only modest improvements in BMI and long-term outcomes, and data are limited on the perspectives of families of children with obesity in clinic-based treatment. This scoping review seeks to answer the question: What is known about the perspectives of families and children who receive care in clinic-based child obesity treatment? This review aims to explore the scope of perspectives reported by families of children with obesity who have received individualized outpatient clinic-based obesity treatment.” 22
  • EXAMPLE 5. Relationship research question (quantitative research)
  • - Defines interactions between dependent variable (use of ankle strategies) and independent variable (changes in muscle tone)
  • “Background: To maintain an upright standing posture against external disturbances, the human body mainly employs two types of postural control strategies: “ankle strategy” and “hip strategy.” While it has been reported that the magnitude of the disturbance alters the use of postural control strategies, it has not been elucidated how the level of muscle tone, one of the crucial parameters of bodily function, determines the use of each strategy. We have previously confirmed using forward dynamics simulations of human musculoskeletal models that an increased muscle tone promotes the use of ankle strategies. The objective of the present study was to experimentally evaluate a hypothesis: an increased muscle tone promotes the use of ankle strategies. Research question: Do changes in the muscle tone affect the use of ankle strategies ?” 23

EXAMPLES OF HYPOTHESES IN PUBLISHED ARTICLES

  • EXAMPLE 1. Working hypothesis (quantitative research)
  • - A hypothesis that is initially accepted for further research to produce a feasible theory
  • “As fever may have benefit in shortening the duration of viral illness, it is plausible to hypothesize that the antipyretic efficacy of ibuprofen may be hindering the benefits of a fever response when taken during the early stages of COVID-19 illness .” 24
  • “In conclusion, it is plausible to hypothesize that the antipyretic efficacy of ibuprofen may be hindering the benefits of a fever response . The difference in perceived safety of these agents in COVID-19 illness could be related to the more potent efficacy to reduce fever with ibuprofen compared to acetaminophen. Compelling data on the benefit of fever warrant further research and review to determine when to treat or withhold ibuprofen for early stage fever for COVID-19 and other related viral illnesses .” 24
  • EXAMPLE 2. Exploratory hypothesis (qualitative research)
  • - Explores particular areas deeper to clarify subjective experience and develop a formal hypothesis potentially testable in a future quantitative approach
  • “We hypothesized that when thinking about a past experience of help-seeking, a self distancing prompt would cause increased help-seeking intentions and more favorable help-seeking outcome expectations .” 25
  • “Conclusion
  • Although a priori hypotheses were not supported, further research is warranted as results indicate the potential for using self-distancing approaches to increasing help-seeking among some people with depressive symptomatology.” 25
  • EXAMPLE 3. Hypothesis-generating research to establish a framework for hypothesis testing (qualitative research)
  • “We hypothesize that compassionate care is beneficial for patients (better outcomes), healthcare systems and payers (lower costs), and healthcare providers (lower burnout). ” 26
  • Compassionomics is the branch of knowledge and scientific study of the effects of compassionate healthcare. Our main hypotheses are that compassionate healthcare is beneficial for (1) patients, by improving clinical outcomes, (2) healthcare systems and payers, by supporting financial sustainability, and (3) HCPs, by lowering burnout and promoting resilience and well-being. The purpose of this paper is to establish a scientific framework for testing the hypotheses above . If these hypotheses are confirmed through rigorous research, compassionomics will belong in the science of evidence-based medicine, with major implications for all healthcare domains.” 26
  • EXAMPLE 4. Statistical hypothesis (quantitative research)
  • - An assumption is made about the relationship among several population characteristics ( gender differences in sociodemographic and clinical characteristics of adults with ADHD ). Validity is tested by statistical experiment or analysis ( chi-square test, Students t-test, and logistic regression analysis)
  • “Our research investigated gender differences in sociodemographic and clinical characteristics of adults with ADHD in a Japanese clinical sample. Due to unique Japanese cultural ideals and expectations of women's behavior that are in opposition to ADHD symptoms, we hypothesized that women with ADHD experience more difficulties and present more dysfunctions than men . We tested the following hypotheses: first, women with ADHD have more comorbidities than men with ADHD; second, women with ADHD experience more social hardships than men, such as having less full-time employment and being more likely to be divorced.” 27
  • “Statistical Analysis
  • ( text omitted ) Between-gender comparisons were made using the chi-squared test for categorical variables and Students t-test for continuous variables…( text omitted ). A logistic regression analysis was performed for employment status, marital status, and comorbidity to evaluate the independent effects of gender on these dependent variables.” 27

EXAMPLES OF HYPOTHESIS AS WRITTEN IN PUBLISHED ARTICLES IN RELATION TO OTHER PARTS

  • EXAMPLE 1. Background, hypotheses, and aims are provided
  • “Pregnant women need skilled care during pregnancy and childbirth, but that skilled care is often delayed in some countries …( text omitted ). The focused antenatal care (FANC) model of WHO recommends that nurses provide information or counseling to all pregnant women …( text omitted ). Job aids are visual support materials that provide the right kind of information using graphics and words in a simple and yet effective manner. When nurses are not highly trained or have many work details to attend to, these job aids can serve as a content reminder for the nurses and can be used for educating their patients (Jennings, Yebadokpo, Affo, & Agbogbe, 2010) ( text omitted ). Importantly, additional evidence is needed to confirm how job aids can further improve the quality of ANC counseling by health workers in maternal care …( text omitted )” 28
  • “ This has led us to hypothesize that the quality of ANC counseling would be better if supported by job aids. Consequently, a better quality of ANC counseling is expected to produce higher levels of awareness concerning the danger signs of pregnancy and a more favorable impression of the caring behavior of nurses .” 28
  • “This study aimed to examine the differences in the responses of pregnant women to a job aid-supported intervention during ANC visit in terms of 1) their understanding of the danger signs of pregnancy and 2) their impression of the caring behaviors of nurses to pregnant women in rural Tanzania.” 28
  • EXAMPLE 2. Background, hypotheses, and aims are provided
  • “We conducted a two-arm randomized controlled trial (RCT) to evaluate and compare changes in salivary cortisol and oxytocin levels of first-time pregnant women between experimental and control groups. The women in the experimental group touched and held an infant for 30 min (experimental intervention protocol), whereas those in the control group watched a DVD movie of an infant (control intervention protocol). The primary outcome was salivary cortisol level and the secondary outcome was salivary oxytocin level.” 29
  • “ We hypothesize that at 30 min after touching and holding an infant, the salivary cortisol level will significantly decrease and the salivary oxytocin level will increase in the experimental group compared with the control group .” 29
  • EXAMPLE 3. Background, aim, and hypothesis are provided
  • “In countries where the maternal mortality ratio remains high, antenatal education to increase Birth Preparedness and Complication Readiness (BPCR) is considered one of the top priorities [1]. BPCR includes birth plans during the antenatal period, such as the birthplace, birth attendant, transportation, health facility for complications, expenses, and birth materials, as well as family coordination to achieve such birth plans. In Tanzania, although increasing, only about half of all pregnant women attend an antenatal clinic more than four times [4]. Moreover, the information provided during antenatal care (ANC) is insufficient. In the resource-poor settings, antenatal group education is a potential approach because of the limited time for individual counseling at antenatal clinics.” 30
  • “This study aimed to evaluate an antenatal group education program among pregnant women and their families with respect to birth-preparedness and maternal and infant outcomes in rural villages of Tanzania.” 30
  • “ The study hypothesis was if Tanzanian pregnant women and their families received a family-oriented antenatal group education, they would (1) have a higher level of BPCR, (2) attend antenatal clinic four or more times, (3) give birth in a health facility, (4) have less complications of women at birth, and (5) have less complications and deaths of infants than those who did not receive the education .” 30

Research questions and hypotheses are crucial components to any type of research, whether quantitative or qualitative. These questions should be developed at the very beginning of the study. Excellent research questions lead to superior hypotheses, which, like a compass, set the direction of research, and can often determine the successful conduct of the study. Many research studies have floundered because the development of research questions and subsequent hypotheses was not given the thought and meticulous attention needed. The development of research questions and hypotheses is an iterative process based on extensive knowledge of the literature and insightful grasp of the knowledge gap. Focused, concise, and specific research questions provide a strong foundation for constructing hypotheses which serve as formal predictions about the research outcomes. Research questions and hypotheses are crucial elements of research that should not be overlooked. They should be carefully thought of and constructed when planning research. This avoids unethical studies and poor outcomes by defining well-founded objectives that determine the design, course, and outcome of the study.

Disclosure: The authors have no potential conflicts of interest to disclose.

Author Contributions:

  • Conceptualization: Barroga E, Matanguihan GJ.
  • Methodology: Barroga E, Matanguihan GJ.
  • Writing - original draft: Barroga E, Matanguihan GJ.
  • Writing - review & editing: Barroga E, Matanguihan GJ.

Logo for VIVA Open Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

20 16. Reporting quantitative results

Chapter outline.

  • Reporting quantitative results (8 minute read time)

Content warning: Brief discussion of violence against women.

16.1 Reporting quantitative results

Learning objectives.

Learners will be able to…

  • Execute a quantitative research report using key elements for accuracy and openness

So you’ve completed your quantitative analyses and are ready to report your results. We’re going to spend some time talking about what matters in quantitative research reports, but the very first thing to understand is this: openness with your data and analyses is key. You should never hide what you did to get to a particular conclusion and, if someone wanted to and could ethically access your data, they should be able to replicate more or less exactly what you did. While your quantitative report won’t have every single step you took to get to your conclusion, it should have plenty of detail so someone can get the picture.

Below, I’m going to take you through the key elements of a quantitative research report. This overview is pretty general and conceptual, and it will be helpful for you to look at existing scholarly articles that deal with quantitative research (like ones in your literature review) to see the structure applied. Also keep in mind that your instructor may want the sections broken out slightly differently; nonetheless, the content I outline below should be in your research report.

Introduction and literature review

These are what you’re working on building with your research proposal this semester. They should be included as part of your research report so that readers have enough information to evaluate your research for themselves. What’s here should be very similar to the introduction and literature review from your research proposal, where you described the literature relevant to the study you wanted to do. With your results in hand, though, you may find that you have to add information to the literature you wrote previously to help orient the reader of the report to important topics needed to understand the results of your study.

In this section, you should explicitly lay out your study design – for instance, if it was experimental, be specific about the type of experimental design. Discuss the type of sampling that you used, if that’s applicable to your project. You should also go into a general description of your data, including the time period, any exclusions you made from the original data set and the source – i.e., did you collect it yourself or was it secondary data?  Next, talk about the specific statistical methods you used, like t- tests, Chi-square tests, or regression analyses. For descriptive statistics, you can be relatively general – you don’t need to say “I looked at means and medians,” for instance. You need to provide enough information here that someone could replicate what you did.

In this section, you should also discuss how you operationalized your variables. What did you mean when you asked about educational attainment – did you ask for a grade number, or did you ask them to pick a range that you turned into a category? This is key information for readers to understand your research. Remember when you were looking for ways to operationalize your variables? Be the kind of author who provides enough information on operationalization so people can actually understand what they did.

You’re going to run lots of different analyses to settle on what finally makes sense to get a result – positive or negative – for your study. For this section, you’re going to provide tables with descriptions of your sample, including, but not limited to, sample size, frequencies of sample characteristics like race and gender, levels of measurement, appropriate measures of central tendency, standard deviations and variances. Here you will also want to focus on the analyses you used to actually draw whatever conclusion you settled on, both descriptive and inferential (i.e., bivariate or multivariate).

The actual statistics you report depend entirely on the kind of statistical analysis you do. For instance, if you’re reporting on a logistic regression, it’s going to look a little different than reporting on an ANOVA. In the previous chapter, we provided links to open textbooks that detail how to conduct quantitative data analysis. You should look at these resources and consult with your research professor to help you determine what is expected in a report about the particular statistical method you used.

The important thing to remember here – as we mentioned above – is that you need to be totally transparent about your results, even and especially if they don’t support your hypothesis. There is value in a disproved hypothesis, too – you now know something about how the state of the world is not .

In this section, you’re going to connect your statistical results back to your hypothesis and discuss whether your results support your hypothesis or not. You are also going to talk about what the results mean for the larger field of study of which your research is a part, the implications of your findings if you’re evaluating some kind of intervention, and how your research relates to what is already out there in this field. When your research doesn’t pan out the way you expect, if you’re able to make some educated guesses as to why this might be (supported by literature if possible, but practice wisdom works too), share those as well.

Let’s take a minute to talk about what happens when your findings disprove your hypothesis or actually indicate something negative about the group you are studying. The discussion section is where you can contextualize “negative” findings. For example, say you conducted a study that indicated that a certain group is more likely to commit violent crime. Here, you have an opportunity to talk about why this might be the case outside of their membership in that group, and how membership in that group does not automatically mean someone will commit a violent crime. You can present mitigating factors, like a history of personal and community trauma. It’s extremely important to provide this relevant context so that your results are more difficult to use against a group you are studying in a way that doesn’t reflect your actual findings.

Limitations

In this section, you’re going to critique your own study. What are the advantages, disadvantages, and trade-offs of what you did to define and analyze your variables? Some questions you might consider include:  What limits the study’s applicability to the population at large? Were there trade-offs you had to make between rigor and available data? Did the statistical analyses you used mean that you could only get certain types of results? What would have made the study more widely applicable or more useful for a certain group? You should be thinking about this throughout the analysis process so you can properly contextualize your results.

In this section, you may also consider discussing any threats to internal validity that you identified and whether you think you can generalize your research. Finally, if you used any measurement tools that haven’t been validated yet, discuss how this could have affected your results.

Significance and conclusions

Finally, you want to use the conclusions section to bring it full circle for your reader – why did this research matter? Talk about how it contributed to knowledge around the topic and how might it be used to further practice. Identify and discuss ethical implications of your findings for social workers and social work research. Finally, make sure to talk about the next steps for you, other researchers, or policy-makers based on your research findings.

Key Takeaways

  • Your quantitative research report should provide the reader with transparent, replicable methods and put your research into the context of existing literature, real-world practice and social work ethics.
  • Think about the research project you are building now. What could a negative finding be, and how might you provide your reader with context to ensure that you are not harming your study population?

The process of determining how to measure a construct that cannot be directly observed

Ability to say that one variable "causes" something to happen to another variable. Very important to assess when thinking about studies that examine causation such as experimental or quasi-experimental designs.

Graduate research methods in social work Copyright © 2020 by Matthew DeCarlo, Cory Cummings, Kate Agnelli is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

  • Open access
  • Published: 29 March 2021

The PRISMA 2020 statement: an updated guideline for reporting systematic reviews

  • Matthew J. Page   ORCID: orcid.org/0000-0002-4242-7526 1 ,
  • Joanne E. McKenzie 1 ,
  • Patrick M. Bossuyt 2 ,
  • Isabelle Boutron 3 ,
  • Tammy C. Hoffmann 4 ,
  • Cynthia D. Mulrow 5 ,
  • Larissa Shamseer 6 ,
  • Jennifer M. Tetzlaff 7 ,
  • Elie A. Akl 8 ,
  • Sue E. Brennan 1 ,
  • Roger Chou 9 ,
  • Julie Glanville 10 ,
  • Jeremy M. Grimshaw 11 ,
  • Asbjørn Hróbjartsson 12 ,
  • Manoj M. Lalu 13 ,
  • Tianjing Li 14 ,
  • Elizabeth W. Loder 15 ,
  • Evan Mayo-Wilson 16 ,
  • Steve McDonald 1 ,
  • Luke A. McGuinness 17 ,
  • Lesley A. Stewart 18 ,
  • James Thomas 19 ,
  • Andrea C. Tricco 20 ,
  • Vivian A. Welch 21 ,
  • Penny Whiting 17 &
  • David Moher 22  

Systematic Reviews volume  10 , Article number:  89 ( 2021 ) Cite this article

316k Accesses

1078 Citations

102 Altmetric

Metrics details

An Editorial to this article was published on 19 April 2021

The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews. In order to encourage its wide dissemination this article is freely accessible on BMJ, PLOS Medicine, Journal of Clinical Epidemiology and International Journal of Surgery journal websites.

Systematic reviews serve many critical roles. They can provide syntheses of the state of knowledge in a field, from which future research priorities can be identified; they can address questions that otherwise could not be answered by individual studies; they can identify problems in primary research that should be rectified in future studies; and they can generate or evaluate theories about how or why phenomena occur. Systematic reviews therefore generate various types of knowledge for different users of reviews (such as patients, healthcare providers, researchers, and policy makers) [ 1 , 2 ]. To ensure a systematic review is valuable to users, authors should prepare a transparent, complete, and accurate account of why the review was done, what they did (such as how studies were identified and selected) and what they found (such as characteristics of contributing studies and results of meta-analyses). Up-to-date reporting guidance facilitates authors achieving this [ 3 ].

The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) [ 4 , 5 , 6 , 7 , 8 , 9 , 10 ] is a reporting guideline designed to address poor reporting of systematic reviews [ 11 ]. The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an “explanation and elaboration” paper [ 12 , 13 , 14 , 15 , 16 ] providing additional reporting guidance for each item, along with exemplars of reporting. The recommendations have been widely endorsed and adopted, as evidenced by its co-publication in multiple journals, citation in over 60,000 reports (Scopus, August 2020), endorsement from almost 200 journals and systematic review organisations, and adoption in various disciplines. Evidence from observational studies suggests that use of the PRISMA 2009 statement is associated with more complete reporting of systematic reviews [ 17 , 18 , 19 , 20 ], although more could be done to improve adherence to the guideline [ 21 ].

Many innovations in the conduct of systematic reviews have occurred since publication of the PRISMA 2009 statement. For example, technological advances have enabled the use of natural language processing and machine learning to identify relevant evidence [ 22 , 23 , 24 ], methods have been proposed to synthesise and present findings when meta-analysis is not possible or appropriate [ 25 , 26 , 27 ], and new methods have been developed to assess the risk of bias in results of included studies [ 28 , 29 ]. Evidence on sources of bias in systematic reviews has accrued, culminating in the development of new tools to appraise the conduct of systematic reviews [ 30 , 31 ]. Terminology used to describe particular review processes has also evolved, as in the shift from assessing “quality” to assessing “certainty” in the body of evidence [ 32 ]. In addition, the publishing landscape has transformed, with multiple avenues now available for registering and disseminating systematic review protocols [ 33 , 34 ], disseminating reports of systematic reviews, and sharing data and materials, such as preprint servers and publicly accessible repositories. To capture these advances in the reporting of systematic reviews necessitated an update to the PRISMA 2009 statement.

Development of PRISMA 2020

A complete description of the methods used to develop PRISMA 2020 is available elsewhere [ 35 ]. We identified PRISMA 2009 items that were often reported incompletely by examining the results of studies investigating the transparency of reporting of published reviews [ 17 , 21 , 36 , 37 ]. We identified possible modifications to the PRISMA 2009 statement by reviewing 60 documents providing reporting guidance for systematic reviews (including reporting guidelines, handbooks, tools, and meta-research studies) [ 38 ]. These reviews of the literature were used to inform the content of a survey with suggested possible modifications to the 27 items in PRISMA 2009 and possible additional items. Respondents were asked whether they believed we should keep each PRISMA 2009 item as is, modify it, or remove it, and whether we should add each additional item. Systematic review methodologists and journal editors were invited to complete the online survey (110 of 220 invited responded). We discussed proposed content and wording of the PRISMA 2020 statement, as informed by the review and survey results, at a 21-member, two-day, in-person meeting in September 2018 in Edinburgh, Scotland. Throughout 2019 and 2020, we circulated an initial draft and five revisions of the checklist and explanation and elaboration paper to co-authors for feedback. In April 2020, we invited 22 systematic reviewers who had expressed interest in providing feedback on the PRISMA 2020 checklist to share their views (via an online survey) on the layout and terminology used in a preliminary version of the checklist. Feedback was received from 15 individuals and considered by the first author, and any revisions deemed necessary were incorporated before the final version was approved and endorsed by all co-authors.

The PRISMA 2020 statement

Scope of the guideline.

The PRISMA 2020 statement has been designed primarily for systematic reviews of studies that evaluate the effects of health interventions, irrespective of the design of the included studies. However, the checklist items are applicable to reports of systematic reviews evaluating other interventions (such as social or educational interventions), and many items are applicable to systematic reviews with objectives other than evaluating interventions (such as evaluating aetiology, prevalence, or prognosis). PRISMA 2020 is intended for use in systematic reviews that include synthesis (such as pairwise meta-analysis or other statistical synthesis methods) or do not include synthesis (for example, because only one eligible study is identified). The PRISMA 2020 items are relevant for mixed-methods systematic reviews (which include quantitative and qualitative studies), but reporting guidelines addressing the presentation and synthesis of qualitative data should also be consulted [ 39 , 40 ]. PRISMA 2020 can be used for original systematic reviews, updated systematic reviews, or continually updated (“living”) systematic reviews. However, for updated and living systematic reviews, there may be some additional considerations that need to be addressed. Where there is relevant content from other reporting guidelines, we reference these guidelines within the items in the explanation and elaboration paper [ 41 ] (such as PRISMA-Search [ 42 ] in items 6 and 7, Synthesis without meta-analysis (SWiM) reporting guideline [ 27 ] in item 13d). Box 1 includes a glossary of terms used throughout the PRISMA 2020 statement.

PRISMA 2020 is not intended to guide systematic review conduct, for which comprehensive resources are available [ 43 , 44 , 45 , 46 ]. However, familiarity with PRISMA 2020 is useful when planning and conducting systematic reviews to ensure that all recommended information is captured. PRISMA 2020 should not be used to assess the conduct or methodological quality of systematic reviews; other tools exist for this purpose [ 30 , 31 ]. Furthermore, PRISMA 2020 is not intended to inform the reporting of systematic review protocols, for which a separate statement is available (PRISMA for Protocols (PRISMA-P) 2015 statement [ 47 , 48 ]). Finally, extensions to the PRISMA 2009 statement have been developed to guide reporting of network meta-analyses [ 49 ], meta-analyses of individual participant data [ 50 ], systematic reviews of harms [ 51 ], systematic reviews of diagnostic test accuracy studies [ 52 ], and scoping reviews [ 53 ]; for these types of reviews we recommend authors report their review in accordance with the recommendations in PRISMA 2020 along with the guidance specific to the extension.

How to use PRISMA 2020

The PRISMA 2020 statement (including the checklists, explanation and elaboration, and flow diagram) replaces the PRISMA 2009 statement, which should no longer be used. Box  2 summarises noteworthy changes from the PRISMA 2009 statement. The PRISMA 2020 checklist includes seven sections with 27 items, some of which include sub-items (Table  1 ). A checklist for journal and conference abstracts for systematic reviews is included in PRISMA 2020. This abstract checklist is an update of the 2013 PRISMA for Abstracts statement [ 54 ], reflecting new and modified content in PRISMA 2020 (Table  2 ). A template PRISMA flow diagram is provided, which can be modified depending on whether the systematic review is original or updated (Fig.  1 ).

figure 1

 PRISMA 2020 flow diagram template for systematic reviews. The new design is adapted from flow diagrams proposed by Boers [ 55 ], Mayo-Wilson et al. [ 56 ] and Stovold et al. [ 57 ] The boxes in grey should only be completed if applicable; otherwise they should be removed from the flow diagram. Note that a “report” could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report or any other document providing relevant information

We recommend authors refer to PRISMA 2020 early in the writing process, because prospective consideration of the items may help to ensure that all the items are addressed. To help keep track of which items have been reported, the PRISMA statement website ( http://www.prisma-statement.org/ ) includes fillable templates of the checklists to download and complete (also available in Additional file 1 ). We have also created a web application that allows users to complete the checklist via a user-friendly interface [ 58 ] (available at https://prisma.shinyapps.io/checklist/ and adapted from the Transparency Checklist app [ 59 ]). The completed checklist can be exported to Word or PDF. Editable templates of the flow diagram can also be downloaded from the PRISMA statement website.

We have prepared an updated explanation and elaboration paper, in which we explain why reporting of each item is recommended and present bullet points that detail the reporting recommendations (which we refer to as elements) [ 41 ]. The bullet-point structure is new to PRISMA 2020 and has been adopted to facilitate implementation of the guidance [ 60 , 61 ]. An expanded checklist, which comprises an abridged version of the elements presented in the explanation and elaboration paper, with references and some examples removed, is available in Additional file 2 . Consulting the explanation and elaboration paper is recommended if further clarity or information is required.

Journals and publishers might impose word and section limits, and limits on the number of tables and figures allowed in the main report. In such cases, if the relevant information for some items already appears in a publicly accessible review protocol, referring to the protocol may suffice. Alternatively, placing detailed descriptions of the methods used or additional results (such as for less critical outcomes) in supplementary files is recommended. Ideally, supplementary files should be deposited to a general-purpose or institutional open-access repository that provides free and permanent access to the material (such as Open Science Framework, Dryad, figshare). A reference or link to the additional information should be included in the main report. Finally, although PRISMA 2020 provides a template for where information might be located, the suggested location should not be seen as prescriptive; the guiding principle is to ensure the information is reported.

Use of PRISMA 2020 has the potential to benefit many stakeholders. Complete reporting allows readers to assess the appropriateness of the methods, and therefore the trustworthiness of the findings. Presenting and summarising characteristics of studies contributing to a synthesis allows healthcare providers and policy makers to evaluate the applicability of the findings to their setting. Describing the certainty in the body of evidence for an outcome and the implications of findings should help policy makers, managers, and other decision makers formulate appropriate recommendations for practice or policy. Complete reporting of all PRISMA 2020 items also facilitates replication and review updates, as well as inclusion of systematic reviews in overviews (of systematic reviews) and guidelines, so teams can leverage work that is already done and decrease research waste [ 36 , 62 , 63 ].

We updated the PRISMA 2009 statement by adapting the EQUATOR Network’s guidance for developing health research reporting guidelines [ 64 ]. We evaluated the reporting completeness of published systematic reviews [ 17 , 21 , 36 , 37 ], reviewed the items included in other documents providing guidance for systematic reviews [ 38 ], surveyed systematic review methodologists and journal editors for their views on how to revise the original PRISMA statement [ 35 ], discussed the findings at an in-person meeting, and prepared this document through an iterative process. Our recommendations are informed by the reviews and survey conducted before the in-person meeting, theoretical considerations about which items facilitate replication and help users assess the risk of bias and applicability of systematic reviews, and co-authors’ experience with authoring and using systematic reviews.

Various strategies to increase the use of reporting guidelines and improve reporting have been proposed. They include educators introducing reporting guidelines into graduate curricula to promote good reporting habits of early career scientists [ 65 ]; journal editors and regulators endorsing use of reporting guidelines [ 18 ]; peer reviewers evaluating adherence to reporting guidelines [ 61 , 66 ]; journals requiring authors to indicate where in their manuscript they have adhered to each reporting item [ 67 ]; and authors using online writing tools that prompt complete reporting at the writing stage [ 60 ]. Multi-pronged interventions, where more than one of these strategies are combined, may be more effective (such as completion of checklists coupled with editorial checks) [ 68 ]. However, of 31 interventions proposed to increase adherence to reporting guidelines, the effects of only 11 have been evaluated, mostly in observational studies at high risk of bias due to confounding [ 69 ]. It is therefore unclear which strategies should be used. Future research might explore barriers and facilitators to the use of PRISMA 2020 by authors, editors, and peer reviewers, designing interventions that address the identified barriers, and evaluating those interventions using randomised trials. To inform possible revisions to the guideline, it would also be valuable to conduct think-aloud studies [ 70 ] to understand how systematic reviewers interpret the items, and reliability studies to identify items where there is varied interpretation of the items.

We encourage readers to submit evidence that informs any of the recommendations in PRISMA 2020 (via the PRISMA statement website: http://www.prisma-statement.org/ ). To enhance accessibility of PRISMA 2020, several translations of the guideline are under way (see available translations at the PRISMA statement website). We encourage journal editors and publishers to raise awareness of PRISMA 2020 (for example, by referring to it in journal “Instructions to authors”), endorsing its use, advising editors and peer reviewers to evaluate submitted systematic reviews against the PRISMA 2020 checklists, and making changes to journal policies to accommodate the new reporting recommendations. We recommend existing PRISMA extensions [ 47 , 49 , 50 , 51 , 52 , 53 , 71 , 72 ] be updated to reflect PRISMA 2020 and advise developers of new PRISMA extensions to use PRISMA 2020 as the foundation document.

We anticipate that the PRISMA 2020 statement will benefit authors, editors, and peer reviewers of systematic reviews, and different users of reviews, including guideline developers, policy makers, healthcare providers, patients, and other stakeholders. Ultimately, we hope that uptake of the guideline will lead to more transparent, complete, and accurate reporting of systematic reviews, thus facilitating evidence based decision making.

Box 1 Glossary of terms

Systematic review —A review that uses explicit, systematic methods to collate and synthesise findings of studies that address a clearly formulated question [ 43 ]

Statistical synthesis —The combination of quantitative results of two or more studies. This encompasses meta-analysis of effect estimates (described below) and other methods, such as combining P values, calculating the range and distribution of observed effects, and vote counting based on the direction of effect (see McKenzie and Brennan [ 25 ] for a description of each method)

Meta-analysis of effect estimates —A statistical technique used to synthesise results when study effect estimates and their variances are available, yielding a quantitative summary of results [ 25 ]

Outcome —An event or measurement collected for participants in a study (such as quality of life, mortality)

Result —The combination of a point estimate (such as a mean difference, risk ratio, or proportion) and a measure of its precision (such as a confidence/credible interval) for a particular outcome

Report —A document (paper or electronic) supplying information about a particular study. It could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report, or any other document providing relevant information

Record —The title or abstract (or both) of a report indexed in a database or website (such as a title or abstract for an article indexed in Medline). Records that refer to the same report (such as the same journal article) are “duplicates”; however, records that refer to reports that are merely similar (such as a similar abstract submitted to two different conferences) should be considered unique.

Study —An investigation, such as a clinical trial, that includes a defined group of participants and one or more interventions and outcomes. A “study” might have multiple reports. For example, reports could include the protocol, statistical analysis plan, baseline characteristics, results for the primary outcome, results for harms, results for secondary outcomes, and results for additional mediator and moderator analyses

Box 2 Noteworthy changes to the PRISMA 2009 statement

• Inclusion of the abstract reporting checklist within PRISMA 2020 (see item #2 and Box 2 ).

• Movement of the ‘Protocol and registration’ item from the start of the Methods section of the checklist to a new Other section, with addition of a sub-item recommending authors describe amendments to information provided at registration or in the protocol (see item #24a-24c).

• Modification of the ‘Search’ item to recommend authors present full search strategies for all databases, registers and websites searched, not just at least one database (see item #7).

• Modification of the ‘Study selection’ item in the Methods section to emphasise the reporting of how many reviewers screened each record and each report retrieved, whether they worked independently, and if applicable, details of automation tools used in the process (see item #8).

• Addition of a sub-item to the ‘Data items’ item recommending authors report how outcomes were defined, which results were sought, and methods for selecting a subset of results from included studies (see item #10a).

• Splitting of the ‘Synthesis of results’ item in the Methods section into six sub-items recommending authors describe: the processes used to decide which studies were eligible for each synthesis; any methods required to prepare the data for synthesis; any methods used to tabulate or visually display results of individual studies and syntheses; any methods used to synthesise results; any methods used to explore possible causes of heterogeneity among study results (such as subgroup analysis, meta-regression); and any sensitivity analyses used to assess robustness of the synthesised results (see item #13a-13f).

• Addition of a sub-item to the ‘Study selection’ item in the Results section recommending authors cite studies that might appear to meet the inclusion criteria, but which were excluded, and explain why they were excluded (see item #16b).

• Splitting of the ‘Synthesis of results’ item in the Results section into four sub-items recommending authors: briefly summarise the characteristics and risk of bias among studies contributing to the synthesis; present results of all statistical syntheses conducted; present results of any investigations of possible causes of heterogeneity among study results; and present results of any sensitivity analyses (see item #20a-20d).

• Addition of new items recommending authors report methods for and results of an assessment of certainty (or confidence) in the body of evidence for an outcome (see items #15 and #22).

• Addition of a new item recommending authors declare any competing interests (see item #26).

• Addition of a new item recommending authors indicate whether data, analytic code and other materials used in the review are publicly available and if so, where they can be found (see item #27).

Gurevitch J, Koricheva J, Nakagawa S, Stewart G. Meta-analysis and the science of research synthesis. Nature. 2018;555:175–82. https://doi.org/10.1038/nature25753 .

Article   CAS   PubMed   Google Scholar  

Gough D, Thomas J, Oliver S. Clarifying differences between reviews within evidence ecosystems. Syst Rev. 2019;8:170. https://doi.org/10.1186/s13643-019-1089-2 .

Article   PubMed   PubMed Central   Google Scholar  

Moher D. Reporting guidelines: doing better for readers. BMC Med. 2018;16:233. https://doi.org/10.1186/s12916-018-1226-0 .

Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151:264–9, W64. https://doi.org/10.7326/0003-4819-151-4-200908180-00135 .

Article   PubMed   Google Scholar  

Moher D, Liberati A, Tetzlaff J, Altman DG. PRISMA Group Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ. 2009;339:b2535. https://doi.org/10.1136/bmj.b2535 .

Moher D, Liberati A, Tetzlaff J, Altman DG. PRISMA Group Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6:e1000097. https://doi.org/10.1371/journal.pmed.1000097 .

Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. J Clin Epidemiol. 2009;62:1006–12. https://doi.org/10.1016/j.jclinepi.2009.06.005 .

Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Int J Surg. 2010;8:336–41. https://doi.org/10.1016/j.ijsu.2010.02.007 .

Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Open Med. 2009;3:e123–30.

PubMed   PubMed Central   Google Scholar  

Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Reprint--preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Phys Ther. 2009;89:873–80. https://doi.org/10.1093/ptj/89.9.873 .

Moher D, Tetzlaff J, Tricco AC, Sampson M, Altman DG. Epidemiology and reporting characteristics of systematic reviews. PLoS Med. 2007;4:e78. https://doi.org/10.1371/journal.pmed.0040078 .

Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol. 2009;62:e1–34. https://doi.org/10.1016/j.jclinepi.2009.06.006 .

Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ. 2009;339:b2700. https://doi.org/10.1136/bmj.b2700 .

Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med. 2009;151:W65–94. https://doi.org/10.7326/0003-4819-151-4-200908180-00136 .

Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6:e1000100. https://doi.org/10.1371/journal.pmed.1000100 .

Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting. systematic reviews and meta-analyses of studies that evaluate health care. interventions: explanation and elaboration. PLoS Med. 2009;6:e1000100. https://doi.org/10.1371/journal.pmed.1000100 .

Page MJ, Shamseer L, Altman DG, et al. Epidemiology and reporting characteristics of systematic reviews of biomedical research: a cross-sectional study. PLoS Med. 2016;13:e1002028. https://doi.org/10.1371/journal.pmed.1002028 .

Panic N, Leoncini E, de Belvis G, Ricciardi W, Boccia S. Evaluation of the endorsement of the preferred reporting items for systematic reviews and meta-analysis (PRISMA) statement on the quality of published systematic review and meta-analyses. PLoS One. 2013;8:e83138. https://doi.org/10.1371/journal.pone.0083138 .

Article   CAS   PubMed   PubMed Central   Google Scholar  

Agha RA, Fowler AJ, Limb C, et al. Impact of the mandatory implementation of reporting guidelines on reporting quality in a surgical journal: a before and after study. Int J Surg. 2016;30:169–72. https://doi.org/10.1016/j.ijsu.2016.04.032 .

Leclercq V, Beaudart C, Ajamieh S, Rabenda V, Tirelli E, Bruyère O. Meta-analyses indexed in PsycINFO had a better completeness of reporting when they mention PRISMA. J Clin Epidemiol. 2019;115:46–54. https://doi.org/10.1016/j.jclinepi.2019.06.014 .

Page MJ, Moher D. Evaluations of the uptake and impact of the preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement and extensions: a scoping review. Syst Rev. 2017;6:263. https://doi.org/10.1186/s13643-017-0663-8 .

O’Mara-Eves A, Thomas J, McNaught J, Miwa M, Ananiadou S. Using text mining for study identification in systematic reviews: a systematic review of current approaches. Syst Rev. 2015;4:5. https://doi.org/10.1186/2046-4053-4-5 .

Marshall IJ, Noel-Storr A, Kuiper J, Thomas J, Wallace BC. Machine learning for identifying randomized controlled trials: an evaluation and practitioner’s guide. Res Synth Methods. 2018;9:602–14. https://doi.org/10.1002/jrsm.1287 .

Marshall IJ, Wallace BC. Toward systematic review automation: a practical guide to using machine learning tools in research synthesis. Syst Rev. 2019;8:163. https://doi.org/10.1186/s13643-019-1074-9 .

McKenzie JE, Brennan SE. Synthesizing and presenting findings using other methods. In: Higgins JPT, Thomas J, Chandler J, et al., editors. Cochrane handbook for systematic reviews of interventions. London: Cochrane; 2019. https://doi.org/10.1002/9781119536604.ch12 .

Chapter   Google Scholar  

Higgins JPT, López-López JA, Becker BJ, et al. Synthesising quantitative evidence in systematic reviews of complex health interventions. BMJ Glob Health. 2019;4(Suppl 1):e000858. https://doi.org/10.1136/bmjgh-2018-000858 .

Campbell M, McKenzie JE, Sowden A, et al. Synthesis without meta-analysis (SWiM) in systematic reviews: reporting guideline. BMJ. 2020;368:l6890. https://doi.org/10.1136/bmj.l6890 .

Sterne JAC, Savović J, Page MJ, et al. RoB 2: a revised tool for assessing risk of bias in randomised trials. BMJ. 2019;366:l4898. https://doi.org/10.1136/bmj.l4898 .

Sterne JA, Hernán MA, Reeves BC, et al. ROBINS-I: a tool for assessing risk of bias in non-randomised studies of interventions. BMJ. 2016;355:i4919. https://doi.org/10.1136/bmj.i4919 .

Whiting P, Savović J, Higgins JP, ROBIS group, et al. ROBIS: a new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol. 2016;69:225–34. https://doi.org/10.1016/j.jclinepi.2015.06.005 .

Shea BJ, Reeves BC, Wells G, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ. 2017;358:j4008. https://doi.org/10.1136/bmj.j4008 .

Hultcrantz M, Rind D, Akl EA, et al. The GRADE working group clarifies the construct of certainty of evidence. J Clin Epidemiol. 2017;87:4–13. https://doi.org/10.1016/j.jclinepi.2017.05.006 .

Booth A, Clarke M, Dooley G, et al. The nuts and bolts of PROSPERO: an international prospective register of systematic reviews. Syst Rev. 2012;1:2. https://doi.org/10.1186/2046-4053-1-2 .

Moher D, Stewart L, Shekelle P. Establishing a new journal for systematic review products. Syst Rev. 2012;1:1. https://doi.org/10.1186/2046-4053-1-1 .

Page MJ, McKenzie JE, Bossuyt PM, et al. Updating guidance for reporting systematic reviews: development of the PRISMA 2020 statement. J Clin Epidemiol 2021;134:103–112. https://doi.org/10.1016/j.jclinepi.2021.02.003 .

Page MJ, Altman DG, Shamseer L, et al. Reproducible research practices are underused in systematic reviews of biomedical interventions. J Clin Epidemiol. 2018;94:8–18. https://doi.org/10.1016/j.jclinepi.2017.10.017 .

Page MJ, Altman DG, McKenzie JE, et al. Flaws in the application and interpretation of statistical analyses in systematic reviews of therapeutic interventions were common: a cross-sectional analysis. J Clin Epidemiol. 2018;95:7–18. https://doi.org/10.1016/j.jclinepi.2017.11.022 .

Page MJ, McKenzie JE, Bossuyt PM, et al. Mapping of reporting guidance for systematic reviews and meta-analyses generated a comprehensive item bank for future reporting guidelines. J Clin Epidemiol. 2020;118:60–8. https://doi.org/10.1016/j.jclinepi.2019.11.010 .

Tong A, Flemming K, McInnes E, Oliver S, Craig J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol. 2012;12:181. https://doi.org/10.1186/1471-2288-12-181 .

France EF, Cunningham M, Ring N, et al. Improving reporting of meta-ethnography: the eMERGe reporting guidance. BMC Med Res Methodol. 2019;19:25. https://doi.org/10.1186/s12874-018-0600-0 .

Page MJ, Moher D, Bossuyt PM, et al. PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. BMJ. 2021;372:n160. https://doi.org/10.1136/bmj.n160 .

Rethlefsen ML, Kirtley S, Waffenschmidt S, et al. PRISMA-S Group PRISMA-S: an extension to the PRISMA statement for reporting literature searches in systematic reviews. Syst Rev. 2021;10:39. https://doi.org/10.1186/s13643-020-01542-z .

Higgins JPT, Thomas J, Chandler J, et al. Cochrane handbook for systematic reviews of interventions: version 6.0. London: Cochrane; 2019. Available from https://training.cochrane.org/handbook

Book   Google Scholar  

Dekkers OM, Vandenbroucke JP, Cevallos M, Renehan AG, Altman DG, Egger M. COSMOS-E: guidance on conducting systematic reviews and meta-analyses of observational studies of etiology. PLoS Med. 2019;16:e1002742. https://doi.org/10.1371/journal.pmed.1002742 .

Cooper H, Hedges LV, Valentine JV. The handbook of research synthesis and meta-analysis. New York: Russell Sage Foundation; 2019.

IOM (Institute of Medicine). Finding what works in health care: standards for systematic reviews. Washington, D.C.: The National Academies Press; 2011.

Google Scholar  

Moher D, Shamseer L, Clarke M, PRISMA-P Group, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4:1. https://doi.org/10.1186/2046-4053-4-1 .

Shamseer L, Moher D, Clarke M, PRISMA-P Group, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015;350:g7647. https://doi.org/10.1136/bmj.g7647 .

Hutton B, Salanti G, Caldwell DM, et al. The PRISMA extension statement for reporting of systematic reviews incorporating network meta-analyses of health care interventions: checklist and explanations. Ann Intern Med. 2015;162:777–84. https://doi.org/10.7326/M14-2385 .

Stewart LA, Clarke M, Rovers M, PRISMA-IPD Development Group, et al. Preferred reporting items for systematic review and meta-analyses of individual participant data: the PRISMA-IPD statement. JAMA. 2015;313:1657–65. https://doi.org/10.1001/jama.2015.3656 .

Zorzela L, Loke YK, Ioannidis JP, et al. PRISMAHarms Group PRISMA harms checklist: improving harms reporting in systematic reviews. BMJ. 2016;352:i157. https://doi.org/10.1136/bmj.i157 .

McInnes MDF, Moher D, Thombs BD, the PRISMA-DTA Group, et al. Preferred reporting items for a systematic review and meta-analysis of diagnostic test accuracy studies: the PRISMA-DTA statement. JAMA. 2018;319:388–96. https://doi.org/10.1001/jama.2017.19163 .

Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for scoping reviews (PRISMA-SCR): checklist and explanation. Ann Intern Med. 2018;169:467–73. https://doi.org/10.7326/M18-0850 .

Beller EM, Glasziou PP, Altman DG, et al. PRISMA for Abstracts Group PRISMA for Abstracts: reporting systematic reviews in journal and conference abstracts. PLoS Med. 2013;10:e1001419. https://doi.org/10.1371/journal.pmed.1001419 .

Boers M. Graphics and statistics for cardiology: designing effective tables for presentation and publication. Heart. 2018;104:192–200. https://doi.org/10.1136/heartjnl-2017-311581 .

Mayo-Wilson E, Li T, Fusco N, Dickersin K, MUDS investigators. Practical guidance for using multiple data sources in systematic reviews and meta-analyses (with examples from the MUDS study). Res Synth Methods. 2018;9:2–12. https://doi.org/10.1002/jrsm.1277 .

Stovold E, Beecher D, Foxlee R, Noel-Storr A. Study flow diagrams in Cochrane systematic review updates: an adapted PRISMA flow diagram. Syst Rev. 2014;3:54. https://doi.org/10.1186/2046-4053-3-54 .

McGuinness LA. mcguinlu/PRISMA-Checklist: Initial release for manuscript submission (Version v1.0.0). Geneva: Zenodo; 2020. https://doi.org/10.5281/zenodo.3994319 .

Aczel B, Szaszi B, Sarafoglou A, et al. A consensus-based transparency checklist. Nat Hum Behav. 2020;4:4–6. https://doi.org/10.1038/s41562-019-0772-6 .

Barnes C, Boutron I, Giraudeau B, Porcher R, Altman DG, Ravaud P. Impact of an online writing aid tool for writing a randomized trial report: the COBWEB (Consort-based WEB tool) randomized controlled trial. BMC Med. 2015;13:221. https://doi.org/10.1186/s12916-015-0460-y .

Chauvin A, Ravaud P, Moher D, et al. Accuracy in detecting inadequate research reporting by early career peer reviewers using an online CONSORT-based peer-review tool (COBPeer) versus the usual peer-review process: a cross-sectional diagnostic study. BMC Med. 2019;17:205. https://doi.org/10.1186/s12916-019-1436-0 .

Wayant C, Page MJ, Vassar M. Evaluation of reproducible research practices in oncology systematic reviews with meta-analyses referenced by national comprehensive cancer network guidelines. JAMA Oncol. 2019;5:1550–5. https://doi.org/10.1001/jamaoncol.2019.2564 .

Article   PubMed Central   PubMed   Google Scholar  

McKenzie JE, Brennan SE. Overviews of systematic reviews: great promise, greater challenge. Syst Rev. 2017;6:185. https://doi.org/10.1186/s13643-017-0582-8 .

Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7:e1000217. https://doi.org/10.1371/journal.pmed.1000217 .

Simera I, Moher D, Hirst A, Hoey J, Schulz KF, Altman DG. Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network. BMC Med. 2010;8:24. https://doi.org/10.1186/1741-7015-8-24 .

Speich B, Schroter S, Briel M, et al. Impact of a short version of the CONSORT checklist for peer reviewers to improve the reporting of randomised controlled trials published in biomedical journals: study protocol for a randomised controlled trial. BMJ Open. 2020;10:e035114. https://doi.org/10.1136/bmjopen-2019-035114 .

Stevens A, Shamseer L, Weinstein E, et al. Relation of completeness of reporting of health research to journals’ endorsement of reporting guidelines: systematic review. BMJ. 2014;348:g3804. https://doi.org/10.1136/bmj.g3804 .

Hair K, Macleod MR, Sena ES, IICARus Collaboration. A randomised controlled trial of an Intervention to Improve Compliance with the ARRIVE guidelines (IICARus). Res Integr Peer Rev. 2019;4:12. https://doi.org/10.1186/s41073-019-0069-3 .

Blanco D, Altman D, Moher D, Boutron I, Kirkham JJ, Cobo E. Scoping review on interventions to improve adherence to reporting guidelines in health research. BMJ Open. 2019;9:e026589. https://doi.org/10.1136/bmjopen-2018-026589 .

Charters E. The use of think-aloud methods in qualitative research: an introduction to think-aloud methods. Brock Educ J. 2003;12:68–82. https://doi.org/10.26522/brocked.v12i2.38 .

Article   Google Scholar  

Welch V, Petticrew M, Tugwell P, PRISMA-Equity Bellagio group, et al. PRISMA-equity 2012 extension: reporting guidelines for systematic reviews with a focus on health equity. PLoS Med. 2012;9:e1001333. https://doi.org/10.1371/journal.pmed.1001333 .

Wang X, Chen Y, Liu Y, et al. Reporting items for systematic reviews and meta-analyses of acupuncture: the PRISMA for acupuncture checklist. BMC Complement Altern Med. 2019;19:208. https://doi.org/10.1186/s12906-019-2624-3 .

Download references

Acknowledgements

We dedicate this paper to the late Douglas G Altman and Alessandro Liberati, whose contributions were fundamental to the development and implementation of the original PRISMA statement.

We thank the following contributors who completed the survey to inform discussions at the development meeting: Xavier Armoiry, Edoardo Aromataris, Ana Patricia Ayala, Ethan M Balk, Virginia Barbour, Elaine Beller, Jesse A Berlin, Lisa Bero, Zhao-Xiang Bian, Jean Joel Bigna, Ferrán Catalá-López, Anna Chaimani, Mike Clarke, Tammy Clifford, Ioana A Cristea, Miranda Cumpston, Sofia Dias, Corinna Dressler, Ivan D Florez, Joel J Gagnier, Chantelle Garritty, Long Ge, Davina Ghersi, Sean Grant, Gordon Guyatt, Neal R Haddaway, Julian PT Higgins, Sally Hopewell, Brian Hutton, Jamie J Kirkham, Jos Kleijnen, Julia Koricheva, Joey SW Kwong, Toby J Lasserson, Julia H Littell, Yoon K Loke, Malcolm R Macleod, Chris G Maher, Ana Marušic, Dimitris Mavridis, Jessie McGowan, Matthew DF McInnes, Philippa Middleton, Karel G Moons, Zachary Munn, Jane Noyes, Barbara Nußbaumer-Streit, Donald L Patrick, Tatiana Pereira-Cenci, Ba′ Pham, Bob Phillips, Dawid Pieper, Michelle Pollock, Daniel S Quintana, Drummond Rennie, Melissa L Rethlefsen, Hannah R Rothstein, Maroeska M Rovers, Rebecca Ryan, Georgia Salanti, Ian J Saldanha, Margaret Sampson, Nancy Santesso, Rafael Sarkis-Onofre, Jelena Savović, Christopher H Schmid, Kenneth F Schulz, Guido Schwarzer, Beverley J Shea, Paul G Shekelle, Farhad Shokraneh, Mark Simmonds, Nicole Skoetz, Sharon E Straus, Anneliese Synnot, Emily E Tanner-Smith, Brett D Thombs, Hilary Thomson, Alexander Tsertsvadze, Peter Tugwell, Tari Turner, Lesley Uttley, Jeffrey C Valentine, Matt Vassar, Areti Angeliki Veroniki, Meera Viswanathan, Cole Wayant, Paul Whaley, and Kehu Yang. We thank the following contributors who provided feedback on a preliminary version of the PRISMA 2020 checklist: Jo Abbott, Fionn Büttner, Patricia Correia-Santos, Victoria Freeman, Emily A Hennessy, Rakibul Islam, Amalia (Emily) Karahalios, Kasper Krommes, Andreas Lundh, Dafne Port Nascimento, Davina Robson, Catherine Schenck-Yglesias, Mary M Scott, Sarah Tanveer and Pavel Zhelnov. We thank Abigail H Goben, Melissa L Rethlefsen, Tanja Rombey, Anna Scott, and Farhad Shokraneh for their helpful comments on the preprints of the PRISMA 2020 papers. We thank Edoardo Aromataris, Stephanie Chang, Toby Lasserson and David Schriger for their helpful peer review comments on the PRISMA 2020 papers.

Provenance and peer review

Not commissioned; externally peer reviewed.

Patient and public involvement

Patients and the public were not involved in this methodological research. We plan to disseminate the research widely, including to community participants in evidence synthesis organisations.

There was no direct funding for this research. MJP is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618) and was previously supported by an Australian National Health and Medical Research Council (NHMRC) Early Career Fellowship (1088535) during the conduct of this research. JEM is supported by an Australian NHMRC Career Development Fellowship (1143429). TCH is supported by an Australian NHMRC Senior Research Fellowship (1154607). JMT is supported by Evidence Partners Inc. JMG is supported by a Tier 1 Canada Research Chair in Health Knowledge Transfer and Uptake. MML is supported by The Ottawa Hospital Anaesthesia Alternate Funds Association and a Faculty of Medicine Junior Research Chair. TL is supported by funding from the National Eye Institute (UG1EY020522), National Institutes of Health, United States. LAM is supported by a National Institute for Health Research Doctoral Research Fellowship (DRF-2018-11-ST2–048). ACT is supported by a Tier 2 Canada Research Chair in Knowledge Synthesis. DM is supported in part by a University Research Chair, University of Ottawa. The funders had no role in considering the study design or in the collection, analysis, interpretation of data, writing of the report, or decision to submit the article for publication.

Author information

Authors and affiliations.

School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia

Matthew J. Page, Joanne E. McKenzie, Sue E. Brennan & Steve McDonald

Department of Clinical Epidemiology, Biostatistics and Bioinformatics, Amsterdam University Medical Centres, University of Amsterdam, Amsterdam, Netherlands

Patrick M. Bossuyt

Université de Paris, Centre of Epidemiology and Statistics (CRESS), Inserm, F 75004, Paris, France

Isabelle Boutron

Institute for Evidence-Based Healthcare, Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Australia

Tammy C. Hoffmann

Annals of Internal Medicine, University of Texas Health Science Center at San Antonio, San Antonio, Texas, USA

Cynthia D. Mulrow

Knowledge Translation Program, Li Ka Shing Knowledge Institute, Toronto, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada

Larissa Shamseer

Evidence Partners, Ottawa, Canada

Jennifer M. Tetzlaff

Clinical Research Institute, American University of Beirut, Beirut, Lebanon; Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada

Elie A. Akl

Department of Medical Informatics and Clinical Epidemiology, Oregon Health & Science University, Portland, OR, USA

York Health Economics Consortium (YHEC Ltd), University of York, York, UK

Julie Glanville

Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada; Department of Medicine, University of Ottawa, Ottawa, Canada

Jeremy M. Grimshaw

Centre for Evidence-Based Medicine Odense (CEBMO) and Cochrane Denmark, Department of Clinical Research, University of Southern Denmark, JB Winsløwsvej 9b, 3rd Floor, 5000 Odense, Denmark; Open Patient data Exploratory Network (OPEN), Odense University Hospital, Odense, Denmark

Asbjørn Hróbjartsson

Department of Anesthesiology and Pain Medicine, The Ottawa Hospital, Ottawa, Canada; Clinical Epidemiology Program, Blueprint Translational Research Group, Ottawa Hospital Research Institute, Ottawa, Canada; Regenerative Medicine Program, Ottawa Hospital Research Institute, Ottawa, Canada

Manoj M. Lalu

Department of Ophthalmology, School of Medicine, University of Colorado Denver, Denver, Colorado, United States; Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA

Tianjing Li

Division of Headache, Department of Neurology, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts, USA; Head of Research, The BMJ, London, UK

Elizabeth W. Loder

Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, Bloomington, Indiana, USA

Evan Mayo-Wilson

Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK

Luke A. McGuinness & Penny Whiting

Centre for Reviews and Dissemination, University of York, York, UK

Lesley A. Stewart

EPPI-Centre, UCL Social Research Institute, University College London, London, UK

James Thomas

Li Ka Shing Knowledge Institute of St. Michael’s Hospital, Unity Health Toronto, Toronto, Canada; Epidemiology Division of the Dalla Lana School of Public Health and the Institute of Health Management, Policy, and Evaluation, University of Toronto, Toronto, Canada; Queen’s Collaboration for Health Care Quality Joanna Briggs Institute Centre of Excellence, Queen’s University, Kingston, Canada

Andrea C. Tricco

Methods Centre, Bruyère Research Institute, Ottawa, Ontario, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada

Vivian A. Welch

Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada; School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada

David Moher

You can also search for this author in PubMed   Google Scholar

Contributions

JEM and DM are joint senior authors. MJP, JEM, PMB, IB, TCH, CDM, LS, and DM conceived this paper and designed the literature review and survey conducted to inform the guideline content. MJP conducted the literature review, administered the survey and analysed the data for both. MJP prepared all materials for the development meeting. MJP and JEM presented proposals at the development meeting. All authors except for TCH, JMT, EAA, SEB, and LAM attended the development meeting. MJP and JEM took and consolidated notes from the development meeting. MJP and JEM led the drafting and editing of the article. JEM, PMB, IB, TCH, LS, JMT, EAA, SEB, RC, JG, AH, TL, EMW, SM, LAM, LAS, JT, ACT, PW, and DM drafted particular sections of the article. All authors were involved in revising the article critically for important intellectual content. All authors approved the final version of the article. MJP is the guarantor of this work. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.

Corresponding author

Correspondence to Matthew J. Page .

Ethics declarations

Competing interests.

All authors have completed the ICMJE uniform disclosure form at http://www.icmje.org/conflicts-of-interest/ and declare: EL is head of research for the BMJ ; MJP is an editorial board member for PLOS Medicine ; ACT is an associate editor and MJP, TL, EMW, and DM are editorial board members for the Journal of Clinical Epidemiology ; DM and LAS were editors in chief, LS, JMT, and ACT are associate editors, and JG is an editorial board member for Systematic Reviews . None of these authors were involved in the peer review process or decision to publish. TCH has received personal fees from Elsevier outside the submitted work. EMW has received personal fees from the American Journal for Public Health , for which he is the editor for systematic reviews. VW is editor in chief of the Campbell Collaboration, which produces systematic reviews, and co-convenor of the Campbell and Cochrane equity methods group. DM is chair of the EQUATOR Network, IB is adjunct director of the French EQUATOR Centre and TCH is co-director of the Australasian EQUATOR Centre, which advocates for the use of reporting guidelines to improve the quality of reporting in research articles. JMT received salary from Evidence Partners, creator of DistillerSR software for systematic reviews; Evidence Partners was not involved in the design or outcomes of the statement, and the views expressed solely represent those of the author.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1..

PRISMA 2020 checklist.

Additional file 2.

PRISMA 2020 expanded checklist.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Page, M.J., McKenzie, J.E., Bossuyt, P.M. et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Syst Rev 10 , 89 (2021). https://doi.org/10.1186/s13643-021-01626-4

Download citation

Accepted : 04 January 2021

Published : 29 March 2021

DOI : https://doi.org/10.1186/s13643-021-01626-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Systematic Reviews

ISSN: 2046-4053

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

reporting guidelines for quantitative research

Equator network

Enhancing the QUAlity and Transparency Of health Research

  • Courses & events
  • Librarian Network
  • Search for reporting guidelines

reporting guidelines for quantitative research

Browse for reporting guidelines by selecting one or more of these drop-downs:

Displaying 19 reporting guidelines found.

Key reporting guidelines, shaded green, are displayed first. Show the most recently added records first .

  • Improving the Reporting of Primary Care Research: Consensus Reporting Items for Studies in Primary Care-the CRISP Statement
  • Systematic Development of Standards for Mixed Methods Reporting in Rehabilitation Health Sciences Research
  • Initial Standardized Framework for Reporting Social Media Analytics in Emergency Care Research
  • CONFERD-HP : recommendations for reporting COmpeteNcy FramEwoRk Development in health professions
  • Development of the ASSESS tool: a comprehenSive tool to Support rEporting and critical appraiSal of qualitative, quantitative, and mixed methods implementation reSearch outcomes
  • Guiding document analyses in health professions education research
  • Social Accountability Reporting for Research (SAR 4Research): checklist to strengthen reporting on studies on social accountability in the literature
  • Application of Mixed Methods in Health Services Management Research: A Systematic Review
  • Six practical recommendations for improved implementation outcomes reporting
  • Ensuring best practice in genomics education and evaluation: reporting item standards for education and its evaluation in genomics (RISE 2 Genomics)
  • Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report
  • Consolidated criteria for strengthening reporting of health research involving indigenous peoples: the CONSIDER statement
  • Standards for Reporting Implementation Studies (StaRI) Statement
  • RAMESES II reporting standards for realist evaluations
  • Developing a methodological framework for organisational case studies: a rapid review and consensus development process
  • A checklist to improve reporting of group-based behaviour-change interventions
  • Evaluating complex interventions in end of life care: the MORECare statement on good practice generated by a synthesis of transparent expert consultations and systematic reviews
  • The quality of mixed methods studies in health services research
  • Guidelines for conducting and reporting mixed research in the field of counseling and beyond

Reporting guidelines for main study types

Translations.

Some reporting guidelines are also available in languages other than English. Find out more in our Translations section .

  • About the Library

For information about Library scope and content, identification of reporting guidelines and inclusion/exclusion criteria please visit About the Library .

Visit our Help page for information about searching for reporting guidelines and for general information about using our website.

Library index

  • What is a reporting guideline?
  • Browse reporting guidelines by specialty
  • Reporting guidelines under development
  • Translations of reporting guidelines
  • EQUATOR Network reporting guideline manual
  • Reporting guidelines for animal research
  • Guidance on scientific writing
  • Guidance developed by editorial groups
  • Research funders’ guidance on reporting requirements
  • Professional medical writing support
  • Research ethics, publication ethics and good practice guidelines
  • Links to other resources
  • 1-icon/ui/arrow_right Amsterdam Public Health
  • 1-icon/ui/arrow_right Home
  • 1-icon/ui/arrow_right Research Lifecycle

More APH...

  • 1-icon/ui/arrow_right About
  • 1-icon/ui/arrow_right News
  • 1-icon/ui/arrow_right Events
  • 1-icon/ui/arrow_right Research information
  • 1-icon/ui/arrow_right Our strenghts
  • Amsterdam Public Health
  • Research Lifecycle
  • Research information
  • Our strenghts
  • Proposal Writing
  • Study Preparation
  • Methods & Data Collection
  • Process & Analyze Data

Writing & Publication

  • Archiving & Open Data
  • Knowledge Utilization
  • Supervision

Reporting guidelines

  • Reporting, Review & Knowledge Utilization
  • Tables & figures
  • Reporting qualitative research

To report your study (results) correctly according to international guidelines.

Requirements

International standards, guidelines or statements should be followed when reporting your study (results) in a scientific article.

Responsibilities

  • Executing researcher: To check if international reporting standards, guidelines or statements exist for your type of study and use them when reporting your study (results) in a scientific article.
  • Project leaders: To ensure that international standards, guidelines or statements are followed when reporting study results.
  • Research assistant: N.a.

When you are going to report your study (results) in an article to submit to an international scientific journal, it is important to follow international guidelines. Below you find standards, guidelines, checklists, etc. for most common study types, among others:  RCTs, systematic reviews, meta-analyses, diagnostic studies, observational studies, qualitative research, quasi-experimental designs and economic evaluations. Also check if the scientific journal where you intend to publish your research demands the use of international reporting guidelines. The guidelines are 1) based on study type or 2) based on research field in alphabetic order.

Based on study type

  • Name: CONSORT
  • Study type: Randomized Controlled Trials (including several extensions for particular types of trials for example cluster randomized controlled trials)
  • Link: http://www.consort-statement.org/ ; http://www.consort-statement.org/downloads/extensions  
  • Name: Consolidated criteria for Reporting Qualitative Research
  • Study type: Qualitative research (interviews and focus groups)
  • Link: Checklist and additional information.
  • Name: Genetic Risk Prediction Studies (GRIPS)
  • Study type: Diagnostic and prognostic studies, observational studies
  • Link:  Strengthening the Reporting of Genetic RIsk Prediction Studies: The GRIPS Statement
  • Strengthening the Reporting of Genetic RIsk Prediction Studies (GRIPS): Explanation and Elaboration
  • Name: CAse REport Guidelines (CARE)
  • Study type: Case reports
  • Link: 2013 CARE checklist
  • Name: Guidelines for Meta-Analyses and Systematic Reviews of Observational Studies
  • Study type: Meta-analyses of observational studies
  • Link: Checklist
  • Name: Preferred Reporting Items for Systematic reviews and Meta-Analyses
  • Study type: Systematic Reviews and Meta-analyses (including extensions for example for developing the review protocol)
  • Link: http://www.prisma-statement.org/  
  • Name: QUality Of Reporting Of Meta-analyses
  • Study type: Meta-analyses of RCT’s
  • Link:  Checklist  and  additional information
  • Extra: Consist of a checklist and flow diagram.
  • Name: STAndards for Reporting of Diagnostic accuracy studies
  • Study type: Diagnostic accuracy studies
  • Link: to checklist and additional information, also check  http://www.equator-network.org/reporting-guidelines/stard/
  • Name: Strengthening the Reporting of Observational Studies in Epidemiology
  • Study type: Observational study (cohort, case-control, cross-sectional study designs)
  • Link: http://www.strobe-statement.org/  
  • Name: Transparent Reporting of Evaluations with Nonrandomized Designs
  • Study type: Evaluation studies that use nonrandomized designs
  • Link:  Checklist  and  additional information.
  • Name: Consolidated Health Economic Evaluation Reporting Standards
  • Study type: Economic evaluations of health interventions
  • Link: website CHEERS, additional information, checklist check ISPOR - CHEERS
  • specific types of economic evaluations check http://www.equator-network.org  

Based on research field

  • Name of organisation: American Psychological Association
  • Research field: Psychology
  • Link:  Manual  (not free of charge) and  additional information
  • Extra: The manual provides 1) standards for all journal articles, 2) more specific standards for reports of studies with experimental manipulations or evaluations of interventions using research designs involving random or non-random assignment and 3) standards for articles reporting meta-analyses.
  • The Journal Article Reporting Standards for quantitative research (JARS-Quant) reports guidelines for manuscripts that report primary quantitative research, experimental designs, non-experimental designs, special designs, analytic methods, and meta-analyses.
  • Name of organisation: American Educational Research Association
  • Research field: Education research grounded in the empirical traditions of social sciences.
  • Link: http://www.sagepub.com/upm-data/13127_Standards_from_AERA.pdf

For more information please also see: The EQUATOR Network 

Please also check for yourself if there are any updates, extensions, etc. published.

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • For authors
  • New editors
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Online First
  • Using reporting guidelines in sports and exercise medicine research: why and how to raise the bar?
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0003-2961-9328 David Blanco 1 ,
  • http://orcid.org/0000-0002-6829-2201 Aïda Cadellans-Arróniz 1 ,
  • http://orcid.org/0000-0001-8836-9109 Márcio Vinícius Fagundes Donadio 1 , 2 ,
  • http://orcid.org/0000-0001-5261-1573 Melissa K Sharp 3 ,
  • http://orcid.org/0000-0002-1775-8331 Martí Casals 4 , 5 , 6 ,
  • http://orcid.org/0000-0003-1969-3612 Pascal Edouard 7 , 8
  • 1 Department of Physiotherapy , Universitat Internacional de Catalunya , Barcelona , Spain
  • 2 Pontificia Universidade Catolica do Rio Grande do Sul , Porto Alegre , Brazil
  • 3 Department of Public Health and Epidemiology , RCSI University of Medicine and Health Sciences , Dublin , Ireland
  • 4 National Institute of Physical Education of Catalonia (INEFC), University of Barcelona , Barcelona , Spain
  • 5 Sport and Physical Activity Studies Centre (CEEAF), Faculty of Medicine, University of Vic-Central University of Catalonia (UVic-UCC) , Barcelona , Spain
  • 6 Sport Performance Analysis Research Group, University of VicCentral University of Catalonia (UVic-UCC) , Barcelona , Spain
  • 7 Inter-university Laboratory of Human Movement Biology (EA 7424) , Université Jean Monnet, Lyon 1, Université Savoie Mont-Blanc , Saint-Etienne , France
  • 8 Department of Clinical and Exercise Physiology, Sports Medicine Unit , University Hospital of Saint-Etienne, Faculty of Medicine , Saint-Etienne , France
  • Correspondence to Dr David Blanco, Physiotherapy Department, Universitat Internacional de Catalunya, Barcelona, Spain; dblanco{at}uic.es

https://doi.org/10.1136/bjsports-2024-108101

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

  • Randomized Controlled Trial
  • Sports medicine

Have you already heard about the Consolidated Standards of Reporting Trials (CONSORT), Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), or Consensus on Exercise Reporting Template (CERT)? Are you using these reporting guidelines (RGs)? And if so, how? These, and other guidelines, should be used when submitting research manuscripts to most of journals in the field of sports and exercise medicine. But why are they so important?

This editorial has two goals: (1) to illustrate how reporting quality differs from methodological quality and why complete reporting is key to maximise the clinical impact of research and (2) to be a call to action for journal editors, peer reviewers and authors to effectively use RGs to improve reporting according to the needs of the sports and exercise medicine community.

Reporting quality versus methodological quality

Let us start by looking at two concepts that are often confused or used interchangeably: reporting quality (or completeness of reporting) and methodological quality. While the first refers to how well the methods and findings are reported in a manuscript, the latter addresses how well the research has been designed and conducted. 1 These two concepts are heavily linked. Complete reporting allows readers to ascertain the methodological quality of research, and therefore, know to what extent the results may be biased ( table 1 ). When there is clinical uncertainty, clinicians and researchers in the sports and exercise medicine field look to the published evidence base for answers. Complete reporting allows for the creation of stronger recommendations for healthcare, and for researchers to clearly identify biases and address gaps in the evidence base. Conversely, incomplete reporting creates uncertainty and can often be a warning sign for a poorly designed or poorly conducted study. Poor reporting and low methodological quality, together with the choice of clinically irrelevant research questions and the failure to publish research, have been pointed out as the main causes of research waste. 2 Although there have been improvements in those four aspects since 2009, when research waste was estimated to be around 85%, there is still a lot of progress to be made. 3 The consequences of (in)complete reporting and (in)adequate methodology are summarised in table 1 .

  • View inline

Consequences of (in)complete reporting and (in)adequate methodology

RGs: a powerful tool to improve reporting quality

RGs have been shown to improve reporting quality and can even increase in the number of article citations. 5 However, in the sports and exercise medicine field, like many others, reporting quality of research is often poor. 6 7 There are several possible explanations for this. First, there is little researcher training or awareness of RGs. For example, 1/5 authors of observational studies had never heard of STROBE before and another 1/5 of them had heard of it but never used it. 8 Researchers have also claimed that RGs increase the burden imposed on them and that the benefits of using these are unclear. 9 This lack of awareness and understanding of RGs is reflected in the commonly documented misuse of RGs as methodological guidelines, with authors often claiming that ‘this trial was done in accordance with CONSORT’. 1 RGs do not provide guidance on how to conduct a study but on how to report its methods and findings. Lastly, the use of RGs is not usually enforced or monitored by other actors involved in the research process (journals, funders, universities, ethics boards). 10

A call for action: proposals to effectively use RGs to improve reporting quality

Next, we present evidence-based proposals of what journal editors, peer reviewers and authors should and should not do to improve reporting quality. Table 2 summarises these proposals.

Summary of proposals of what journal editors, peer reviewers and authors should and should not do to improve reporting quality

At a journal editor level, one common way to promote the use of RGs is to require authors to provide a completed RG checklist that indicates the page number where each item has been reported. However, this does not seem to improve the reporting quality of published articles, as editors and reviewers usually overlook these checklists. 11 For this reason, we encourage journal editors to involve trained editorial staff in the checklist assessment or to include an RG expert in the peer review process to check for completeness and provide recommendations to authors for improvement, which has been shown to be effective. 12 This approach has been routinely implemented by a general medical journal 13 but, to our knowledge, not by any journal in the sports and exercise medicine field.

At a peer reviewer level, it has been shown that asking standard reviewers to check specific RG items failed to improve reporting. 14 As reviewers are usually experts who publish in a similar field as the authors, they are not necessarily more knowledgeable about RGs than the authors themselves. 14 Consequently, we suggest that reviewers use RGs as a first check, but that they do not focus all of their review on simply checking RG requirements. This should be enforced by editors in their instructions to reviewers.

At an author level, RGs must be considered early in the manuscript writing stage. Some effective ways of doing this could be using online writing aid tools that integrate RGs 15 or using a structured format when preparing manuscripts. 16 We strongly suggest the inclusion of subheadings within the traditional Introduction, Methods, Results and Discussion format that correspond to RG items (or groups of items). In this way, the structure of the manuscript itself would favour the inclusion of all necessary information. This strategy is starting to be implemented in the medical field 13 and, due to its ease of implementation and low cost, should also be considered by authors in the field of sports and exercise medicine.

Conclusions

Raising the bar of reporting quality is essential to maximise the impact of sports and exercise medicine research. We encourage journal editors, peer reviewers and authors to take responsibility for this.

Ethics statements

Patient consent for publication.

Not applicable.

  • Chalmers I ,
  • Glasziou P ,
  • EQUATOR Network
  • Selva-O’Callaghan A , et al
  • Hansford HJ ,
  • Wewege MA ,
  • Cashin AG , et al
  • Mbuagbaw L ,
  • Kosa D , et al
  • Bertizzolo L ,
  • Rius R , et al
  • Hansford HJ
  • Moher D , et al
  • Macleod MR ,
  • Sena ES , et al
  • Schroter S ,
  • Aldcroft A , et al
  • Qureshi R ,
  • Schönenberger CM , et al
  • Boutron I ,
  • Giraudeau B , et al
  • Donadio MVF ,
  • Cadellans-Arróniz A

X @david_blanco91, @sharpmelk, @CasalsTMarti, @https://x.com/PascalEdouard42

Contributors DB and PE conceptualised this work. DB wrote the original draft. AC-A, MVFD, MKS, MC and PE reviewed and edited the manuscript. All authors understand that they are accountable for all aspects of the work and ensure the accuracy or integrity of this manuscript. DB and PE are the guarantors.

Funding DB was funded by the Ministerio de Ciencia e Innovación (Spain) (PID2019-104830RB-I00/ DOI (AEI): 10.13039/501100011033).

Competing interests MC and PE are associate editors for the British Journal of Sports Medicine. PE is an associate editor for the BMJ Open Sports and Exercise Medicine.

Provenance and peer review Not commissioned; externally peer reviewed.

Equity, diversity and inclusion statement The research team consists of five senior researchers (two women and three men) from a variety of disciplines (methodology, statistics, sports medicine, sports physiotherapy), and who work at three countries in Europe (Spain, Ireland, and France).

Read the full text or download the PDF:

Postdoctoral Compensation Guidelines

Every year, the Office of Research reviews the minimum salary and stipend levels for Boston University postdoctoral scholars (postdocs), per our Postdoctoral Scholars Policy. As part of this process, we compare our practices to our peers in the Association of American Universities as well as regional benchmarks to ensure fair and competitive compensation for all the University’s postdocs.

Last year, BU and many of our local peers chose to depart from the National Institutes of Health (NIH) Ruth L. Kirschstein National Research Service Award minimum stipend level as a minimum salary for postdoc compensation. BU will continue to proactively provide a supportive and competitive training environment for our postdocs.

Our new Postdoctoral Scholars Policy , effective April 16, 2024, allows us to establish salary and stipend structures based on postdoctoral years of experience, in addition to setting minimum levels.

Effective January 1, 2025, we will be increasing the salary and stipend minimum for all current BU postdocs as follows:

  • 0-2 years of BU experience: $67,500
  • More than 2 years of BU experience: $70,000

We are establishing a minimum of $70,000 for more experienced postdocs in recognition of  recommendations  from the NIH Advisory Committee to the Director on Re-envisioning NIH-Supported Postdoctoral Training.

To reach these salary and stipend levels, we recommend the following actions:

  • Starting immediately, all new proposals submitted through Sponsored Programs will require postdoc salaries to be budgeted at the new institutional minimums of either $67,500 or $70,000, as appropriate.
  • All new and current postdoc appointments as of January 1, 2025, must reflect the new salary and stipend minimums of either $67,500 or $70,000, as appropriate to the postdoc’s year(s) of service.

We recognize this is a significant increase that may be challenging to implement. Nevertheless, the success of our mission depends on recruiting, retaining, and supporting the most talented postdocs, which requires elevating minimum salaries to a level commensurate with nationwide recommendations. We expect existing funding sources to cover salary or stipend increases.

There may be some grants (e.g., individual postdoc fellowships, training grants) for which re-budgeting won’t be possible. If you have questions or require support, contact Sarah Hokanson, Assistant Vice President and Assistant Provost for Research Development and PhD & Postdoctoral Affairs, at [email protected] .

Information For...

IMAGES

  1. Detailed guidelines for reporting quantitative research in ...

    reporting guidelines for quantitative research

  2. Table 1 from Step-by-step guide to critiquing research. Part 1

    reporting guidelines for quantitative research

  3. FREE 10+ Quantitative Research Report Samples & Templates in PDF

    reporting guidelines for quantitative research

  4. Reporting Your Quantitative Research Findings: Best Practices and

    reporting guidelines for quantitative research

  5. Quantitative Research

    reporting guidelines for quantitative research

  6. Quantitative Research: What It Is, Practices & Methods

    reporting guidelines for quantitative research

VIDEO

  1. Reporting Descriptive Analysis

  2. Lecture 43: Quantitative Research

  3. Lecture 41: Quantitative Research

  4. Standards for Research in Social Sciences

  5. Reporting about "Quantitative Research"

  6. may 2024 quantitative techniques and statistics b.com 4th sem #punjabuniversitychandigarh

COMMENTS

  1. Quantitative research design (JARS-Quant)

    Quantitative Research Design (JARS-Quant) The current JARS-Quant standards, released in 2018, expand and revise the types of research methodologies covered in the original JARS, which were published in 2008. JARS-Quant include guidance for manuscripts that report. In addition, JARS-Quant now divides hypotheses, analyses, and conclusions ...

  2. Guidelines for Reporting Quantitative Methods and Results in Primary

    These guidelines, commissioned and vetted by the board of directors of Language Learning, outline the basic expectations for reporting of quantitative primary research with a specific focus on Method and Results sections. The guidelines are based on issues raised in: Norris, J. M., Ross, S., & Schoonen, R. (Eds.). (2015).

  3. Critical Appraisal Tools and Reporting Guidelines

    More. Critical appraisal tools and reporting guidelines are the two most important instruments available to researchers and practitioners involved in research, evidence-based practice, and policymaking. Each of these instruments has unique characteristics, and both instruments play an essential role in evidence-based practice and decision-making.

  4. The PRISMA 2020 statement: an updated guideline for reporting ...

    Statistical synthesis—The combination of quantitative results of two or more studies. This ... We updated the PRISMA 2009 statement by adapting the EQUATOR Network's guidance for developing health research reporting guidelines.64 We evaluated the reporting completeness of published systematic reviews,17 21 36 37 reviewed the items ...

  5. PDF Journal Article Reporting Standards for Quantitative Research in

    For example, a few words from the Animal Research: Report-ing of In Vivo Experiments (ARRIVE; Kilkenny et al., 2010) guidelines have been incorporated into JARS-Quant to make the two sets of standards for reporting studies using nonhuman living organisms consistent. In the case of reporting standards

  6. Reporting and Interpreting Quantitative Research Findings: What Gets

    This paper presents a set of guidelines for reporting on five types of quantitative data issues: (1) Descriptive statistics, (2) Effect sizes and confidence intervals, (3) Instrument reliability, (4) Visual displays of data, and (5) Raw data.

  7. Writing Quantitative Research Studies

    Summarizing quantitative data and its effective presentation and discussion can be challenging for students and researchers. This chapter provides a framework for adequately reporting findings from quantitative analysis in a research study for those contemplating to write a research paper. The rationale underpinning the reporting methods to ...

  8. Study reporting guidelines: How valid are they?

    Reporting guidelines have been developed to help improve the reporting of specific study designs. If followed by authors this should enable users to understand the design, conduct and analysis of the research, to critically appraise and review the findings and interpret the conclusions appropriately [ 4 ]. A guideline is a checklist, diagram or ...

  9. Reporting Guidelines

    Reporting Guidelines. It is important that your manuscript gives a clear and complete account of the research that you have done. Well reported research is more useful and complete reporting allows editors, peer reviewers and readers to understand what you did and how. Poorly reported research can distort the literature, and leads to research ...

  10. A Practical Guide to Writing Quantitative and Qualitative Research

    INTRODUCTION. Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses.1,2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results.3,4 Both research questions and hypotheses are essentially formulated based on conventional theories and real-world processes, which allow the ...

  11. Guidelines for Reporting Quantitative Methods and ...

    These guidelines, commissioned and vetted by the board of directors of Language Learning, outline the basic expectations for reporting of quantitative primary research with a specific focus on ...

  12. Reporting guidelines

    Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report 157 Guidelines for Reporting Trial Protocols and Completed Trials Modified Due to the COVID- 19 Pandemic and Other Extenuating Circumstances: The CONSERVE 2021 Statement

  13. 16. Reporting quantitative results

    Execute a quantitative research report using key elements for accuracy and openness. So you've completed your quantitative analyses and are ready to report your results. We're going to spend some time talking about what matters in quantitative research reports, but the very first thing to understand is this: openness with your data and ...

  14. The PRISMA 2020 statement: an updated guideline for reporting

    The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement published in 2009 (hereafter referred to as PRISMA 2009) [4,5,6,7,8,9,10] is a reporting guideline designed to address poor reporting of systematic reviews [].The PRISMA 2009 statement comprised a checklist of 27 items recommended for reporting in systematic reviews and an "explanation and elaboration ...

  15. Mixed methods studies

    4. CONFERD-HP : recommendations for reporting COmpeteNcy FramEwoRk Development in health professions. 5. Development of the ASSESS tool: a comprehenSive tool to Support rEporting and critical appraiSal of qualitative, quantitative, and mixed methods implementation reSearch outcomes. 6.

  16. Reporting guidelines

    The Journal Article Reporting Standards for quantitative research (JARS-Quant) reports guidelines for manuscripts that report primary quantitative research, experimental designs, non-experimental designs, special designs, analytic methods, and meta-analyses. AERA. Name of organisation: American Educational Research Association

  17. Guidelines for Reporting Quantitative Methods and Results in Primary

    These guidelines, commissioned and vetted by the board of directors of Language Learning, outline the basic expectations for reporting of quantitative primary research with a specific focus on Method and Results sections. The guidelines are based on issues raised in: Norris, J. M., Ross, S., & Schoonen, R. (Eds.). (2015).

  18. Using reporting guidelines in sports and exercise medicine research

    This editorial has two goals: (1) to illustrate how reporting quality differs from methodological quality and why complete reporting is key to maximise the clinical impact of research and (2) to be a call to action for journal editors, peer reviewers and authors to effectively use RGs to improve reporting according to the needs of the sports ...

  19. Postdoctoral Compensation Guidelines

    Starting immediately, all new proposals submitted through Sponsored Programs will require postdoc salaries to be budgeted at the new institutional minimums of either $67,500 or $70,000, as appropriate. All new and current postdoc appointments as of January 1, 2025, must reflect the new salary and stipend minimums of either $67,500 or $70,000 ...

  20. Applied Sciences

    The Changzhou ship lock is approaching its capacity limit. In order to quantitatively analyze the influencing factors that restrict the capacity of the Changzhou ship lock, this study proposes an influencing factor analysis method based on principal component analysis (PCA). This method estimates the confidence interval of ship passing time by fitting a lognormal distribution curve, eliminates ...

  21. Editorial: Detailed guidelines for reporting quantitative research in

    Editorial: Detailed guidelines for reporting quantitative research in Health & Social Care in the Community

  22. Diagnostics

    Bitemark analysis maintains significant importance in forensic odontology, as it can wield substantial influence, whether within a legal framework or in evaluating the well-being of children considered to be at risk [].Bitemarks act as impressions created by teeth on the skin or other flexible surfaces [2,3].Bitemark analysis involves examining both the patterned injury and the surrounding ...

  23. Heart disease and stroke could affect at least 60% of adults in U.S. by

    One report looks at the projected increase in cardiovascular disease rates in the decades ahead, while the other projects their total related costs. "The landscape of cardiovascular disease in the U.S. is seeing the arrival of a near-perfect storm," Dr. Dhruv S. Kazi, vice chair of the advisory writing group, said in a news release .