Research and Reporting Methods
4 September 2018

PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and ExplanationFREE

Publication: Annals of Internal Medicine
Volume 169, Number 7

Abstract

Scoping reviews, a type of knowledge synthesis, follow a systematic approach to map evidence on a topic and identify main concepts, theories, sources, and knowledge gaps. Although more scoping reviews are being done, their methodological and reporting quality need improvement. This document presents the PRISMA-ScR (Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews) checklist and explanation. The checklist was developed by a 24-member expert panel and 2 research leads following published guidance from the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network. The final checklist contains 20 essential reporting items and 2 optional items. The authors provide a rationale and an example of good reporting for each item. The intent of the PRISMA-ScR is to help readers (including researchers, publishers, commissioners, policymakers, health care providers, guideline developers, and patients or consumers) develop a greater understanding of relevant terminology, core concepts, and key items to report for scoping reviews.
Scoping reviews can be conducted to meet various objectives. They may examine the extent (that is, size), range (variety), and nature (characteristics) of the evidence on a topic or question; determine the value of undertaking a systematic review; summarize findings from a body of knowledge that is heterogeneous in methods or discipline; or identify gaps in the literature to aid the planning and commissioning of future research (1, 2). A recent scoping review by members of our team suggested that although the number of scoping reviews in the literature is increasing steadily, methodological and reporting quality needs to improve in order to facilitate complete and transparent reporting (1). Results from a survey on scoping review terminology, definitions, and methods showed a lack of consensus on how to conduct and report scoping reviews (3).
The Joanna Briggs Institute (JBI) published a guidance document for the conduct of scoping reviews (4) (updated in 2017 [5]) based on earlier work by Arksey and O'Malley (6) and Levac and colleagues (7). However, a reporting guideline for scoping reviews currently does not exist.
Reporting guidelines outline a minimum set of items to include in research reports and have been shown to increase methodological transparency and uptake of research findings (8, 9). Although a reporting guideline exists for systematic reviews—the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) statement (10)—scoping reviews serve a different purpose (11). Systematic reviews are useful for answering clearly defined questions (for example, “Does this intervention improve specified outcomes when compared with a given comparator in this population?”), whereas scoping reviews are useful for answering much broader questions (such as “What is the nature of the evidence for this intervention?” or “What is known about this concept?”). Given the difference in objectives, and therefore in the methodological approach (such as presence vs. absence of a risk-of-bias assessment or meta-analysis), scoping reviews should have different essential reporting items from systematic reviews. Consequently, some PRISMA items may not be appropriate, whereas other important considerations may be missing (12–14). It was decided that a PRISMA extension for scoping reviews was needed to provide reporting guidance for this specific type of knowledge synthesis. This extension is also intended to apply to evidence maps (15, 16), which share similarities with scoping reviews and involve a systematic search of a body of literature to identify knowledge gaps, with a visual representation of results (such as a figure or graph).

Methods

The PRISMA-ScR (PRISMA extension for Scoping Reviews) was developed according to published guidance by the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network for the development of reporting guidelines (9). The St. Michael's Hospital Research Ethics Board granted research ethics approval for this study on 15 August 2016.

Protocol, Advisory Board, and Expert Panel

Our protocol was drafted by the research team and revised as necessary by the advisory board before being listed as a reporting guideline on the EQUATOR (17) and PRISMA (18) Web sites. The research team included 2 leads (A.C.T. and S.E.S.) and 2 research coordinators (E.L. and W.Z.), none of whom participated in the scoring exercises, and a 4-member advisory board (K.K.O., H.C., D.L., and D.M.) with extensive experience doing scoping reviews or developing reporting guidelines. We aimed to form an expert panel of approximately 30 members that would be representative of different geography and stakeholder types and research experiences, including persons with experience in the conduct, dissemination, or uptake of scoping reviews.

Survey Development and Round 1 of Delphi

The initial step in developing the Delphi survey via Qualtrics (an online survey platform) (19) involved identifying potential modifications to the original 27-item PRISMA checklist. The modifications were based on a research program carried out by members of the advisory board to better understand scoping review practices (1, 3, 20) and included a broader research question and literature search strategy, optional risk-of-bias assessment and consultation exercise (whereby relevant stakeholders contribute to the work, as described by Arksey and O'Malley [6]), and a qualitative analysis. For round 1 of scoring, we prepared a draft of the PRISMA-ScR (Supplement) and asked expert panel members to rate their agreement with each of the proposed reporting items using a 7-point Likert scale (1 = “entirely disagree,” 2 = “mostly disagree,” 3 = “somewhat disagree,” 4 = “neutral,” 5 = “somewhat agree,” 6 = “mostly agree,” and 7 = “entirely agree”). Each survey item included an optional text box where respondents could provide comments. The research team calibrated the survey for content and clarity before administering it and sent biweekly reminders to optimize participation.

Survey Analysis

To be conservative, a threshold for 85% agreement was established a priori for each of the reporting items to indicate consensus among the expert panel. This rule required that at least 85% of the panel mostly or entirely agreed (values of 6 or 7 on the Likert scale) with the inclusion of the item in the PRISMA-ScR. If agreement was less than 85%, it was considered to be discrepant. This standard was used for all 3 rounds of scoring to inform the final checklist. For ease and consistency with how the survey questions were worded, we did not include a provision for agreement on exclusion (that is, 85% of answers corresponding to values of 1 or 2 on the Likert scale). All comments were summarized to help explain the scorings and identify any issues. For the analysis, the results were stratified by group (in-person meeting vs. online, hereafter e-Delphi) because discrepant items could differ between groups.

In-Person Group (Round 2 of Delphi)

The Chatham House rule (21) was established at the beginning of the meeting, whereby participants were free to use information that is shared but were not permitted to reveal the identity or affiliation of the speaker. Expert panel members were given their individual results; the overall group distribution, median, and interquartile range; a summary of the JBI methodological guidance (4); and preliminary feedback from the e-Delphi group. These data were used to generate and inform the discussion about each discrepant item from round 1. Two researchers (A.C.T. and S.E.S.) facilitated the discussion using a modified nominal group technique (22) to reach consensus. Panel members were subsequently asked to rescore the discrepant items using sli.do (23), a live audience-response system in a format that resembled the round 1 survey. For items that failed to meet the threshold for consensus, working groups were assembled. The meeting was audio-recorded and transcribed using TranscribeMe (24), and 3 note-takers independently documented the main discussion points. The transcript was annotated to complement a master summary of the discussion points, which was compiled using the 3 note-takers' files.

E-Delphi Group (Round 2 of Delphi)

Those who could not attend the in-person meeting participated via an online discussion exercise using Conceptboard (25), a visual collaboration platform that allows users to provide feedback on “whiteboards” in real time. The discrepant items from round 1 were presented as a single whiteboard, and questions (for example, “After reviewing your survey results with respect to this item, please share why you rated this item the way you did”) were assigned to participants as tasks to facilitate the discussion. E-Delphi panel members received the same materials as in-person participants and were encouraged to respond to others' comments and interact through a chat feature. The second round of scoring was done in Qualtrics using a similar format as in round 1. A summary of the Conceptboard discussion, as well as the annotated meeting transcript and master summary document, were shared so that participants could learn about the perspectives of the in-person group before rescoring.

Working Groups and Round 3 of Delphi

To enable panel-wide dialogue and refine the checklist items before the final round of scoring, working groups were created and collaborated by teleconference and e-mail. Their task was to discuss the discrepant items in terms of the key issues and considerations (relating to both concepts and wording) that had been raised in earlier stages across both groups. To harmonize the data from the 2 groups, a third round of scoring exercise was administered using Qualtrics (19). In this step, suggested modifications (in terms of both concepts and wording) from all previous stages were incorporated into the items that had failed to reach consensus in the first 2 rounds across both groups, and the full panel scored this updated list.

Interactive Workshop (Testing)

A workshop led by the lead investigator (A.C.T.) and facilitated by members of the advisory board and expert panel (S.E.S., C.M.G., C.G., T.H., M.T.M., and M.D.J.P.) was held as part of the Global Evidence Summit in Cape Town, South Africa, in September 2017. Participants (researchers, scientists, policymakers, managers, and students) tested the checklist by applying the PRISMA-ScR to a scoping review on a health-related topic (26).

Role of the Funding Source

This work was supported by a grant from the Canadian Institutes of Health Research. The funding source had no role in designing the study; collecting, analyzing, or interpreting the data; writing the manuscript; or deciding to submit it for publication.

Results

Expert Panel

A total of 37 persons were invited to participate, of whom 31 completed round 1 and 24 completed all 3 rounds of scoring. The Figure presents results of the modified Delphi, including the number of items that met agreement at each stage.
Figure. Methods flow chart. PRISMA = Preferred Reporting Items for Systematic reviews and Meta-Analyses; PRISMA-ScR = PRISMA extension for Scoping Reviews.
Figure. Methods flow chart.
PRISMA = Preferred Reporting Items for Systematic reviews and Meta-Analyses; PRISMA-ScR = PRISMA extension for Scoping Reviews.

Round 1 of Delphi

For the in-person group, which involved 16 participants, 9 of 27 items reached agreement. For the discrepant items, agreement ranged from 56% for item 15 (risk of bias) to 81% for items 3 (rationale), 16 (additional analyses), 20 (results of individual sources), and 23 (additional analyses). For the e-Delphi group, which involved 15 participants, 8 of 27 items met the 85% agreement threshold. For the discrepant items, agreement ranged from 40% for item 12 (risk of bias) to 80% for items 3 (rationale), 25 (limitations), and 26 (conclusions).

In-Person Meeting and Round 2 of Delphi

The 16 panel members who attended the in-person meeting in Toronto on 29 November 2016 were largely from North America, although a few were from Australia, Lebanon, or the United Kingdom. Of the 18 discrepant items from round 1, the panel decided to rescore 11 of the items after facilitated discussion. All, except item 7 (information sources), reached the 85% threshold for agreement in the rescoring exercise. For the remaining 7 of the 18 discrepant items, the group believed that notable changes were required, which formed the basis of action by the working groups.

E-Delphi Discussion and Round 2 of Delphi

Fifteen panel members were invited to participate in the online discussion exercise, from countries that included Canada, the United Kingdom, Switzerland, Norway, and South Africa. Of these, 50% (7 of 14 panelists) participated in at least 1 discussion on Conceptboard (25) and 1 dropped out. Eleven participants completed the second scoring exercise of the 19 discrepant items, whereby 5 items reached 85% agreement.

Working Groups and Round 3 of Delphi

The 6 working groups (with 1 call per group) ranged in size from 3 to 8 participants, with an average of 5 per group. Round 3 of the Delphi did not include the 11 items that reached consensus during round 1 or 2 across both the in-person and e-Delphi groups. The survey focused on the remaining 16 items that failed to reach consensus across both groups to ensure that one group's decisions did not take precedence over the other group.
A total of 27 persons were invited to participate in round 3 of the Delphi—16 from the in-person group and 11 from the e-Delphi group. Overall, 24 of 27 participants completed the final round of scoring, and 3 withdrew (2 from the in-person group and 1 from the e-Delphi). Two of the 16 applicable items—10 (data collection process) and 15 (risk of bias across studies)—failed to meet the 85% agreement threshold. Item 15 was subsequently removed from the checklist (along with its companion item, 22), whereas item 10 was retained but revised to exclude the optional consultation exercise described by Arksey and O'Malley (6) and Levac and colleagues (7), which was the source of the disagreement. Participants decided that the consultation exercise could be considered a knowledge translation activity, which could be done for any type of knowledge synthesis.

Interactive Workshop (Testing)

A total of 30 participants attended an interactive workshop at the Global Evidence Summit in September 2017 in Cape Town, South Africa, where minor revisions were suggested for wording of the items.

PRISMA-ScR Checklist

The Table presents the final checklist, which includes 20 items plus 2 optional items. It consists of 10 items that reached agreement in rounds 1 and 2 (items 1, 3, 5, 6, 8, 9, 17, 25, 26, and 27) and 9 that were agreed on in round 3 (items 2, 4, 7, 11, 14, 18, 20, 21, and 24), along with 1 item (item 10) that was modified for inclusion after the final round. Five items from the original PRISMA checklist were deemed not relevant. They were item 13 (summary measures), which reached agreement as not applicable for scoping reviews in round 1, and the following 4 items: item 15 (risk of bias across studies) and item 22 (risk of bias across study results, companion to item 15), which were excluded after round 3, along with item 16 (additional analyses) and item 23 (additional analyses results, companion to item 16), which reached agreement as not applicable for scoping reviews in round 3. The Figure illustrates this process. In addition, because scoping reviews can include many types of evidence (such as documents, blogs, Web sites, studies, interviews, and opinions) and do not examine the risk of bias of included sources, items 12 (risk of bias in individual studies) and 19 (risk of bias within study results) from the original PRISMA are treated as optional in the PRISMA-ScR.
Table. PRISMA-ScR Checklist
Table. PRISMA-ScR Checklist

PRISMA-ScR Explanation and Elaboration

The Appendix elaborates on the PRISMA-ScR checklist items. It defines each item and gives examples of good reporting from existing scoping reviews to provide authors with additional guidance on how to use the PRISMA-ScR.

Discussion

The PRISMA-ScR is intended to provide guidance on the reporting of scoping reviews. To develop this PRISMA extension, the original PRISMA statement was adapted and the following revisions were made: 5 items were removed (because they were deemed not relevant to scoping reviews), 2 items were deemed optional, and the wording was modified for all items. This reporting guideline is consistent with the JBI guidance for scoping reviews, which highlights the importance of methodological rigor in the conduct of scoping reviews. It is hoped that the PRISMA-ScR will improve the reporting of scoping reviews and increase their relevance for decision making and that adherence to our reporting guideline will be evaluated in the future, which will be critical to measure its impact.
The PRISMA-ScR will be housed on the EQUATOR Network's Web site and the Knowledge Translation Program Web site of St. Michael's Hospital (27). To promote its uptake, a 1-minute YouTube video will be created outlining how to operationalize each item, webinars for organizations that do scoping reviews will be offered, and a 1-page tip sheet for each item will be created. In the future, an automated e-mail system may be considered, whereby authors would be sent the PRISMA-ScR upon registering their scoping review, as well as an online tool similar to Penelope, which verifies manuscripts for completeness and provides feedback to authors (28). The PRISMA-ScR will be shared widely within our networks, including the Alliance for Health Policy and Systems Research, the World Health Organization (29), and the Global Evidence Synthesis Initiative (30). Finally, ongoing feedback and suggestions to improve uptake of the PRISMA-ScR will be collected via an online form on the Web site for the Knowledge Translation Program of St. Michael's Hospital (27).

Appendix: PRISMA Extension for Scoping Reviews (PRISMA-ScR): Explanation and Elaboration

This explanation and elaboration document presents key examples from the literature with explanations for best practices for reporting each PRISMA-ScR item.

Title and Abstract

Item 1: Title

Identify the report as a scoping review.
Example
Screening of cognitive impairment in the dialysis population: a scoping review. (31)
Explanation and Elaboration
To easily identify the article as a scoping review, the title should include the term “scoping review.” The JBI guidance states that the title should reflect the key elements that inform the eligibility criteria of the scoping review, such as the population, concept, and context (4, 5).

Item 2: Structured Summary

Provide a structured summary that includes (as applicable) background, objectives, eligibility criteria, sources of evidence, charting methods, results, and conclusions that relate to the review questions and objectives.
Example
Background. Among circumpolar populations, recent research has documented a significant increase in risk factors which are commonly associated with chronic disease, notably obesity.
Objective. The present study undertakes a scoping review of research on obesity in the circumpolar Inuit to determine the extent obesity research has been undertaken, how well all subpopulations and geographic areas are represented, the methodologies used and whether they are sufficient in describing risk factors, and the prevalence and health outcomes associated with obesity.
Design. Online databases were used to identify papers published 1992–2011, from which we selected 38 publications from Canada, the United States, and Greenland that used obesity as a primary or secondary outcome variable in 30 or more non-pregnant Inuit…participants aged 2 years or older.
Results. The majority of publications (92%) reported cross-sectional studies while 8% examined retrospective cohorts. All but one of the studies collected measured data. Overall 84% of the publications examined obesity in adults. Those examining obesity in children focused on early childhood or adolescence. While most (66%) reported 1 or more anthropometric indices, none incorporated direct measures of adiposity. Evaluated using a customized quality assessment instrument, 26% of studies achieved an “A” quality ranking, while 18 and 39% achieved quality rankings of “B” and “C”, respectively.
Conclusions. While the quality of studies is generally high, research on obesity among Inuit would benefit from careful selection of methods and reference standards, direct measures of adiposity in adults and children, studies of preadolescent children, and prospective cohort studies linking early childhood exposures with obesity outcomes throughout childhood and adolescence. (32)
Explanation and Elaboration
The structured summary should concisely describe the aims, methods, findings, and conclusions of the scoping review so that they can be easily identified by knowledge users (such as policymakers, health care providers, health care managers, and patients or consumers), funding agencies, and researchers (33–35). The summary will be the only information available to some readers, who will rely on it to decide whether to read the full text, so details should be clearly reported. It is also useful for the purposes of literature searching retrieval (33–35).
The summary elements are listed “as applicable” to indicate that authors should include only those that are relevant to their scoping review and the journal requirements, such as the background or context of the issue under study, objectives or purpose of the review, eligibility criteria, selection process for sources of evidence, eligible sources of evidence, data charting methods, main findings or results, and conclusions as they relate to the review questions and objectives. Where applicable, authors should include information about other elements that are not listed (such as funding and registration number). In contrast with the original PRISMA structured summary item, “synthesis methods” has been replaced with “charting methods” (the more appropriate terminology and approach for scoping reviews) and “limitations” is omitted. Although the limitations associated with the conduct of the scoping review should be outlined, risk of bias and methodological quality are generally not appraised.

Introduction

Item 3: Rationale

Describe the rationale for the review in the context of what is already known. Explain why the review questions or objectives lend themselves to a scoping review approach.
Example
The support of the social environment is equally important: parents, peers, teachers, community-members, and friends. Parents, in particular, greatly influence participation at school, at home and in the community. They undertake many actions to improve their children's participation in daily life. Understanding the actions of parents and also their challenges and needs will contribute to how society can support these parents and thereby enable the participation of children with physical disabilities. Pediatric rehabilitation, aiming for optimal participation, could benefit from this understanding to improve Family-centered services (FCS). In FCS, the family is seen as an expert on the child's abilities and needs, and professionals work in partnership with the family. Pediatric rehabilitation considers FCS as a way to increase participation of children with a physical disability in daily life.
However, it is unclear what kind of information is available in literature about what parents live through, do, and what kind of problems and needs they have in supporting their child's participation? For these reasons, a scoping review was conducted in order to systematically map the research done in this area, as well as to identify any existing gaps in knowledge. (36)
Explanation and Elaboration
The background of a scoping review should be comprehensive and cover the main topic elements, important definitions, and existing knowledge in the field (4, 5). When reporting the rationale, researchers should situate their work appropriately (that is, in the context of what is already known on the topic or research question) and clearly explain why they chose the scoping review method given the many types of knowledge synthesis available (11).

Item 4: Objectives

Provide an explicit statement of the questions and objectives being addressed with reference to their key elements (for example, population or participants, concepts, and context) or other relevant key elements used to conceptualize the review questions or objectives.
Example
…a scoping review was conducted in order to systematically map the research done in this area, as well as to identify any existing gaps in knowledge.…The following research question was formulated: What is known from the literature about parents' action, challenges, and needs while enabling participation of their children with a physical disability? (36)
Explanation and Elaboration
Authors should include a clear and explicit statement of the overall objectives and research questions that they will address in their scoping review. These should be articulated in terms of the key elements, which relate to the review's eligibility criteria. The language of this item (“other relevant key elements used to conceptualize the review questions…”) aims to be inclusive of the many approaches authors can use to develop research questions, including (but not limited to) the PICO (population, intervention, comparison, outcome) (37), SPICE (setting, population/perspective, intervention, comparison, evaluation) (38), or PCC (population, concept, context) (5, 39) frameworks. Regardless of how the objectives and questions were conceptualized, the main components guiding the inquiry should be clearly stated (6, 7).

Methods

Item 5: Protocol and Registration

Indicate whether a review protocol exists; state if and where it can be accessed (for example, a Web address); and if available, provide registration information, including the registration number.
Example
Our protocol was drafted using the Preferred Reporting Items for Systematic Reviews and Meta-analysis Protocols (PRISMA-P…), which was revised by the research team and members of Health Canada, and was disseminated through our programme's Twitter account (@KTCanada) and newsletter to solicit additional feedback. The final protocol was registered prospectively with the Open Science Framework on 6 September 2016 (https://osf.io/kv9hu/). (40)
Explanation and Elaboration
A protocol should be developed a priori, and it is important to include information about the protocol in the scoping review (4, 5). To ensure transparency and reduce duplication of work, the protocol should ideally be registered (for example, with the Open Science Framework [41]), and authors may wish to publish it in a journal (such as Systematic Reviews [42], JBI Database of Systematic Reviews and Implementation Reports [43], or BMJ Open [44]). If the protocol is not publicly available, details about how to access it (for example, on request from the corresponding author) should be provided. If the scoping review is an update of an existing review, the original scoping review should be cited.

Item 6: Eligibility Criteria

Specify characteristics of the sources of evidence used as eligibility criteria (for example, years considered, language, and publication status), and provide a rationale.
Example
…to be included in the review, papers needed to measure or focus on specific dimensions of treatment burden, developed in the conceptual framework (e.g. financial, medication, administrative, lifestyle, healthcare and time/travel). Peer-reviewed journal papers were included if they were: published between the period of 2000–2016, written in English, involved human participants and described a measure for burden of treatment, e.g. including single measurements, measuring and/or incorporating one or two dimensions of burden of treatment. Quantitative, qualitative and mixed-method studies were included in order to consider different aspects of measuring treatment burden. Papers were excluded if they did not fit into the conceptual framework of the study, focused on a communicable chronic condition, for example human immunodeficiency virus infection and acquired immune deficiency syndrome (HIV/AIDS) or substance abuse. Papers talking about carer burden, in addition to patient burden of treatment, were also included. (45)
Explanation and Elaboration
Inclusion criteria should be provided to allow the reader to understand the types of evidence sources that will be included in the review. The rationale for each inclusion criterion should be clearly described. If limitations were in place by year, language, publication status, or other characteristics, the authors should specify these and provide a rationale for each.

Item 7: Information Sources

Describe all information sources in the search (for example, databases with dates of coverage and contact with authors to identify additional sources), as well as the date the most recent search was executed.
Example
To identify potentially relevant documents, the following bibliographic databases were searched from 2004 to June 2015: MEDLINE, EMBASE, LexisNexis Academic, the Legal Scholarship Network, Justis, LegalTrac, QuickLaw, and HeinOnline. The search strategies were drafted by an experienced librarian [name] and further refined through team discussion. The final search strategy for MEDLINE can be found in Additional file 3. The final search results were exported into EndNote, and duplicates were removed by a library technician. The electronic database search was supplemented by searching the Canadian Medical Protective Association website (https://www.cmpa-acpm.ca/en) and scanning relevant reviews. (46)
Explanation and Elaboration
A comprehensive literature search should be done for a scoping review, and it may include both published and difficult to locate or unpublished (sometimes called “gray”) literature (4, 5). Not all scoping reviews will include gray literature (depending on the specific research question and objectives), but if done, the information sources should be reported. The date of the most recent literature search is important to include because it allows the reader to judge how current the scoping review is. Details should be provided if the search was supplemented through various approaches, such as contacting authors to identify additional relevant material, hand-searching key journals, and scanning reference lists of included or relevant sources of evidence.

Item 8: Search

Present the full electronic search strategy for at least 1 database, including any limits used, such that it could be repeated.
Example
The final search strategy for MEDLINE can be found in Additional file 3.…
…Medline Search Strategy (Literature Search performed: June 15, 2015)
1. Obstetrics/
2. “Obstetrics and Gynecology Department, Hospital”/
3. exp Obstetric Surgical Procedures/
4. obstetric$.tw,hw.
5. exp Obstetric Labor Complications/
6. exp “Dilatation and Curettage”/
7. exp Hysterectomy/
8. Sterilization, Tubal/
9. Salpingostomy/
10. exp Pregnancy Complications/
11. cerebral palsy/
12. Asphyxia Neonatorum/
13. (abortion$ or cervical cerclage or colpotomy or culdoscop$ or fetoscop$ or hysteroscop$ or hysterotomy).tw.
14. (paracervical block$ or obstetric$ anesthe$ or obstetric$ anaesthe$).tw.
15. (Cesarean or Episiotom$ or obstetric$ abstraction$ or fetal version).tw.
16. ((induc$ or augmentation or premature or pre-term or preterm or obstructed) adj (labour or labor)).tw.
17. (Abruptio Placentae or breech or Cephalopelvic Disproportion or premature rupture of fetal membrane$ or prom or fetal membranes premature rupture or Dystocia or Uterine Inertia or Chorioamnionitis or Placenta Accreta or Placenta Previa or Postpartum Hemorrhage or Uterine Inversion or Uterine Rupture or Vasa Previa).tw.
18. (Fetal Death or Fetal Resorption or Stillbirth or perinatal death or peri-natal death or Maternal Death or Birth Injuri$ or obstetric$ paralys$).tw.
19. (pre-eclampsia or dilatation or Curettage or Vacuum aspiration).tw.
20. (asphyxia neonatorum or cerebral palsy or birth asphyxia or fetal pulmonary embolism or dystocia).tw.
21. exp Dystocia/ or exp Pregnancy Complications, Cardiovascular/
22. or/1-21
23. exp Medical Errors/
24. ae.fs.
25. (error$ or advers$ or mistake$ or negligence).tw.
26. or/23-25
27. 22 and 26
28. exp Malpractice/
29. Expert Testimony/
30. (reforms or tort reform$ or damage award limit$ or lawsuit$ or immunity provision$).tw.
31. (immunity provision$ or immunity clause$ or fault compensation or Malpractice or expert witness$).tw.
32. (statutes adj2 limitations).tw.
33. lj.fs.
34. exp Jurisprudence/
35. or/28-34
36. 27 and 35
37. Limit 36 to yr = 2004-current
38. Limit 37 to English (46)
Explanation and Elaboration
The literature search strategy should be reported in a manner that allows easy replication by others and should be presented in its entirety in the text, a table, or an appendix. Additional details to report include the person who did the literature search (for example, an experienced librarian or information specialist) and whether it was peer-reviewed by another librarian using the Peer Review of Electronic Search Strategies (PRESS) checklist, a set of recommendations for librarians and other information specialists to use when evaluating electronic search strategies (47). The full search strategy should be provided for at least 1 electronic database, and if gray literature was searched as part of the scoping review, a detailed account of the approach should be documented. For example, “Grey Matters: a practical tool for searching health-related grey literature” (created by the Canadian Agency for Drugs and Technologies in Health) outlines an approach for searching gray literature (48). Any search limitations, such as language, date of publication, and study design filters, should be clearly documented with a rationale provided.

Item 9: Selection of Sources of Evidence

State the process for selecting sources of evidence (that is, screening and eligibility) included in the scoping review.
Example
To increase consistency among reviewers, all reviewers screened the same 50 publications, discussed the results and amended the screening and data extraction manual before beginning screening for this review. Nine reviewers working in pairs sequentially evaluated the titles, abstracts and then full text of all publications identified by our searches for potentially relevant publications.… We resolved disagreements on study selection and data extraction by consensus and discussion with other reviewers if needed. (49)
Explanation and Elaboration
A narrative description of the selection process for included sources of evidence should be provided. When reporting this item, authors should include information about the process for developing the form that was used to guide the selection of sources of evidence (that is, how the items were selected and which software was used), calibration exercises or pilot-testing (testing the form among some or all team members to refine it and ensure that all relevant data were captured), full screening process (how many reviewers participated and whether they screened independently and compared answers or 1 or more researchers screened and 1 or more researchers verified the screening for accuracy), and how inconsistencies or disagreements were resolved (for example, by involvement of a third party). Calibration exercise details should include the number of persons who tested the form (using x number of citations and full-text articles), the process for resolving inconsistencies, and key changes that were made and why. The Box shows further details on the calibration exercise. If applicable, all processes for obtaining and confirming data from investigators should be described.
Box. Implementation of calibration exercises (items 9 and 10).
Box. Implementation of calibration exercises (items 9 and 10).

Item 10: Data Charting Process

Describe the methods of charting data from the included sources of evidence (for example, calibrated forms or forms that have been tested by the team before their use, and whether data charting was done independently or in duplicate) and any processes for obtaining and confirming data from investigators.
Examples
A data-charting form was jointly developed by two reviewers to determine which variables to extract. The two reviewers independently charted the data, discussed the results and continuously updated the data-charting form in an iterative process. (50)
Data from eligible studies were [charted] using a standardized data abstraction tool designed for this study. The tool captured the relevant information on key study characteristics and detailed information on all metrics used to estimate/describe [child] growth based on at least two data points per child/group (even though our tool can accommodate metrics based on cross-sectional analyses) anywhere in the article, including metrics that were mentioned in the narrative yet for which results were not shown.…
Two reviewers independently [charted] data from each eligible article. Any disagreements were resolved through discussion between the two reviewers or further adjudication by a third reviewer. Data [charting] was implemented using REDCap, a customizable informatics systems-based web software. (51)
Explanation and Elaboration
In the frameworks by Arksey and O'Malley (6) and Levac and colleagues (7), as well as the JBI guidance (4, 5), the process of data extraction in a scoping review is called “data charting” and involves the use of a clear and comprehensive data charting form to extract the relevant information from the included sources of evidence. When reporting this step, authors should include information about the process for developing the charting form (that is, how the items were selected and which software was used), calibration (testing the form among some or all team members to refine it and ensure that all relevant data were captured), full data charting process (how many reviewers participated and whether they charted independently and compared answers or 1 or more researchers charted and 1 or more researchers verified the data for accuracy), and how inconsistencies or disagreements were resolved (for example, through discussion or involvement of a third party). Calibration exercise details should include the number of persons who tested the form (using x number of included sources), as well as the process for resolving inconsistencies, and key changes that were made and why (Box). If the charting process was iterative (that is, the form was continually updated), authors should describe the main revisions with a rationale, to increase transparency of reporting. If applicable, the processes used for obtaining and confirming data from investigators of the included sources of evidence should be described.

Item 11: Data Items

List and define all variables for which data were sought and any assumptions and simplifications made.
Example
We abstracted data on article characteristics (e.g., country of origin, funder), engagement characteristics and contextual factors (e.g., type of knowledge user, country income level, type of engagement activity, frequency and intensity of engagement, use of a framework to inform the intervention), barriers and facilitators to engagement, and results of any formal assessment of engagement (e.g., attitudes, beliefs, knowledge, benefits, unintended consequences). (52)
Explanation and Elaboration
The specific data items (whether qualitative or quantitative) collected for the scoping review will vary according to the review's focus. If any items involve interpretation, this should be reported. The final version of the charting form, including clear definitions of each item, should be included (if possible) in the scoping review as an appendix or supplementary file.

Item 12 (Optional): Critical Appraisal of Individual Sources of Evidence

If done, provide a rationale for conducting a critical appraisal of included sources of evidence; describe the methods used and how this information was used in any data synthesis (if appropriate).
Example
…an in-depth assessment of the conduct of the knowledge synthesis approaches underlying the NMA [network meta-analysis] is lacking. As such, we aimed to explore the characteristics and methodological quality of knowledge synthesis approaches of NMAs. We also aimed to assess the statistical methods applied using the Analysis subdomain of the ISPOR checklist.…
The quality of the knowledge synthesis methods was appraised using the AMSTAR tool. The AMSTAR tool was created and validated to assess the methodological quality of systematic reviews of RCTs. The tool measures overall quality, where a score of 8 or higher is considered high quality, 4 to 7 is moderate quality, and 0 to 3 is low quality. Information for quality assessment was incorporated into the data extraction form, which was pilot-tested on a random sample of seven included articles that ranged from low to high quality.
To appraise the validity of the analytical methods applied, we used the 6-item Analysis subdomain of the ISPOR checklist for NMAs. To ensure high inter-rater agreement, a workshop on the tool was held with the team and two pilot-tests were conducted on a random sample of seven included NMAs. Each pilot-test consisted of a facilitated team meeting for feedback and discussion on discrepant items. Upon completion of the pilot-tests, pairs of reviewers (A.A.V., W.Z., J.A., S.S., P.R., C.D., J.E.) independently assessed the first 215 included articles. The remaining 241 included articles were assessed by one reviewer (M.P.) and verified by a second reviewer (A.V., S.S.). All discrepancies were resolved by a third reviewer (W.Z., A.A.V.). ISPOR items that were not applicable to open loop networks (related terms include without a closed-loop, star-shaped network, and tree-shaped network) were scored as “not applicable.” Items related to heterogeneity were also not applicable to NMAs that used a fixed-effect model and provided a rationale for selecting this model. (53)
Explanation and Elaboration
For both this item and item 19, we adopted the term “critical appraisal” (instead of “risk of bias”) to include and acknowledge the various sources of evidence that may be used in a scoping review (such as quantitative and qualitative research, policy documents, and expert opinions). A key difference between scoping reviews and systematic reviews is that the former are generally conducted to provide an overview of the existing evidence regardless of methodological quality or risk of bias (4, 5). Therefore, the included sources of evidence are typically not critically appraised for scoping reviews. When individual sources of evidence are assessed for methodological quality or risk of bias, authors must provide a clear explanation of how the appraisal aligns with the review objectives (that is, a rationale), along with a description of the methodological approach (such as tools used and process followed, including number of reviewers, calibration, and so forth) and how the findings were used. This is an optional step, so authors are expected to report the rationale and methods only if an appraisal was done.

Item 13 (Not Applicable): Summary Measures

This item from the original PRISMA is not applicable for scoping reviews because a meta-analysis is not done (that is, summary measures are not relevant).

Item 14: Synthesis of Results

Describe the methods of handling and summarizing the data that were charted.
Example
We grouped the studies by the types of behavior they analyzed, and summarized the type of settings, populations and study designs for each group, along with the measures used and broad findings. Where we identified a systematic review, we counted the number of studies included in the review that potentially met our inclusion criteria and noted how many studies had been missed by our search. (54)
Explanation and Elaboration.
The aim of the synthesis is to present the range of evidence that was identified to answer the review question or meet the objectives of the scoping review (4, 5). Authors should clearly describe how the evidence will be presented: in a narrative format, table, or visual representation, including a map or diagram.

Item 15 (Not Applicable): Risk of Bias Across Studies

This item from the original PRISMA is not applicable for scoping reviews because the scoping review method is not intended to be used to critically appraise (or appraise the risk of bias of) a cumulative body of evidence.

Item 16 (Not Applicable): Additional Analyses

This item from the original PRISMA is not applicable for scoping reviews because additional analyses, including sensitivity or subgroup analyses and meta-regression, are not done.

Results

Item 17: Selection of Sources of Evidence

Give numbers of sources of evidence screened, assessed for eligibility, and included in the review, with reasons for exclusions at each stage, ideally using a flow diagram.
Example
After duplicates were removed, a total of 883 citations were identified from searches of electronic databases and review article references. Based on the title and the abstract, 699 were excluded, with 184 full text articles to be retrieved and assessed for eligibility. Of these, 144 were excluded for the following reasons: 23 did not directly quantify the effects of climate change, 53 did not directly quantify effects on human health, and 67 were not considered to be original quantitative research (e.g., review articles, commentaries). We excluded 1 study because we were unable to retrieve it. The remaining 40 studies were considered eligible for this review. (55)
Explanation and Elaboration
The results of the literature search should be reported, including numbers of citations screened, duplicates removed, and full-text documents screened. Consistent with the original PRISMA statement (10), we recommend including a flow diagram that details the reasons for exclusion at the full-text level of screening at a minimum (Appendix Figure 1).
Appendix Figure 1. Example of item 17, selection of sources of evidence.
Appendix Figure 1. Example of item 17, selection of sources of evidence.

Item 18: Characteristics of Sources of Evidence

For each source of evidence, present characteristics for which data were charted and provide the citations.
Example
The modules [of the e-recovery interventions] are described in Table 2, together with a description of aim, target group and setting for each intervention.…
The studies' place of origin, aims, design, methods, measures and outcomes, and main findings related to each intervention are presented in Table 3 Study characteristics. The number of studies available per interventions varied from one to six. (56)
Explanation and Elaboration
The characteristics of interest of each source of evidence should be presented, along with their references. An overall summary can be reported in the text, with characteristics for the individual sources of evidence provided in tables and appendixes, as appropriate.

Item 19 (Optional): Critical Appraisal Within Sources of Evidence

If done, present data on critical appraisal of included sources of evidence (see item 12).
Example
Appendix Figure 2. Example of item 19, critical appraisal within sources of evidence. From the same example presented in item 12. Adapted from reference 53. AMSTAR = A Measurement Tool to Assess Systematic Reviews.
Appendix Figure 2. Example of item 19, critical appraisal within sources of evidence.
From the same example presented in item 12. Adapted from reference 53. AMSTAR = A Measurement Tool to Assess Systematic Reviews.
Explanation and Elaboration
As explained in item 12, although critical appraisal of individual sources of evidence falls outside the realm of scoping review methodology (4, 5), it may be done if relevant to the scoping review objectives. If done, authors must report the data (that is, the critical appraisal findings) for each included source of evidence in a manner that corresponds to the approach described in the methods (see item 12). Because this step is optional, authors are expected to report the results only if an appraisal was done.

Item 20: Results of Individual Sources of Evidence

For each included source of evidence, present the relevant data that were charted that relate to the review questions and objectives.
Example
Appendix Figure 3. Example of item 20, results of individual sources of evidence. Scoping review included articles. Adapted from reference 57.
Appendix Figure 3. Example of item 20, results of individual sources of evidence.
Scoping review included articles. Adapted from reference 57.
Explanation and Elaboration
As noted in item 11, the specific data items that were charted will vary according to the specific questions and objectives of the scoping review and may not necessarily include the outcome results of the included sources of evidence. Depending on the number of sources included in the scoping review, the relevant data from each source can be provided in an appendix or supplementary file.

Item 21: Synthesis of Results

Summarize or present the charting results as they relate to the review questions and objectives.
Example
Active Travel and Physical Activity
Ninety-two studies examined associations between active travel and physical activity [references]. The majority were from the UK (n = 24) and USA (n = 19), followed by Australia (n = 12), Canada (n = 7), Denmark (n = 6) and New Zealand (n = 5). Other countries with less than five studies included: Norway, Netherlands, Belgium, Switzerland, Spain, Portugal, Estonia, Germany, Sweden, and Ireland. The majority were conducted among children (70%, n = 64), including 8 studies that included only children under 10 years old. Only two studies reported analyses of only adults over 65 [references].
The vast majority of studies used cross-sectional analyses—only six studies out of 92 (6.5%) reported results from longitudinal or pre/post analysis to examine associations between active travel and physical activity [references]. Just over half of the studies (n = 48) used objective measures (e.g., accelerometer, pedometer) to assess physical activity.
Overall, most studies (n = 72; 78%, representing 75% of the children's and 86% of the adults' studies) reported a positive association between active travel and physical activity; however many of these (n = 32) reported mixed results overall (e.g., when using more than one measure of physical activity, or in sub-analyses such as for gender). Of the 20 studies that reported no association, 12 used objective measures to assess physical activity. The average numbers of participants in these 20 studies were much lower than in studies which did report an association. This may be indicative of insufficient power to find associations. (54)
Explanation and Elaboration.
Results may be presented as a “map” of the data in the form of a diagram or table (Appendix Figure 4) or in a descriptive format, whichever aligns best with the review's objectives (4, 5).
Appendix Figure 4. Example of item 21, synthesis of results. Description of included studies in the scoping review of bivariate analyses of health and environmental behaviors. Adapted from reference 54. AT/PA = active transportation and physical activity.
Appendix Figure 4. Example of item 21, synthesis of results.
Description of included studies in the scoping review of bivariate analyses of health and environmental behaviors. Adapted from reference 54. AT/PA = active transportation and physical activity.

Item 22 (Not Applicable): Risk of Bias Across Studies

This item is not applicable for scoping reviews. See explanation for item 15.

Item 23 (Not Applicable): Additional Analyses

This item is not applicable for scoping reviews. See explanation for item 16.

Discussion and Funding

Item 24: Summary of Evidence

Summarize the main results (including an overview of concepts, themes, and types of evidence available), link to the review questions and objectives, and consider the relevance to key groups.
Example
In this scoping review we identified 88 primary studies addressing dissemination and implementation research across various settings of dementia care published between 1998 and 2015. Our findings indicate a paucity of research focusing specifically on dissemination of knowledge within dementia care and a limited number of studies on implementation in this area. We also found that training and educating professionals, developing stakeholder interrelationships, and using evaluative and iterative strategies are frequently employed to introduce and promote change in practice. However, although important and feasible, these strategies only partly address what is repeatedly highlighted in the evidence base: that organisational factors are reported as the main barrier to implementation of knowledge within dementia care. Moreover, included studies clearly support an increased effort to improve the quality of dementia care provided in residential settings in the last decade. (26)
Explanation and Elaboration.
The main findings should be summarized and linked to the review questions and objectives (6, 7). The data charting results can be elaborated for or tailored to each relevant knowledge user group, such as policymakers, health care providers, and patients or consumers.

Item 25: Limitations

Discuss the limitations of the scoping review process.
Example
Our scoping review has some limitations. To make our review more feasible, we were only able to include a random sample of rapid reviews from websites of rapid review producers. Further adding to this issue is that many rapid reviews contain proprietary information and are not publicly available. As such, our results are only likely generalizable to rapid reviews that are publicly available. Furthermore, this scoping review was an enormous undertaking and our results are only up to date as of May 2013. (58)
Explanation and Elaboration.
Because a critical appraisal is optional for scoping reviews, reporting of this item should focus on limitations of the scoping review process (vs. limitations of the included sources of evidence). Any deviations from guidance (for example, the JBI methods guidance [4, 5]) or the scoping review protocol should be noted, along with a rationale and a reflection on the potential effect on the results.

Item 26: Conclusions

Provide a general interpretation of the results with respect to the review questions and objectives, as well as potential implications or next steps.
Example
The lack of evidence to support physiotherapy interventions for this population appears to pose a challenge to physiotherapists. The aim of this scoping review was to identify gaps in the literature which may guide a future systematic review. However, the lack of evidence found means that undertaking a systematic review is not appropriate or necessary.
Evidence is insufficient to guide the nature of the physiotherapy intervention. There is also limited evidence to describe the experiences of patients, next of kin, or physiotherapists working with this population. The consideration of the attitudes towards an intervention could be considered a vital component of a complex intervention and it is suggested that they should be an integral part of the implementation of that intervention. This advocates high quality research being needed to determine what physiotherapy techniques may be of benefit for this population and to help guide physiotherapists as how to deliver this. (59)
Explanation and Elaboration.
The charting results should be discussed in relation to current literature, practice, and policy (4, 5). The potential implications of the scoping review should be discussed. Recommendations for future research, including a more focused systematic review that builds from the scoping review results, can be mentioned if appropriate. Most scoping reviews aim to summarize what has been done previously and carry out data charting and do not perform a formal appraisal or synthesis (1). As such, recommendations for practice and policy will not be relevant for most scoping reviews (4, 5). The interpretation of results should link to the review questions and objectives, as initially specified.

Item 27: Funding

Describe sources of funding for the included sources of evidence, as well as sources of funding for the scoping review. Describe the role of the funders of the scoping review.
Example
This paper was funded by Stichting Innovatie Alliantie (PRO-3-36) (http://www.regieorgaan-sia.nl) and Zuyd University of Applied Sciences. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. (50)
Explanation and Elaboration
The source of funding for included sources of evidence should be documented, along with the source of funding for the scoping review. In addition, the role of the scoping review funding organization should be described. For any type of research, many journals require the contract number for the source of funding, which can be provided, if applicable.

Supplemental Material

Supplement. PRISMA-ScR Round 1 Survey (With Information Sheet)

References

1.
Tricco ACLillie EZarin WO'Brien KColquhoun HKastner Met al. A scoping review on the conduct and reporting of scoping reviews. BMC Med Res Methodol. 2016;16:15. [PMID: 26857112]  doi: 10.1186/s12874-016-0116-4
2.
Canadian Institutes of Health Research. A guide to knowledge synthesis: a knowledge synthesis chapter. 2010. Accessed at www.cihr-irsc.gc.ca/e/41382.html on 10 January 2018.
3.
Colquhoun HLLevac DO'Brien KKStraus STricco ACPerrier Let al. Scoping reviews: time for clarity in definition, methods, and reporting. J Clin Epidemiol. 2014;67:1291-4. [PMID: 25034198]  doi: 10.1016/j.jclinepi.2014.03.013
4.
Peters MDGodfrey CMKhalil HMcInerney PParker DSoares CB. Guidance for conducting systematic scoping reviews. Int J Evid Based Healthc. 2015;13:141-6. [PMID: 26134548]  doi: 10.1097/XEB.0000000000000050
5.
Peters MDJGodfrey CMcInerney PBaldiniSoares CKhalil HParker D. Scoping reviews. In: Aromataris E, Munn Z, eds. Joanna Briggs Institute Reviewer's Manual. Adelaide, Australia: Joanna Briggs Inst; 2017.
6.
Arksey HO'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8:19-32.
7.
Levac DColquhoun HO'Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:69. [PMID: 20854677]  doi: 10.1186/1748-5908-5-69
8.
Altman DGSimera I. Using reporting guidelines effectively to ensure good reporting of health research. In: Moher D, Altman D, Schulz K, Simera I, Wager E, eds. Guidelines for Reporting Health Research: A User's Manual. Hoboken, NJ: J Wiley; 2014:32-40.
9.
Moher DSchulz KFSimera IAltman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7:e1000217. [PMID: 20169112]  doi: 10.1371/journal.pmed.1000217
10.
Moher DLiberati ATetzlaff JAltman DGPRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151:264-9. [PMID: 19622511].  doi: 10.7326/0003-4819151-4-200908180-00135
11.
Tricco ACZarin WGhassemi MNincic VLillie EPage MJet al. Same family, different species: methodological conduct and quality varies according to purpose for five types of knowledge synthesis. J Clin Epidemiol. 2018;96:133-42. [PMID: 29103958]  doi: 10.1016/j.jclinepi.2017.10.014
12.
McInnes MDBossuyt PM. Pitfalls of systematic reviews and meta-analyses in imaging research. Radiology. 2015;277:13-21. [PMID: 26402491]  doi: 10.1148/radiol.2015142779
13.
Macaskill P, Gatsonis C, Deeks JJ, Harbord RM, Takwoingi Y. Analysing and presenting results. In: Deeks JJ, Bossuyt PM, Gatsonis C, eds. Cochrane Handbook for Systematic Reviews of Diagnostic Test Accuracy. Version 1.0. London: The Cochrane Collaboration; 2010. Accessed at https://methods.cochrane.org/sites/methods.cochrane.org.sdt/files/public/uploads/Chapter%2010%20-%20Version%201.0.pdf on 3 August 2018.
14.
Whiting PFRutjes AWWestwood MEMallett SDeeks JJReitsma JBQUADAS-2 Group. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155:529-36. [PMID: 22007046].  doi: 10.7326/0003-4819-155-8-201110180-00009
15.
Schmucker CMotschall EAntes GMeerpohl JJ. [Methods of evidence mapping. A systematic review]. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz. 2013;56:1390-7. [PMID: 23978984]  doi: 10.1007/s00103-013-1818-y
16.
Miake-Lye IMHempel SShanman RShekelle PG. What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products. Syst Rev. 2016;5:28. [PMID: 26864942]  doi: 10.1186/s13643-016-0204-x
17.
EQUATOR Network. Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews (PRISMA-ScR). 2017. Accessed at www.equator-network.org/library/reporting-guidelines-under-development/#55 on 10 January 2018.
18.
PRISMA. Extensions in development: Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). 2015. Accessed at www.prisma-statement.org/Extensions/InDevelopment.aspx on 10 January 2018.
19.
Qualtrics. 2018. Accessed at www.qualtrics.com/uk on 10 January 2018.
20.
O'Brien KKColquhoun HLevac DBaxter LTricco ACStraus Set al. Advancing scoping study methodology: a web-based survey and consultation of perceptions on terminology, definition and methodological steps. BMC Health Serv Res. 2016;16:305. [PMID: 27461419]  doi: 10.1186/s12913-016-1579-z
21.
The Royal Institute of International Affairs. Chatham House Rule. 2018. Accessed at www.chathamhouse.org/chatham-house-rule on 14 June 2018.
22.
Jones JHunter D. Consensus methods for medical and health services research. BMJ. 1995;311:376-80. [PMID: 7640549]
23.
Sli.do. 2012. Accessed at www.sli.do on 10 January 2018.
24.
TranscribeMe. 2018. Accessed at https://transcribeme.com on 27 February 2018.
25.
Conceptboard. 2018. Accessed at https://conceptboard.com on 14 June 2018.
26.
Lourida IAbbott RARogers MLang IAStein KKent Bet al. Dissemination and implementation research in dementia care: a systematic scoping review and evidence map. BMC Geriatr. 2017;17:147. [PMID: 28709402]  doi: 10.1186/s12877-017-0528-y
27.
Knowledge Translation Program. 2016. Accessed at https://knowledgetranslation.net on 10 January 2018.
28.
Penelope. 2017. Accessed at www.penelope.ai on 28 February 2018.
29.
World Health Organization. Alliance for Health Policy and Systems Research. 2018. Accessed at www.who.int/alliance-hpsr/en on 27 February 2018.
30.
Global Evidence Synthesis Initiative. 2016. Accessed at www.gesiinitiative.com on 27 February 2018.
31.
San AHiremagalur BMuircroft WGrealish L. Screening of cognitive impairment in the dialysis population: a scoping review. Dement Geriatr Cogn Disord. 2017;44:182-95. [PMID: 28869959]  doi: 10.1159/000479679
32.
Galloway TBlackett HChatwood SJeppesen CKandola KLinton Jet al. Obesity studies in the circumpolar Inuit: a scoping review. Int J Circumpolar Health. 2012;71:18698. [PMID: 22765938]  doi: 10.3402/ijch.v71i0.18698
33.
Beller EMGlasziou PPAltman DGHopewell SBastian HChalmers Iet alPRISMA for Abstracts Group. PRISMA for Abstracts: reporting systematic reviews in journal and conference abstracts. PLoS Med. 2013;10:e1001419. [PMID: 23585737]  doi: 10.1371/journal.pmed.1001419
34.
Hopewell SClarke MMoher DWager EMiddleton PAltman DGet alCONSORT Group. CONSORT for reporting randomized controlled trials in journal and conference abstracts: explanation and elaboration. PLoS Med. 2008;5:e20. [PMID: 18215107]  doi: 10.1371/journal.pmed.0050020
35.
Haynes RBMulrow CDHuth EJAltman DGGardner MJ. More informative abstracts revisited. Ann Intern Med. 1990;113:69-76. [PMID: 2190518]
36.
Piškur BBeurskens AJJongmans MJKetelaar MNorton MFrings CAet al. Parents' actions, challenges, and needs while enabling participation of children with a physical disability: a scoping review. BMC Pediatr. 2012;12:177. [PMID: 23137074]  doi: 10.1186/1471-2431-12-177
37.
Richardson WSWilson MCNishikawa JHayward RS. The well-built clinical question: a key to evidence-based decisions [Editorial]. ACP J Club. 1995;123:A12-3. [PMID: 7582737]
38.
Andrew B. Clear and present questions: formulating questions for evidence based practice. Library Hi Tech. 2006;24:355-68.
39.
Joanna Briggs Institute. The Joanna Briggs Institute Reviewers' Manual 2015: Methodology for JBI Scoping Reviews. Adelaide, Australia: Joanna Briggs Institute; 2015. Accessed at http://joannabriggs.org/assets/docs/sumari/Reviewers-Manual_Methodology-for-JBI-Scoping-Reviews_2015_v2.pdf on 3 August 2018.
40.
Tricco ACZarin WLillie EPham BStraus SE. Utility of social media and crowd-sourced data for pharmacovigilance: a scoping review protocol. BMJ Open. 2017;7:e013474. [PMID: 28104709]  doi: 10.1136/bmjopen-2016-013474
41.
Open Science Framework. 2011. Accessed at https://osf.io on 10 January 2018.
42.
BioMed Central. Systematic Reviews. 2018. Accessed at https://systematicreviewsjournal.biomedcentral.com on 10 January 2018.
43.
JBI Database of Systematic Reviews and Implementation Reports. 2018. Accessed at http://journals.lww.com/jbisrir/pages/default.aspx on 10 January 2018.
44.
BMJ Open. Accessed at http://bmjopen.bmj.com on 1 March 2018.
45.
Sav ASalehi AMair FSMcMillan SS. Measuring the burden of treatment for chronic disease: implications of a scoping review of the literature. BMC Med Res Methodol. 2017;17:140. [PMID: 28899342]  doi: 10.1186/s12874-017-0411-8
46.
Cardoso RZarin WNincic VBarber SLGulmezoglu AMWilson Cet al. Evaluative reports on medical malpractice policies in obstetrics: a rapid scoping review. Syst Rev. 2017;6:181. [PMID: 28874176]  doi: 10.1186/s13643-017-0569-5
47.
McGowan JSampson MSalzwedel DMCogo EFoerster VLefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40-6. [PMID: 27005575]  doi: 10.1016/j.jclinepi.2016.01.021
48.
Canadian Agency for Drugs and Technologies in Health. Grey Matters: a practical tool for searching health-related grey literature. 2015. Accessed at https://cadth.ca/resources/finding-evidence/grey-matters on 10 January 2018.
49.
Duffett MChoong KHartling LMenon KThabane LCook DJ. Randomized controlled trials in pediatric critical care: a scoping review. Crit Care. 2013;17:R256. [PMID: 24168782]  doi: 10.1186/cc13083
50.
Lenzen SADaniëls Rvan Bokhoven MAvan der Weijden TBeurskens A. Disentangling self-management goal setting and action planning: a scoping review. PLoS One. 2017;12:e0188822. [PMID: 29176800]  doi: 10.1371/journal.pone.0188822
51.
Leung MPerumal NMesfin EKrishna AYang SJohnson Wet al. Metrics of early childhood growth in recent epidemiological research: a scoping review. PLoS One. 2018;13:e0194565. [PMID: 29558499]  doi: 10.1371/journal.pone.0194565
52.
Tricco ACZarin WRios PNincic VKhan PAGhassemi Met al. Engaging policy-makers, heath system managers, and policy analysts in the knowledge synthesis process: a scoping review. Implement Sci. 2018;13:31. [PMID: 29433543]  doi: 10.1186/s13012-018-0717-x
53.
Zarin WVeroniki AANincic VVafaei AReynen EMotiwala SSet al. Characteristics and knowledge synthesis approach for 456 network meta-analyses: a scoping review. BMC Med. 2017;15:3. [PMID: 28052774]  doi: 10.1186/s12916-016-0764-6
54.
Hutchinson JPrady SLSmith MAWhite PCGraham HM. A scoping review of observational studies examining relationships between environmental behaviors and health behaviors. Int J Environ Res Public Health. 2015;12:4833-58. [PMID: 25950651]  doi: 10.3390/ijerph120504833
55.
Hosking JCampbell-Lendrum D. How well does climate change and human health research match the demands of policymakers? A scoping review. Environ Health Perspect. 2012;120:1076-82. [PMID: 22504669]  doi: 10.1289/ehp.1104093
56.
Strand MGammon DRuland CM. Transitions from biomedical to recovery-oriented practices in mental health: a scoping review to explore the role of Internet-based interventions. BMC Health Serv Res. 2017;17:257. [PMID: 28388907]  doi: 10.1186/s12913-017-2176-5
57.
Constand MKMacDermid JCDal Bello-Haas VLaw M. Scoping review of patient-centered care approaches in healthcare. BMC Health Serv Res. 2014;14:271. [PMID: 24947822]  doi: 10.1186/1472-6963-14-271
58.
Tricco ACAntony JZarin WStrifler LGhassemi MIvory Jet al. A scoping review of rapid review methods. BMC Med. 2015;13:224. [PMID: 26377409]  doi: 10.1186/s12916-015-0465-6
59.
Hall AJLang IAEndacott RHall AGoodwin VA. Physiotherapy interventions for people with dementia and a hip fracture—a scoping review of the literature. Physiotherapy. 2017;103:361-8. [PMID: 28843451]  doi: 10.1016/j.physio.2017.01.001

Comments

0 Comments
Sign In to Submit A Comment

Information & Authors

Information

Published In

cover image Annals of Internal Medicine
Annals of Internal Medicine
Volume 169Number 72 October 2018
Pages: 467 - 473

History

Published online: 4 September 2018
Published in issue: 2 October 2018

Keywords

Authors

Affiliations

Andrea C. Tricco, PhD, MSc
St. Michael's Hospital and University of Toronto, Toronto, Ontario, Canada (A.C.T., S.E.S.)
Erin Lillie, MSc
St. Michael's Hospital, Toronto, Ontario, Canada (E.L., W.Z.)
Wasifa Zarin, MPH
St. Michael's Hospital, Toronto, Ontario, Canada (E.L., W.Z.)
Kelly K. O'Brien, PhD, BScPT
University of Toronto, Toronto, Ontario, Canada (K.K.O., H.C.)
Heather Colquhoun, PhD
University of Toronto, Toronto, Ontario, Canada (K.K.O., H.C.)
Danielle Levac, PhD, MSc, BScPT
Northeastern University, Boston, Massachusetts (D.L.)
David Moher, PhD, MSc
Ottawa Hospital Research Institute, Ottawa, Ontario, Canada (D.M., C.G.)
Micah D.J. Peters, PhD, MA(Q)
University of South Australia and University of Adelaide, Adelaide, South Australia, Australia (M.D.P.)
Tanya Horsley, PhD
Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada (T.H.)
Laura Weeks, PhD
Canadian Agency for Drugs and Technologies in Health, Ottawa, Ontario, Canada (L.W., T.C.)
Susanne Hempel, PhD
RAND Corporation, Santa Monica, California (S.H.)
Elie A. Akl, MD, PhD, MPH
American University of Beirut, Beirut, Lebanon (E.A.A.)
Christine Chang, MD, MPH
Agency for Healthcare Research and Quality, Rockville, Maryland (C.C.)
Jessie McGowan, PhD
University of Ottawa, Ottawa, Ontario, Canada (J.M.)
Lesley Stewart, PhD, MSc
University of York, York, United Kingdom (L.S.)
Lisa Hartling, PhD, MSc, BScPT
University of Alberta, Edmonton, Alberta, Canada (L.H.)
Adrian Aldcroft, BA(Hons), BEd
BMJ Open, London, United Kingdom (A.A.)
Michael G. Wilson, PhD
McMaster University, Hamilton, Ontario, Canada (M.G.W.)
Chantelle Garritty, MSc
Ottawa Hospital Research Institute, Ottawa, Ontario, Canada (D.M., C.G.)
Simon Lewin, PhD
Norwegian Institute of Public Health, Oslo, Norway, and South African Medical Research Council, Cape Town, South Africa (S.L.)
Christina M. Godfrey, PhD, RN
Queen's University, Kingston, Ontario, Canada (C.M.G.)
Marilyn T. Macdonald, PhD, MSN
Dalhousie University, Halifax, Nova Scotia, Canada (M.T.M.)
Etienne V. Langlois, PhD
World Health Organization, Geneva, Switzerland (E.V.L., Ö.T.)
Karla Soares-Weiser, MD, PhD
Cochrane, London, United Kingdom (K.S.)
Jo Moriarty, MA
King's College London, London, United Kingdom (J.M.)
Tammy Clifford, PhD, MSc
Canadian Agency for Drugs and Technologies in Health, Ottawa, Ontario, Canada (L.W., T.C.)
Özge Tunçalp, MD, PhD, MPH
World Health Organization, Geneva, Switzerland (E.V.L., Ö.T.)
Sharon E. Straus, MD, MSc
St. Michael's Hospital and University of Toronto, Toronto, Ontario, Canada (A.C.T., S.E.S.)
Note: Dr. Tricco affirms that the manuscript is an honest, accurate, and transparent account of the study being reported; that no important aspects of the study have been omitted; and that any discrepancies from the study as planned (and, if relevant, registered) have been explained.
Acknowledgment: The authors thank Susan Le for supporting the coordination of the project and formatting the manuscript; Anna Lambrinos and Dr. Mai Pham for participating in round 1 of scoring and attending the in-person meeting; Dr. Lisa O'Malley for participating in round 1 of scoring and the e-Delphi round 2 of scoring; Dr. Peter Griffiths and Dr. Charles Shey Wiysonge for participating in round 1 of scoring and providing feedback on Conceptboard; Dr. Jill Manthorpe and Dr. Mary Ann McColl for participating in round 1 of scoring; Assem M. Khamis for assisting with the identification of examples for the Explanation and Elaboration document; and Melissa Chen, Jessica Comilang, and Meghan Storey for providing administrative support for the in-person meeting.
Financial Support: By Knowledge Synthesis grant KRS 144046 from the Canadian Institutes of Health Research. Dr. Tricco is funded by a Tier 2 Canada Research Chair in Knowledge Synthesis. Dr. O'Brien was supported by a Canadian Institutes of Health Research New Investigator Award. Dr. Straus is funded by a Tier 1 Canada Research Chair in Knowledge Translation.
Disclosures: Mr. Aldcroft reports that he is the editor of BMJ Open. Dr. Lewin reports that he is the joint coordinating editor for the Cochrane Effective Practice and Organisation of Care (EPOC) Group. Dr. Straus reports that she is an associate editor of ACP Journal Club. Authors not named here have disclosed no conflicts of interest. Disclosures can also be viewed at www.acponline.org/authors/icmje/ConflictOfInterestForms.do?msNum=M18-0850.
Reproducible Research Statement: Study protocol: Available at the EQUATOR and PRISMA Web sites (www.equator-network.org/library/reporting-guidelines-under-development/#55 and www.prisma-statement.org/Extensions/InDevelopment.aspx). Statistical code: Not applicable. Data set: Available from Dr. Tricco (e-mail, [email protected]).
Corresponding Author: Andrea C. Tricco, PhD, MSc, Knowledge Translation Program, Li Ka Shing Knowledge Institute, St. Michael's Hospital, 209 Victoria Street, East Building, Toronto, Ontario M5B 1W8, Canada; e-mail, [email protected].
Current Author Addresses: Drs. Tricco and Straus, Ms. Lillie, and Ms. Zarin: Knowledge Translation Program, Li Ka Shing Knowledge Institute, St. Michael's Hospital, 209 Victoria Street, East Building, Toronto, Ontario M5B 1T8, Canada.
Dr. O'Brien: Department of Physical Therapy, University of Toronto, 160-500 University Avenue, Toronto, Ontario M5G 1V7, Canada.
Dr. Colquhoun: Department of Occupational Science and Occupational Therapy, University of Toronto, 160-500 University Avenue, Toronto, Ontario M5G 1V7, Canada.
Dr. Levac: Department of Physical Therapy, Movement and Rehabilitation Science, Bouvé College of Health Sciences, Northeastern University, 360 Huntington Avenue, Boston, MA 02115.
Dr. Moher and Ms. Garritty: Centre for Journalology, Ottawa Hospital Research Institute, The Ottawa Hospital, 501 Smyth Road, PO Box 201B, Ottawa, Ontario K1H 8L6, Canada.
Dr. Peters: Rosemary Bryant AO Research Centre, Sansom Institute for Health Research, Division of Health Sciences, University of South Australia, Adelaide, South Australia 5000, Australia.
Dr. Horsley: The Royal College of Physicians and Surgeons, 774 Echo Drive, Ottawa, Ontario K1S 5N8, Canada.
Drs. Weeks and Clifford: CADTH (Canadian Agency for Drugs and Technologies in Health), 865 Carling Avenue, Suite 600, Ottawa, Ontario K1S 5S8, Canada.
Dr. Hempel: RAND Corporation, 1776 Main Street, Santa Monica, CA 90401.
Dr. Akl: Department of Internal Medicine, Faculty of Medicine, Gefinor Center, Block B, 4th Floor, American University of Beirut, Riad El-Solh, Beirut, Lebanon.
Dr. Chang: Agency for Healthcare Research and Quality, 5600 Fishers Lane, Rockville, MD 20857.
Dr. McGowan: Department of Medicine, University of Ottawa, Roger Guindon Hall, 451 Smyth Road, Ottawa, Ontario K1H 8M5, Canada.
Dr. Stewart: Centre for Reviews and Dissemination, University of York, Heslington, York YO10 5DD, United Kingdom.
Dr. Hartling: Department of Pediatrics, Faculty of Medicine and Dentistry, University of Alberta, 11405-87 Avenue, Edmonton, Alberta T6G 1C9, Canada.
Mr. Aldcroft: BMJ Open Editorial Office, BMA House, Tavistock Square, London WC1H 9JR, United Kingdom.
Dr. Wilson: Department of Health Research Methods, Evidence, and Impact, McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada.
Dr. Lewin: Norwegian Institute of Public Health, PO Box 4404 Nydalen, N-0403 Oslo, Norway.
Dr. Godfrey: Queen's Collaboration for Health Care Quality: A JBI Centre of Excellence, Queen's University School of Nursing, 992 University Avenue, Barrie Street, Kingston, Ontario K7L 3N6, Canada.
Dr. Macdonald: School of Nursing, Dalhousie University, PO Box 15000, 5869 University Avenue, Halifax, Nova Scotia B3H 4R2, Canada.
Dr. Langlois: Alliance for Health Policy and Systems Research, World Health Organization, Avenue Appia 20, 1211 Geneva, Switzerland.
Dr. Soares-Weiser: Cochrane Editorial Unit, Cochrane, St. Albans House, 57-59 Haymarket, London SW1Y 4QX, United Kingdom.
Ms. Moriarty: Social Care Workforce Research Unit, King's College London, Strand, London WC2R 2LS, United Kingdom.
Dr. Tunçalp: Department of Reproductive Health and Research, World Health Organization, Avenue Appia 20, 1211 Geneva, Switzerland.
Author Contributions: Conception and design: A.C. Tricco, W. Zarin, K.K. O'Brien, D. Moher, M.D.J. Peters, L. Stewart, S.E. Straus.
Analysis and interpretation of the data: A.C. Tricco, E. Lillie, W. Zarin, K.K. O'Brien, H. Colquhoun, D. Moher, M.D.J. Peters, T. Horsley, S. Hempel, E.A. Akl, C. Chang, J. McGowan, L. Stewart, L. Hartling, A. Aldcroft, M.G. Wilson, C. Garritty, S. Lewin, C.M. Godfrey, M.T. Macdonald, K. Soares-Weiser, Ö. Tunçalp, S.E. Straus.
Drafting of the article: A.C. Tricco, E. Lillie, W. Zarin, K.K. O'Brien, D. Levac, D. Moher, M.D.J. Peters, T. Horsley, C. Chang, J. McGowan, A. Aldcroft, C. Garritty.
Critical revision of the article for important intellectual content: A.C. Tricco, E. Lillie, W. Zarin, K.K. O'Brien, D. Moher, M.D.J. Peters, T. Horsley, L. Weeks, S. Hempel, E.A. Akl, J. McGowan, L. Stewart, L. Hartling, A. Aldcroft, M.G. Wilson, C. Garritty, S. Lewin, C.M. Godfrey, E.V. Langlois, T. Clifford, Ö. Tunçalp, S.E. Straus.
Final approval of the article: A.C. Tricco, E. Lillie, W. Zarin, K.K. O'Brien, H. Colquhoun, D. Levac, D. Moher, M.D.J. Peters, T. Horsley, L. Weeks, S. Hempel, E.A. Akl, C. Chang, J. McGowan, L. Stewart, L. Hartling, A. Aldcroft, M.G. Wilson, C. Garritty, S. Lewin, C.M. Godfrey, M.T. Macdonald, E.V. Langlois, K. Soares-Weiser, J. Moriarty, T. Clifford, Ö. Tunçalp, S.E. Straus.
Obtaining of funding: A.C. Tricco, K.K. O'Brien, S.E. Straus.
Administrative, technical, or logistic support: E. Lillie, W. Zarin, S.E. Straus.
Collection and assembly of data: E. Lillie, W. Zarin, K.K. O'Brien, H. Colquhoun, T. Horsley, L. Weeks, S. Hempel, E.A. Akl, L. Hartling, A. Aldcroft, C. Garritty, M.T. Macdonald, K. Soares-Weiser, T. Clifford, Ö. Tunçalp.
This article was published at Annals.org on 4 September 2018.

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. For an editable text file, please select Medlars format which will download as a .txt file. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format





Download article citation data for:
Andrea C. Tricco, Erin Lillie, Wasifa Zarin, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med.2018;169:467-473. [Epub 4 September 2018]. doi:10.7326/M18-0850

View More

Get Access

Login Options:
Purchase

You will be redirected to acponline.org to sign-in to Annals to complete your purchase.

Access to EPUBs and PDFs for FREE Annals content requires users to be registered and logged in. A subscription is not required. You can create a free account below or from the following link. You will be redirected to acponline.org to create an account that will provide access to Annals. If you are accessing the Free Annals content via your institution's access, registration is not required.

Create your Free Account

You will be redirected to acponline.org to create an account that will provide access to Annals.

View options

PDF/ePub

View PDF/ePub

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share on social media