Hostname: page-component-8448b6f56d-wq2xx Total loading time: 0 Render date: 2024-04-17T20:08:23.970Z Has data issue: false hasContentIssue false

Simulation curricular content in postgraduate emergency medicine: A multicentre Delphi study

Published online by Cambridge University Press:  14 May 2019

Nicole Kester-Greene*
Affiliation:
Division of Emergency Medicine, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario, Canada
Andrew K Hall
Affiliation:
Department of Emergency Medicine, Kingston Health Sciences Center, Queen's University, Kingston, Ontario, Canada
Catharine M Walsh
Affiliation:
Division of Gastroenterology, Hepatology and Nutrition, the Research and Learning Institutes, Hospital for Sick Children, Department of Paediatrics, University of Toronto Wilson Centre, University of Toronto, Toronto, Ontario, Canada
*
Correspondence to: Dr. Nicole Kester-Greene, Sunnybrook Health Sciences Centre, Department of Emergency Services, 2475 Bayview Ave, Toronto, ON M4N 3M5; Email: nicole.kestergreene@sunnybrook.ca

Abstract

Objectives

There is increasing evidence to support integration of simulation into medical training; however, no national emergency medicine (EM) simulation curriculum exists. Using Delphi methodology, we aimed to identify and establish content validity for adult EM curricular content best suited for simulation-based training, to inform national postgraduate EM training.

Methods

A national panel of experts in EM simulation iteratively rated potential curricular topics, on a 4-point scale, to determine those best suited for simulation-based training. After each round, responses were analyzed. Topics scoring <2/4 were removed and remaining topics were resent to the panel for further ratings until consensus was achieved, defined as Cronbach α ≥ 0.95. At conclusion of the Delphi process, topics rated ≥ 3.5/4 were considered “core” curricular topics, while those rated 3.0-3.5 were considered “extended” curricular topics.

Results

Forty-five experts from 13 Canadian centres participated. Two hundred eighty potential curricular topics, in 29 domains, were generated from a systematic literature review, relevant educational documents and Delphi panellists. Three rounds of surveys were completed before consensus was achieved, with response rates ranging from 93-100%. Twenty-eight topics, in eight domains, reached consensus as “core” curricular topics. Thirty-five additional topics, in 14 domains, reached consensus as “extended” curricular topics.

Conclusions

Delphi methodology allowed for achievement of expert consensus and content validation of EM curricular content best suited for simulation-based training. These results provide a foundation for improved integration of simulation into postgraduate EM training and can be used to inform a national simulation curriculum to supplement clinical training and optimize learning.

Résumé

Objectif

De plus en plus d’études étayent l'intégration de la simulation dans la formation médicale; toutefois, il n'existe aucun programme national de simulation en médecine d'urgence (MU). Aussi l’étude menée selon la méthode Delphi visait-elle à dégager des sujets d'intérêt et à valider le contenu du programme convenant le mieux à l'apprentissage par simulation en MU de l'adulte, en vue de l’élaboration d'une formation postdoctorale, à l’échelle du pays.

Méthode

Un groupe national d'experts en simulation en MU a évalué par itération, sur une échelle de 4 points, des sujets d'intérêt en vue de déterminer ceux qui convenaient le mieux à une formation axée sur la simulation. Il y a eu analyse des réponses à chaque tour. Les sujets qui avaient recueilli une cote < 2/4 étaient retirés et les autres étaient soumis de nouveau au groupe pour évaluation, et ce, jusqu’à l'atteinte d'un consensus, défini comme une valeur alpha de Cronbach ≥ 0,95. À la fin du processus Delphi, les sujets ayant obtenu une cote ≥ 3,5/4 étaient considérés comme des matières majeures, et ceux ayant obtenu une cote 3,0-3,5, comme des matières mineures.

Résultats

Ont participé à l'exercice 45 experts provenant de 13 centres canadiens de formation. Une revue systématique de la documentation, un examen du matériel didactique pertinent et l'avis des experts en méthode Delphi ont permis de dégager 280 sujets d'intérêt, dans 29 domaines. Il y a eu atteinte d'un consensus après trois tours d’évaluation, et les taux de réponse variaient de 93 à 100%. À la fin du processus, 28 sujets, dans 8 domaines, ont été classés majeurs, et 35 autres, dans 14 domaines, mineurs.

Conclusion

L’étude menée selon la méthode Delphi a permis de valider, après l'atteinte d'un consensus par les experts, le contenu du programme convenant le mieux à une formation axée sur la simulation en MU. Les résultats pourront servir d'assises à une meilleure intégration de l'apprentissage par simulation dans la formation postdoctorale en MU et, par suite, à l’élaboration d'un programme national de simulation visant à enrichir la formation clinique et à maximiser l'apprentissage.

Type
Original Research
Copyright
Copyright © Canadian Association of Emergency Physicians 2019 

CLINICIAN'S CAPSULE

What is known about the topic? Simulation is increasingly being integrated into postgraduate emergency medicine (EM) training, but no national curriculum or consensus on priority topics exists.

What did this study ask? What adult EM curricular content is best suited for simulation-based training, to inform a national postgraduate EM simulation curriculum?

What did this study find? A national Delphi panel identified 28 core curricular topics, categorized into eight domains, which are best suited for simulation-based training.

Why does this study matter to clinicians? Standardized curricular content provides the foundation for improved integration of simulation into postgraduate programs to supplement clinical training and optimize learning.

INTRODUCTION

The medical education community strongly supports the incorporation of simulation-based education into postgraduate training programs as it allows trainees to practice skills in the safety of a controlled environment without risk of patient harm and enables receipt of immediate performance enhancing feedback.Reference Cook, Andersen, Combes, Feldman and Sachdeva1,Reference McGaghie, Issenberg, Barsuk and Wayne2 Deliberate practice, as defined by Ericsson,Reference Ericsson3,Reference Ericsson, Krampe and Tesch-Römer4 involves focused repetitive performance of a skill, coupled with the receipt of informative feedback and a primary goal of improving performance toward a mastery standard. Simulation-based education with deliberate practice has been shown to be superior to traditional clinical education in acquiring specific procedural and resuscitation skills.Reference McGaghie, Issenberg, Cohen, Barsuk and Wayne5 Furthermore, there is a growing body of literature demonstrating that simulation-based education improves patient outcomes.Reference Zendejas, Brydges, Wang and Cook6,Reference Brydges, Hatala, Zendejas, Erwin and Cook7 In Canada, there has been a national call for simulation-based education to be integrated more thoughtfully into existing postgraduate curricula.Reference Leblanc, Bould and Sharma8

Emergency medicine (EM) training programs have embraced simulation with enthusiasm, and the extant literature supports structured integration of simulation-based education into EM training.Reference Ilgen, Sherbino and Cook9Reference Cook, Brydges, Zendejas, Hamstra and Hatala13 A recent survey demonstrated that all Canadian EM postgraduate training programs utilize simulation-based education; however, there is great variation in the content, structure, quantity, and timing of simulation-based activities because of differences in funding, resources, barriers, and clinical contexts.Reference Russell, Hall and Hagel14 Further clarity on the best method of harnessing simulation to educate EM residents is needed. Currently, there is no standardized national EM simulation curriculum in Canada or the United States. Guidance and standardization of simulation curricular content would provide a framework on which teachers and educators in postgraduate EM programs could build a solid simulation curriculum.

In the transition to competency-based medical education in the form of Competence by Design as directed by the Royal College of Physicians and Surgeons of Canada (RCPSC),Reference Frank, Snell and Sherbino15 simulation will likely play an increasingly important role.Reference Boursicot, Etheridge and Setna16 EM has a broad and fairly inclusive set of objectives of training17 that includes many critical but rare clinical situations. If systematic training and assessment in a competency-based model require exposure to these rare events, simulation-based education would need to be harnessed as a tool for both training and assessment. Cook et al.Reference Cook, Andersen, Combes, Feldman and Sachdeva1 eloquently describe the “value proposition” of simulation, outlining the importance of balancing outcomes and cost. With substantial resources required for simulation, it is important that we prioritize key experiences, given existing resource limitations in postgraduate training. Defining the EM curricular content best suited for simulation-based training would provide the foundation for improved integration of this educational technique into postgraduate programs. Using the Delphi methodology, we sought to determine the content for such a simulation curriculum in adult EM that could be used to supplement clinical training and assessment to optimize learning.

METHODS

Study design

The present study used the Delphi technique to achieve consensus among a panel of experts to determine what adult EM curricular content is best suited for simulation-based training. For this study, simulation was defined broadly as an instructional methodology that engages learners in lifelike experiences with varying fidelity, designed to mimic real clinical encounters.Reference McGaghie, Issenberg, Cohen, Barsuk and Wayne5 The Delphi method is a research technique that draws upon the “pooled intelligence” of an expert panel to achieve consensus on a specific topic through the use of iterative rounds of questionnaires with controlled feedback.Reference de Villiers, de Villiers and Kent18Reference Murry and Hammons20 The Delphi methodology, through the provision of expert professional judgment, can be used to generate content-related validity evidence for curricular content.Reference Delbecq, Van De Ven and Gustafson21 This study was approved by the Sunnybrook Health Sciences Centre institutional review board.

Delphi panel recruitment and sample

A Delphi group of Canadian experts in EM simulation was established. Purposive sampling was used to ensure that the appropriate experts were invited to participate. First, we emailed current program directors of all 14 RCPSC-accredited EM residency training programs in Canada to identify one or more experts in simulation-related education at their site. Second, some panellists were identified as national experts as evidenced by their role as opinion leaders in simulation, and/or EM education within organizations such as the RCPSC and the Canadian Association of Emergency Physicians, and/or their positions on national EM-related simulation and/or education committees. To increase the reliability of the Delphi group's composite judgment, our goal was to include a sample of at least 30 experts with broad expertise reflective of current knowledge and perceptions related to EM training and simulation.Reference Delbecq, Van De Ven and Gustafson21,Reference Murphy, Black and Lamping22 Sixty-six prospective panellists were sent email invitations explaining the study purpose and methodology. Membership of the Delphi panel was kept confidential.

Item generation and elimination of data redundancy

Potential curricular topics were generated from: 1) a systematic literature review; 2) a review of relevant educational documents (e.g., RCPSC, Accreditation Council for Graduate Medical Education, and the American Board of Emergency Medicine); and 3) input from Delphi panel members. Published literature on the topic of EM curricular content was systematically reviewed. Health professions databases were searched from inception until May 2015, including MEDLINE, EMBASE and Web of Science (see Appendix 1, search strategy). Articles pertaining to EM training and/or simulation were reviewed independently by two authors (NKG and CMW), and potential adult EM curricular topics were extracted and integrated into the initial item pool. Pediatric and neonatal curricular topics were not included, as they have been examined previously.Reference Bank, Cheng, McLeod and Bhanji24 The list of potential topics was then reduced by combining redundant items. This process was completed by three authors (NKG, AH, and CMW) who have clinical and pedagogical expertise in EM and/or simulation. Potential topics were reviewed independently for redundancy, and the research team met on two occasions to compare topics and establish consensus. Additionally, panellists were asked to comment on the list of potential curricular topics and identify any additional topics they felt should be included in a simulation-based EM training curriculum during round one of the Delphi process.

Item reduction

Iterative rounds of Delphi surveys were used to reduce the item pool further. To maximize response rates, the surveys were designed and distributed in line with principles outlined by Dillman's tailored design method: respondent-friendly surveys using clear and easy-to-understand language; personalized correspondence; and up to four email contracts for each Delphi round.Reference Dillman23 During each round, panellists were provided with a link to the online survey.

During the first Delphi round, participants individually rated the suitability of simulation as an educational tool for teaching each potential curricular topic. Topics were rated on a 1 (“best taught using methods other than simulation”) to 4 (“definitely should be taught using simulation”) ordinal scale used in a previous study to determine pediatric EM simulation curricular content.Reference Bank, Cheng, McLeod and Bhanji24 Participants were provided the opportunity to comment on the wording of potential topics, provide reasons for their choices, and/or suggest additional topic areas.

During the subsequent rounds, panellists re-rated the remaining curricular topics using the same four-point ordinal scale. Beyond the first rounds, participants were informed of the group mean rating and standard deviation (SD) for each topic in the proceeding round. Once again, panellists were given the opportunity to provide open-ended comments. The Delphi process continued until consensus among the expert panel was achieved using the criteria described below.

Data analysis and determining consensus

All statistical analysis was performed using SPSS, version 20.0 (IBM Corp). After each Delphi round, the mean rating ± SD and proportion of panellists rating an item within each category (1 to 4) were calculated. Panellists’ ratings were given equal weight. Three authors (NKG, AH, and CMW), blinded to the sources of the data, reviewed panellists’ ratings and qualitative comments. Using a priori criteria, a curricular topic was eliminated from subsequent Delphi rounds if its mean rating was <2. Topics were combined and/or their wording modified based on comments from the expert panel. The updated survey was redistributed for rating. The Delphi process continued in an iterative fashion until consensus was achieved. Consensus, a condition of homogeneity or consistency of opinion of the expert panellists, was quantified using Cronbach α.Reference Graham, Regehr and Wright19,Reference Bland and Altman25,Reference Diamond, Grant and Feldman26 A Cronbach α of ≥0.95 was predefined as the minimal threshold to determine when consensus had been achieved.Reference Bland and Altman25 It was predetermined that once consensus was achieved, topics rated ≥3.5 would be considered “core” curricular topics while items rated 3.0–3.5 would be considered “extended” curricular topics.

RESULTS

Forty-five experts from 13 of the 14 academic RCPSC-accredited EM training sites across Canada agreed to participate. The Delphi panel member demographics are outlined in Table 1. Of the participating experts, 45 (100%) completed the first Delphi survey, 42 (93.3%) completed round 2, and 43 (95.5%) completed round three. Across all three iterations of the Delphi process, 0.30% of items had missing ratings.

Table 1. Profile of the panel of experts (N = 45) participating in the Delphi consensus process

*Panellists were instructed to self-select all non-clinical roles that may apply.

The systematic literature review identified 2,279 potential citations (981 Medline, 911 Embase, and 387 Web of Science), of which 1,533 were duplicates. The titles and abstracts of the remaining 746 citations were reviewed, and 199 relevant articles were identified. From these, as well as relevant educational documents, 2,515 potential curricular topic areas in adult EM were generated. No additional topics were suggested by the Delphi panel. By combining like items, these were reduced by the research team to 280 non-redundant curricular topics that fell into 29 curricular content domains (Appendix 2).

The flow of topics through each round of the Delphi process is outlined in Figure 1. Items with a mean rating of ≥2 were retained after each round, and the survey was updated and distributed for subsequent voting. Consensus was achieved after three rounds (Cronbach α = 0.97). Thirteen curricular content domains (dentistry, dermatology, ear nose and throat, endocrine, hematology and oncology, musculoskeletal, nephrology, ophthalmology, psychosocial, transplant medicine, minor procedures, emergency ultrasound, and urologic) were eliminated completely during the Delphi process. After the third round, 108 additional topics were eliminated as they did not achieve consensus (mean rating < 3). A total of 63 curricular topics categorized into 16 domains reached consensus for inclusion as adult EM curricular content best suited for simulation-based training, with 28 topics in eight domains considered “core” curricular topics (Table 2) and 35 topics in 14 domains being considered “extended” curricular topics (Table 3 and Appendix 2).

Figure 1. Overview of the Delphi process to identify curricular topics for a national simulation curriculum.

Table 2. Core curricular topics (N = 28) for simulation-based training

Table 3. Extended curricular topics (N = 35) for simulation-based training

DISCUSSION

With the implementation of competency by design, Canadian postgraduate EM training programs are engaging in the process of significant transformation. Simultaneously, there has been rapid, but variable, uptake of simulation-based education,Reference Russell, Hall and Hagel14 and many programs are grappling to understand how best to integrate simulation into the required training experiences and assessment processes of their revised competency-based programs.17,27 This Delphi process, which included 45 experts representing almost all RCPSC-accredited EM training programs in Canada, sought to determine the priority content for a simulation curriculum. Twenty-eight “core” topics and 35 “extended” topics were identified as curricular content best suited for simulation-based training. Understanding that resources are limited, and there is an ever-increasing requirement for efficiency within postgraduate training programs,Reference Nousiainen, Caverzagie, Ferguson and Frank28 this study is intended to guide EM teachers and educators regarding what topics should be prioritized for simulation-based education. These results can be used to inform a national simulation curriculum to supplement clinical training and optimize learning.

Curricular topics that were identified as “core” or considered of high importance for simulation-based education were primarily focused within the areas of crisis resource management, resuscitation, trauma, and procedures. The non-medical expert competencies of leadership, team skills, and effective communication and collaboration with an interprofessional team also achieved consensus as “core” topic areas. Our results are consistent with a recent study by Bank et al.Reference Bank, Cheng, McLeod and Bhanji24 who sought to identify content for a national simulation-based curriculum for pediatric EM training. The 48 “key” topics achieving consensus for pediatric EM were similarly grouped into the categories of resuscitation, trauma, procedures, and crisis resource management skills that included leadership, teamwork, and communication. The similarity of the findings further confirms that the “core” topics we identified are paramount to any EM simulation curricula. It is important to note, however, that the resulting list of “core” topics are fairly broad in scope and may be subject to varied interpretation by simulation educators. For example, the topic “altered level of consciousness and coma” may relate to a number of different clinical presentations including septic shock, head trauma, or hepatic encephalopathy. Interpretation of these results may present a challenge to educators charged with developing simulation curriculum relevant to the specific curricular needs of their own postgraduate program.

Our results are also consistent with existing literature outlining potential content for simulation training and assessment in adult EM that has emphasized resuscitation-focused content while also incorporating key non-medical expert roles.Reference Dagnone, McGraw and Howes11,Reference Binstadt, Walls and White12,Reference Hart, Bond and Siegelman29 In a recent national survey, EM residents reported experiencing simulation-based education in the aforementioned content areas and, in fact, perceived the amount of time dedicated to rarely encountered resuscitation scenarios such as obstetrical emergencies and pediatric resuscitations to be inadequate.Reference Russell, Hall and Hagel14 Motivating factors for determining content best suited to simulation-based education revolve around the desire to train for high stakes situations that are rarely encountered and the need to practice more frequently encountered clinical scenarios that expose patients to risk when performed by novices.Reference Ziv, Wolpe, Small and Glick30 Patient safety has been referred to as the principal motivator for the adoption of simulation-based education.Reference Cook, Andersen, Combes, Feldman and Sachdeva1 In an era when patient safety and quality of care have become a significant focus of medical practice, simulation offers an ideal training environment, as individuals can engage in sustained, deliberate practice and learn from errors with no risk to patient harm.Reference Rodriguez-Paz, Kennedy and Salas31

The panel also identified 32 “extended” curricular topics that are also considered to be important in EM training and should be considered for inclusion in a national curriculum based on local resources and perceived needs. These topics tended to reflect more commonly encountered high-acuity clinical scenarios such as anaphylaxis and gastrointestinal bleeding. The inclusion of “extended” topics enables programs to customize simulation-based education to serve their educational needs and local context; factors influenced by variables such as program size, clinical case mix, cultural and socioeconomic demographics of presenting patients, available simulation resources, and funding. Interestingly, despite a previous study in which trainees reported inadequate exposure to obstetrical emergencies,Reference Russell, Hall and Hagel14 topics in this domain reached consensus as “extended” topics, rather than “core.” We suspect that the Delphi panellists considered the need for unique manikins required to conduct obstetrical-related simulation training and were, therefore, hesitant to list them as “core” topics.

The exclusion of some topics was surprising, including acute coronary syndrome, environmental injuries (e.g., burns), biologic agents of warfare, maxillofacial trauma, cervical, and thoracolumbar spine injuries, as well as many specific toxic exposures and the approach to the violent and agitated patient. Many of these topics reflect infrequently experienced high-acuity scenarios and, therefore, are considered well suited for simulation-based education.Reference Ziv, Wolpe, Small and Glick30 Additionally, there is literature examining simulation-based training of some of the aforementioned topics such as de-escalation techniques and disaster management.Reference Wong, Auerbach and Ruppel32Reference Franc, Nichols and Dong34 The exclusion of these topics is in contrast to prior simulation studies. Possible explanations for this difference may be the challenge of maintaining physical fidelity in the simulation environment or a lack of local expertise and/or resources.

The Delphi methodology is considered one of the most ideal methods for achieving informed group consensus on a complex topic. Through rigorous item generation and the provision of expert professional judgment, the Delphi methodology provides evidence of content validity of the proposed curricular topics and helps to enhance the generalizability of the results.Reference Goodman35 The Delphi method, however, is not without its limitations. First, there are no formal guidelines regarding panel selection or size. To minimize bias, we defined selection criteria a priori to ensure that the panellists had broad expertise related to EM training and simulation. We were limited, however, in that all panellists were Canadian and one RCPSC-accredited EM training program was not represented. Canadian College of Family Physicians (CCFP) EM training programs were also not specifically represented, although there is significant overlap between RCPSC and CCFP programs at many participating sites, with shared training experiences and resources. As such, the study findings may not be generalizable to these programs. Eight panellists, all of whom were program directors, did not self-identify as simulation educators. While program directors provide a unique perspective of the overarching EM curriculum, their knowledge of simulation-based education may be limited. Additionally, 69% of the Delphi panellists were male. Although this reflects the current demographics of Canadian EM physicians (69.4% male in 201836), it may introduce some bias. We aimed to include at least 30 experts, as the reliability of a panels’ aggregate judgment is minimally improved beyond 30 participants.Reference Delbecq, Van De Ve and Gustafson37 Additionally, our response rate was high across rounds. To increase methodological rigour, we followed recently proposed standardized methodological criteria for Delphi studies, including a clear objective and a priori determination of criteria for: 1) selecting Delphi panellists; 2) dropping and/or combining items; and 3) consensus.Reference Diamond, Grant and Feldman26 Finally, while this study attempted to define key content for simulation-based education in EM, it has not articulated how best to functionally integrate these topics into a competency-based curriculum to optimize training and assessment. Moving forward, it will be important to engage key stakeholders from across Canada to create a national simulation curriculum and accompanying assessment plan based on the proposed topics to ensure transferability across EM training programs. Formulation of a national working group would also help to maximize buy-in and facilitate sharing of scenarios, assessment tools, and other materials.

CONCLUSIONS

A national Delphi study enabled us to define EM curricular content best suited for simulation-based education. This will provide the foundation for improved integration of simulation into postgraduate EM training programs, to supplement clinical training and assessment and optimize learning. With the implementation of CBME and acknowledgement of the substantial resources required for simulation-based education, it will be important to prioritize these key topics when developing and revising a national EM simulation curriculum.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/cem.2019.348

Acknowledgements

Study concept and design: Kester-Greene, Walsh; acquisition, analysis, or interpretation of data: Kester-Greene, Hall, and Walsh; critical revision of the manuscript for important intellectual content: Kester-Greene, Hall, and Walsh; and final manuscript approval: Kester-Greene, Hall, and Walsh.

Competing interests

None. CMW holds a Career Development Award from the Canadian Child Health Clinician Scientist Program. No funding organization had any role in the design and conduct of the study, collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript.

References

REFERENCES

1.Cook, DA, Andersen, DK, Combes, JR, Feldman, DL, Sachdeva, AK. The value proposition of simulation-based education. Surgery 2018;163(4):944–9.Google Scholar
2.McGaghie, WC, Issenberg, SB, Barsuk, JH, Wayne, DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ 2014;48(4):375–85.Google Scholar
3.Ericsson, KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med 2004;79(10 Suppl):S70–81.Google Scholar
4.Ericsson, KA, Krampe, RT, Tesch-Römer, C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev 1993;100(3):363406.Google Scholar
5.McGaghie, WC, Issenberg, SB, Cohen, ER, Barsuk, JH, Wayne, DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011;86(6):706–11.Google Scholar
6.Zendejas, B, Brydges, R, Wang, AT, Cook, DA. Patient outcomes in simulation-based medical education: a systematic review. J Gen Intern Med 2013;28(8):1078–89.Google Scholar
7.Brydges, R, Hatala, R, Zendejas, B, Erwin, PJ, Cook, DA. Linking simulation-based educational assessments and patient-related outcomes: a systematic review and meta-analysis. Acad Med 2015;90(2):246–56.Google Scholar
8.Leblanc, VR, Bould, MD, Sharma, B. Simulation in Postgraduate Medical Education. Members of the FMEC PG Consortium; 2011.Google Scholar
9.Ilgen, JS, Sherbino, J, Cook, DA. Technology-enhanced simulation in emergency medicine: a systematic review and meta-analysis. Acad Emerg Med 2013;20(2):117–27.Google Scholar
10.McLaughlin, S, Fitch, MT, Goyal, DG, et al. Simulation in graduate medical education 2008: a review for emergency medicine. Acad Emerg Med 2008;15(11):1117–29.10.1111/j.1553-2712.2008.00188.xGoogle Scholar
11.Dagnone, JD, McGraw, R, Howes, D, et al. How we developed a comprehensive resuscitation-based simulation curriculum in emergency medicine. Med Teach 2016;38(1):30–5.Google Scholar
12.Binstadt, ES, Walls, RM, White, BA, et al. A comprehensive medical simulation education curriculum for emergency medicine residents. Ann Emerg Med 2007;49(4):495504.Google Scholar
13.Cook, DA, Brydges, R, Zendejas, B, Hamstra, SJ, Hatala, R. Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Acad Med 2013;88(6):872–83.Google Scholar
14.Russell, E, Hall, AK, Hagel, C, et al. Simulation in Canadian postgraduate emergency medicine training - a national survey. CJEM 2018;20(1):132–41.10.1017/cem.2017.24Google Scholar
15.Frank, J, Snell, L, Sherbino, J, eds. CanMEDS 2015 Physician Competency Framework.; 2015.Google Scholar
16.Boursicot, K, Etheridge, L, Setna, Z, et al. Performance in assessment: consensus statement and recommendations from the Ottawa conference. Med Teach 2011;33(5):370–83.Google Scholar
17.Royal College of Physicians and Surgeons of Canada. Emergency Medicine Competencies; 2018.Google Scholar
18.de Villiers, MR, de Villiers, PJ, Kent, AP. The Delphi technique in health sciences education research. Med Teach 2005;27(7):639–43.Google Scholar
19.Graham, B, Regehr, G, Wright, JG. Delphi as a method to establish consensus for diagnostic criteria. J Clin Epidemiol 2003;56(12):1150–6.Google Scholar
20.Murry, JW, Hammons, JO. Delphi: A versatile methodology for conducting qualitative research. Rev High Educ 1995;18(4):423–36.10.1353/rhe.1995.0008Google Scholar
21.Delbecq, AL, Van De Ven, AH, Gustafson, H. Group Techniques for Program Planning. Glenview, IL: Scott, Foresman and Company; 1975.Google Scholar
22.Murphy, MK, Black, NA, Lamping, DL, et al. Consensus development methods, and their use in clinical guideline development. Health Technol Assess 1998;2(3):iiv.10.3310/hta2030Google Scholar
23.Dillman, D. Mail and Internet Surveys: The Tailored Design Method. 2nd ed. New York: John Wiley and Sons; 2007.Google Scholar
24.Bank, I, Cheng, A, McLeod, P, Bhanji, F. Determining content for a simulation-based curriculum in pediatric emergency medicine: results from a national Delphi process. CJEM 2015;17(6):662–9.Google Scholar
25.Bland, JM, Altman, DG. Cronbach's alpha. BMJ 1997;314(7080)572.Google Scholar
26.Diamond, IR, Grant, RC, Feldman, BM, et al. Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies. J Clin Epidemiol 2014;67(4):401–9.Google Scholar
27.Royal College of Physicians and Surgeons of Canada. Emergency Medicine Required Training Experience. Ottawa: Royal College of Physicians and Surgeons of Canada; 2018.Google Scholar
28.Nousiainen, MT, Caverzagie, KJ, Ferguson, PC, Frank, JR; ICBME Collaborators. Implementing competency-based medical education: what changes in curricular structure and processes are needed? Med Teach 2017;39(6):594–8.10.1080/0142159X.2017.1315077Google Scholar
29.Hart, D, Bond, W, Siegelman, J, et al. Simulation for assessment of milestones in emergency medicine residents. Acad Emerg Med 2018;25(2):205–20.Google Scholar
30.Ziv, A, Wolpe, PR, Small, SD, Glick, S. Simulation-based medical education: an ethical imperative. Acad Med 2003;78(8):783–8.Google Scholar
31.Rodriguez-Paz, JM, Kennedy, M, Salas, E, et al. Beyond “see one, do one, teach one”: toward a different training paradigm. Postgrad Med J 2009;85(1003):244–9.Google Scholar
32.Wong, AH, Auerbach, MA, Ruppel, H, et al. Addressing dual patient and staff safety through a team-based standardized patient simulation for agitation management in the emergency department. Simul Healthc 2018;13(3):154–62.Google Scholar
33.Wong, AH, Wing, L, Weiss, B, Gang, M. Coordinating a team response to behavioral emergencies in the emergency department: A simulation-enhanced interprofessional curriculum. West J Emerg Med 2015;16(6):859–65.Google Scholar
34.Franc, JM, Nichols, D, Dong, SL. Increasing emergency medicine residents’ confidence in disaster management: use of an emergency department simulator and an expedited curriculum. Prehosp Disaster Med 2012;27(1):31–5.10.1017/S1049023X11006807Google Scholar
35.Goodman, CM. The Delphi technique: a critique. J Adv Nurs 1987;12(6):729–34.Google Scholar
36.Canadian Medical Association. Number and Percent Distribution of Physicians by Specialty and Sex, Canada 2018. Toronto: Canadian Medical Association; 2019.Google Scholar
37.Delbecq, A, Van De Ve, A, Gustafson, H. Group Techniques for Program Planning. Glenview, IL: Scott, Foresman and Company; 1975.Google Scholar
Figure 0

Table 1. Profile of the panel of experts (N = 45) participating in the Delphi consensus process

Figure 1

Figure 1. Overview of the Delphi process to identify curricular topics for a national simulation curriculum.

Figure 2

Table 2. Core curricular topics (N = 28) for simulation-based training

Figure 3

Table 3. Extended curricular topics (N = 35) for simulation-based training

Supplementary material: PDF

Kester-Greene et al. supplementary material

Appendix 1

Download Kester-Greene et al. supplementary material(PDF)
PDF 80.9 KB
Supplementary material: PDF

Kester-Greene et al. supplementary material

Appendix 2

Download Kester-Greene et al. supplementary material(PDF)
PDF 309.5 KB