Hostname: page-component-8448b6f56d-tj2md Total loading time: 0 Render date: 2024-04-24T21:51:16.369Z Has data issue: false hasContentIssue false

Outcomes in the age of competency-based medical education: Recommendations for emergency medicine training in Canada from the 2019 symposium of academic emergency physicians

Published online by Cambridge University Press:  25 March 2020

Teresa M. Chan*
Affiliation:
McMaster University, Division of Emergency Medicine, Department of Medicine. Program for Faculty Development, Faculty of Health Sciences, McMaster University, McMaster program for Education Research, Innovation, and Theory, Hamilton, ON
Quinten S. Paterson
Affiliation:
Department of Emergency Medicine, University of Saskatchewan
Andrew K. Hall
Affiliation:
Department of Emergency Medicine, Queen's University, Royal College of Physicians and Surgeons of Canada, Kingston, ON
Fareen Zaver
Affiliation:
Department of Emergency Medicine, University of Calgary, Calgary, AL
Robert A. Woods
Affiliation:
Department of Emergency Medicine, University of Saskatchewan, Saskatoon, SK
Stanley J. Hamstra
Affiliation:
Milestones Research and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; Faculty of Education, University of Ottawa, Ottawa, ON; Department of Medical Education, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
Alexandra Stefan
Affiliation:
Division of Emergency Medicine, Department of Medicine, University of Toronto, Toronto, ON
Daniel K. Ting
Affiliation:
Department of Emergency Medicine, University of British Columbia, Vancouver, BC
Brent Thoma
Affiliation:
Department of Emergency Medicine, University of Saskatchewan, Saskatoon, SK
*
Correspondence to: Dr. Teresa M. Chan, 237 Barton St. E., Hamilton General Hospital. McMaster Clinics Room 255. Hamilton, ON, Canada, L8L2X2; Email: teresa.chan@medportal.ca

Abstract

Objectives

The national implementation of competency-based medical education (CBME) has prompted an increased interest in identifying and tracking clinical and educational outcomes for emergency medicine training programs. For the 2019 Canadian Association of Emergency Physicians (CAEP) Academic Symposium, we developed recommendations for measuring outcomes in emergency medicine training in the context of CBME to assist educational leaders and systems designers in program evaluation.

Methods

We conducted a three-phase study to generate educational and clinical outcomes for emergency medicine (EM) education in Canada. First, we elicited expert and community perspectives on the best educational and clinical outcomes through a structured consultation process using a targeted online survey. We then qualitatively analyzed these responses to generate a list of suggested outcomes. Last, we presented these outcomes to a diverse assembly of educators, trainees, and clinicians at the CAEP Academic Symposium for feedback and endorsement through a voting process.

Conclusion

Academic Symposium attendees endorsed the measurement and linkage of CBME educational and clinical outcomes. Twenty-five outcomes (15 educational, 10 clinical) were derived from the qualitative analysis of the survey results and the most important short- and long-term outcomes (both educational and clinical) were identified. These outcomes can be used to help measure the impact of CBME on the practice of Emergency Medicine in Canada to ensure that it meets both trainee and patient needs.

Résumé

RésuméObjectif

La mise sur pied de la formation médicale fondée sur les compétences (FMFC) à l’échelle nationale a eu pour effet de susciter un intérêt accru pour l’établissement et le suivi de résultats cliniques et éducationnels dans les programmes de formation en médecine d'urgence. Aussi, avons-nous élaboré, en vue du symposium 2019 de la section des affaires universitaires de l'Association canadienne des médecins d'urgence (ACMU), des recommandations sur la mesure des résultats en ce qui concerne la formation en médecine d'urgence dans le contexte de la FMFC, afin de faciliter la tâche des leadeurs éducationnels et des concepteurs de systèmes dans l’évaluation des programmes.

Méthode

Nous avons réalisé une étude en trois phases afin d’établir des résultats cliniques et éducationnels relatifs à la formation en médecine d'urgence (MU) au Canada. Tout d'abord, nous avons cherché à obtenir le point de vue d'experts et de la communauté médicale sur les meilleurs résultats cliniques et éducationnels, par un processus de consultation structuré, réalisé à l'aide d'une enquête ciblée en ligne. Nous avons par la suite procédé à une analyse qualitative des réponses afin de dresser une liste de résultats suggérés. Enfin, nous avons présenté les résultats à une assistance diversifiée, composée d’éducateurs, de stagiaires et de cliniciens, à l'occasion du symposium de la section des affaires universitaires de l'ACMU, afin de recueillir ses observations et d'obtenir son contentement par vote.

Conclusion

Les participants au symposium de la section des affaires universitaires ont approuvé le concept de la mesure des résultats cliniques et éducationnels aux fins de la FMFC et celui du lien entre les deux types de résultats. Il s'est dégagé au total 25 résultats (15 éducationnels et 10 cliniques) de l'analyse qualitative des réponses au questionnaire d'enquête, et la démarche a permis l’établissement des résultats les plus importants à court terme et à long terme (tant éducationnels que cliniques). Ces résultats peuvent aider à mesurer l'incidence de la FMFC sur la pratique en MU au Canada, et permettent de s'assurer que cette forme d'enseignement répond tant aux besoins des stagiaires qu’à ceux des patients.

Type
CAEP Academic Symposium Paper
Copyright
Copyright © Canadian Association of Emergency Physicians 2020

INTRODUCTION

Competency-based medical education (CBME) is a form of outcomes-based education for the health professions.Reference McGaghie, Miller, Sajid and Telder1 In the North American postgraduate medical education context, it has come to fruition with the Accreditation Council for Graduate Medical Education Milestones Project and the Royal College of Physicians and Surgeons of Canada's (RCPSC) Competence by Design (CBD) program.

For many years, medical educators have sought to define and measure the impact of education on clinical outcomes. Outcomes measurement is particularly important in CBME, because accountability is the cornerstone of its value as an educational framework.Reference Gruppen, Frank and Lockyer2 Specifically, outcome measurement for CBME is required to: (1) demonstrate the desired impact on physician training, (2) ensure that no harm is being done by means of unanticipated consequences of the initiative, (3) investigate concerns raised in the community about the need to justify the significant investment of resources, (4) determine what aspects of CBME components and implementations were and were not effective, and (5) determine contextual elements that lead to the resultant outcomes.Reference Gruppen, Frank and Lockyer2,Reference Holmboe, Sherbino, Englander, Snell and Frank3 In this study, we sought to define both educational and clinical outcomes within EM through a consensus conference.

Assembling an expansive collection of outcomes that include educational as well as clinical outcomes is of the utmost importance.Reference Touchie and Ten Cate4,Reference Ross, Hauer and van Melle5 Selecting outcomes that are purely patient-centered suffers from several key limitations, including problems with study feasibility, biased outcome selection, difficulty in attributing outcomes to learners, and inadvertently de-emphasizing other important skills.Reference Cook and West6Reference Smirnova, Sebok-Syer and Chahine8 Our broad set of outcomes include factors such as learner supervision and resident cognitive load that consider the effects of supervision and context on resident performance, which aligns with current literature on the complexity of isolating trainee performance.Reference Smirnova, Sebok-Syer and Chahine8,Reference Sebok-Syer, Chahine, Watling, Goldszmidt, Cristancho and Lingard9 Residency is a decisive period in which to examine trainee performance as it encompasses a critical inflection point toward ultimate skill attainment. Therefore, as we transition to CBME, it is important to define and measure these outcomes to help residents along the path to both competence and mastery.

The implementation of CBD in Canadian EM in July of 2018 presents us with an opportunity to examine its outcomes. Evaluation of educational and clinical outcomes that arise from this new training model represents a direct form of accountability for the resources spent upon its design and implementation. As a detailed list of anticipated outcomes was not defined during the design of CBD, we aimed to use a consensus conference process to define a list of educational and clinical outcomes that EM training programs can reference when assessing implementation success.

METHODS

We conducted a three-phase (divergent, analysis, and convergent phase) study to solicit opinions from the emergency medicine community on important educational and clinical outcomes related to CBME that should be measured. The divergent phase consisted of a structured expert consultation process to generate an elaborative list of possible outcomes. The results from the divergent phase were then analyzed by our study team by means of a generic thematic analysis that resulted in the development of lists of clinical and educational outcomes. We presented this list of outcomes at the 2019 Canadian Association of Emergency Physicians (CAEP) Academic Symposium where they were stratified by their level of endorsement.

Conceptual framework

Outcomes were conceptualized as being at the micro (resident, patient), meso (program, institution or region), or macro (system) level and being either clinical or educational in nature. In the Fall of 2018, we reviewed key medical education literature related to measurable outcomes of CBME. Our review strategy consisted of a structured Google Scholar search augmented by an open social media request for suggestions from experts in the medical education (#MedEd) community on Twitter.

Survey development

We developed a survey-based tool with the goal of soliciting diverse perspectives on potential clinical and educational outcomes for CBME. The survey consisted of three discrete parts: Part 1, open free-text suggestions of outcomes in the clinical and educational domains; Part 2, vignette-based brainstorming exercise; Part 3, open free-text entry for new or final thoughts.

Part 2 provided participants with three vignettes which situated emergency residents in different stages of training and clinical environments. Participants were asked to read the vignettes and then respond to questions designed to identify potentially measurable clinical and educational outcomes at the learner (micro), institutional (meso), and health system (macro) levels. The vignettes and the survey are presented in Appendix 1. The survey was developed and piloted internally by our study team before deployment. The vignettes were edited for clarity, readability, and appropriateness based upon feedback from the pilot.

Survey deployment

The survey was sent to 175 people, including EM Program Directors; CBD Implementation Leads; Competence Committee Chairs; Chief Residents from the EM training programs of both the RCPSC and the College of Family Physicians of Canada (CFPC); one resident in addition to the Chief Resident at each program site; hand-selected physician leaders in administration, medical education, and quality improvement; Deans; Postgraduate leads; and international and national experts in the field of medical education. Participants were sent one initial email and two reminder emails spaced approximately 2 weeks apart.

Analysis of survey results

Upon survey completion, data were qualitatively analyzed by the authorship team (F.Z., Q.S.P., A.H., B.T., T.C.) and themes were generated within the education and clinical outcome spheres. All coding was reviewed by at least three authors with B.T. and T.C. reviewing all codes. B.T. and T.C. then summarized the content of each theme into outcome statements.

Convergent phase: academic symposium voting

The list of outcomes was presented at the CAEP Academic Symposium on posters that organized the items based on the themes from our thematic analysis. Attendees were given stickers and tasked to vote upon the educational and clinical outcomes that they believed were: (a) most feasible to measure in the immediate (short-term) postimplementation period; and (b) most important to measure in the long-term period.

Participants were instructed to place stickers next to each outcome to represent the weight of their endorsement. The number of stickers next to each outcome were tabulated on site, and the results were presented to the audience at the end of the consensus conference session.

SUMMARY OF RECOMMENDATIONS

A total of 30 individuals (17.1% response rate) completed our survey. Respondents included representatives from all the targeted groups (see Table 1). The results of the qualitative analysis are outlined in Table 2 (Educational Outcomes) and Table 3 (Clinical Outcomes). Box 1 details the recommendations that were most heavily favored during our symposium process.

Table 1. Demographics of survey respondents, n (%)

CAEP = Canadian Association of Emergency Physicians; CFPC = College of Family Physicians of Canada; EM = Emergency Medicine; MD = Medical Doctor; PhD = Doctor of Philosophy; RCPSC = Royal College of Physicians and Surgeons of Canada

*Since individuals might hold several roles or certifications simultaneously, the numbers in the sections will add up to more than 100%.

Table 2. Symposium educational outcomes

Table 3. Symposium Clinical Outcomes

Legend for abbreviations: CBME = Competency-Based Medical Education; ED = Emergency Department; $ = money

Box 1. Initial outcomes that should be measured in the early phase of CBME rollout most endorsed by the participants of the 2019 Academic Symposium

Summary of short-term outcomes

1. Outcomes at the micro- & meso-level

There are several short-term outcomes that could serve as easy, measurable outcomes to demonstrate the value of CBME. For instance, many educators endorsed measuring the number of entrustable professional activities (EPAs) observations and determining the quality of feedback. Previous studies on workplace-based assessment systems, such as In-Training Evaluation Reports (ITERs) have shown that examining the artifacts generated in a systematic way can improve the overall quality of assessment data.Reference Chan and Sherbino10

Going forward, it may be prudent for educators to form chains of evidence linking these immediately measurable outcomes (e.g., EPAs acquired during the first year) and later outcomes (e.g., likelihood of remediation required,Reference Ross, Binczyk and Hamza11 likelihood of completing areas of specific enhanced competency).Reference Binczyk, Babenko, Schipper and Ross12 Program leaders might harness the power of ongoing program evaluation or quality improvement, and that under our national research ethics policy (Tri-council Policy Statement 2),13 these efforts are usually deemed exempt from research ethics processes. As such, measuring the more accessible and top-rated short-term outcomes is both necessary and pragmatic for robust program improvement.

2. Wellness and trainee efficacy

Trainee wellness and resident self-efficacy were ranked in the top five educational outcomes to be measured in the short term, which aligns well with the Institute for Healthcare Improvement's Quadruple Aim. We might aim to determine if we are training of physicians committed to continuing professional development, and who are receptive to feedback and adept in providing it. Poor faculty feedback to trainees has been cited as a contributing factor to depression in trainees,Reference Pereira-Lima, Gupta, Guille and Sen14 whereas improved feedback has the potential to foster a sense of competence, autonomy, and relatedness.Reference Sargeant15 Because CBME advocates for an increase in the direct observation and provision of objective, high-quality feedback, program evaluators should attempt to study their effects on learner well-being.

3. Macro-level outcomes

Patient safety and quality of care metrics (e.g., evidence-based practice, efficiency and effects on hospital flow) for CBME trainees was one of the clinical outcomes that was ranked as most important to follow both in the short-term and the long-term. From a practical perspective, this appears daunting both in terms of data gathering and in establishing causal links between specific clinical outcomes and care provided. Collaboration between educators and quality improvement researchers will be essential to examine these complex systems. Other clinical outcomes, such as multi-source assessments for CBME residents in the clinical settings are more easily trackable and can provide useful information especially when linked to the educational outcomes of individual EPA metrics.

Summary of long-term outcomes

1. Health systems impact

Postgraduate training programs should focus on the endorsed long-term educational outcomes, as they sit in the broader health care system where budgets and human health resource management are important. Specifically, one of the most endorsed long-term educational impacts might be a change in the number of trainees pursuing sub-specialty training after their qualifying exams. Traditionally, the RCPSC training programs in EM have had an embedded area of focused learning that was part of the Postgraduate year 4 of most programs.Reference Thoma, Mohindra and Woods16 With regards to costs, the expense of time-variable training within CBME was likely believed to be important for our nationally funded system. Time-variability may be ideal for the traineeReference Nousiainen, Mironova and Hynes17 but can add cost and uncertainty to the health care system. Such financial outcomes are more system-focused and easier to measure than outcomes that attempt to quantify trainee quality.

2. Trainee focused impact

Trainee-focused clinical outcomes may serve as proxies for better physicians and are typically cited by education scholars as potential benefits of CBMEReference Frank, Snell and Ten18; however, defining and measuring these outcomes will be quite difficult. How can we measure physician quality in EM?Reference Pines, Alfaraj and Batra19 Especially in environments like EM where trainees can so readily access their supervising physicians, how can we tease apart their individual performance?Reference Schumacher, Holmboe, Van Der Vleuten, Busari and Carraccio7,Reference Sebok-Syer, Chahine, Watling, Goldszmidt, Cristancho and Lingard9 Or should we be aiming to find resident-sensitive markers of performance?Reference Schumacher, Holmboe, Van Der Vleuten, Busari and Carraccio7,Reference Sebok-Syer, Chahine, Watling, Goldszmidt, Cristancho and Lingard9,Reference Sebok-Syer, Goldszmidt, Watling, Chahine, Venance and Lingard20 These considerations are important but are beyond the scope of this present project. The quality of physician care is undoubtedly a complex construct that involves numerous competing factors.

3. New methods for evaluation of outcomes

While outcome measurement is important, Gruppen and colleagues warn us not to focus too much on demonstrating that graduates of CBME programs are “better.”Reference Gruppen, Frank and Lockyer2 Instead, we must match the inherent complexity of outcomes with sophisticated study designs, such as collaborative and multicenter studies, focusing on individuals, teams, institutions, and settings. We should also aim to include novel models for program evaluation such as rapid-cycle evaluationReference Hall, Rich and Dagnone21 and realist evaluation (see Appendix 2). Recent evaluation work in EM evaluating early implementation of CBME using rapid evaluation has yielded important lessons about fidelity of implementation, and perhaps even short-term outcomes such as behavior change among stakeholders.Reference Hall, Rich and Dagnone21 Moving forward, these strategies and others will help us tackle the difficult task of high-quality outcome-based evaluation.

Limitations

Our recommendation process has several limitations. Most obviously, the consultatory population that engaged with the initial part of our study (the divergent phase) had a relatively poor response rate, which may relate to the reluctance of trainees and senior leaders to participate in this type of consultation. That said, the dataset that we were able to acquire reached sufficiency for our qualitative analysis and was, therefore, unlikely to have significantly impacted the ultimate outcome of this study. Another limitation is the narrow nature of our study question; we were specifically seeking to derive a list of clinical and educational outcomes relevant to EM training in Canada. For this reason, our results may have limited generalizability to other specialties and jurisdictions.

Final recommendations

In this three-phase study, we developed consensus recommendations for clinical and educational CBME outcomes. We have defined a novel framework to describe outcomes that were stratified by their inherent nature (clinical or educational), the timing of their impact, and their scope of impact (micro, meso, or macro). This framework has allowed us to identify a broad range of outcomes and organize them into a list of 15 educational and 10 clinical themes (ranked by importance). The main strength of our study was the ability to obtain input from all targeted expert groups from across the country and to prioritize these outcomes during the education summit. Our results will help to operationalize CBME program evaluation and determine its impact on our programs and clinical care in EM.

Through our three-phase study design, we have derived a set of general recommendations regarding outcomes and program evaluation of CBME:

  1. 1. We should measure educational and clinical outcomes in CBME to ensure that this model of assessment is meeting the needs of our patients.

  2. 2. Initial educational outcomes in the early rollout of CBME should include: measuring EPA observations and feedback, determining the amount of simulated assessment used in CBME, and monitoring and measuring trainee wellness and health.

  3. 3. Initial clinical outcomes in the early rollout of CBME should include: measuring patient safety and quality care metrics associated with CBME residents and graduates, demonstrating evidence-based practice of CBME residents and graduates, and monitoring efficiency/impact of CBME on hospital flow and function.

  4. 4. Specialty-specific micro (patient/training site, learner/training program), meso (institution or region), and (national) macro outcomes will need to be measured by means of robust research methods linking educational and clinical outcomes.

CONCLUSIONS

We have created a key listing of educational and clinical outcomes that should be targeted both in the short- and long-term to determine the impact of CBME on the practice of Canadian EM. Clinician Educators and Education Scholars should use the recommendations to inform future program evaluation initiatives of CBME.

Supplemental material

The supplemental material for this article can be found at https://doi.org/10.1017/cem.2019.491.

Acknowledgments

We thank Dr. Jeff Perry for his leadership of the academic section of Canadian Association of Emergency Physicians and Dr. Ian Stiell for championing educational scholarship both in his foundational work with the academic section and as the Editor-in-Chief of the Canadian Journal of Emergency Medicine.

Competing interests

None.

References

REFERENCES

1.McGaghie, WC, Miller, GE, Sajid, AW, Telder, TV.Competency based curriculum in medical education: an introduction. Public Health Pap. 1978:1191.Google ScholarPubMed
2.Gruppen, L, Frank, JR, Lockyer, J, et al. Toward a research agenda for competency-based medical education. Med Teach. 2017;39(6):623–30, doi: 10.1080/0142159X.2017.1315065.CrossRefGoogle Scholar
3.Holmboe, ES, Sherbino, J, Englander, R, Snell, L, Frank, JR.A call to action: the controversy of and rationale for competency-based medical education. Med Teach. 2017;39(6):574–81, doi: 10.1080/0142159X.2017.1315067.CrossRefGoogle Scholar
4.Touchie, C, Ten Cate, O.The promise, perils, problems and progress of competency-based medical education. Med Educ. 2016;50(1):93100, doi: 10.1111/medu.12839.CrossRefGoogle ScholarPubMed
5.Ross, S, Hauer, KE, van Melle, E.Outcomes are what matter: competency-based medical education gets us to our goal. MedEdPublish. 2018;7(2):15, doi: 10.15694/mep.2018.0000085.1.CrossRefGoogle Scholar
6.Cook, DA, West, CP.Reconsidering the focus on “outcomes research” in medical education: a cautionary note. Acad Med. 2013;88(2):162–7, doi: 10.1097/ACM.0b013e31827c3d78.CrossRefGoogle ScholarPubMed
7.Schumacher, DJ, Holmboe, ES, Van Der Vleuten, C, Busari, JO, Carraccio, C.Developing resident-sensitive quality measures: a model from pediatric emergency medicine. Acad Med. 2018;93(7):1071–8, doi: 10.1097/ACM.0000000000002093.CrossRefGoogle Scholar
8.Smirnova, A, Sebok-Syer, SS, Chahine, S, et al. Defining and adopting clinical performance measures in graduate medical education: where are we now and where are we going? Acad Med. 2019;94(5):671–7, doi: 10.1097/ACM.0000000000002620.CrossRefGoogle ScholarPubMed
9.Sebok-Syer, SS, Chahine, S, Watling, CJ, Goldszmidt, M, Cristancho, S, Lingard, L.Considering the interdependence of clinical performance: implications for assessment and entrustment. Med Educ. 2018;52(9):970–80, doi: 10.1111/medu.13588.CrossRefGoogle Scholar
10.Chan, T, Sherbino, J, Collaborators M. The McMaster Modular Assessment Program (McMAP): a theoretically grounded work-based assessment system for an emergency medicine residency program. 2015;90:900–5, doi: 10.1097/acm.0000000000000707.CrossRefGoogle Scholar
11.Ross, S, Binczyk, NM, Hamza, DM, et al. Association of a competency-based assessment system with identification of and support for medical residents in difficulty. JAMA Netw Open. 2018;1(7):e184581, doi: 10.1001/jamanetworkopen.2018.4581.CrossRefGoogle ScholarPubMed
12.Binczyk, NM, Babenko, O, Schipper, S, Ross, S.Unexpected result of competency-based medical education: 9-year application trends to enhanced skills programs by family medicine residents at a single institution in Canada. Educ Prim Care. 2019;30(3):16, doi: 10.1080/14739879.2019.1573108.Google Scholar
13.Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, Social Sciences and Humanities Research Council. Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans. Vol 45. December 2. Ottawa, ON; 2018. doi: 10.1007/BF02627565.Google Scholar
14.Pereira-Lima, K, Gupta, RR, Guille, C, Sen, S.Residency program factors associated with depressive symptoms in internal medicine interns: a prospective cohort study. Acad Med. 2019;94(6):869875, doi: 10.1097/ACM.0000000000002567.CrossRefGoogle ScholarPubMed
15.Sargeant, J.Future research in feedback: how to use feedback and coaching conversations in a way that supports development of the individual as a self-directed learner and resilient professional. Acad Med. 2019;94:S9S10, doi: 10.1097/ACM.0000000000002911.CrossRefGoogle Scholar
16.Thoma, B, Mohindra, R, Woods, RA.Enhanced training in emergency medicine: the search and application process. CJEM. 2015;17(5):565–8, doi: 10.1017/cem.2015.61.CrossRefGoogle Scholar
17.Nousiainen, MT, Mironova, P, Hynes, M, et al. Eight-year outcomes of a competency-based residency training program in orthopedic surgery. Med Teach. 2018;40(10):1042–54, doi: 10.1080/0142159X.2017.1421751.CrossRefGoogle ScholarPubMed
18.Frank, JR, Snell, LS, Ten, Cate O, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638645, doi: 10.3109/0142159X.2010.501190.CrossRefGoogle ScholarPubMed
19.Pines, JM, Alfaraj, S, Batra, S, et al. Factors important to top clinical performance in emergency medicine residency: results of an ideation survey and Delphi panel. AEM Educ Train. 2018;2(4):269–76, doi: 10.1002/aet2.10114.CrossRefGoogle Scholar
20.Sebok-Syer, SS, Goldszmidt, M, Watling, CJ, Chahine, S, Venance, SL, Lingard, L.Using electronic health record data to assess residents’ clinical performance in the workplace: the good, the bad, and the unthinkable. Acad Med. 2019;94(6):853–60, doi: 10.1097/ACM.0000000000002672.CrossRefGoogle ScholarPubMed
21.Hall, A, Rich, J, Dagnone, J, et al. It's a marathon, not a sprint: rapid evaluation of CBME program implementation. Acad Med. 2019, doi: 10.1097/ACM.0000000000003040.CrossRefGoogle Scholar
Figure 0

Table 1. Demographics of survey respondents, n (%)

Figure 1

Table 2. Symposium educational outcomes

Figure 2

Table 3. Symposium Clinical Outcomes

Supplementary material: File

Chan et al. supplementary material

Chan et al. supplementary material

Download Chan et al. supplementary material(File)
File 572.4 KB