Hostname: page-component-848d4c4894-2pzkn Total loading time: 0 Render date: 2024-05-31T17:26:58.918Z Has data issue: false hasContentIssue false

Introducing trainees to research using an online, asynchronous course

Published online by Cambridge University Press:  29 June 2023

Jason T. Blackard*
Affiliation:
Division of Digestive Diseases, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA Center for Clinical and Translational Science and Training, University of Cincinnati, Cincinnati, OH, USA
Jacqueline M. Knapke
Affiliation:
Center for Clinical and Translational Science and Training, University of Cincinnati, Cincinnati, OH, USA Department of Family and Community Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
Stephanie Schuckman
Affiliation:
Center for Clinical and Translational Science and Training, University of Cincinnati, Cincinnati, OH, USA
Jennifer Veevers
Affiliation:
Center for Clinical and Translational Science and Training, University of Cincinnati, Cincinnati, OH, USA
William D. Hardie
Affiliation:
Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA
Patrick H. Ryan
Affiliation:
Center for Clinical and Translational Science and Training, University of Cincinnati, Cincinnati, OH, USA Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, OH, USA Division of Biostatistics and Epidemiology, Cincinnati Children’s Hospital, Cincinnati, OH, USA
*
Corresponding author: J. Blackard, PhD; Email: jason.blackard@uc.edu
Rights & Permissions [Opens in a new window]

Abstract

Introduction:

Research is an important aspect of many students’ training. However, formal research training is rarely included in curricula. Thus, we developed an online, asynchronous series of modules to introduce trainees to multiple topics that are relevant to the conduct of research.

Methods:

Research 101 was utilized by first-year medical students and undergraduate students conducting mentored research projects. Students’ knowledge, confidence, and satisfaction were assessed using pre- and post-module surveys with five-point Likert scaled questions, open-ended text responses, and a final quiz.

Results:

Pre-module survey results showed that learners felt most confident with the Conducting a literature search and Race and racism in medicine modules and least confident with the Submitting an Institutional Review Board protocol at UC module. Post-module survey responses were significantly increased compared to pre-module results for all modules and questions (p < 0.0001). The response to “The content of this module met my needs” was endorsed across all modules (84.9% “yes” responses). A final quiz of 25 multiple-choice questions was completed by 92 participants who received a median score of 21. Content analysis of open-ended post-module survey responses identified several strengths and opportunities for improvement in course content and instructional methods.

Conclusions:

These data demonstrate that significant learning resulted from completion of Research 101, as post-module survey scores were significantly higher than pre-module survey scores for all modules and questions. Final quiz scores were positive but also highlighted opportunity for additional trainee learning and will guide evolution of future modules.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© University of Cincinnati, 2023. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

Research is an important component of any medical professional’s training and professional development. A 2015 meta-analysis of medical students found that 72% were interested in conducting research and 31% were interested in a career that included research [Reference Amgad, Man Kin Tsui, Liptrott and Shash1]. Students participating in research activities in medical school were 3.55 times more likely to be interested in research as part of their future careers [Reference Amgad, Man Kin Tsui, Liptrott and Shash1]. The 2021 Association of American Medical Colleges (AAMC) Medical School Graduation Questionnaire reported that 83.9% of medical school graduates participated in a research project with a faculty member compared to 69.3% in 2014 and 77.3% in 2017 [2,3]. Overall, 61.2% of 2021 medical school graduates submitted a paper for publication compared to 48.6% of 2017 respondents, and 51.1% planned to participate in research during their careers [3].

Medical trainees may conduct scholarly activities at various times during their training, including summer electives, mandatory curricular activities, extracurricular research activities, and longitudinal research experiences. AAMC core competencies for entering medical students outline competencies such as quantitative reasoning, critical thinking, and written communication [2]. A study by Amgad et al. reported that career advancement is a significant motivation for performing research during medical school [Reference Amgad, Man Kin Tsui, Liptrott and Shash1]. While Green et al. found that program directors often ranked research experience lower among selection criteria when all specialties were grouped together, research experience was ranked highly in competitive specialties such as plastic surgery, radiation oncology, dermatology, and neurosurgery [Reference Green, Jones and Thomas4]. DeFranco and Sowa suggest that rigorous hands-on training in the scientific method will aid in the integration of basic science knowledge with clinical decision-making and ultimately enhance patient care [Reference DeFranco and Sowa5]. Nonetheless, formal training in the scientific method and the conduct of research is often fragmented across the medical school curriculum. For instance, Stone et al. noted that curriculum efforts varied widely across institutions, and research was often buried within the curriculum and not obvious [Reference Stone, Dogbey, Klenzak, Van Fossen, Tan and Brannan6].

The Accreditation Council for Graduate Medical Education (ACGME) mandates that residents must participate in scholarly activity prior to the completion of their training [7]. Nearly all residency programs have established guidelines for scholarly activities that align with accreditation requirements; yet, organized, comprehensive research curricula are often lacking [Reference Alguire, Anderson, Albrecht and Poland8]. Benefits of resident research exposure have been well described in the literature (reviewed in [Reference Zimmerman, Alweis, Short, Wasser and Donato9,Reference Stevenson, Smigielski, Naifeh, Abramson, Todd and Li10]) and include increased lifelong learning, improved patient care, increased satisfaction with training, and higher likelihood of pursuing academic careers. Nonetheless, several potential barriers exist including a lack of mentors, lack of research infrastructure, lack of trainee interest, lack of financial support, the high demand for clinical responsibilities, and the lack of research curricula.

To establish fundamental research skills and fill gaps within training curricula, we previously piloted an online, asynchronous set of modules – called Research 101 – to introduce medical students to various topics that are germane to the conduct of research [Reference Blackard, Knapke and Ryan11]. Post-module mean scores were significantly higher than pre-module results for all modules indicating significant learning by completing Research 101. Here, we evaluated the use of Research 101 across a larger, broad spectrum of learners participating in structured research at the University of Cincinnati College of Medicine and compared learning outcomes across participant groups.

Methods

The creation and pilot study of Research 101 has been described in detail elsewhere [Reference Blackard, Knapke and Ryan11]. Briefly, Research 101 modules were offered asynchronously through the online educational platform Canvas (Salt Lake City, UT). The first module – Getting started with Research 101 – provided a brief introduction to the course content. Each module consisted of several elements including learning objectives, assignments, a pre-module survey, and a post-module survey. To complete a module, participants completed (1) a pre-module survey before reviewing any of the assignments within a module, (2) all assignments within a module, and (3) a post-module survey at the conclusion of each module. The pre-module survey included questions based on the learning objectives of the particular module with responses provided on a 5-point Likert scale. For instance, “I am confident in my ability to…identify my skills as a mentee/trainee” (learning objective #1 for the Aligning Expectations module) or “I am confident in my ability to…describe possible barriers to an effective mentor-mentee relationship” (learning objective #4). The post-module survey included the same questions based on the learning objectives as the pre-module survey and two questions with yes/no/unsure response options: (1) The content of this module met my needs? and (2) Would you recommend this module to a friend if it was not a requirement? Additionally, open-ended text field questions were included: (1) What did you like most about this module?; (2) What did you like least about this module?; and (3) If you could change one thing about this module, what would it be?

All survey data were collected and managed using the Research Electronic Data Capture tool hosted at the University of Cincinnati [Reference Harris, Taylor, Thielke, Payne, Gonzalez and Conde12]. Qualitative survey responses were analyzed using an inductive content analysis approach that offers a systematic and objective method for classifying words and phrases into meaningful categories and permits the analyst(s) to discern key ideas from a larger body of text [Reference Elo and Kyngäs13]. Changes in module Likert scale scores were assessed by subtracting pre-module Likert scores from post-module scores such that positive differences indicated increased confidence in knowledge regarding the module content. Statistical significance in pre-post scores was tested using a paired t-test (SAS Version 9.4). For group comparisons, pooled p-values were reported when the test for equality of variances (Folded F) was > 0.05. When the equality of variances test was < 0.05, the p-value from the Satterthwaite method was reported.

A final quiz consisting of 25 multiple-choice questions with one correct answer per question was required for all participants. Participants had access to all Research 101 content during the quiz, feedback on incorrect responses was provided, and there was no time limit. The final quiz score was utilized for reporting purposes only and to refine the module content.

The University of Cincinnati Institutional Review Board (IRB) reviewed the study and determined the research qualified as having minimal risk to participants and was exempt from most of the requirements of the Federal Policy for the Protection of Human Subjects.

Results

During the 2021–22 academic year, 132 individuals were registered for Research 101, including 99 first-year medical students, 23 undergraduates or international medical students, and 10 internal medicine residents. Pre-module and post-module survey results are shown in Table 1. Due to program-specific requirements and/or missing data, the number of individuals responding to specific survey questions is not consistent across all modules. Prior to completing the modules, learners were most confident with the Race and racism in medicine (4.15 – 4.21) and the Conducting a literature search (4.04–4.10) modules and least confident with the Submitting an IRB protocol at UC (2.23–2.37) and the Study design and data analysis basics (3.13–3.27) modules. Post-module mean scores were significantly increased compared to pre-module scores for all modules and all learning objectives (p < 0.0001).

Table 1. Pre-module and post-module survey results for the Research 101 modules

*p < 0.0001.

As shown in Fig. 1a, the response to “The content of this module met my needs” was endorsed highly across all modules (84.9% “yes” responses). “No” and “unsure” responses were highest for the Submitting an IRB protocol at UC module and lowest for the Introduction to human subjects research and protection and Effective writing for publication modules. Across all modules, the response to “Would you recommend this module to a friend if it was not a requirement?” received 52.6% “yes” responses, 22.4% “no” responses, and 25.0% “unsure” responses (Fig. 1b). “No” and “unsure” responses were highest for the Introduction to research module and lowest for the Presenting your summer research module.

Figure 1. ( a ) Responses to the question “the content of this module met my needs” and ( b ) “would recommend this module to a friend if it was not a requirement.” Yes responses are black, no responses are hashed, and unsure responses are gray.

Because of potential differences across learner types, we compared findings from the two largest groups of participants, namely medical students and undergraduate students. Several differences in the magnitude of learning – as measured by the average change in survey scores (post-module – pre-module) by group – were observed (Table 2). Each showed a higher magnitude of change among undergraduate students compared to medical students. For instance, the change in average score was higher for undergraduate students compared to medical students for the “identify the steps of the scientific method” (1.32 versus 0.60; p < 0.001). Similarly, the change in average score was higher for undergraduate students compared to medical students for the “understand how to write a clear and concise research project description” (1.65 versus 0.79; p = 0.001). Overall, the average change in score was higher for undergraduates than for medical students for 17 of 41 (41.5%) learning objectives.

Table 2. Average change in Likert scores (post-pre) by undergraduate versus medical student status

*Pooled P values were used when the test for equality of variances (Folded F) was > 0.05. When the equality of variances test was < 0.05, the p value from the Satterthwaite method was reported. P values < 0.05 are highlighted in blue.

The final quiz was completed by 92 participants, including 81 medical students and 11 undergraduate students. As shown in Fig. 2, the average score out of a possible score of 25 for medical students was higher than for undergraduate students (21.4 versus 19.7; p = 0.0072).

Figure 2. Final quiz scores (of a possible 25) for 81 medical students and 11 undergraduate students. (21.4 versus 19.7; p = 0.0072).

Content analysis of the open-ended text responses gathered from the post-module surveys revealed several strengths of the Research 101 modules. First, students appreciated videos that (1) were succinct, easily understood, and visual, (2) presented diverse opinions by dynamic speakers, and (3) struck an appropriate balance between introductory level for basic topics and more comprehensive level for topics that required detailed information. For instance, a participant stated “[This module] covered important topics in engaging way with short videos.” A second strength that participants identified was case studies that demonstrated real-world scenarios. “Case studies highlighted real potential problems one might encounter entering a research setting.” A third strength was the practical and relevant information presented in the form of examples (e.g., IRB, past projects, stages of CTR research, peer review process, mentoring relationships, misconduct, journal clubs), activities that allowed participants to apply their learning to their interests and work (e.g., finding an article, writing clearly and concisely, reading scientific articles, reflecting on their research question, designing a poster), and resources (e.g., Google Scholar, PubMed, NIH RePORTER) that could be referenced later. Participants stated “[This module] discussed a very important topic and gave good examples” and “I like having these resources available for if I ever need them in the future.” Many students appreciated examples that were specific to our institution (e.g., UC Radiation Study, UC Libraries, UC IRB), which discussed difficult topics, such as racism and human subjects’ violations, and historical examples that were particularly impactful, primarily with regard to research ethics. “I appreciated the attention giv[en] to Cincinnati’s own involvement in unethical research practices.” Finally, participants noted a strength in the diversity of speakers, viewpoints, and topics.

Qualitative data also identified several suggestions for improvement with respect to Research 101 content, including videos that were too long (“Could be more concise”), modules with too many links, resources, or discussion board questions (“[I did not like] all of the separate links, just lots of clicking and navigating”), repetition within Research 101 and/or redundancy with other medical school curricular components (“It is information I was mostly familiar with before”), content that was too introductory or broad to be useful (“It was too broad to develop a good understanding of the material”), need for better examples and more resources that are more diverse in discipline and setting, more concrete and real-world, more current, and/or more practical (“It talked about posters and lab presentations--I wish it also discussed other ways of communicating research”). Some students felt that certain modules were missing information, including how to get funding, opportunities to present research, poster formatting, clinical trial phases, writing a research project description, and overview of the IRB process (“I wish the module addressed ways to bring forward complaints and frustrations in a productive way,” “I wish there was a video on how to find opportunities to present research,” “I wish there was more conversations on how to get funding,” “I wish they went over the different labels of clinical trials [phases]”). Students were mixed on the use of discussion boards; however, the majority would like fewer assigned and preferred to respond to each other rather than the main thread (“[I did not like having] 4 discussion boards in 1 module”). Technical issues (e.g., nonfunctioning links, audio problems, or the need to add closed captioning to videos) were a minor theme.

Discussion

Research training is a cornerstone of education and professional development, and resources to provide this training in a comprehensive, student-centered manner are highly sought after. Unfortunately, significant barriers exist to pursuing research training during medical school, such as lack of infrastructure, a paucity of high-quality faculty mentors, insufficient institutional incentives for those conducting research, limited awareness of local research opportunities, and the absence of a research office or coordinator for training [Reference de Oliveira, Luz, Saraiva and Alves14,Reference Chakraborti, Gleeson and Gunderson15]. In a survey of US medical students, 19.4% reported taking a required course on research methods, and 28.7% reported that a research elective was available at their institution [Reference Chakraborti, Gleeson and Gunderson15]. Structured training in research is important for residency programs as well and is included in current ACGME requirements [7]. It has been suggested that resident research may improve clinical care by fostering clinical evaluation skills, clinical reasoning, and lifelong learning [16,Reference Hebert, Levine, Smith and Wright17]. Early exposure to research [Reference Hall18] may also increase the number of physician scientists [Reference Hebert, Levine, Smith and Wright17,Reference Rosenberg19]. Furthermore, residency training programs with organized programs/curricula, including protected time for research, were associated with increased productivity [Reference Laupland, Edwards and Dhanani20]. Yet, there is limited information on what specific topics should be taught and existing curricula may not be readily available to those interested in modifying them for their own programmatic needs [Reference Alguire, Anderson, Albrecht and Poland8,Reference Hebert, Levine, Smith and Wright17]. A systematic review of research curricula for residents found that the most common objectives were to increase research productivity and to enhance critical evaluation skills [Reference Hebert, Levine, Smith and Wright17].

Research 101 was developed to provide a structured introduction to important topics in research in a highly accessible format. Its online, asynchronous format offers a basic training infrastructure that is sufficiently flexible to enable individualized learning and/or program-specific adaptations. Evaluating the expansion of Research 101 to include other learner types demonstrated significant learning by completing Research 101 (i.e., post-module survey scores were significantly higher than pre-module scores for all modules and all learning objectives). In general, final quiz scores were high; however, mean scores were different for medical students compared to undergraduate students highlighting the opportunity for additional learning by all future participants enrolled in Research 101.

This study has several limitations to consider. First, Research 101 was piloted initially with a small number of medical students and subsequently expanded to include additional training programs and learner types. Different educational needs based on learner type are likely, and modules may need to be tailored to these specific needs in the future when more participant data have been gathered. Second, individual learners may complete a different set of modules based on their programmatic requirements (i.e., not all participants complete all modules). Third, selection bias may exist among the students who completed Research 101. Those that participated may be inclined to have research interests or had previous research experience prior to starting Research 101 when compared to their nonparticipant counterparts. Nonetheless, the qualitative data that are collected for each module are helpful for regular enhancement of Research 101 for future audiences that may have limited exposure to scholarly activities. Fourth, while Research 101 is offered asynchronously, some topics may require more direct interaction with learners. Thus, it is important to highlight that Research 101 should not replace in-person interactions; rather, it provides an additional option for learners given their distinct learning styles, limited space for new content within existing medical school curricula, and the varying interests and backgrounds of learners.

In conclusion, Research 101 is a valuable addition to the research training toolkit that can be utilized by distinct learner types and result in clear learning. Research 101 can bolster current training and/or fill existing gaps within undergraduate and medical school degree programs.

Funding statement

This work was supported by the University of Cincinnati Center for Clinical and Translational Science and Training through grant UL1TR001425 and short-term medical student training grants T35 DK060444 to JTB and T35 HL113229 to WH.

Competing interests

The authors have no conflicts of interest to declare.

References

Amgad, M, Man Kin Tsui, M, Liptrott, SJ, Shash, E. Medical student research: an integrated mixed-methods systematic review and meta-analysis. PLoS One. 2015;10(6):e0127470.CrossRefGoogle ScholarPubMed
Association of American Medical Colleges. Medical School Graduate Questionnaire: 2018 All Schools Summary Report. Association of American Medical Colleges, 2018.Google Scholar
Association of American Medical Colleges. Medical School Graduate Questionnaire; 2021 All Schools Summary Report. Association of American Medical Colleges, 2021.Google Scholar
Green, M, Jones, P, Thomas, JX. Selection criteria for residency: results of a national program directors survey. Acad Med. 2009;84(3):362367.CrossRefGoogle ScholarPubMed
DeFranco, DB, Sowa, G. The importance of basic science and research training for the next generation of physicians and physician scientists. Mol Endocrinol. 2014;28(12):19191921.CrossRefGoogle ScholarPubMed
Stone, C, Dogbey, GY, Klenzak, S, Van Fossen, K, Tan, B, Brannan, GD. Contemporary global perspectives of medical students on research during undergraduate medical education: a systematic literature review. Med Educ Online. 2018;23(1):1537430.CrossRefGoogle ScholarPubMed
ACGME Common Program Requirements (Residency). ACGME common program requirements (Residency). (https://www.acgme.org/globalassets/pfassets/programrequirements/cprresidency_2022v3.pdf) Accessed March 1, 2023.Google Scholar
Alguire, PC, Anderson, WA, Albrecht, RR, Poland, GA. Resident research in internal medicine training programs. Ann Intern Med. 1996;124(3):321328.CrossRefGoogle ScholarPubMed
Zimmerman, R, Alweis, R, Short, A, Wasser, T, Donato, A. Interventions to increase research publications in graduate medical education trainees: a systematic review. Arch Med Sci. 2019;15(1):111.CrossRefGoogle ScholarPubMed
Stevenson, MD, Smigielski, EM, Naifeh, MM, Abramson, EL, Todd, C, Li, ST. Increasing scholarly activity productivity during residency: a systematic review. Acad Med. 2017;92(2):250266.CrossRefGoogle ScholarPubMed
Blackard, JT, Knapke, JM, Ryan, PH, et al. Research 101: an online course introducing medical students to research. J Clin Transl Sci. 2022;6(1):e102.CrossRefGoogle ScholarPubMed
Harris, PA, Taylor, R, Thielke, R, Payne, J, Gonzalez, N, Conde, JG. Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377381.CrossRefGoogle ScholarPubMed
Elo, S, Kyngäs, H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107115.CrossRefGoogle ScholarPubMed
de Oliveira, NA, Luz, MR, Saraiva, RM, Alves, LA. Student views of research training programmes in medical schools. Med Educ. 2011;45(7):748755.CrossRefGoogle ScholarPubMed
Chakraborti, CBID, Gleeson, E, Gunderson, W. Identifying barriers to successful research during medical school. Med Educ Dev. 2012;2(1):e2.CrossRefGoogle Scholar
Does research make for better doctors? Lancet (London, England). 1993;342(8879):10631064.CrossRefGoogle Scholar
Hebert, RS, Levine, RB, Smith, CG, Wright, SM. A systematic review of resident research curricula. Acad Med. 2003;78(1):6168.CrossRefGoogle ScholarPubMed
Hall, RP. Training physician–scientists: a call for a cultural shift in our approach. JID Innov. 2022;2(1):100093.CrossRefGoogle Scholar
Rosenberg, LE. Young physician-scientists: internal medicine’s challenge. Ann Intern Med. 2000;133(10):831832.CrossRefGoogle ScholarPubMed
Laupland, KB, Edwards, F, Dhanani, J. Determinants of research productivity during postgraduate medical education: a structured review. BMC Med Educ. 2021;21(1):567.CrossRefGoogle ScholarPubMed
Figure 0

Table 1. Pre-module and post-module survey results for the Research 101 modules

Figure 1

Figure 1. (a) Responses to the question “the content of this module met my needs” and (b) “would recommend this module to a friend if it was not a requirement.” Yes responses are black, no responses are hashed, and unsure responses are gray.

Figure 2

Table 2. Average change in Likert scores (post-pre) by undergraduate versus medical student status

Figure 3

Figure 2. Final quiz scores (of a possible 25) for 81 medical students and 11 undergraduate students. (21.4 versus 19.7; p = 0.0072).