Hostname: page-component-8448b6f56d-dnltx Total loading time: 0 Render date: 2024-04-20T00:45:05.031Z Has data issue: false hasContentIssue false

Disrupted Learning about Democracy: Instructor Strategies for Navigating Temporary Modality Shifts

Published online by Cambridge University Press:  11 January 2023

Joshua M. Jansa
Affiliation:
Oklahoma State University, USA
Eve M. Ringsmuth
Affiliation:
Oklahoma State University, USA
Rights & Permissions [Opens in a new window]

Abstract

Within-semester shifts in course modality in response to pandemics, weather, or accommodation for travel and health are increasingly common and can interrupt student learning. We tracked temporary modality changes across 10 sections of “Introduction to American Government” to examine the extent to which instructors have tools to help students successfully navigate such changes and mitigate learning loss. We find that students rated instructors’ handling of shifts well if they made course material engaging, communicated clearly, and effectively used technology. The analysis suggests that instructors can mitigate the impact of unplanned changes to modality on students’ learning when there are three or fewer shifts during a semester.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of the American Political Science Association

The COVID-19 pandemic created unprecedented challenges for college-student learning. After initially pivoting to fully online learning in March 2020, course modalities shifted between and within semesters as policies and conditions changed. Changes from in-person to online meetings often were abrupt and unpredictable, requiring students to repeatedly relearn how to attend class. Within-semester shifts in modality in response to viral spread, natural disasters, or accommodation for travel or health absences are increasingly common. Although studies in the field during the Spring 2020 semester were able to examine the impact of the emergency shift online on student learning and satisfaction (e.g., Seitz and Rediske Reference Seitz and Rediske2021), no study has examined the effect of repeated, temporary shifts on student learning about politics and government.

What effect do within-semester shifts in modality have on student learning and how can instructors best execute these shifts? We addressed this question by tracking 10 sections of “Introduction to American Government” at a large university during the Fall 2021 semester.Footnote 1 We surveyed students about their self-assessed growth in political knowledge as well as their assessment of instructors’ handling of modality shifts. Students thought instructors handled temporary modality shifts well if they made course material engaging, communicated clearly, and effectively used technology. These strategies increased students’ confidence in their understanding of course material in sections that experienced a few modality shifts. This study illuminates how the stress and uncertainty of shifting modalities can affect students’ political knowledge and be mitigated by tools within instructors’ control.

BACKGROUND

The extant scholarship on teaching and learning has investigated whether and how modality can affect student learning (Alpert, Couch, and Harmon Reference Alpert, Couch and Harmon2016; Botsch and Botsch Reference Botsch and Botsch2012; Clawson, Deen, and Oxley Reference Clawson, Deen and Oxley2002; Daigle and Stuvland Reference Daigle and Stuvland2020; Glazier et al. Reference Glazier, Hamann, Pollock and Wilson2020; Hamann, Pollock, and Wilson Reference Hamann, Pollock and Wilson2009; Pollock and Wilson Reference Pollock and Wilson2002; Wilson, Pollock, and Hamann Reference Wilson, Pollock and Hamann2007; Xu and Jaggers Reference Xu and Jaggers2014). These studies, however, treat modality as a fixed course structure. Studies outside of political science have leveraged the emergency shift online due to COVID-19, finding some evidence that students learned more after moving online (Seitz and Rediske Reference Seitz and Rediske2021) but also that students were less confident in their learning (Prokes and Housel Reference Prokes and Housel2021), less satisfied with their experience (Kumalasari and Akmal Reference Kumalasari and Akmal2021), and felt more stress and disengagement (Besser, Flett, and Ziegler-Hill Reference Besser, Flett and Ziegler-Hill2022).

Unconsidered in political science and beyond are repeated modality changes throughout the semester. Whereas within-semester shifts in modality are a relatively new phenomenon for scholars of teaching and learning to study, they are likely here to stay. Internet-based technology, the proliferation of course modalities, and students’ and instructors’ experience with different modalities have created a new equilibrium in which courses can continue on schedule despite occasionally being unable to be conducted in person. However, these temporary modality shifts place unique demands on students and instructors that could come at a cost to student learning. The degree to which shifts induce uncertainty and stress and foster disengagement can hamper learning (e.g., Vogel and Schwabe Reference Vogel and Schwabe2016). The potential negative impact of shifts on learning may require instructors to adapt how they communicate with students and present course material to mitigate learning loss.

The degree to which shifts induce uncertainty and stress and foster disengagement can hamper learning. The potential negative impact of shifts on learning may require instructors to adapt how they communicate with students and present course material to mitigate learning loss.

RESEARCH DESIGN

We address our research question by tracking the number of shifts in modality across 10 sections of “Introduction to American Government”—a required general education course at a large university—taught by five different instructors in Fall 2021 (Jansa and Ringsmuth Reference Jansa and Ringsmuth2022). In the event of a positive COVID-19 case among students enrolled in the course, instructors were allowed to (1) remain face-to-face with masks required; (2) switch to a hybrid format with masks required for the face-to-face portion; or (3) switch to a fully online (synchronous or asynchronous) format. Changes in modality were required to persist for two weeks, but additional positive COVID-19 cases could lead to an extension for an additional two weeks from the time the positive case was reported. We asked instructors to report their experiences with temporary modality shifts and then coded the number of shifts for each section (e.g., transitioning from face-to-face to online due to a positive COVID-19 case counted as one shift; moving back to face-to-face learning counted as a second shift). As table 1 illustrates, we observed some sections with no shifts in modality, whereas other sections experienced up to five during the semester. This number of shifts variable is used as a key independent variable in the following analyses.

Table 1 Variation in Modality Shifts by Section (Fall 2021)

Students were asked to participate in a two-wave survey to gauge changes in their self-assessed knowledge about American government. Wave 2 also measured students’ perceptions of how well instructors handled temporary modality shifts if their section experienced at least one shift during the semester. Additionally, we gathered data on students’ demographics and their academic background.Footnote 2 In all, 928 students completed wave 1, a response rate of 54.9%; 724 students completed wave 2, a response rate of 42.9%. A total of 587 students completed both waves, which allowed us to pair their responses and measure growth during the course of the semester.

STUDENTS THOUGHT INSTRUCTORS EFFECTIVELY HANDLED MODALITY SHIFTS

To understand student satisfaction with how instructors handled changing formats, we asked them to register their support on a five-point scale ranging from “strongly disagree” (0) to “strongly agree” (4) for the following statement: “Overall, my instructor handled the temporary change(s) to delivering the course well.” Figure 1 exhibits notable skew because students resoundingly were pleased with how their instructors navigated the temporary shifts from face-to-face classes to online sessions (mean = 3.2; median = 4.0). The data suggest that it is possible for instructors to guide students through significant and unpredictable disruptions to one of the most fundamental aspects of a college course—that is, the format of content delivery—in a positive way. By this time—the second full semester during the pandemic—instructors had found ways to navigate frequent, unplanned shifts from face-to-face meetings to remote learning to the satisfaction of students.

Figure 1 Distribution of Instructor Handling of Temporary Modality Changes

CLARITY, EFFECTIVE TECHNOLOGY USE, AND ENGAGING MATERIAL MAKE SHIFTS NAVIGABLE

To understand what made instructors successful at handling modality shifts, we examine three potential contributing factors: whether students thought instructors (1) provided clear instructions; (2) used technology effectively; and (3) made course material engaging (all using a five-point scale).Footnote 3 Figure 2 displays coefficients and 95% confidence intervals for a model with students’ overall assessment of the instructors’ handling of temporary modality changes (five-point scale) as the dependent variable, which included instructor fixed effects to control for instructor idiosyncrasies.Footnote 4

Figure 2 Instructor Handling of Temporary Modality Changes

Ordinary least squares regression with instructor fixed effects.

The results indicate that how instructors use technology to facilitate a temporary shift to online learning, communicate when executing modality changes, and present course material during shifts each relate significantly to students’ views on how well an instructor handled unplanned modality changes. Combined, these results suggest that under challenging circumstances—when available options and instructor autonomy may be restricted (e.g., by institutional policies or personal circumstances)—instructors retain tools to significantly mitigate the disruptiveness of changes in course modality.

The results indicate that how instructors use technology to facilitate a temporary shift to online learning, communicate when executing modality changes, and present course material during shifts related significantly to students’ views on how well an instructor handled unplanned modality changes.

A NEGATIVE IMPACT OF MODALITY SHIFTS ON STUDENT LEARNING

Although the general consensus among survey participants that instructors handled the modality shifts well is encouraging, we also are interested in whether these shifts influenced learning. We operationalize student learning as change in students’ self-assessment of political knowledge because it can be measured consistently across sections and instructors.Footnote 5 Moreover, it is positively associated with a key objective of American government and similar civics education courses: encouraging students to become active citizens by participating in politics (e.g., Lee and Matsuo Reference Lee and Matsuo2018).

Students were asked to self-assess their political knowledge in five areas: (1) what distinguishes the two parties from one another; (2) how elections work; (3) what the Constitution says; (4) how laws are made; and (5) how power is divided among the three branches of government. Participants rated their confidence in understanding on a five-point scale ranging from “not confident at all” to “extremely confident.”

We calculated each student’s score as an additive index in which the maximum score was 20 (i.e., “extremely confident” on all questions) and the minimum score was 0 (i.e., “not confident at all” on all questions). Students’ responses to these five questions created a strongly reliable scale with Cronbach’s alphas of 0.83 for wave 1 and 0.86 for wave 2. We took each student’s wave 2 score and subtracted the wave 1 score to capture change in knowledge confidence. Overall, students exhibited an average increase of 2.38 points, or about 12% growth during the semester.

As a first step, we examine the bivariate relationship between shifts and growth. We find that mean growth in knowledge confidence is significantly greater for students experiencing no modality shifts (3.33) compared to those who experienced at least one change (2.33, p = 0.1). This suggests a possible weak impact of modality shifts on student learning. Using chi-square tests, we also find that students who experienced more shifts were less likely to report feeling actively engaged and less likely to attend class (p<0.01 for both). This reinforces the idea that within-semester shifts can negatively impact students by reducing interactions, attendance, and student learning.

HANDLING TRANSITIONS WELL CAN MITIGATE LEARNING LOSS FOR A FEW SHIFTS

These initial results show a weak, negative impact of shifts on students’ learning and stronger negative impacts on students’ attendance and perceived opportunities for interaction. However, the manner in which professors provide instructions, use technology, and present course material can help students to navigate short-term modality shifts. Given these results and what we know from the extant literature, we expect that the impact of temporary modality shifts on student learning will be lessened when instructors handle these disruptions well.

To measure instructor handling of modality shifts, we create an additive index that combines whether an instructor provided clear instructions, effectively used technology, and made course material engaging with the overall instructor handling question. The index ranges from 0 (“strongly disagree” on all questions) to 16 (“strongly agree” on all questions) and has a mean of 12.2, which again reflects a generally positive assessment of how instructors handled temporary modality changes. These four questions create a strongly reliable scale (Cronbach’s alpha = 0.88).

We use this measure as a key independent variable in a series of hierarchical linear models that test the impact of the number of shifts and an instructor’s handling of them on learning. The models account for the nested nature of the data with instructor- and section-level random effects. Change in knowledge confidence serves as the dependent variable in each model. We use the number of shifts measure to capture modality changes. The results are displayed in table 2.

Table 2 Change in Confidence in Knowledge Hierarchical Linear Model

Note: Standard errors in parentheses; +p<0.1, *p<0.05, **p<0.01.

Model 1 examines only the number of shifts. As expected, more changes in modality is negatively related to growth in students’ knowledge confidence. An additional modality shift in a section corresponds to a 0.19 decrease in students’ assessment of their knowledge, further suggesting that within-semester modality shifts can negatively impact student learning (p<0.1).

Model 2 adds the instructor handling index as an independent variable, which excludes those who did not experience a modality shift (i.e., online-only sections).Footnote 6 We find in Model 2 that a one-unit increase in students’ ratings of an instructor’s handling of modality shifts corresponds to an 0.08 increase in knowledge confidence during the semester. Furthermore, the negative effect of the number of shifts dissipates with the inclusion of the instructor handling index. The results suggest that skilled instructor handling of modality shifts can enhance student learning.

To examine whether instructor handling can mitigate the impact of modality shifts on student learning, we include an interaction between the number of shifts and the instructor handling index in model 3. We also control for students’ wave 1 knowledge confidence because students who start with higher levels of confidence have less room for growth (Meirick and Wackman Reference Meirick and Wackman2004) and they may overestimate their confidence before learning about politics (Rogers and Gooch Reference Rogers and Gooch2021). We also controlled for student-level factors (i.e., class standing, GPA, gender, racial/ethnic minority status, and parents’ education) that may have varied across types of sections and respondents and affected students’ self-assessment of learning.Footnote 7

The results in model 3 suggest that instructors can take actions—particularly when there are three or fewer modality shifts in a semester—that shield students from problems related to unplanned changes to course modality. Figure 3 plots the conditional marginal effect of modality shifts on student learning. We hold the instructor handling index at its mean value that, given its distribution, represents positive assessments of how instructors handled modality shifts. The x-axis displays the frequency of modality changes present in the data. Figure 3 demonstrates that the marginal effect of good instructor handling on students’ knowledge confidence is strongest when the number of modality shifts is lower (a 0.16 increase in knowledge confidence; p = 0.01 for two shifts) and decreases as the number of shifts increases (0.11, p = 0.01; 0.05, p = 0.09; and 0, p = n.s. at three, four, and five shifts, respectively). Although there are limits to instructors’ power, they can take steps to offset the potential negative consequences of a few unplanned modality shifts during a semester.

The results suggest that instructors can take actions—particularly when there are three or fewer modality shifts in a semester—that shield students from problems that can come with unplanned changes to course modality.

Figure 3 Margins Plot for Shifts X Instructor Handling Index

The model also indicates that minority students reported less growth in knowledge confidence, reinforcing the idea that there are ongoing challenges to equity in civic education. Additionally, students with a higher GPA and a higher level of knowledge confidence at the beginning of the semester reported less knowledge-confidence growth, which suggests that students from advantageous positions (i.e., a strong academic record and confidence in their political knowledge) may have less room to increase their understanding of politics.Footnote 8

DISCUSSION AND CONCLUSION

Modality is one of the most fundamental aspects of a college course. Temporary changes in modality increased dramatically during the COVID-19 pandemic but are also used during extreme weather conditions and when instructors need to travel or meet personal obligations. When circumstances precipitate a modality shift, instructors have tools at their disposal to guide students through these disruptions in a positive way. We found that when students thought that their instructors provided clear instructions, effectively used technology, and made course material engaging, students gained more confidence in their understanding of course material despite experiencing a few modality shifts and being in relatively large sections in which they often feel less engaged.

Overall, the results are encouraging, but the level of disruption brought on by numerous modality shifts can be too much to overcome. Additional data on the formats, pedagogies, and practices used during shifts (e.g., synchronous versus asynchronous, online lectures versus discussions, and mode and frequency of communication) as well as instructors’ and students’ past experience with online courses would allow us to further probe how instructorsFootnote 9 can mitigate the consequences of within-semester modality shifts. More observations (e.g., a larger range of class sizes) or an alternative measure of learning also could provide greater certainty about the negative impact of shifts on learning, as the effect we found with our indicator of learning is only tenuously significant.

The lessons in this study provide important insights for students, faculty, and administrators who are adjusting to a new normal. Sound pedagogy, clear communication, and effective technology can make a difference in students’ knowledge confidence. In a course such as “Introduction to American Government,” instructors are able to use their soft skills and expertise to foster learning about government and politics—an impressive and important feat in an especially disruptive and stressful time for education and democracy.

DATA AVAILABILITY STATEMENT

Research documentation and data that support the findings of this study are openly available at the PS: Political Science & Politics Harvard Dataverse at https://doi.org/10.7910/DVN/C61HF9.

Supplementary Materials

To view supplementary material for this article, please visit http://doi.org/10.1017/S1049096522001305.

CONFLICTS OF INTEREST

The authors declare that there are no ethical issues or conflicts of interest in this research.

Footnotes

1. Approval for this study was obtained from the Oklahoma State University Institutional Review Board (i.e., IRB-21-308 and IRB-21-312).

2. We emailed both waves directly to students via the Qualtrics platform and provided two reminder emails for each wave. The emails contained information about earning extra credit for completing the survey. We also provided guidance to instructors on how to communicate with students about the survey. Despite uniform recruitment procedures, we obtained differential response rates by section (see table 1).

3. Question wording for all items used in this study is available in the online appendix.

4. The results are robust to using section fixed effects and controlling for students’ attendance in the course, class standing, GPA, gender, racial/ethnic minority status, expected course grade, and parents’ education.

5. Although all sections used the same textbook, instructors had discretion to emphasize different topics and to assess grades differently, making it challenging to develop an unbiased test of objective knowledge. We could measure confidence in knowledge equitably across sections and instructors.

6. Students who experienced zero modality shifts are used as a comparison group in model 1 in table 2 as well as in the bivariate tests reported previously. Online-only students were not asked about instructor handling of within-semester modality shifts because these sections did not experience modality changes. Thus, online-only students are excluded from models that include the instructor handling index as an independent variable.

7. The results are also robust to including students’ attendance in the course as a control and to excluding wave 1 knowledge confidence.

8. Bayesian Information Criterion indicates that model 3 is the best-fitting model; the addition of the interaction term and control variables improves the model’s explanatory power without overfitting.

9. Future research also could pursue how instructor traits shape students’ perception of how shifts are handled.

References

REFERENCES

Alpert, William T., Couch, Kenneth A., and Harmon, Oskar R.. 2016. “A Randomized Assessment of Online Learning.” American Economic Review 106 (5): 378–82.CrossRefGoogle Scholar
Besser, Avi, Flett, Gordon L., and Ziegler-Hill, Virgil. 2022. “Adaptability to a Sudden Transition to Online Learning During the COVID-19 Pandemic: Understanding the Challenges for Students.” Scholarship of Teaching and Learning in Psychology 8 (2): 85105. DOI:10.1037/stl0000198.CrossRefGoogle Scholar
Botsch, Robert E., and Botsch, Carol S.. 2012. “Audiences and Outcomes in Online and Traditional American Government Classes Revisited.” PS: Political Science & Politics 45 (3): 493500.Google Scholar
Clawson, Rosalee A., Deen, Rebecca E., and Oxley, Zoe M.. 2002. “Online Discussion Across Three Universities: Student Participation and Pedagogy.” PS: Political Science & Politics 35 (4): 713–18.Google Scholar
Daigle, Delton T., and Stuvland, Aaron. 2020. “Teaching Political Science Research Methods Across Delivery Modalities: Comparing Outcomes Between Face-to-Face and Distance-Hybrid Courses.” Journal of Political Science Education 17 (1): 123.Google Scholar
Glazier, Rebecca A., Hamann, Kerstin, Pollock, Philip H., and Wilson, Bruce M.. 2020. “Age, Gender, and Student Success: Mixing Face-to-Face and Online Courses in Political Science.” Journal of Political Science Education 16 (2): 142–57.CrossRefGoogle Scholar
Hamann, Kerstin, Pollock, Philip H., and Wilson, Bruce M.. 2009. “Learning from ‘Listening’ to Peers in Online Political Science Classes.” Journal of Political Science Education 5 (1): 111.CrossRefGoogle Scholar
Jansa, Joshua, and Ringsmuth, Eve. 2022. “Replication data for ‘Disrupted Learning about Democracy: Instructor Strategies for Navigating Temporary Modality Shifts.’” PS: Political Science & Politics. DOI:10.7910/DVN/C61HF9.CrossRefGoogle Scholar
Kumalasari, Dewi, and Akmal, Sari Zakiah. 2021. “Less Stress, More Satisfaction with Online Learning During the COVID-19 Pandemic: The Moderating Role of Academic Resilience.” Psychological Research on Urban Society 4 (1): Article 12.CrossRefGoogle Scholar
Lee, Seonghui, and Matsuo, Akitaka. 2018. “Decomposing Political Knowledge: What Is Confidence in Knowledge and Why It Matters.” Electoral Studies 51:113.CrossRefGoogle Scholar
Meirick, Patrick, and Wackman, Daniel. 2004. “Kids Voting and Political Knowledge.” Social Science Quarterly 85 (5): 1161–77.CrossRefGoogle Scholar
Pollock, Phillip H., and Wilson, Bruce M.. 2002. “Evaluating the Impact of Internet Teaching: Preliminary Evidence from American National Government Classes.” PS: Political Science & Politics 35 (3): 561–66.Google Scholar
Prokes, Christopher, and Housel, Jacqueline. 2021. “Community College Student Perceptions of Remote Learning Shifts Due to COVID-19.” TechTrends 65:576–88.CrossRefGoogle ScholarPubMed
Rogers, Michael T., and Gooch, Donald M.. 2021. “Overconfidence in Politics and Civic Education: Testing for the Dunning–Kruger Effect.” Paper presented at the Annual Meeting of the American Political Science Association. Washington, DC.Google Scholar
Seitz, Heather, and Rediske, Andrea. 2021. “Impact of COVID-19 Curricular Shifts on Learning Gains on the Microbiology for Health Sciences Concept Inventory.” Journal of Microbiology and Biology Education 22 (1): 22.1.73.CrossRefGoogle ScholarPubMed
Vogel, Susanne, and Schwabe, Lars. 2016. “Learning and Memory Under Stress: Implications for the Classroom.” npj Science of Learning 1:16011.CrossRefGoogle ScholarPubMed
Wilson, Bruce M., Pollock, Philip H., and Hamann, Kerstin. 2007. “Does Active Learning Enhance Learner Outcomes? Evidence from Discussion Participation in Online Classes.” Journal of Political Science Education 3 (2): 131–42.CrossRefGoogle Scholar
Xu, Di, and Jaggers, Shanna S.. 2014. “Performance Gaps Between Online and Face-to-Face Courses: Differences Across Types of Students and Academic Subject Areas.” Journal of Higher Education 85 (5): 633–59.CrossRefGoogle Scholar
Figure 0

Table 1 Variation in Modality Shifts by Section (Fall 2021)

Figure 1

Figure 1 Distribution of Instructor Handling of Temporary Modality Changes

Figure 2

Figure 2 Instructor Handling of Temporary Modality ChangesOrdinary least squares regression with instructor fixed effects.

Figure 3

Table 2 Change in Confidence in Knowledge Hierarchical Linear Model

Figure 4

Figure 3 Margins Plot for Shifts X Instructor Handling Index

Supplementary material: Link

Jansa and Ringsmuth Dataset

Link
Supplementary material: File

Jansa and Ringsmuth supplementary material

Jansa and Ringsmuth supplementary material

Download Jansa and Ringsmuth supplementary material(File)
File 20.2 KB