Hostname: page-component-76fb5796d-x4r87 Total loading time: 0 Render date: 2024-04-25T10:20:37.231Z Has data issue: false hasContentIssue false

Audiences and Outcomes in Online and Traditional American Government Classes Revisited

Published online by Cambridge University Press:  12 June 2012

Robert E. Botsch
Affiliation:
University of South Carolina, Aiken
Carol S. Botsch
Affiliation:
University of South Carolina, Aiken
Rights & Permissions [Opens in a new window]

Abstract

In 1997 we first offered American government classes online as well as face-to-face classes. We administered pre- and posttests to our students to measure their general knowledge of American government, political attitudes, demographics, and some behaviors. Following an initial report in 2001, we continued to gather data for 10 more years; this current study covers nearly 3,200 students during 13 years. We examine the sample as a whole and changes in audiences and outcomes, over time, for the two teaching formats. Although the kinds of students taking online classes have become more similar, a few differences persist. Learning outcome differences continue to be insignificant. Neither format has a clear advantage in students' changes in attitudes, but the online classes increased students' newspaper reading. Class dropout rate and faculty workload both favor face-to-face classes, but flexibility in scheduling and student demand clearly favor online classes.

Type
The Teacher
Copyright
Copyright © American Political Science Association 2012

When future historians study the impact of technology in the late twentieth and early twenty-first centuries, they will doubtless note the sea change that occurred in how courses were taught with the emergence of the Internet. In the late 1990s, we began to explore alternative formats for delivery of our program's “bread and butter” class, American National Government. Our own motivation was more basic than using a new technology. At that time, our university, a regional campus of the state university with about 3,000 full- and part-time students, was an entirely commuter campus in a small town, and drew its students mostly from the surrounding counties. First-generation college students shared parental disdain for government and politics. They saw college mainly as a stepping stone to a good job. Because an American government course was not required for graduation for most students, we struggled to attract students. Thus, our web-delivered courses were born out of desperation as much as technological possibility.

After one of us received release time to develop an online class, we conducted a pilot project in the fall of 1997 with four students. The following spring (1998), we offered the University of South Carolina Aiken's first completely web-based class. We have offered one or more American government online classes every semester since then and developed several web-based upper-level political science courses. Of course, online courses, and even entirely “virtual” universities, are now common around the world.

Distance education has often been viewed with suspicion. Regardless of the method of course delivery, “… classroom instruction has been the standard to match.”Footnote 1 Although many of our colleagues initially thought web-based teaching would lack the academic rigor of the face-to-face format, our early research helped put those fears to rest. We began to administer pre- and posttests to our students to determine their general knowledge and political attitudes when they began and completed the class. We also gathered demographic data on the students. After several years, we, like many others who conducted research in the field, concluded that “no significant difference” in the factual knowledge was gained between web-based and face-to-face classes (Anstine and Skidmore Reference Anstine and Skidmore2005; Botsch and Botsch Reference Botsch and Botsch2000; Botsch and Botsch Reference Botsch and Botsch2001; Dolan Reference Dolan2008). Web-based courses in a variety of disciplines are no longer unusual at USC Aiken and, during registration, are frequently among the first classes to close. After concluding this 13-year quasi-experimental studyFootnote 2 in the spring of 2010 with a sample increased tenfold from the original study, we reexamine our original conclusions and changes that have occurred.

THE TWO TEACHING FORMATS

As long-time colleagues who frequently consult each other, our classes, whether face-to-face or online, have a great deal in common, in large part because we planned the web-based class to be very similar to the face-to-face class format. We used the same standard text until 2009, when we switched to a free, online text that one of us authored.Footnote 3 All students answer “reading mastery” questions based on the material in the text. To reinforce learning, students in both classes are required to access a web page and/or read a newspaper story on the topic of the week and write a short essay relating the material to concepts in the text. Both classes have opportunities to attend campus events and write reports for extra credit. Both classes have discussions, but discussions in the web classes are totally asynchronous and virtual with no real-time interactions among students and faculty. All interactions are via the Internet, although students contact their teachers individually in their offices, on the phone, or by e-mail. Students must respond within a specified time to a specific discussion question.Footnote 4

Other than the obvious difference that in face-to-face classes discussions are verbal, a significant difference is that meaningful student participation in discussion in the web-based classes is part of the course grade although not generally graded in the face-to-face classes. To compensate for the greater emphasis on formal discussions in the online classes, in face-to-face classes we attempt to stimulate student interest in news reading with occasional extra credit news quizzes and engage the students through in-class exercises and activities to reinforce their readings. Testing procedures are different. Face-to-face classes use mixed-format and closed-book tests: about 70% to 80% are objective questions and one or two are essay questions. All testing for web-based students is online, open-book essays, which is consistent with keeping the schedule highly flexible. Each test, for either class format, generally covers three or four chapters . The maximum enrollment is 20 students for the web-based classes and 25 for the face-to-face classes (plus a few overrides in compelling circumstances). The maximum was 15 in web-based classes a few years ago, but to increase productivity in tight budgetary times, the enrollment was increased.

Despite these differences, the basic format for both web-based and face-to-face classes was well established during our many years of teaching American government before taking it online: The format included close reading of the text with questions to answer, discussions based on short writing assignments, and current events assignments to illustrate concepts. The course has remained in place during the entire 13-year period of this study in both the web-based and face-to-face classes. The only variations were minor fluctuations in the precise numbers of news and web assignments from semester to semester and professor to professor and use of the general knowledge posttest as part of the final examination in some selected classes for a few years.Footnote 5 In sum, both types of classes have much in common despite different delivery formats. The major differences between the web-based and face-to-face classes are testing procedures and the more formalized and graded discussions in the online classes.Footnote 6

COMPARATIVE MEASUREMENTS

Through the spring of 2010, we compiled data on nearly 3,200 students, 2,525 who took the course in the traditional face-to-face format, and 659 who took it online. Having such a large sample allows us to detect with statistical significance much smaller differences than those we reported more than a decade ago, when we had an N of 321 students: 105 online and 215 face-to-face (Botsch and Botsch Reference Botsch and Botsch2001). The data include a wide range of information, including GPA, major, age, gender, ethnicity, parent's education, and perceived course difficulty. We also gathered information on knowledge, attitudes, and behaviors that might have changed from the beginning of the course to the end of the course (including general political knowledge about American government, political interest, political efficacy, political trust, and daily general newspaper reading). Increases in knowledge,Footnote 7 increasing interest, increasing efficacy, and more frequent exposure to news were goals for a course that purports to improve the civic culture of citizens.

The act of measurement often affects the qualities that are being measured. Because this study was a long-term field experiment, we cannot be certain that we did not change the way we taught because of the less-than-stellar posttest scores. Because we were both experienced professors who were reasonably confident that we were using tried-and-tested teaching techniques before this experiment began, we made no conscious dramatic changes driven by test scores.

AUDIENCES

As we reported a decade ago (Botsch and Botsch Reference Botsch and Botsch2001), we found and continue to find both similarities and differences (although shrinking) in the audiences for online and traditional classes. Students who enrolled in online classes during the 13 years of this study differ significantly from those who enrolled in traditional lecture classes in terms of age, GPA, gender, major, initial level of information about government and politics, daily newspaper reading, political efficacy, trust, and ethnicity, as shown in table 1.

Table 1 Comparisons of Traditional and Online American National Government Students, All (Spring 1998–Spring 2010) and Initial (Spring 1998–Fall 1999)

a E-mail for a copy of the 59-item test with 63 possible correct answers.

b “Would you say that you follow what's going on in government and public affairs (4) most of the time, (3) some of the time, (2) only now and then, or (1) hardly at all?”

c “How much of the time do you think you can trust the government to do what is right? (1) none of the time, (2) only some of the time, (3) most of the time, or (4) just about always.”

d “Sometimes politics and government seem so complicated that a person like me can't really understand what's going on. Do you (1) strongly agree, (2) agree, (3) have mixed feelings, (4) disagree, or (5) strongly disagree?”

e “How many days in the past week did you read a daily newspaper?”

f “How would you compare the difficulty and workload of this course with others you have taken? (1) Easier (2) About the same (3) Or harder?”

On average, web-based students were 2.6 years older than students who enrolled in the face-to-face classes. The difference is down dramatically from the six-year difference we found a decade ago, as shown in table 1. Tracked over time, the average age of students taking the online classes steadily declined during the 13 years of this study (by 10.4 years). Students in the face-to-face classes also declined in age, yet although statistically significant, the decline was small (1.4 years). Some of this overall decline is explained by increased enrollment of younger traditional students at the university. Although online students are typically older than face-to-face students (Tallent-Runnels et al. Reference Tallent-Runnels, Lan, Cooper, Ahern, Shaw and Liu2006), clearly the age mix of students in our web-based classes became much closer to those in face-to-face classes. For some time, the scheduling flexibility and convenience of online classes has attracted older, nontraditional students (Anstine and Skidmore Reference Anstine and Skidmore2005; Garson Reference Garson1998; Harrington and Loffredo Reference Harrington and Loffredo2010; Lei and Gupta Reference Lei and Gupta2010). Now, traditional students also seem to place a higher value on scheduling flexibility. This flexibility is especially important with increasing tuition costs when more of the traditional students must work to pay for school. Like others (Kreb Reference Kreb2009), we found that student-athletes like online classes because these courses easily fit around tight practice and game schedules.

Students taking the online version of the class had a slightly but significantly higher GPA than students in the face-to-face class. Part of this difference is related to age, as age is a significant predictor of GPA (p = 0.000). However, it also may be related to the kinds of students who preregister and enroll in high-demand classes that fill quickly and close. Students with high GPAs are probably better planners and less likely to procrastinate during the registration process.

Women have comprised almost exactly two-thirds of the students on our campus throughout the 13 years of this study (Dawe Reference Dawe2011; Herrin Reference Herrin1999). For the first three years of this study, women were significantly less likely to enroll in the web-based classes than in the face-to-face classes (an average of nearly 10 percentage points lower). Yet during 2000–01 the percentages flipped. Since then, women consistently have been more likely to enroll in online classes (an average of nearly 12 percentage points higher).

Although the gender change in 2000–01 may have been a statistical artifact,Footnote 8 we offer an explanation for the years that followed. This explanation has implications beyond gender. In 2001–02, about one-third of the way through our study, a change in general education requirements across campus had a dramatic effect. All students were required to complete an American Political Institutions requirement that could be satisfied by taking either the American government course or an American history course. Prior to this, only education and political science majors were required to take American government. Relatively few students from other majors took the course. We continued to see relatively few nonsocial science or education majors in the face-to-face classes until this requirement changed. However, the online classes were drawing a different mix prior to the 2001–02 change in general education. Online classes especially drew well among science majors who comprised nearly the same percentage as social science majors in web-based classes (11%), more than double what they were in the face-to-face classes (5%). After the requirement change, the mix of majors taking all American government classes began to better reflect the overall mix of majors across campus, and the number of students in American government classes increased dramatically, jumping by more than 65% the first year. Nursing, a major of primarily female students, contributed significantly to this gender shift. Nursing students increased from 6% to 18% of all online students after the requirement change in general education. We note that web-based classes were drawing relatively more nursing students than face-to-face classes even prior to the general education requirement change (6% and 2% respectively). Nursing students face significant scheduling difficulties with their many required nursing classes, so the convenience of online classes has great appeal.

The gender shift was not entirely explained by changes in the composition of majors taking the web-based classes. We examined the percentage of women in web-based classes over time for each major. Although the percentages did fluctuate because of low Ns for some years in some majors, the overall trend for each major was higher percentages of women in the web-based classes during the 13-year period. This trend suggests two other possible explanatory factors: the erasing of a gender digital divide and the increasing importance of convenience for women. In the 1990s, men were more likely than women to be Internet users, but that is no longer true (“The Internet” 1997; Krantz Reference Krantz2000; Stoughton and Walker Reference Stoughton and Walker1999).Footnote 9 Women have become relatively more comfortable with taking online classes, so any differences between men and women should have been expected to disappear. Yet women did more than erase the difference. Perhaps the increasing pressures on all college students may be disproportionately felt by women, some of whom are single mothers and all of whom live in a traditional culture in which women are expected to help out more at home.

While the composition by academic major of our American government students now better reflects the overall student body in both the web-based and face-to-face classes, academic major still makes a slight, but significant, difference. Students in scientific fields, business, and nursing were more likely, by a few more percentage points, to take the course online. Those students who were undecided in major, in the humanities/fine arts, or in social sciences, were slightly less likely to take the course online. Education majors matched the overall average. As noted earlier, one of our original motivations for creating the online version of the American government course was to attract students from disciplines far removed from political science. Our data suggest that this benefit still exists, although it is smaller than it was a decade ago, largely because students are now more comfortable using the web. The remaining differences are likely to grow smaller as time passes.

Studies conducted in past years have confirmed the existence of a racial “digital divide” in addition to a well-documented income gap (“Digital Divide” 2000; Gladieux and Swail Reference Gladieux and Swail1999; “Income Gap” 2007; “Survey Shows” 2000). Income differences are positively associated with Internet access (Fairlie Reference Fairlie2003; Martin and Robinson Reference Martin and Robinson2007). Pew Center researchers found, however, that by 2010, minorities had become as likely as whites to own laptops, although whites were still more likely to have broadband access at home. Changes in technology and the advent of a range of frequently cheaper and portable devices also have given minorities increased Internet access (Washington Reference Washington2011).Footnote 10 Indeed, in recent years we have seen many of our students, including African Americans, answering Blackboard reading mastery questions with their smartphones.

These trends are reflected in our data. Most of the nonwhites on our campus are African Americans. The percentage has remained relatively constant, increasing slightly from 23% in 1999 (Herrin) to 27% in 2010–11 (Dawe). Over the 13 years of the study, significantly fewer minority students enrolled in the online classes than in the traditional classes (19% and 30% respectively), as shown in table 1.

However, if we track the ethnic composition year by year, the differences have dissipated. The biggest break point was 2006–07, when the percentage of African American students increased about eight percentage points. Since then, the percentage of African Americans has remained relatively stable. From 1997–98 through 2005–06, the percentage of African Americans in online classes averaged 16%, and between 2006–07 and 2009–10, the average was 23% (a statistically significant increase, p = .05). Although most of the difference is gone, some remains. Our hopes to attract relatively more minorities to the online classes have been only partially fulfilled. We suspect that this remaining difference is largely due to the same factors national studies have identified. Our nonwhite students come from homes with parents having significantly lower levels of education than white students (p =0.000). Those homes are less likely to have had computers or broadband access.

A 1997 survey of our university's students found that their knowledge of American government was about the same or below that of the general public (Botsch 1998/Reference Botsch1999). Similarly, most students entering our classes knew little about American government and politics. In general, Americans know little about the Constitution and their system of government (“Americans' Awareness of First Amendment Freedoms” 2006; Delli Carpini and Keeter Reference Delli Carpini and Keeter1996; Gallup Reference Gallup2003; “Knowing it by Heart” 2002; “Our Fading Heritage” 2008; “We the People” 1997). Over the years, studies conducted by survey research organizations consistently find that Americans have low levels of political knowledge and that many are unable to identify important political figures and do not closely follow current events.Footnote 11 In this abyss of ignorance, web-based students, however, proved to be relatively better informed than students in traditional classes. A decade ago web-based students scored about 7 points higher on the general knowledge pretest than face-to-face students, and that difference remains about the same today. Over the entire period, out of a possible score of 63, the average score was 11.6 for the face-to-face classes and 18.9 for the web-based classes, a large and significant difference (p = 0.000). We must add that these are depressingly low scores for students in both kinds of classes. We face great challenges as teachers.

We measured a number of factors together that explained 27% of the total variation in pretest scores for all students (GPA, gender, ethnicity, political interest, political efficacy, daily newspaper reading, and age). Students who chose the web-based classes differed significantly on many of these factors in ways that help explain their relatively higher scores, as shown in table 1. Web-based students had a significantly higher GPA, were more likely to be white, to read newspapers more frequently, and were a little older, as previously discussed. On political interest no difference existed between online and traditional students. Gender and political efficacy played no significant role although both genderFootnote 12 and efficacy, by themselves, were strongly associated with low scores.Footnote 13

OUTCOMES

Format did have an impact on changes in attitudes and behaviors. Most of the changes, although significant, were relatively small and did not follow any strong pattern favoring one format more than the other, as shown in table 1. As we explore these changes, we isolate the impact of delivery format by accounting for differences in the students who were taking the two class formats.

We begin with factual knowledge. When online teaching began, educators asked, first, whether it would be comparable to or, second, better than traditional teaching. With more than a decade of experience and collected data measuring factual knowledge gains, our answers are “yes” to the first query and a qualified “no” to the second. Students in both groups improved their scores from the pretest to the posttest in a near statistical tie, as shown in table 1. That students in the online classes had higher GPAs made little difference.Footnote 14 We are doing just as well, or just as poorly, with both groups in the transmission of knowledge.Footnote 15

Improvements in critical thinking are harder to measure. Online students complete relatively more assignments requiring critical thinking. If we could measure critical thinking improvements, we suspect that high-GPA students in online classes would have more improvement here than high-GPA face-to-face students, because grades in the online classes require more analytical thinking.Footnote 16

Many studies chronicle decreases in political interest among AmericansFootnote 17 (Bennet Reference Bennet1997; Norris Reference Norris2004). As teachers, increasing political interest is our goal. Political interest increased significantly in both types of classes, but increased slightly more in the web-based classes than in the face-to-face classes as shown in table 1.

Like political interest, political trust has also declined in the United States. Many people mark the decline beginning with the Vietnam War and the events surrounding Watergate (“The Trust in Government” 2008). The long, costly wars in Afghanistan and Iraq, a seemingly never-ending series of scandals involving political office holders behaving very badly, an economic crisis that necessitated a costly and highly unpopular financial bailout, and ballooning deficits and cuts in education and many services have not improved attitudes. All our students began the course with low trust, with the younger and less-knowledgeable face-to-face students having slightly less trust. Students' political trust improved in both classes, with face-to-face students increasing slightly more, so that they caught up with the online students, as shown in table 1. Knowing more about how government operates seems to increase political trust. We note that ultimately trust rests more on policy successes than on anything professors can do.

For the last six decades, between 59% and 71% of American citizens have agreed that politics is “too complicated to understand” (The ANES Guide 2008; “Politics Is Too Complicated” 1999). Although our web-based students scored lower, all our students scored low on political efficacy as they entered the classes. Web-based students improved relatively more, so that both groups ended up with about the same level of confidence, as shown in table 1. We note that while we met our goal of improving efficacy, coming close to only having “mixed feelings” is less than impressive.

For a long time, Americans have paid little attention to current events.Footnote 18 Recently, we see little improvement.Footnote 19 Of course, one of the best ways to keep up with current events in any detail is to read newspapers every day. We asked students how many days during the past week they had read a newspaper. On average, students in both types of classes read the newspaper about two days a week when they entered the course, as shown in table 1. This number is down from what we observed a decade ago. That is not surprising given the long-term decline in traditional newspapers. Web-based students read a paper slightly more often. Both groups increased by the end of the semester by nearly a day a week, with web-based students increasing significantly more, expanding the difference they enjoyed at the beginning of the semester, as shown in table 1. Most likely, these gains are attributed to the nearly weekly news assignments we give students in both classes. Whether these gains continue is the critical question.

Web-based students' final grades were five points higher than those of face-to-face students, as shown in table 1. Most of this difference is due to online students being somewhat more skilled students who entered the class with higher GPAs. Some of the grade differences were the result of different grading criteria in the two formats. Performance in the web-based classes is measured relatively more on essay questions and completion of written assignments, including discussions. Consequently, effort counts more in the online classes, so that students who make more effort are rewarded relatively more.Footnote 20

However, despite higher grades, the online class is not easier, and the students perceived a difference. This is no accident. We designed the online class to be rigorous and academically demanding. A question on the posttest asked students to rate the difficulty of the course compared to other classes. Students in both class formats saw their classes as harder than other classes (a rating of 2.0 on the 1 to 3 scale would have been the same level of difficulty, as shown in table 1). But relative to the face-to-face students, web-based students felt their class was even harder.

Yet, despite the fact that students perceive the online class to be relatively more difficult, demand for this class remains high and is growing. For most of the time covered by this study, we offered only a single American government web-based class each semester. Now, we regularly offer two web-based sections. The web-based classes fill faster and regularly have waiting lists. In response to this demand, we now teach an increasing variety of other classes in an asynchronous online format as well as in the usual face-to-face formats. This experience suggests that students may value scheduling convenience more than they fear demanding courses.

When this study began in 1997–98, we saw a class dropout rate of around 25%, about average for this kind of course according to the literature at the time (Merisotis Reference Merisotis1999). More current research suggests a dropout rate of 15% to 20% higher than in “traditional” classes (Parry Reference Parry2010).Footnote 21 Over the next few years, however, dropout rates fell quickly: when we published our first study, we saw no difference, as shown in table 1. For the first few years students who were not willing and able to face a new format seemed to avoid the web-based classes. However, as more traditional students chose the online format and were more comfortable with computers, a better cross section of students chose web-based classes—including many who lacked the necessary self-discipline. Dropout rates for web-based classes are now significantly higher than face-to-face classes, as shown in table 1. What may have seemed like an easy option—not ever having to go to class—turned out to be more difficult.

Another kind of outcome is the impact on workload for professors. Much research indicates that web-based teaching takes more time than face-to-face classes (Berdichevsky Reference Berdichevsky1999; Bradshaw and Weston Reference Bradshaw and Weston1999; Clark-Ibanez and Scott Reference Clark-Ibanez and Scott2008; Maguire Reference Maguire2005; Tallent-Runnels et al. Reference Tallent-Runnels, Lan, Cooper, Ahern, Shaw and Liu2006). This outcome certainly has been true for us, even with technological changes that save significant grading time, like the Blackboard reading mastery tests we use now. Although we save time not having to grade these routine objective informational quizzes, we also have to create them and compose comments on right and wrong answers that explain and clarify. Any time we save here easily gets used in making more detailed responses in online class discussions. Reflecting on comments made by 20 to 25 students about the material, commenting on current news articles that presumably illustrate things in the chapters, and grading essay exams are all time-consuming. Of course, much of this also happens in our face-to-face classes, where we also save some time using Blackboard for the reading mastery tests. And that gives us a little extra time for commenting on the assignments students turn in. The biggest differences are (1) in the web-based classes the exams are all essay and open book, consequently longer than the in-class essays that face-to-face students write, and (2) web discussions are all typed and often require individual responses to each student, while in face-to-face classes not all students speak in class and our verbal responses to those who do, take little time. The only way to save significant time would be to reduce the online discussions or move to objective timed tests using Blackboard rather than essay exams, which in our view would reduce the quality of the course.

Although we both enjoy web-based teaching, we would not like to do it exclusively, because we also enjoy the physical and social dynamics and spontaneity that take place in the classroom. Moreover, some exercises and skills, such as simulations or group decision-making or research and statistical exercises that are necessary in research methods classes, do not lend themselves as well to an online format.

CONCLUSIONS

We draw several conclusions about differences in face-to-face and web-based classes from our 13-year experience. First, the audiences for the classes have become far more similar than they were when we started offering classes online. Still, some lingering differences remain. Web-based students are a bit older, have higher GPAs, are more knowledgeable about the subject matter, read newspapers more, are more likely to be women, are slightly more likely to be in majors that rely more on computers, and are slightly less likely to be African American. The driving forces behind these remaining student differences are convenience and flexibility in scheduling and students' comfort in using computers and being future-oriented enough to preregister for these popular classes.

Second, and perhaps most important, is what we did not find. We found no significant difference in factual knowledge gained between the two delivery formats. Moreover, differences in changes in political interest, trust, and efficacy, although statistically significant, were slight and had no clear pattern. Carefully constructed web-based classes in American government can be as effective as traditional lecture/discussion classes in nurturing an interested, trusting, confident, and knowledgeable citizenry.

Third, some differences were larger and seemed more important to us. Newspaper reading gains were significantly greater among web-based students, although we wonder if the gains will be lasting. Web-based students earned significantly higher grades, although that may be explained mostly by their better student skills. Yet despite higher grades, web-based students perceived their classes to be relatively more difficult than face-to-face students because of the greater writing workload in the online format. Dropout rates were also significantly higher for the web-based classes. Although we did not keep hard data on demand, we perceive demand to be higher for web-based classes. These classes almost always close out first and have a much better chance of making the minimum enrollment necessary in the summer semester.Footnote 22 Finally, we have found our own workload to be greater in web-based classes than face-to-face classes, at least the way we teach them. Given these tradeoffs, our own preferences are to teach one web-based class a semester, although two can work well if we want more flexible schedules and want to reduce pressure to teach at odd hours or in remote locations in an environment of limited classroom space. Given the workload, three web-based classes would be too demanding. Reducing the workload by relying more on standardized testing and less on written essays and give and take in online discussions would, in our view, sacrifice too much quality. Moreover, if we taught most or all of our courses online, we would miss the interpersonal interactions that take place in face-to-face classes, some of which is certainly an important part of learning essential political skills.

Online classes are now rather standard fare on most college campuses and are more accepted and more representative of all students. Here, we have shown that carefully constructed web-based classes can produce similar outcomes to traditional face-to-face classes. Yet some important differences remain in audiences and outcomes. Some of these remaining differences may also dissipate over time as all students become comfortable taking classes online and as face-to-face classes use more online resources to become more like web-based classes. We can still exploit some of these differences to attract more and slightly different students, add flexibility to our own schedules, create a better fit with the busy lives of students, and provide a greater diversity of course experiences that better match a world in which more and more human interaction is online.

Footnotes

1 Distance education courses, broadly defined, include courses offered via correspondence, television, and using videotapes. See Tallent-Runnels et al. (Reference Tallent-Runnels, Lan, Cooper, Ahern, Shaw and Liu2006) and Larreamendy-Joerns and Leinhardt (Reference Larreamendy-Joerns and Leinhardt2006) for historical reviews of distance education.

2 The quasi-experimental design of this study was discussed in our first published report. See Botsch and Botsch (Reference Botsch and Botsch2001, 135).

3 The courses, including syllabi and assignments as well as a link to the online text we both use, can be seen as links at the following url: http://web.usca.edu/polisci/course-links.dot.

4 We have used several different formats for this over the years, primarily an e-mail listserv, and more recently, a blog or the Blackboard discussion forum. Everyone in the web-based class can see and read everyone else's comments, and can reply to those comments, although not many students do.

5 For a few years with selected classes we did count the posttest as part of the final exam. Not surprisingly, students in both formats studied for the posttest and dramatically improved their scores. Given the difficulty of scheduling web-based students to come in and take the posttest in a supervised setting, this little experiment within the larger field study did not last long. We excluded these classes from the analysis on knowledge gains.

6 We note, however, that our face-to-face classes are far from what is often described as “blended” or “hybrid” classes (“How Blended” n.d.), although students use web tools such as Blackboard and use the web for reading newspapers and for looking at assignments and syllabi. For example, most of the students in the face-to-face classes are now sending in their assignments via e-mail, as do web students, and often from a smartphone. The face-to-face classes have become almost paperless.

7 Increases in knowledge were measured by administering a pre- and posttest with 59 standard questions about American government. The scores on this portion of the test ranged from 0 to 63 (students could list up to five correct answers for the question on First Amendment rights, each of which was counted separately).

8 In 2000–01 the number of men taking the course via the web dropped for some unknown reason while the number of womem taking the course in each format did not change much.

9 As noted by Pew Center researchers (Taylor and Keeter Reference Taylor and Keeter2009), “millennials” of both genders are, by their own assessment, the most technologically competent of all generations.

10 See Washington (Reference Washington2011) and PEW Research Center, “Americans Spending More Time” (2010a) for further discussion of ethnic/racial differences in use of the Internet and electronic devices.

11 See, for example, Kohut, Morin, and Keeter (Reference Kohut, Morin and Keeter2007); Keeter and Suls (Reference Keeter and Suls2007); Pew Research Center, “Well Known: Twitter; Little Known: John Roberts” (2010e). A survey conducted several weeks after the November 2010 midterm elections found, for example, that less than half of those surveyed knew that the Republicans had won a majority in just the House of Representatives, although 75% knew that the Republicans had done better than the Democrats in the midterm elections (Pew 2010c, “Public Knows”).

12 Ford (Reference Ford2002) found a gap in knowledge as well as political interest between men and women.

13 Female students scored significantly lower on the pretest than male students (11.8 and 16.3 respectively), and as already discussed, women were relatively more likely to choose the web classes than men for the last 10 years of the study. Web students had significantly lower political efficacy (p =0.001) when they entered the course (in part because they were more likely to be female), and political efficacy was related to higher knowledge scores (p = 0.000). Yet, despite having more female students and students with low efficacy in the web-based classes, web students scored higher on the pretest.

14 In their study of relationship between taking American government classes and knowledge gain, Champney and Edelman (Reference Champney and Edelman2010) found that those students with low and high GPAs had similar patterns in improvement in background knowledge, but those students with higher GPAs gained relatively more in knowledge of current events. They note that other studies indicate that higher GPAs are associated with greater gains in students' background knowledge. Our knowledge test mixed background knowledge and current events, although most of the questions would probably be considered background knowledge. We did find that GPA was significantly associated with knowledge gains, but the GPA difference between web-based and face-to-face classes was not enough to make a significant difference in knowledge improvement between the classes.

15 As noted earlier, for a few years some students in both types of classes had the posttest count as part of their final exam. Not surprisingly, posttest scores and consequently improvement dramatically increased. They are not included in this analysis of improved scores. But when we look at these students alone, the differences in improvement were also not statistically different.

16 See also Carr (Reference Carr2000) for some interesting comparisons in a study of web-based and face-to-face classes.

17 However, Gallup found a gradual increase in the percent of Americans following the national news during the 2001–09 period (with the expected jumps during presidential election periods and a high of 43% in 2008). See Saad (Reference Saad2009). In an analysis of public interest and press coverage in 2010, however, Pew Center researchers found that on many issues, the public had little interest in stories about politics or Washington, DC, unless it was something that might have some personal impact, like health care reform. See Pew Research Center, “Press Coverage and Public Interest” (2010b). However, in 2010, Americans' news consumption had returned to levels not seen since the 1990s, although news consumption on television had remained stable and the well-documented declines in news consumption from radio and print newspapers had continued. Increases in online news consumption, including social networks and podcasts, as well as cell phones, accounted for the difference. See Pew Research Center, “Americans Spending More Time Following the News” (2010a). Whether this will positively affect political interest is difficult to say at this juncture.

18 A 2000 Pew Research Center report listed the most closely followed stories of the previous 15 years. Only on 36 of the more than 600 stories listed did more than half say they followed that story “very closely” (Pew Research Center 2000).

19 In 2010, stories that focused on two natural disasters, the earthquake in Haiti and the Gulf oil spill, led the list of those that captured the attention of the public (60% and 59%), while stories about the economy, not surprisingly, generated a great deal of public interest throughout the year. Only 4 out of the 15 top stories captured the interest of more than 50% of those queried. See PEW Research Center (2010d).

20 A multiple regression to explain course grade showed GPA to be by far the most important factor (Standardized Beta = .51). Course format was a distant second (.13), but format was not much more important than other factors that were also significant at less than the .01 level: age (.11), professor (.09), and political interest and ethnicity (.07 each). Together these variables explained 37% of the total variation in final grades.

21 See Diaz (Reference Diaz2002), for a discussion of factors that affect online dropout rates that suggests the picture is far more complicated than one of simply measuring completion rates in determining success.

22 The American government web-based classes draw more students in summer classes than either face-to-face American government classes or the American history classes (which are all face-to-face format), either of which students may choose to satisfy the general education requirement put in place in 2001–02. In the summer of 2008 the face-to-face American government class was almost cancelled for low enrollment while the web-based class was oversubscribed. Since then we rarely offer face-to-face American government in the summer. In the summers of 2008, 2009, and 2010, enrollment in the web-based American government classes was at capacity while enrollment in the competing American history classes ran below full enrollment.

References

Achen, Christopher H. 1986. The Statistical Analysis of Quasi-Experiments. Berkeley: University of California Press.CrossRefGoogle Scholar
Americans' Awareness of First Amendment Freedoms.” 2006. McCormick Tribune Freedom Museum. http://www.mccormickfoundation.org/mccormickmuseum/pdf/Survey_Results_Report.pdf.Google Scholar
ANES Guide to Public Opinion, and Electoral Behavior. 2008. “Politics Is Too Complicated 1952–2008.” The American National Election Studies. http://www.electionstudies.org/nesguide/text/tab5b_1.txt.Google Scholar
Anstine, Jeff, and Skidmore, Mark. 2005. “A Small Sample Study of Traditional and Online Courses with Sample Selection Adjustment.” Journal of Economic Education 36 (2): 107–27.Google Scholar
Bennet, Stephen E. 1997. “Why Young Americans Hate Politics and What We Should Do About It.” PS: Political Science and Politics 30 (1): 4752.Google Scholar
Berdichevsky, Cristina. 1999. “Teaching in Cyberspace.” Footnotes 20 (Fall): 1.Google Scholar
Botsch, Carol S., and Botsch, Robert E.. 2001. “Audiences and Outcomes in Online and Traditional American Government Classes.” PS: Political Science and Politics 34 (1): 135–47.Google Scholar
Botsch, Carol S., and Botsch, Robert E.. 2000. “Gaining Faculty Acceptance for Online Courses at a Traditional College.” The Technology Source (July–August): http://horizon.unc.edu/TS/cases/2000-07.asp.Google Scholar
Botsch, Robert E. 1998/99. “Political Attitudes and Knowledge of USC Aiken Students: Are We Nurturing Healthy Citizens?Social and Behavioral Sciences Journal 19: 3141. http://www.usca.sc.edu/polisci/sbsjournal/volxvix/botsch.htm.Google Scholar
Bradshaw, Lynn, and Weston, Laurie. 1999. “Distance Learning in East Carolina University's Educational Leadership Program.” The Technology Source (July/August). http://horizon.unc.edu/TS/cases/1999–07.asp.Google Scholar
Carr, Sarah. 2000. “Online Psychology Instruction Is Effective, but Not Satisfying, Study Finds.” The Chronicle of Higher Education, March 10, A48.Google Scholar
Champney, Leonard, and Edelman, Paul. 2010. “Assessing Student Learning Outcomes in United States Government Courses.” PS: Political Science and Politics 43 (1): 127–31.Google Scholar
Clark-Ibanez, Marisol, and Scott, Linda. 2008. “Learning to Teach Online.” Teaching Sociology 36 (1): 3441.CrossRefGoogle Scholar
Dawe, Lloyd. 2011. USC Aiken Institutional Research. E-mails. February 10 and 15.Google Scholar
Delli Carpini, Michael X., and Keeter, Scott. 1996. What Americans Know about Politics and Why It Matters. New Haven, CT: Yale University Press.Google Scholar
Diaz, David P. 2002. “Online Drop Rates Revisited.” The Technology Source. http://technologysource.org/article/online_drop_rates_revisited/.Google Scholar
Digital Divide.” 2000. All Things Considered, January 28. Radio broadcast. www.npr.org/programs/asc/archives.htm.Google Scholar
Dolan, Kathleen. 2008. “Comparing Modes of Instruction: The Relative Efficacy of On-Line and In-Person Teaching for Student Learning.” PS: Political Science and Politics 41 (2): 387–91.Google Scholar
Fairlie, Robert W. 2003. “Is There a Digital Divide? Ethnic and Racial Differences in Access to Technology and Possible Explanations.” University of California, Latino Policy Institute and California Policy Research Center. http://www2.ucsc.edu/cjtc/docs/r_techreport5.pdf.Google Scholar
Ford, Lynne. 2002. Women and Politics: The Pursuit of Equality. Boston: Houghton Mifflin.Google Scholar
Gallup, George H. Jr. 2003. “How Many Americans Know US History? Part I.” Gallup. October 21. http://www.gallup.com/poll/9526/How-Many-Americans-Know-US-History-Part.aspx.Google Scholar
Garson, G. David. 1998. “Evaluating Implementation of Web-Based Teaching in Political Science.” PS: Political Science and Politics 31 (3): 585–90.Google Scholar
Gladieux, Lawrence E., and Swail, Watson Scott. 1999. “The Internet: New Engine of Inequality?On the Horizon 7 (July/August): 89.Google Scholar
Harrington, Rick, and Loffredo, Donald A.. 2010. “MBTI Personality Type and Other Factors That Relate to Preference for Online versus Face-to-Face Instruction.” Internet and Higher Education 13: 8995.CrossRefGoogle Scholar
Herrin, Jody. 1999. USC Aiken Institutional Research. Telephone interview, February 10.Google Scholar
How Blended (Hybrid) Classes Work.” N.D. Grossmont-Cuyamaca Community College District. http://www.gcccd.edu/online/blended_classes.htm.Google Scholar
Income Gap between Blacks, Whites Expands.” 2007. National Public Radio. November 13. http://www.npr.org/templates/story/story.php?storyId=16257374.Google Scholar
The Internet Got Bigger in 1997, But Not Always Better.” 1997. The State, December 28, D7.Google Scholar
Keeter, Scott, and Suls, Robert. 2007. “Political Knowledge Update—Most of the Public Is Familiar with Key Political and Iraq Facts.” PEW Research Center, September 24. http://pewresearch.org/pubs/political-knowledge-update.Google Scholar
Knowing It by Heart: Americans Consider the Constitution, and Its Meaning. 2002. National Constitution Center. http://ratify.constitutioncenter.org/CitizenAction/CivicResearchResults/asset_upload_file173_2678.pdf.Google Scholar
Kohut, Andrew, Morin, Richard, and Keeter, Scott. 2007. “What Americans Know: 1989–2007 Public Knowledge of Current Affairs Little Changed by News and Information Revolutions.” PEW Research Center, April 15. http://people-press.org/report/319/public-knowledge-of-current-affairs-little-changed-by-news-and-information-revolutions.Google Scholar
Krantz, Michael. 2000. “The Great Online Makeover.” Time, January 31: 64–65.Google Scholar
Kreb, Sigrid D. 2009. “Innovations in Higher Ed—Course Delivery Options for Student Athletes.” VAHPERD Journal 30 (1) (Spring). http://www.freepatentsonline.com/article/VAHPERD-Journal/20668941.html.Google Scholar
Larreamendy-Joerns, Jorge, and Leinhardt, Gaea. 2006. “Going the Distance with Online Education.” Review of Educational Research 76 (4): 567605.CrossRefGoogle Scholar
Lei, Simon A., and Gupta, Rajeev K.. 2010. “College Distance Education Courses: Evaluating Benefits and Costs from Institutional, Faculty and Students' Perspectives.” Education 130 (4): 616–31.Google Scholar
Maguire, Loreal L. 2005. “Literature Review: Faculty Participation in Online Distance Education: Barriers and Motivators.” Online Journal of Distance Learning Administration VIII (1) (Spring). http://www.westga.edu/~distance/ojdla/spring81/maguire81.htm.Google Scholar
Martin, Steven P., and Robinson, John P.. 2007. “The Income Digital Divide: Trends and Predictions for Levels of Internet Use.” Social Problems 54 (1): 122.CrossRefGoogle Scholar
Merisotis, Jamie P. 1999. “The ‘What's-the-Difference?’ Debate.” Academe 85 (September–October): 4751.CrossRefGoogle Scholar
Norris, Pippa. 2004. “Do Campaigns Matter for Civic Engagement? American Elections from Eisenhower to Bush.” http://www.hks.harvard.edu/fs/pnorris/Acrobat/Farrell&Schmitt-Beck%20Chapter%209.pdf.Google Scholar
Our Fading Heritage.” 2008. Intercollegiate Studies Institute American Civic Literacy Program. http://www.americancivicliteracy.org/2008/major_findings_finding1.htmlGoogle Scholar
Parry, Marc. 2010. “Preventing Online Dropouts: Does Anything Work?The Chronicle of Higher Education, September 22. http://chronicle.com/blogPost/blogPost-content/27108/.Google Scholar
Pew Research Center. 2000. “Public Attentiveness to News Stories: 1986–2000.” www.people-press.org/database.htm.Google Scholar
Pew Research Center. 2010a. “Americans Spending More Time following the News.” (September 12). http://people-press.org/report/?pageid=1793.Google Scholar
Pew Research Center. 2010b. “Press Coverage and Public Interest.” June 11. http://pewresearch.org/pubs/1850/public-media-priorities-comparison-2010.Google Scholar
Pew Research Center. 2010c. “Public Knows Basic Facts about Politics, Economics, But Struggles with Specifics.” November 18. http://people-press.org/report/677/.Google Scholar
Pew Research Center. 2010d. “Top Stories of 2010: Haiti Earthquake, Gulf Oil Spill” (December 21). http://people-press.org/report/687/.Google Scholar
Pew Research Center. 2010e. “Well Known: Twitter; Little Known, John Roberts,” Political Knowledge Update, July 15. http://people-press.org/report/635/.Google Scholar
Politics Is Too Complicated: 1952–1996.” The National Election Studies. 1999. http://www.umich.edu/~nes/nesguide/toptable/tab5b_1.htm.Google Scholar
Saad, Lydia. 2009. “More Americans Plugged into Political News,” September 28. http://www.gallup.com/poll/123203/Americans-Plugged-Into-Political-News.aspx.Google Scholar
Stoughton, Stephanie, and Walker, Leslie. 1999. “Web Retailers Court Women.” The State, November 4: 1A, 11A.Google Scholar
Survey Shows Widespread Enthusiasm for High Technology.” 2000. The NPR/Kaiser/Kennedy School Poll. http://www.npr.org/programs/specials/poll/technology/.Google Scholar
Tallent-Runnels, Julie A. Thomas, Lan, William Y., Cooper, Sandi, Ahern, Terence C., Shaw, Shana M., and Liu, Xiaoming. 2006. “Teaching Courses Online: A Review of the Research.” Review of Educational Research 76 (1): 93135.CrossRefGoogle Scholar
Taylor, Paul, and Keeter, Scott (eds.). 2009. “Millennials. Confident. Connected. Open to Change.” PEW Research Center. http://pewsocialtrends.org/assets/pdf/millennials-confident-connected-open-to-change.pdf.Google Scholar
The Trust in Government Index 1958–2008.” 2008. The National Election Studies. http://www.electionstudies.org/nesguide/text/tab5a_5.txt.Google Scholar
Washington, Jesse. 2011. “For Minorities, New ‘Digital Divide’ Seen.” Associate Press. (January 1). http://www.msnbc.msn.com/id/41023900/ns/us_news-life/.Google Scholar
We The People Do Not Know the Constitution.” 1997. The State, September 16: 1DGoogle Scholar
Figure 0

Table 1 Comparisons of Traditional and Online American National Government Students, All (Spring 1998–Spring 2010) and Initial (Spring 1998–Fall 1999)