Hostname: page-component-7c8c6479df-xxrs7 Total loading time: 0 Render date: 2024-03-26T16:29:07.734Z Has data issue: false hasContentIssue false

An Entrance to Exit Polling: Strategies for Using Exit Polls as Experiential Learning Projects

Published online by Cambridge University Press:  12 June 2012

Michael J. Berry
Affiliation:
University of Colorado, Denver
Tony Robinson
Affiliation:
University of Colorado, Denver
Rights & Permissions [Opens in a new window]

Abstract

Engaging students in the design, administration, and postelection analysis of an exit poll can be an excellent experiential learning activity. Lelieveldt and Rossen (2009) argue that exit polls are a “perfect teaching tool” because they provide students with a cooperative (rather than competitive) learning experience; help students better connect theory, methodology, and course substance; and allow students to move outside of the classroom by branching out into the community. As professors at the University of Colorado, Denver (UCD), we have organized student exit polling during the 2008 and 2010 elections in the Denver area for research methods and elections classes. Although we have found these exit polls to be rewarding experiences for instructors and students alike, the reality is that conducting an exit poll with a group of polling neophytes, in the confines of a single semester, can be challenging. In this article, we discuss strategies and issues for instructors to consider when using an exit poll as an experiential learning exercise.

Type
The Teacher
Copyright
Copyright © American Political Science Association 2012

Among researchers there is consensus that experiential learning assignments generate class excitement, stimulate student interest, build political research skills, and help students master concepts and facts more completely (Cole Reference Cole2003; Currin-Percival and Johnson Reference Currin-Percival and Johnson2010; Evans and Lagergren Reference Evans and Lagergren2007; Gourgey Reference Gourgey2000; Lelieveldt and Rossen Reference Lelieveldt and Rossen2009; McBride Reference McBride1994). At the very founding of the American Political Science Association (APSA) in 1903, the new association recommended active learning to supplement formal classroom teaching in politics (Carpini and Keeter Reference Carpini and Keeter2000)—a recommendation that was repeated in the 1997 “Statement of Purpose” of the APSA's Task Force on Civic Education. Experiential learning projects may be particularly useful for research methods courses, which are among the most challenging of undergraduate courses to teach. “Social science research methods are widely avoided and maligned by students,” notes McBride (Reference McBride1994, 553). Much of the reason for students' perpetual dislike of methods courses is that students rarely apply research skills to real-world, substantive questions of interest to them (Winn Reference Winn1995). As a result, students are required to master the language and techniques of various scientific concepts—sampling, probability theory, reliability, and validity—“all matters for which students see little or no purpose” (McBride Reference McBride1994, 553). In the end, these courses often have the perverse result of reinforcing students' aversion to scientific methodology (Lelieveldt and Rossen Reference Lelieveldt and Rossen2009).

As some of the most widely used political science research tools, survey research in general, and election exit polls in specific, can be a particularly beneficial experiential learning project. By assigning students to apply survey research skills in assessing local arts programs, for example, Gregory, Mattern, and Mitchell conclude, “Most of the students felt that it was easier and more effective to learn research methods by actually putting them into practice and that the applied nature of the course increased their commitment, motivation, and enthusiasm” (Reference Gregory, Mattern and Mitchell2001, 123). Survey research through exit polling has the added benefit of having students directly engage issues and topics related to the election, while helping students better understand course material (Cole Reference Cole2003). A class polling project presents students with an applied problem-solving experience, builds student understanding of multiple stages of the research process, allows students to engage the community and connect methodology with substance, and produces original survey data (Lelieveldt and Rossen, Reference Lelieveldt and Rossen2009). Students rank hands-on exit polling projects very favorably. Reviewing exit polling projects at seven universities, Cole (Reference Cole2003) concluded that most students enjoyed exit polling as an innovative way to improve their understanding of a number of research related topics, while also furthering their interest in the election (see also Evans and Lagergren Reference Evans and Lagergren2007; Gourgey Reference Gourgey2000; Winn Reference Winn1995). Professors report exceptional student effort in such projects, as students take ownership of their research project and move beyond being passive vessels of knowledge supplied by the teacher into a sense of confidence as budding social scientists of their own (Gregory, Mattern, and Mitchell Reference Gregory, Mattern and Mitchell2001; Lelieveldt and Rossen Reference Lelieveldt and Rossen2009).

In our exit polling projects at the University of Colorado, Denver, during the 2008 and 2010 election cycles, we have involved more than 75 students (from three separate classes) in designing, administering, and analyzing exit polls, and we have found the same results. In our research methods course, polling projects engage students in topics that are commonly covered in an undergraduate methods course, including survey design, question wording, measurement error, reliability, validity, sampling methods, sample size, margin of error, data coding, and empirical analysis. In our course on campaigns and elections, students learn basic research skills as well. They generate polling questions tailored to specific issues and candidates that allow them to run cross-tabs analysis and produce an original investigation of voter attitudes and behavior, rather than simply consuming existing research on a particular topic. These benefits associated with exit polling need not be limited to research methods or elections classes. An exit polling/survey research project can be a useful student engagement assignment in a wide range of upper-level political science courses that allows students to investigate class themes by conducting primary research of their own. Class sizes ranging from 25 to 40 students are ideal for the type of exit polling project described in this article. Classes of this size allow for students to administer the exit polls in pairs, generate a substantial number of poll respondents, and also make the exercise manageable for the instructor.

For students to reap the benefits of participating in a classroom exit poll, instructors should be mindful that such an experiential learning exercise is challenging. In the sections that follow, we discuss strategies for ensuring that a class exit poll project is a success, from preclass preparations to postpoll analysis.

PRE-CLASS PREPARATIONS

A single semester is not much time to complete a polling project, including polling design, execution, and postpoll analysis, considering that the exit poll will not be the centerpiece of the entire class. By attending to the significant organizational planning required before the course even begins, professors will save valuable class time and ensure that the polling process itself experiences few problems (Cole Reference Cole2003). Here are some preclass details to attend to before the course begins.

Notify Local Election Officials

The county clerk of elections (or similar official) oversees the election process in his or her jurisdiction, while local election judges oversee voting in individual precincts. Both officials play a critical role in shaping the success of a class exit poll. Because many jurisdictions have adopted interpretations that exit polling is the same as electioneering, both county election clerks and local precinct judges have ordered exit pollsters to remain more than 100 feet away from any entrance to a building in which voting is taking place—just as anyone campaigning for a candidate or issue must do (Cole Reference Cole2003; Evans and Lagergren Reference Evans and Lagergren2007; Lelieveldt and Rossen Reference Lelieveldt and Rossen2009)

During our classes, we experienced this problem at several precincts and learned that conducting an exit poll from such a distance is not practical for many reasons. For instance, voters will often park their cars within the 100 foot “banned circle” and will vote and leave without a pollster ever being able to request a survey. This serious challenge to a polling project can be minimized if professors notify local election officials about the exit polling project before the election and work out mutually acceptable ground rules for conducting the poll. In our experience, conducting exit polls in two election cycles across seven different counties, we have found that the response of election officials varies. Many times a county election clerk quickly concludes that exit polling does not count as electioneering and grants approval for pollsters to stand close to the building entrance (Lelieveldt and Rossen Reference Lelieveldt and Rossen2009). Better yet, we obtain their approval in writing, so that students can show the approval letter to any skeptical local precinct judges who continue to order students to remain more than 100 feet away. We have also experienced county election clerks who see their role as keeping the election process quiet and orderly, who interpret exit polls as intruding on this duty, and who therefore insist on pollsters remaining more than 100 feet from the entrance to any voting location.

These situations can be resolved. In our case, about a month before our 2010 polling began, we contacted the Colorado Secretary of State's office to discuss our project and request state-level communication to specify to local election officials that exit polling was not an electioneering activity and that exit pollsters were permitted to poll within the 100-foot circle. Following our request, lawyers with the Secretary of State's office quickly confirmed, in a written bulletin sent to election officials across the state, the right of exit pollsters to be within the 100-foot radius where electioneering is prohibited.Footnote 1

Having official approval letters in hand before students hit the ground significantly increased the success of our project. Nevertheless, our experience indicates that professors and students should count on a number of local precinct judges who will remain intransigent, even when given formal letters, and some of these officials are still likely to order students to leave their polling stations. Therefore, professors should be in telephone contact with students throughout the polling day, and backup plans should be in place in terms of alternative precincts to poll, should it become clear that a political battle with the street-level bureaucrats (Lipsky Reference Lipsky1980) of the polling station is not worth the energy or time lost on polling day.

Cover Administrative Bases with the University

Some universities require that students involved in off-campus educational experiences sign a liability release form, in case of troubles. Professors should also be attentive to the expectations of the Human Subjects Research Committee (HSRC) on his or her campus and get necessary review and approval of the project early in the semester. Because a class exit poll is typically not intended to result in generalizable, published research, and because the risk to human subjects is minimal from such surveys, gaining HSRC approval for the poll (although often required) should not be a major obstacle.

Obtain Supplies

Some supplies are necessary for completing a class exit poll: clipboards, pens, and student identification badges (to list the most vital). Acquiring some kind of name badge for each student (purchased from an office supply store), which can simply list their “pollster ID number,” the name of the school, and perhaps the school logo, will go a long way to helping students feel professional in the field, and will help survey respondents know that this project is credit-worthy by means of its association with a local college or university. Also, depending on the size of the class and the number of surveys to be collected, it is important to plan for a substantial number of photocopies, which may or may not fit within departmental expectations of an allowable expense.

DESIGNING THE POLL AND TRAINING STUDENTS

When the semester begins, students are directly involved in four key activities that have pedagogical benefits: developing research questions, designing survey questions, strategizing sample selection, and pollster training. To successfully execute these activities, structure the course schedule to allow sufficient time for each activity. In our case, we discussed the class poll early in the term and spent considerable time designing the poll and training student pollsters during the weeks before the election. Introducing the exit poll concept during the first week of class, including a description of the exit poll activity in the course syllabus, and distributing information sheets to students allows students time to discuss concerns with the professor.Footnote 2 By providing sufficient details about the project at the start of the semester, students can drop the course if they wish.

Research Question Considerations

After introducing the polling project, we facilitated class discussion about what questions we should explore in our exit poll. There was no debate that the poll should ask about voter demographics and voting choices—but beyond that, students wanted the poll to ask questions directed at important political issues and voter attitudes. As students grew excited about shaping their own research questions to investigate, discussion became vibrant and more students participated than in typical discussions. Students debated the types of questions we could explore (e.g., “Is the news source one watches correlated with voting habits?” or “Are white people nervous about Obama?” or “Are Tea Party advocates knowledgeable about public affairs?”) and had vigorous discussions about the worth of various research topics and how to investigate these through an exit poll. We dedicated parts of two class sessions to exploring such issues and then allowed students to vote on the topics for our poll.

Survey Question Design

After we settled on core research questions, students designed the wording of questions. Involving students in the creation of the exit poll questions required them to consider issues related to survey design and measurement error, which has been found to contribute to better student understanding of such issues (McCarthy and Anderson Reference McCarthy and Anderson2000). Prior to brainstorming poll questions, students read the survey research chapter in Johnson, Reynolds and Mycoff (Reference Johnson, Reynolds and Mycoff2008), which covers a number of survey research topics. In addition, students also read a short primer on the American National Election Studies (ANES) and examined some of the questions posed to voters by the ANES (DeBell Reference DeBell2010). Additional readings on survey research contributions to political science (Brady Reference Brady2000) or considerations for exit poll practitioners (Barreto et al. Reference Barreto, Guerra, Marks, Nuno and Woods2006) could also be assigned, depending on the focus of the course.

We spent one class period discussing the importance of validity and reliability with regard to measurement of concepts through survey research. This class concluded with brainstorming question topics for inclusion in the poll. Many of the suggested topics were related to voter demographics. Students also suggested topics such as voting history and frequency, candidate preferences, family and employment situation, religiosity, partisanship, ideology, and sexual orientation, among others. Students energetically discussed topics regarding various campaign issues, coming to consensus about which issues were the most interesting. Following instruction on such principles as avoiding question ambiguity, bias, and excessively complicated questions, students were each assigned topics for which they were responsible for drafting question language. Students brought their questions in writing to the following class.

The subsequent class period, devoted to vetting student-written exit poll questions, involved participation from all students. Students discussed the strengths and weaknesses of student-submitted questions, again framed in terms of measurement issues such as validity, reliability, and level of measurement. As part of the discussion, the class decided how questions would be structured. Would the poll have a blank where respondents could enter their age, or would interval age categories be provided where respondents could mark the appropriate box? Would voters be allowed an open-ended response to a question about the issues most important to them, or would only closed-ended questions be offered? How would we break down racial and religious categories offered to voters? Discussions of such themes allowed students to grapple first-hand with the same thorny methodological issues facing all survey researchers. At the end of the session, students chose the initial questions and the wording to be included in the poll.

One topic of classroom debate was survey length. Although some professors have found that a short poll (no more than two pages) brings the best response rate (Evans and Lagergren Reference Evans and Lagergren2007), we wanted our project to collect a broad range of data on numerous topics to allow a wide range of topics to be investigated in students' final papers. We ended up with a 35-question poll spanning eight pages. We did find that some respondents refused the poll due to its length, but overall we found voters eager to respond to this university exit poll. In 2008 and 2010, our 75 students were collectively able to get 1,500 poll respondents by putting in their assigned five hours of exit polling (polling occurred during both the early voting period and on Election Day).Footnote 3

Sample Selection and Size

Involving students in working through issues of sample selection offers another opportunity to teach methodology. In our classes, we discussed the relationship between population size, sample size, and confidence levels and then applied these principles to arrive at a target sample size to reach our desired margin of error. The practical implications and challenges of different random sampling techniques were made clear to students, enhancing discussion of these methodological principles. To achieve a random sample, the class settled on a cluster sampling technique, whereby the class identified different areas of town and times of day (the clusters) in which to do the exit polling; then they selected random respondents from these clusters. The clusters were chosen by identifying exit-polling precincts across the region that were located in a diverse range of communities and then assigning pollsters to those precincts at differing time intervals throughout the day.

In thinking through polling day dynamics, we were aware that students might be tempted to only request interviews of people in the students' “comfort zone” (younger voters, or voters appearing racially or socioeconomically similar to the student). Therefore, we discussed the need for interval sampling, whereby students were instructed to request interviews of every nth voter who exited the precinct, rather than only requesting surveys of voters who looked friendly to them. Because some polling places or times of day are busier than others, and because we wanted to keep pollsters' morale high by always having them busy collecting surveys, we settled on an quasi-interval sampling strategy of having the student always ask the very next exiting voter to complete a poll, any time the student was free from administering a survey. In that way, students were always administering a survey or asking the next voter who exited the polling station to take a survey. Although this method is not a strictly formal interval sampling strategy, the students were kept busy and avoided bias in terms of which voters students asked for surveys (Lelieveldt and Rossen Reference Lelieveldt and Rossen2009).

Training Students

We trained students for the polling day by preparing and distributing a polling instruction sheet (reminding students of their polling day expectations and of key principles such as the interval sampling rule), reviewing those instructions as a class, and giving students their supplies (piles of blank exit polls, pens, clipboards, and an envelope or box for collecting polls). After this training, most students were quite eager for their upcoming hands-on research project.

Polling Day Considerations

As students headed out for their polling shift, they had four key supplies to help ensure a productive day:

  • An identification name tag with the school logo and their pollster ID. These badges are helpful in validating the students to local precinct judges and to voters.

  • A cover letter from the course professor announcing the poll and its purpose, and offering a contact number to call the professor in case of any problems. Consistent voter questions have taught us that this cover letter should also inform voters if any of the polling results will be made available to the public, and if so, how voters can see them.

  • A manila envelope, or covered box, in which respondents could deposit their completed polls without disclosing their answers to the pollster.

  • The student's cell phone, with the professor's phone number. Through the day, the instructor should be available by phone, as unforeseen problems will likely emerge.

Instructors may be concerned with whether all students are actually doing their polling as assigned on polling day. As in other classes, students may shirk their assigned work, or may become demoralized and tempted either not to poll many people or (worse) to fill out polls on their own and submit them as valid. Beyond building a good esprit de corps in the class and addressing such possibilities in advance to inoculate against such problems, there are additional strategies instructors can adopt to minimize these possibilities.

First, students can be assigned to poll in pairs. Polling with a partner helps keep student morale high while also increasing the chances that students will do all assigned polling honestly. A disadvantage of this choice is that only half as many polling stations can be targeted, which can have a substantial effect on the randomness of the sample. Second, as recommended by Cole (Reference Cole2003), the instructor can inform students that he or she will be doing site visits during the day, to check in, replenish supplies, and troubleshoot problems. Third, students can be required to drop off completed polls with the professor immediately after their polling shift. Falsification of surveys is less likely if the student does not have time after Election Day to ponder and execute such a plan. Lastly, instructors may want to avoid grading students on the number of completed polls submitted to avoid incentives for cheating.

POSTELECTION ACTIVITIES

Our class periods following Election Day were partially reserved for discussion of field experiences. The lively discussion further built a sense of camaraderie and accomplishment among students. Subsequently, the real work of coding and analyzing polling data began.

Data Coding

Our polling projects focused on gathering data that could inform postelection analysis of voter opinions on a wide range of issues and the behavior of voting blocs, after all the data were coded. Coding 1,500 surveys of 35 questions is a daunting task that is manageable (with good pedagogical effect) by requiring students to code the data. If students are not required to be familiar with statistical programs like SPSS or STATA, students can easily use Microsoft Excel for entering the data into a worksheet. Excel has the further advantage of an easy to use cross-tabs analysis function (called “pivot tables”) that students can quickly learn to run their own analysis on the data, after coding (for a scholarly review of teaching basic data analysis with Excel's pivot tables, see Palocsay, Markham, and Markham Reference Palocsay, Markham and Markham2010).

To prepare for student coding of data, we built and distributed to students an Excel database template, together with a coding instruction sheet. Instructions were also covered in class; an additional preparation session outside of class was offered. Each student received about 30 surveys to code. Students were instructed to put a respondent identification number on each survey, so that instructors or other students could check for coder reliability when the coding was completed. Coding of results took each student a few hours; students sent the Excel file by e-mail to the instructor. The instructors compiled all files into a master Excel document. The polling data were now ready for the big pay-off: original analysis and reporting of findings.

Data Analysis

Students can be taught the basics of cross-tabs analysis so that they can run queries on the polling data and create their own tables, graphs, and charts for class assignments, such as a final data analysis paper.Footnote 4 By mastering Excel's pivot table function, students can query the polling data to analyze a number of questions such as “what percent of each racial group voted for each party?” “What percent of young women ranked abortion rights as among their reasons for their voting choice?” and so on, limited only by the questions built into the exit poll. Tables, pie charts, line graphs, and other figures can be constructed through Excel, so students can explore the data as they please and enjoy the satisfaction of building their own visual aids to convey their findings. In the end, students can report findings from independent, primary research—a unique achievement for undergraduates. Allowing students to share their findings in the form of classroom presentations, or even in a campus postelection forum, is a positive way for students to wrap up the experience, perhaps giving some students their first experience as a “scholar,” by formally presenting findings.

Local Media Coverage

Instructors should also consider the possibility of media interest in poll findings or in the exit poll process itself. We notified local press during our polling process, and were pleased that a local television station followed students during their polling day and reported on the project on the evening news. The campus newspaper is also likely to be interested. As students complete their data analysis, some substantive findings may interest the local press, which brings positive attention to innovative student projects on the campus (Lelieveldt and Rossen Reference Lelieveldt and Rossen2009; Winn Reference Winn1995).

CONCLUSION

A class exit poll can be an innovative way to excite students about research methods and about election dynamics—and the pedagogical benefits can be substantial. Students bond during their participatory work in building and executing an exit poll—and perhaps even receive media coverage of their work. As professors, it can be immensely satisfying to witness the creative work students can do analyzing polling results and presenting their primary findings, complete with sophisticated data tables, eye-catching pie charts, and revealing bar graphs. Although there is little debate among teachers about the utility of such class projects, the reality is that executing such a project can be challenging and typically requires the professor's additional time and attention. Having done such exit polls in several classes over the last two election cycles, we have learned that with careful management of the details, the challenges can be overcome and an exit poll project will be a fondly remembered success among students.

Footnotes

1 In preparing for any discussions with election officials on this subject, professors may find it useful to know that courts in the last seven years have overturned 100-foot circle bans on exit polling in Florida, Georgia, Kentucky, Minnesota, Montana, Ohio, New Jersey, Washington, and Wyoming.

2 For a number of reasons some students are not well suited to a requirement that they survey strangers in the field, and alternative assignments may need to be arranged.

3 Many jurisdictions offer an early-voting period during which some vote center locations are open for people to vote or drop off their mail-in ballots. To approach a random sample, it is important to assign students to do some early polling at these early voting locations, which also can be arranged as cluster samples in which students visit targeted polling locations at an assigned variety of times during the early voting period.

4 Instructors can find numerous online tutorials regarding Excel's pivot table functions to share with students.

References

Barreto, Matt A., Guerra, Fernando, Marks, Mara, Nuno, Stephen A., and Woods, Nathan D.. 2006. “Controversies in Exit Polling: Implementing a Racially Stratified Homogenous Precinct Approach.” PS: Political Science and Politics 39 (3): 477–83.Google Scholar
Brady, Henry E. 2000. “Contributions of Survey Research to Political Science.” PS: Political Science and Politics 33 (1): 4757.Google Scholar
Carpini, Michael X. Delli, and Keeter, Scott. 2000. “What Should Be Learned through Service Learning?PS: Political Science & Politics 33 (3): 635–37.Google Scholar
Cole, Alexandra. 2003. “To Survey or Not to Survey: The Use of Exit Polling as a Teaching Tool.” PS: Political Science & Politics 36 (2): 245–52.Google Scholar
Currin-Percival, Mary, and Johnson, Martin. 2010. “Understanding Sample Surveys: Selective Learning about Social Science Research Methods.” PS: Political Science & Politics 43 (4): 533–40.Google Scholar
DeBell, Matthew. 2010. “How to Analyze ANES Survey Data.” ANES Technical Report Series no. nes012492. Palo Alto, CA, and Ann Arbor, MI: Stanford University and the University of Michigan.Google Scholar
Evans, Jocelyn, and Lagergren, Olivia. 2007. “See You at the Polls: Exit Polling as a Tool for Teaching Research Methods and Promoting Civic Engagement.” Paper presented at the APSA Teaching and Learning Conference. February 9–11, 2007.Google Scholar
Gourgey, Annette F. 2000. “A Classroom Simulation Based on Political Polling to Help Students Understand Sampling Distributions.” Journal of Statistics Education 8 (3): 110.Google Scholar
Gregory, Jill, Mattern, Mark, and Mitchell, Shari. 2001. “From Ivory Tower to Urban Street: Using the Classroom as a Community Research and Development Tool.” PS: Political Science & Politics 33 (March): 119–24.Google Scholar
Johnson, Janet Buttolph, Reynolds, H. T., and Mycoff, Jason D.. 2008. Political Science Research Methods. 6th Edition. Washington, DC: CQ Press.Google Scholar
Lelieveldt, Herman, and Rossen, Gregor. 2009. “Why Exit Polls Make Good Teaching Tools.” European Political Science 8 (1): 113–22.Google Scholar
Lipsky, Michael. 1980. Street-level Bureaucracy: Dilemmas of the Individual in Public Service. New York: Russell Sage Foundation.Google Scholar
McBride, Allan. 1994. “Teaching Research Methods Using Appropriate Technology.” PS: Political Science & Politics 27 (3): 553–57.Google Scholar
McCarthy, J. Patrick, and Anderson, Liam. 2000. “Active Learning Techniques versus Traditional Teaching Styles: Two Experiments from History and Political Science.” Innovative Higher Education 24 (4): 279–94.Google Scholar
Palocsay, Susan W., Markham, Ina S., and Markham, Steven E.. 2010. “Utilizing and Teaching Data Tools in Excel for Exploratory Analysis.” Journal of Business Research 63 (2): 191206.CrossRefGoogle Scholar
Winn, Sandra. 1995. “Learning by Doing: Teaching Research Methods through Student Participation in a Commissioned Research Project.” Studies in Higher Education 20 (2): 203–14.Google Scholar