Hostname: page-component-758b78586c-pp4sz Total loading time: 0 Render date: 2023-11-28T22:06:17.029Z Has data issue: false Feature Flags: { "corePageComponentGetUserInfoFromSharedSession": true, "coreDisableEcommerce": false, "useRatesEcommerce": true } hasContentIssue false

Paying Attention to Inattentive Survey Respondents

Published online by Cambridge University Press:  16 January 2019

R. Michael Alvarez*
Professor of Political Science, California Institute of Technology, USA. Email:
Lonna Rae Atkeson
Professor of Political Science, University of New Mexico, USA
Ines Levin
Assistant Professor of Political Science, University of California, Irvine, USA
Yimeng Li
Graduate Student, Division of the Humanities and Social Science, California Institute of Technology, USA


Does attentiveness matter in survey responses? Do more attentive survey participants give higher quality responses? Using data from a recent online survey that identified inattentive respondents using instructed-response items, we demonstrate that ignoring attentiveness provides a biased portrait of the distribution of critical political attitudes and behavior. We show that this bias occurs in the context of both typical closed-ended questions and in list experiments. Inattentive respondents are common and are more prevalent among the young and less educated. Those who do not pass the trap questions interact with the survey instrument in distinctive ways: they take less time to respond; are more likely to report nonattitudes; and display lower consistency in their reported choices. Inattentiveness does not occur completely at random and failing to properly account for it may lead to inaccurate estimates of the prevalence of key political attitudes and behaviors, of both sensitive and more prosaic nature.

Copyright © The Author(s) 2019. Published by Cambridge University Press on behalf of the Society for Political Methodology. 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)


Authors’ note: Levin thanks Sean Ingham for his collaboration in collecting the survey data used in this paper, and the University of Georgia for providing the financial support to conduct this survey. The survey data was collected using procedures approved by the Institute Review Board at the University of Georgia. Previous versions of this research were presented at the 34th Annual Meeting of the Society for Political Methodology (2017), at the 2017 Annual Meeting of the American Political Science Association (APSA), and at the 2nd Annual Southern California Methods Workshop at UCSB (September 19–20, 2017). We thank participants at these meetings for their comments and suggestions, and in particular we thank Michael James Ritter for his comments after our presentation at APSA, and Leah Stokes for her comments on our presentation at the Southern California Methods Workshop. Replication materials for this paper are available (Alvarez et al.2018).

Contributing Editor: Jeff Gill


Alvarez, R. M. 1997. Information and Elections . Ann Arbor, MI: University of Michigan Press.Google Scholar
Alvarez, R. M., and Brehm, J.. 2002. Hard Choices, Easy Answers: Values, Information, and American Public Opinion . Princeton, NJ: Princeton University Press.Google Scholar
Alvarez, R. M., and Franklin, C. H.. 1994. “Uncertainty and Political Perceptions.” The Journal of Politics 56(3):671688.Google Scholar
Alvarez, R. M., Atkeson, L. R., Levin, I., and Li, Y.. 2018. “Replication Data for: Paying Attention to Inattentive Survey Respondents.”, Harvard Dataverse, V1, UNF:6:ZHc1mHgkrXEorZvXXJnURQ== [fileUNF].Google Scholar
Anduiza, E., and Galais, C.. 2016. “Answering Without Reading: IMCs and Strong Satisficing in Online Surveys.” International Journal of Public Opinion Research 29(3):497519.Google Scholar
Ansolabehere, S., and Schaffner, B. F.. 2018. “Taking the Study of Political Behavior Online.” In The Oxford Handbook of Polling and Survey Methods , edited by Atkeson, L. R. and Alvarez, R. M., 7696. New York: Oxford University Press.Google Scholar
Atkeson, L. R., and Adams, A. N.. 2018. “Mixing Survey Modes and Its Implications.” In The Oxford Handbook of Polling and Survey Methods , edited by Atkeson, L. R. and Alvarez, R. M., 5375. New York: Oxford University Press.Google Scholar
Atkeson, L. R., Adams, A. N., and Alvarez, R. M.. 2014. “Nonresponse and Mode Effects in Self- and Interviewer-Administered Surveys.” Political Analysis 22(3):304320.Google Scholar
Bailey, M. A.2017. “Selection Sensitive Survey Design: Moving Beyond Weighting.” Presented at the 2017 Annual Meetings of the American Political Science Association, San Francisco, CA.Google Scholar
Barber, L. K., Barnes, C. M., and Carlson, K. D.. 2013. “Random and Systematic Error Effects of Insomnia on Survey Behavior.” Organizational Research Methods 16(4):616649.Google Scholar
Berinsky, A. J., Huber, G. A., and Lenz, G. S.. 2012. “Evaluating Online Labor Markets for Experimental Research: Amazon.Com’s Mechanical Turk.” Political Analysis 20(3):351368.Google Scholar
Berinsky, A. J., Margolis, M. F., and Sances, M. W.. 2014. “Separating the Shirkers from the Workers? Making Sure Respondents Pay Attention on Self-Administered Surveys.” American Journal of Political Science 58(3):739753.Google Scholar
Berinsky, A. J., Margolis, M. F., and Sances, M. W.. 2016. “Can We Turn Shirkers into Workers? Journal of Experimental Social Psychology 66:2028.Google Scholar
Blair, G., and Imai, K.. 2012. “Statistical Analysis of List Experiments.” Political Analysis 20(1):4777.Google Scholar
Bowling, N. A. et al. . 2016. “Who Cares and Who is Careless? Insufficient Effort Responding as a Reflection of Respondent Personality.” Journal of Personality and Social Psychology 111(2):218229.Google Scholar
Clifford, S., and Jerit, J.. 2015. “Do Attempts to Improve Respondent Attention Increase Social Desirability Bias? Public Opinion Quarterly 79(3):790802.Google Scholar
Curran, P. G. 2016. “Methods for the Detection of Carelessly Invalid Responses in Survey Data.” Journal of Experimental Social Psychology 66:419.Google Scholar
Dillman, D. A., Smyth, J. D., and Christian, L. M.. 2009. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method . 3rd edn. Hoboken, NJ: John Wiley & Sons.Google Scholar
Downes-Le Guin, T.2005. “Satisficing Behavior in Online Panelists.” Presented at the MRA Annual Conference & Symposium, Chicago, IL.Google Scholar
Droitcour, J. et al. . 1991. “The Item Count Technique as a Method of Indirect Questioning: A Review of its Development and a Case Study Application.” In Measurement Errors in Surveys , edited by Biemer, P. P. et al. , 185210. Hoboken, NJ: John Wiley & Sons.Google Scholar
Eady, G.2016. “Replication Data for: The Statistical Analysis of Misreporting on Sensitive Survey Questions.” (September 16, 2018).Google Scholar
Eady, G. 2017. “The Statistical Analysis of Misreporting on Sensitive Survey Questions.” Political Analysis 25(2):241259.Google Scholar
Edwards, A. L. 1957. The Social Desirability Variable in Personality Assessment and Research . Fort Worth, TX: Dryden Press.Google Scholar
Fisher, R. J. 1993. “Social Desirability Bias and the Validity of Indirect Questioning.” Journal of Consumer Research 20(2):303315.Google Scholar
Glynn, A. N. 2013. “What Can We Learn with Statistical Truth Serum? Design and Analysis of the List Experiment.” Public Opinion Quarterly 77(S1):159172.Google Scholar
Groves, R. M., and Lyberg, L.. 2010. “Total Survey Error: Past, Present, and Future.” Public Opinion Quarterly 74(5):849879.Google Scholar
Huang, J. L. et al. . 2012. “Detecting and Deterring Insufficient Effort Responding to Surveys.” Journal of Business and Psychology 27(1):99114.Google Scholar
Imai, K. 2011. “Multivariate Regression Analysis for the Item Count Technique.” Journal of the American Statistical Association 106(494):407416.Google Scholar
Johnson, J. A. 2005. “Ascertaining the Validity of Individual Protocols from Web-based Personality Inventories.” Journal of Research in Personality 39(1):103129.Google Scholar
Jones, M. S., House, L. A., and Gao, Z.. 2015. “Attribute Non-Attendance and Satisficing Behavior in Online Choice Experiments.” Proceedings in Food System Dynamics 2015:415432.Google Scholar
Kapelner, A., and Chandler, D.. 2010. “Preventing Satisficing in Online Surveys: A ‘Kapcha’ to Ensure Higher Quality Data.” In Proceedings of CrowdConf 2010, San Francisco, CA .Google Scholar
Kiewiet de Jonge, C. P., and Nickerson, D. W.. 2014. “Artificial Inflation or Deflation? Assessing the Item Count Technique in Comparative Surveys.” Political Behavior 36(3):659682.Google Scholar
King, G., Honaker, J., Joseph, A., and Scheve, K.. 2001. “Analyzing Incomplete Political Science Data: An Alternative Algorithm for Multiple Imputation.” American Political Science Review 95(1):4969.Google Scholar
Krosnick, J. A. 1991. “Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys.” Applied Cognitive Psychology 5(3):213236.Google Scholar
Little, R. J. A. 1992. “Regression with Missing X’s: a Review.” Journal of the American Statistical Association 87(420):12271237.Google Scholar
Maccoby, E. E., Maccoby, N., and Lindzey, G.. 1954. “The Interview: A Tool of Social Science.” In Handbook of Social Psychology , edited by Lindzey, G., 449487. Reading, MA: Addison-Wesley.Google Scholar
Maniaci, M. R., and Rogge, R. D.. 2014. “Caring About Carelessness: Participant Inattention and its Effects on Research.” Journal of Research in Personality 48:6183.Google Scholar
Meade, A. W., and Craig, S. B.. 2012. “Identifying Careless Responses in Survey Data.” Psychological Methods 17(3):437455.Google Scholar
Miller, J.2006. “Research Reveals Alarming Incidence of ‘Undesirable’ Online Panelists.” Research Conference Report, RFL Communications, Skokie, IL, USA. Available at Scholar
Miller, J. D.1984. “A New Survey Technique for Studying Deviant Behavior.” PhD thesis, George Washington University.Google Scholar
Oppenheimer, D. M., Meyvis, T., and Davidenko, N.. 2009. “Instructional Manipulation Checks: Detecting Satisficing to Increase Statistical Power.” Journal of Experimental Social Psychology 45(4):867872.Google Scholar
Pepinsky, T. B. 2018. “A Note on Listwise Deletion Versus Multiple Imputation.” Political Analysis 26(4):480488.Google Scholar
Rosenfeld, B., Imai, K., and Shapiro, J. N.. 2016. “An Empirical Validation Study of Popular Survey Methodologies for Sensitive Questions.” American Journal of Political Science 60(3):783802.Google Scholar
Simon, H. A. 1956. “Rational Choice and the Structure of the Environment.” Psychological Review 63(2):129138.Google Scholar
Vannette, D.2017. “Using Attention Checks in Your Surveys May Harm Data Quality.” Qualtrics, (June 14, 2018).Google Scholar
Ward, M. K., and Pond, S. B.. 2015. “Using Virtual Presence and Survey Instructions to Minimize Careless Responding on Internet-Based Surveys.” Computers in Human Behavior 48:554568.Google Scholar
Warner, S. L. 1965. “Randomized Response: A Survey Technique for Eliminating Evasive Answer Bias.” Journal of the American Statistical Association 60(309):6369.Google Scholar
Zagorsky, J. L., and Rhoton, P.. 2008. “The Effects of Promised Monetary Incentives on Attrition in a Long-Term Panel Survey.” Public Opinion Quarterly 72(3):502513.Google Scholar
Supplementary material: File

Alvarez et al. supplementary material

Alvarez et al. supplementary material 1

Download Alvarez et al. supplementary material(File)
File 685 KB