Skip to main content Accessibility help
×
Home
Hostname: page-component-684899dbb8-pcn4s Total loading time: 0.524 Render date: 2022-05-20T22:06:11.145Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "useRatesEcommerce": false, "useNewApi": true }

Paying Attention to Inattentive Survey Respondents

Published online by Cambridge University Press:  16 January 2019

R. Michael Alvarez*
Affiliation:
Professor of Political Science, California Institute of Technology, USA. Email: rma@caltech.edu
Lonna Rae Atkeson
Affiliation:
Professor of Political Science, University of New Mexico, USA
Ines Levin
Affiliation:
Assistant Professor of Political Science, University of California, Irvine, USA
Yimeng Li
Affiliation:
Graduate Student, Division of the Humanities and Social Science, California Institute of Technology, USA

Abstract

Does attentiveness matter in survey responses? Do more attentive survey participants give higher quality responses? Using data from a recent online survey that identified inattentive respondents using instructed-response items, we demonstrate that ignoring attentiveness provides a biased portrait of the distribution of critical political attitudes and behavior. We show that this bias occurs in the context of both typical closed-ended questions and in list experiments. Inattentive respondents are common and are more prevalent among the young and less educated. Those who do not pass the trap questions interact with the survey instrument in distinctive ways: they take less time to respond; are more likely to report nonattitudes; and display lower consistency in their reported choices. Inattentiveness does not occur completely at random and failing to properly account for it may lead to inaccurate estimates of the prevalence of key political attitudes and behaviors, of both sensitive and more prosaic nature.

Type
Articles
Copyright
Copyright © The Author(s) 2019. Published by Cambridge University Press on behalf of the Society for Political Methodology. 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Authors’ note: Levin thanks Sean Ingham for his collaboration in collecting the survey data used in this paper, and the University of Georgia for providing the financial support to conduct this survey. The survey data was collected using procedures approved by the Institute Review Board at the University of Georgia. Previous versions of this research were presented at the 34th Annual Meeting of the Society for Political Methodology (2017), at the 2017 Annual Meeting of the American Political Science Association (APSA), and at the 2nd Annual Southern California Methods Workshop at UCSB (September 19–20, 2017). We thank participants at these meetings for their comments and suggestions, and in particular we thank Michael James Ritter for his comments after our presentation at APSA, and Leah Stokes for her comments on our presentation at the Southern California Methods Workshop. Replication materials for this paper are available (Alvarez et al.2018).

Contributing Editor: Jeff Gill

References

Alvarez, R. M. 1997. Information and Elections . Ann Arbor, MI: University of Michigan Press.CrossRefGoogle Scholar
Alvarez, R. M., and Brehm, J.. 2002. Hard Choices, Easy Answers: Values, Information, and American Public Opinion . Princeton, NJ: Princeton University Press.Google Scholar
Alvarez, R. M., and Franklin, C. H.. 1994. “Uncertainty and Political Perceptions.” The Journal of Politics 56(3):671688.CrossRefGoogle Scholar
Alvarez, R. M., Atkeson, L. R., Levin, I., and Li, Y.. 2018. “Replication Data for: Paying Attention to Inattentive Survey Respondents.” https://doi.org/10.7910/DVN/TUUYLQ, Harvard Dataverse, V1, UNF:6:ZHc1mHgkrXEorZvXXJnURQ== [fileUNF].Google Scholar
Anduiza, E., and Galais, C.. 2016. “Answering Without Reading: IMCs and Strong Satisficing in Online Surveys.” International Journal of Public Opinion Research 29(3):497519.Google Scholar
Ansolabehere, S., and Schaffner, B. F.. 2018. “Taking the Study of Political Behavior Online.” In The Oxford Handbook of Polling and Survey Methods , edited by Atkeson, L. R. and Alvarez, R. M., 7696. New York: Oxford University Press.Google Scholar
Atkeson, L. R., and Adams, A. N.. 2018. “Mixing Survey Modes and Its Implications.” In The Oxford Handbook of Polling and Survey Methods , edited by Atkeson, L. R. and Alvarez, R. M., 5375. New York: Oxford University Press.Google Scholar
Atkeson, L. R., Adams, A. N., and Alvarez, R. M.. 2014. “Nonresponse and Mode Effects in Self- and Interviewer-Administered Surveys.” Political Analysis 22(3):304320.CrossRefGoogle Scholar
Bailey, M. A.2017. “Selection Sensitive Survey Design: Moving Beyond Weighting.” Presented at the 2017 Annual Meetings of the American Political Science Association, San Francisco, CA.Google Scholar
Barber, L. K., Barnes, C. M., and Carlson, K. D.. 2013. “Random and Systematic Error Effects of Insomnia on Survey Behavior.” Organizational Research Methods 16(4):616649.CrossRefGoogle Scholar
Berinsky, A. J., Huber, G. A., and Lenz, G. S.. 2012. “Evaluating Online Labor Markets for Experimental Research: Amazon.Com’s Mechanical Turk.” Political Analysis 20(3):351368.CrossRefGoogle Scholar
Berinsky, A. J., Margolis, M. F., and Sances, M. W.. 2014. “Separating the Shirkers from the Workers? Making Sure Respondents Pay Attention on Self-Administered Surveys.” American Journal of Political Science 58(3):739753.CrossRefGoogle Scholar
Berinsky, A. J., Margolis, M. F., and Sances, M. W.. 2016. “Can We Turn Shirkers into Workers? Journal of Experimental Social Psychology 66:2028.CrossRefGoogle Scholar
Blair, G., and Imai, K.. 2012. “Statistical Analysis of List Experiments.” Political Analysis 20(1):4777.CrossRefGoogle Scholar
Bowling, N. A. et al. . 2016. “Who Cares and Who is Careless? Insufficient Effort Responding as a Reflection of Respondent Personality.” Journal of Personality and Social Psychology 111(2):218229.CrossRefGoogle ScholarPubMed
Clifford, S., and Jerit, J.. 2015. “Do Attempts to Improve Respondent Attention Increase Social Desirability Bias? Public Opinion Quarterly 79(3):790802.CrossRefGoogle Scholar
Curran, P. G. 2016. “Methods for the Detection of Carelessly Invalid Responses in Survey Data.” Journal of Experimental Social Psychology 66:419.CrossRefGoogle Scholar
Dillman, D. A., Smyth, J. D., and Christian, L. M.. 2009. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method . 3rd edn. Hoboken, NJ: John Wiley & Sons.Google Scholar
Downes-Le Guin, T.2005. “Satisficing Behavior in Online Panelists.” Presented at the MRA Annual Conference & Symposium, Chicago, IL.Google Scholar
Droitcour, J. et al. . 1991. “The Item Count Technique as a Method of Indirect Questioning: A Review of its Development and a Case Study Application.” In Measurement Errors in Surveys , edited by Biemer, P. P. et al. , 185210. Hoboken, NJ: John Wiley & Sons.Google Scholar
Eady, G.2016. “Replication Data for: The Statistical Analysis of Misreporting on Sensitive Survey Questions.” https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/PZKBUX (September 16, 2018).CrossRefGoogle Scholar
Eady, G. 2017. “The Statistical Analysis of Misreporting on Sensitive Survey Questions.” Political Analysis 25(2):241259.CrossRefGoogle Scholar
Edwards, A. L. 1957. The Social Desirability Variable in Personality Assessment and Research . Fort Worth, TX: Dryden Press.Google Scholar
Fisher, R. J. 1993. “Social Desirability Bias and the Validity of Indirect Questioning.” Journal of Consumer Research 20(2):303315.CrossRefGoogle Scholar
Glynn, A. N. 2013. “What Can We Learn with Statistical Truth Serum? Design and Analysis of the List Experiment.” Public Opinion Quarterly 77(S1):159172.CrossRefGoogle Scholar
Groves, R. M., and Lyberg, L.. 2010. “Total Survey Error: Past, Present, and Future.” Public Opinion Quarterly 74(5):849879.CrossRefGoogle Scholar
Huang, J. L. et al. . 2012. “Detecting and Deterring Insufficient Effort Responding to Surveys.” Journal of Business and Psychology 27(1):99114.CrossRefGoogle Scholar
Imai, K. 2011. “Multivariate Regression Analysis for the Item Count Technique.” Journal of the American Statistical Association 106(494):407416.CrossRefGoogle Scholar
Johnson, J. A. 2005. “Ascertaining the Validity of Individual Protocols from Web-based Personality Inventories.” Journal of Research in Personality 39(1):103129.CrossRefGoogle Scholar
Jones, M. S., House, L. A., and Gao, Z.. 2015. “Attribute Non-Attendance and Satisficing Behavior in Online Choice Experiments.” Proceedings in Food System Dynamics 2015:415432.Google Scholar
Kapelner, A., and Chandler, D.. 2010. “Preventing Satisficing in Online Surveys: A ‘Kapcha’ to Ensure Higher Quality Data.” In Proceedings of CrowdConf 2010, San Francisco, CA .Google Scholar
Kiewiet de Jonge, C. P., and Nickerson, D. W.. 2014. “Artificial Inflation or Deflation? Assessing the Item Count Technique in Comparative Surveys.” Political Behavior 36(3):659682.CrossRefGoogle Scholar
King, G., Honaker, J., Joseph, A., and Scheve, K.. 2001. “Analyzing Incomplete Political Science Data: An Alternative Algorithm for Multiple Imputation.” American Political Science Review 95(1):4969.CrossRefGoogle Scholar
Krosnick, J. A. 1991. “Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys.” Applied Cognitive Psychology 5(3):213236.CrossRefGoogle Scholar
Little, R. J. A. 1992. “Regression with Missing X’s: a Review.” Journal of the American Statistical Association 87(420):12271237.Google Scholar
Maccoby, E. E., Maccoby, N., and Lindzey, G.. 1954. “The Interview: A Tool of Social Science.” In Handbook of Social Psychology , edited by Lindzey, G., 449487. Reading, MA: Addison-Wesley.Google Scholar
Maniaci, M. R., and Rogge, R. D.. 2014. “Caring About Carelessness: Participant Inattention and its Effects on Research.” Journal of Research in Personality 48:6183.CrossRefGoogle Scholar
Meade, A. W., and Craig, S. B.. 2012. “Identifying Careless Responses in Survey Data.” Psychological Methods 17(3):437455.CrossRefGoogle ScholarPubMed
Miller, J.2006. “Research Reveals Alarming Incidence of ‘Undesirable’ Online Panelists.” Research Conference Report, RFL Communications, Skokie, IL, USA. Available at http://www.burke.com/Library/Articles/Jeff%20Miller%20RCR%20PDF.pdf.Google Scholar
Miller, J. D.1984. “A New Survey Technique for Studying Deviant Behavior.” PhD thesis, George Washington University.Google Scholar
Oppenheimer, D. M., Meyvis, T., and Davidenko, N.. 2009. “Instructional Manipulation Checks: Detecting Satisficing to Increase Statistical Power.” Journal of Experimental Social Psychology 45(4):867872.CrossRefGoogle Scholar
Pepinsky, T. B. 2018. “A Note on Listwise Deletion Versus Multiple Imputation.” Political Analysis 26(4):480488.CrossRefGoogle Scholar
Rosenfeld, B., Imai, K., and Shapiro, J. N.. 2016. “An Empirical Validation Study of Popular Survey Methodologies for Sensitive Questions.” American Journal of Political Science 60(3):783802.CrossRefGoogle Scholar
Simon, H. A. 1956. “Rational Choice and the Structure of the Environment.” Psychological Review 63(2):129138.CrossRefGoogle ScholarPubMed
Vannette, D.2017. “Using Attention Checks in Your Surveys May Harm Data Quality.” Qualtrics, https://www.qualtrics.com/blog/using-attention-checks-in-your-surveys-may-harm-data-quality/ (June 14, 2018).Google Scholar
Ward, M. K., and Pond, S. B.. 2015. “Using Virtual Presence and Survey Instructions to Minimize Careless Responding on Internet-Based Surveys.” Computers in Human Behavior 48:554568.CrossRefGoogle Scholar
Warner, S. L. 1965. “Randomized Response: A Survey Technique for Eliminating Evasive Answer Bias.” Journal of the American Statistical Association 60(309):6369.CrossRefGoogle ScholarPubMed
Zagorsky, J. L., and Rhoton, P.. 2008. “The Effects of Promised Monetary Incentives on Attrition in a Long-Term Panel Survey.” Public Opinion Quarterly 72(3):502513.CrossRefGoogle Scholar
Supplementary material: File

Alvarez et al. supplementary material

Alvarez et al. supplementary material 1

Download Alvarez et al. supplementary material(File)
File 685 KB
18
Cited by

Save article to Kindle

To save this article to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Paying Attention to Inattentive Survey Respondents
Available formats
×

Save article to Dropbox

To save this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Dropbox account. Find out more about saving content to Dropbox.

Paying Attention to Inattentive Survey Respondents
Available formats
×

Save article to Google Drive

To save this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Google Drive account. Find out more about saving content to Google Drive.

Paying Attention to Inattentive Survey Respondents
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *