Skip to main content Accessibility help
×
Home

An Inconvenient Truth: Arbitrary Distinctions Between Organizational, Mechanical Turk, and Other Convenience Samples

  • Richard N. Landers (a1) and Tara S. Behrend (a2)

Abstract

Sampling strategy has critical implications for the validity of a researcher's conclusions. Despite this, sampling is frequently neglected in research methods textbooks, during the research design process, and in the reporting of our journals. The lack of guidance on this issue often leads reviewers and journal editors to rely on simple rules of thumb, myth, and tradition for judgments about sampling, which promotes the unnecessary and counterproductive characterization of sampling strategies as universally “good” or “bad.” Such oversimplification, especially by journal editors and reviewers, slows the progress of the social sciences by considering legitimate data sources to be categorically unacceptable. Instead, we argue that sampling is better understood in methodological terms of range restriction and omitted variables bias. This considered approach has far-reaching implications because in industrial–organizational (I-O) psychology, as in most social sciences, virtually all of the samples are convenience samples. Organizational samples are not gold standard research sources; instead, they are merely a specific type of convenience sample with their own positive and negative implications for validity. This fact does not condemn the science of I-O psychology but does highlight the need for more careful consideration of how and when a finding may generalize based on the particular mix of validity-related affordances provided by each sample source that might be used to investigate a particular research question. We call for researchers to explore such considerations cautiously and explicitly both in the publication and in the review of research.

Copyright

Corresponding author

Correspondence concerning this article should be addressed to Richard N. Landers, 250 Mills Godwin Building, Department of Psychology, Old Dominion University, Norfolk, VA 23529, rnlanders@odu.edu

References

Hide All
Aguinis, H., & Edwards, J. R. (2014). Methodological wishes for the next decade and how to make wishes come true.Journal of Management Studies, 51, 143174.
Aguinis, H., & Lawal, S. O. (2012). Conducting field experiments using eLancing’s natural environment.Journal of Business Venturing, 27, 493505.
Aguinis, H., & Vandenberg, R. J. (2014). An ounce of prevention is worth a pound of cure: Improving research quality before data collection.Annual Review of Organizational Psychology and Organizational Behavior, 1, 569595.
Behrend, T. S., Sharek, D. J., Meade, A. W., & Wiebe, E. N. (2011). The viability of crowdsourcing for survey research. Behavior Research Methods, 43, 800813. doi:10.3758/s13428-011-0081-0
Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon's Mechanical Turk: A new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science, 6, 35. doi:10.1177/1745691610393980
Campbell, J. (1986). Labs, fields, and straw issues. In Locke, E. A. (Ed.), Generalizing from laboratory to field settings (pp. 269279). Lexington, MA: Lexington Books.
Cook, T. D., & Campbell, D. T. (1976). The design and conduct of quasi-experiments and true experiments in field settings. In Dunnette, M. D. (Ed.), Handbook of industrial and organizational psychology (pp. 223336). Chicago, IL: Rand McNally.
Dipboye, R. L. (1990). Laboratory vs. field research in industrial and organizational psychology. International Review of Industrial and Organizational Psychology, 5, 134.
Gordon, M. E., Slade, L. A., & Schmitt, N. (1986). The “science of the sophomore” revisited: From conjecture to empiricism. Academy of Management Review, 11, 191207. doi:10.2307/258340
Greenberg, J. (1987). The college sophomore as guinea pig: Setting the record straight. Academy of Management Review, 12, 157159. doi:10.5465/AMR.1987.4306516
Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33, 6183. doi:10.1017/S0140525'0999152X
Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and bias in research findings. Thousand Oaks, CA: Sage.
Ilgen, D. (1986). Laboratory research: A question of when, not if. In Locke, E. A. (Ed.), Generalizing from laboratory to field settings (pp. 257267). Lexington, MA: Lexington Books.
James, L. R. (1980). The unmeasured variables problem in path analysis. Journal of Applied Psychology, 65, 415421.
Johns, G. 2006. The essential impact of context on organizational behavior. Academy of Management Review, 31, 396408.
Kenny, D. A. (1979). Correlation and causality. New York, NY: Wiley.
Landy, F. J. (2008). Stereotypes, bias, and personnel decisions: Strange and stranger. Industrial and Organizational Psychology, 1, 379392. doi:10.1111/j.1754-9434.2008.00071.x
Mauro, R. (1990). Understanding LOVE (left out variables error): A method for estimating the effects of omitted variables. Psychological Bulletin, 108, 314329.
McGrath, J. E. (1982). Dilemmatics: The study of research choices and dilemmas. In McGrath, J. E., Martin, J., & Kulka, R. A. (Eds.), Judgment calls in research (pp. 69102). Beverly Hills, CA: Sage.
McGrath, J. E., & Brinberg, D. (1984). Alternative paths for research: Another view of the basic versus applied distinction. In Oskamp, S. (Ed.), Applied Social Psychology Annual (pp. 109132). Beverly Hills, CA: Sage.
Meade, A. W., Behrend, T. S., & Lance, C. E. (2009). Dr. StrangeLOVE, or: How I learned to stop worrying and love omitted variables. In Lance, C. E. & Vandenberg, R. J. (Eds.), Statistical and methodological myths and urban legends: Doctrine, verity and fable in the organizational and social sciences (pp. 89106). New York, NY: Routledge.
Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17, 437455.
Pedhazur, E. J., & Schmelkin, L. P. (2013). Measurement, design, and analysis: An integrated approach. New York, NY: Psychology Press.
Reis, H. T., & Gosling, S. D. (2010). Social psychological methods outside the laboratory. In Fiske, S. T., Gilbert, D. T., & Lindzey, G. (Eds.), Handbook of Social Psychology (5th ed., Vol. 1). Hoboken, NJ: Wiley Hoboken.
Rynes, S. L. (2012). The research–practice gap in industrial–organizational psychology and related fields: Challenges and potential solutions. In Kozlowski, S. W. J. (ed.), Oxford handbook of organizational psychology (Vol. 1, pp. 409452). New York, NY: Oxford University Press.
Rynes, S. L., Bartunek, J. M., & Daft, R. L. (2001). Across the great divide: Knowledge creation and transfer between practitioners and academics. Academy of Management Journal, 44, 340355. doi:10.2307/3069460
Sackett, P. R., & Larson, J. (1990). Research strategies and tactics in I-O psychology. In Dunnette, M. D. & Hough, L. (Eds.), Handbook of industrial and organizational psychology (2nd ed., pp. 1989). Palo Alto, CA: Consulting Psychologists Press.
Sackett, P. R., Lievens, F., Berry, C. M., & Landers, R. N. (2007). A cautionary note on the effects of range restriction on predictor intercorrelations. Journal of Applied Psychology, 92, 538544. doi:10.1037/0021-9010.92.2.538
Sackett, P. R., & Yang, H. (2000). Correction for range restriction: An expanded typology. Journal of Applied Psychology, 85, 112118. doi:10.1037/0021-9010.85.1.112
Schmidt, F. L., Law, K., Hunter, J. E., Rothstein, H. R., Pearlman, K., & McDaniel, M. (1993). Refinements in validity generalization methods: Implications for the situational specificity hypothesis. Journal of Applied Psychology, 78, 312. doi:10.1037/0021-9010.78.1.3
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal influence. Boston, MA: Houghton Mifflin.
Spector, P. E., & Brannick, M. T. (2010). Methodological urban legends: The misuse of statistical control variables. Organizational Research Methods, 14, 287305.
Stanton, J. M., & Weiss, E. M. (2002). Online panels for social science research: An introduction to the StudyResponse project (Tech. Report No. 13001). Syracuse, NY: Syracuse University, School of Information Studies.
Staw, B. M., & Ross, J. (1980). Commitment in an experimenting society: A study of the attribution of leadership from administrative scenarios. Journal of Applied Psychology, 65, 249260.
StudyReponse.net. (2015). Research FAQ. Retrieved from http://www.studyresponse.net/researcherFAQ.htm
Trochim, W., & Donnelly, J. (2008). The research methods knowledge base. Mason, OH: Atomic Dog.
Wang, M., & Hanges, P. J. (2011). Latent class procedures: Applications to organizational research. Organizational Research Methods, 14, 2431. doi:10.1177/1094428110383988
Wang, M., & Russell, S. S. (2005). Measurement equivalence of the job descriptive index across Chines and American workers: Results for confirmatory factor analysis and item response theory. Educational and Psychological Measurement, 65, 709732. doi:10.1177/0013164404272494
Wang, M., Zhan, Y., Liu, S., & Shultz, K. S. (2008). Antecedents of bridge employment: A longitudinal investigation. Journal of Applied Psychology, 93, 818830. doi:10.1037/0021-9010.93.4.818

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed