Skip to main content Accessibility help
×
Hostname: page-component-76fb5796d-9pm4c Total loading time: 0 Render date: 2024-04-29T07:54:13.126Z Has data issue: false hasContentIssue false

14 - Quasi-Experimental Research

from Part III - Data Collection

Published online by Cambridge University Press:  25 May 2023

Austin Lee Nichols
Affiliation:
Central European University, Vienna
John Edlund
Affiliation:
Rochester Institute of Technology, New York
Get access

Summary

In this chapter, we discuss the logic and practice of quasi-experimentation. Specifically, we describe four quasi-experimental designs – one-group pretest–posttest designs, non-equivalent group designs, regression discontinuity designs, and interrupted time-series designs – and their statistical analyses in detail. Both simple quasi-experimental designs and embellishments of these simple designs are presented. Potential threats to internal validity are illustrated along with means of addressing their potentially biasing effects so that these effects can be minimized. In contrast to quasi-experiments, randomized experiments are often thought to be the gold standard when estimating the effects of treatment interventions. However, circumstances frequently arise where quasi-experiments can usefully supplement randomized experiments or when quasi-experiments can fruitfully be used in place of randomized experiments. Researchers need to appreciate the relative strengths and weaknesses of the various quasi-experiments so they can choose among pre-specified designs or craft their own unique quasi-experiments.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Aiken, L. S., West, S. G., Schwalm, D. E., Carroll, J. L., & Hsiung, S. (1998). Comparison of a randomized and two quasi-experimental designs in a single outcome evaluation: Efficacy of a university-level remedial writing program. Evaluation Review, 22(2), 207244.Google Scholar
Angrist, J. D. & Pischke, J-S. (2015). Mastering ‘Metrics: The Path from Cause to Effect. Princeton University Press.Google Scholar
Arum, R. & Roksa, J. (2010). Academically Adrift: Limited Learning on College Campuses. University of Chicago Press.CrossRefGoogle Scholar
Berk, R., Barnes, G., Ahlman, L., & Kurtz, E. (2010). When second best is good enough: A comparison between a true experiment and a regression discontinuity quasi-experiment. Journal of Experimental Criminology, 6(2), 191208.CrossRefGoogle Scholar
Bloom, H. S. (2003). Using “short” interrupted time-series analysis to measure the impacts of whole-school reforms: With application to a study of accelerated schools. Evaluation Review, 27(1), 349.CrossRefGoogle Scholar
Braden, J. P. & Bryant, T. J. (1990). Regression discontinuity designs: Applications for school psychologists. School Psychology Review, 19(2), 232239.Google Scholar
Cook, T. D. (2008). “Waiting for life to arrive”: A history of the regression–discontinuity designs in psychology, statistics and economics. Journal of Econometrics, 142(2), 636654.Google Scholar
Cook, T. D., Shadish, W. R., & Wong, V. C. (2008). Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons. Journal of Policy Analysis and Management, 27(4), 724750.CrossRefGoogle Scholar
Cook, T. D., Steiner, P. M., & Pohl, S. (2009). Assessing how bias reduction is influenced by covariate choice, unreliability and data analysis mode: An analysis of different kinds of within-study comparisons in different substantive domains. Multivariate Behavioral Research, 44(6), 828847.Google Scholar
Eckert, W. A. (2000). Situational enhancement of design validity: The case of training evaluation at the World Bank Institute. American Journal of Evaluation, 21(2) 185193.Google Scholar
Eysenck, H. J. (1952). The effects of psychotherapy: An evaluation. Journal of Consulting Psychology, 16(5), 319324.CrossRefGoogle ScholarPubMed
Goldberger, A. S. (2008). Selection bias in evaluation treatment effects: Some formal illustrations. In Fomby, T., Hill, R. C., Millimet, D. L., Smith, J. A., & Vytlacil, E. J. (eds.), Modeling and Evaluating Treatment Effects in Economics (pp. 131). JAI Press.Google Scholar
Goplan, M., Rosinger, K., & Ahn, J. B. (2020). Use of quasi-experimental research designs in education research: Growth, promise, and challenges. Review of Research in Education, 44(1), 218243.Google Scholar
Heinsman, D. T. & Shadish, W. R. (1996). Assignment methods in experimentation: When do nonrandomized experiments approximate answers from randomized experiments? Psychological Methods, 1(2), 154169.Google Scholar
Henry, G. T., Fortner, C. K., & Thompson, C. L. (2010). Targeted funding for educationally disadvantaged students: A regression discontinuity estimate of the impact on high school student achievement. Educational Evaluation and Policy Analysis, 32(2), 183204.CrossRefGoogle Scholar
Henry, G. T. & Harbatkin, E. (2020). The next generation of state reforms to improve their lowest performing schools: An evaluation of North Carolina’s school transformation intervention. Journal of Research on Educational Effectiveness, 13(4), 702730.CrossRefGoogle Scholar
Hudson, J., Fielding, S., & Ramsay, C.R. (2019). Methodology and reporting characteristics of studies using interrupted time series design in healthcare. BMC Medical Research Methodology, 19(1), 137.CrossRefGoogle ScholarPubMed
Imbens, G. W. & Lemieux, T. (2008). Regression discontinuity designs: A guide to practice, Journal of Econometrics, 142(2), 615635.Google Scholar
Jacob, R., Zhu, P., Somers, M-A., & Bloom, H. (2012). A Practical Guide to Regression Discontinuity. Manpower Demonstration Research Corporation.Google Scholar
Kazden, A. E. (2011). Single-Case Research Designs: Methods for Clinical and Applied Settings, 2nd ed. Oxford University Press.Google Scholar
Lee, D. S. & Lemieux, T. (2010). Regression discontinuity designs in economics. Journal of Economic Literature, 48(2), 281355.Google Scholar
Lehman, D. R., Lempert, R. O., & Nisbett, R. E. (1988). The effects of graduate training on reasoning: Formal discipline and thinking about everyday-life events. American Psychologist, 43(6), 431442.Google Scholar
Lipsey, M.W., Cordray, D.S., & Berger, D.E. (1981). Evaluation of a juvenile diversion program: Using multiple lines of evidence. Evaluation Review, 5(3), 283306.CrossRefGoogle Scholar
Mark, M. M. & Mellor, S. (1991). The effect of self-relevance of an event on hindsight bias: The foreseeability of a layoff. Journal of Applied Psychology, 76(4), 569577.CrossRefGoogle Scholar
Matthews, M. S., Peters, S. J., & Housand, A. M. (2012). Regression discontinuity design in gifted and talented education research. Gifted Child Quarterly, 56(2), 105112.Google Scholar
McCleary, R. & McDowall, D. (2012). Time-series designs. In Cooper, H., Camic, P. M., Long, D. L., et al. (eds.), APA Handbook of Research Methods in Psychology, Volume 2. Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological (pp. 613627). American Psychological Association.Google Scholar
McCrary, J. (2008). Manipulation of the running variable in the regression discontinuity design: A density test. Journal of Econometrics, 142(2), 698714.Google Scholar
Nugent, W. R. (2010). Analyzing Single System Design Data. Oxford University Press.Google Scholar
Palmgreen, P. (2009) Interrupted time-series designs for evaluating health communication campaigns. Communication Methods and Measures, 3(1–2), 2946.Google Scholar
Paluck, E. L. & Green, D. P. (2009). Prejudice reduction: What works? A review and assessment of research practice. Annual Review of Psychology, 60, 339367.CrossRefGoogle Scholar
Reichardt, C. S. (2019). Quasi-Experimentation: A Guide to Design and Analysis. Guilford Press.Google Scholar
Reynolds, K. D. & West, S. G. (1987). A multiplist strategy for strengthening nonequivalent control group designs. Evaluation Review, 11(6), 691714.Google Scholar
Rubin, D. B. (2005). Causal inference using potential outcomes: Design, modeling, decisions. Journal of the American Statistical Association, 100(469), 322331.CrossRefGoogle Scholar
Sagarin, B. J., West, S. G., Ratnikov, A., Homan, W. K., & Ritchie, T. D. (2014). Treatment noncompliance in randomized experiments: Statistical approaches and design issues. Psychological Methods, 19(3), 317333.Google Scholar
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Houghton-Mifflin.Google Scholar
Somers, M.-A., Zhu, P., Jacob, R., & Bloom, H. (2013). The Validity and Precision of the Comparative Interrupted Time Series Design and the Difference-in-Difference Design in Educational Evaluation. Manpower Demonstration Research Corporation.Google Scholar
St. Pierre, R. G., Ricciuti, A., & Creps, C. (1999). Synthesis of Local and State Even Start Evaluations. Abt Associates.Google Scholar
Thistlewaite, D. L. & Campbell, D. T. (1960). Regression–discontinuity analysis: An alternative to the ex-post-facto experiment. Journal of Educational Psychology, 51(2), 309317.CrossRefGoogle Scholar
Trochim, W. M. K. (1984). Research Designs for Program Evaluation: The Regression–Discontinuity Approach. SAGE Publications.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×