Skip to main content Accessibility help
×
Hostname: page-component-76fb5796d-45l2p Total loading time: 0 Render date: 2024-04-28T05:43:12.970Z Has data issue: false hasContentIssue false

15 - Non-equivalent Control Group Pretest–Posttest Design in Social and Behavioral Research

from Part III - Data Collection

Published online by Cambridge University Press:  25 May 2023

Austin Lee Nichols
Affiliation:
Central European University, Vienna
John Edlund
Affiliation:
Rochester Institute of Technology, New York
Get access

Summary

Experimental research designs feature two essential ingredients: manipulation of an independent variable and random assignment of subjects. However, in a quasi-experimental design, subjects are assigned to groups based on non-random criteria. This design allows for manipulation of the independent variable with the aim of examining causality between an intervention and an outcome. In social and behavioral research, this design is useful when it may not be logistically or ethically feasible to use a randomized control design – the “gold standard.” Although not as strong as an experiment, non-equivalent control group pretest–posttest designs are usually higher in internal validity than correlation designs. Overcoming possible threats to internal and external validity in a non-equivalent control group pretest–posttest design, such as cofounding variables, are discussed in relation to sample selection, power, effect size, and specific methods of data analyses.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Further Reading

Barber, T. X. (1973). Pitfalls in research: Nine investigator and experimenter effects. In Travers, R. (ed.). Second Handbook of Research on Teaching. Rand McNally.Google Scholar
Cook, D. L. (1967). The Impact of the Hawthorne Effect in Experimental Design in Educational Research, Cooperative Research Project, 1967, No. 1757. US Office of Education.Google Scholar
Gephart, W. J. & Antonoplos, D. P. (1969). The effects of expectancy and other research-biasing factors. The Phi Delta Kappan, 50(10) 579583. https://www.jstor.org/stable/20372478.Google Scholar
Lee, J. (2021). Situation, background, assessment, and recommendation stepwise education program: A quasi-experimental study. Nurse Education Today, 100, 104847. https://doi.org/10.1016/j.nedt.2021.104847Google Scholar
Noh, G. O. & Kim, M. (2021). Effectiveness of assertiveness training, SBAR, and combined SBAR and assertiveness training for nursing students undergoing clinical training: A quasi-experimental study. Nurse Education Today, 103, 104958. https://doi.org/10.1016/j.nedt.2021.104958CrossRefGoogle ScholarPubMed
Osman, K. & Lee, T. (2014). Impact of interactive multimedia module with pedagogical agents on students’ understanding and motivation in the Learning of electrochemistry. International Journal of Science & Mathematics Education, 12(2), 395421. https://doi.org/10.1007/s10763-013-9407Google Scholar
Rosenthal, R. (1963). On the social psychology of the psychological experiment: The experimenter’s hypothesis as unintended determinant of experimental results. American Science, 51, 268283.Google ScholarPubMed
Whitley, E. & Ball, J. (2002). Statistics review 4: Sample size calculations. Critical Care, 6(4), 335341. https://doi.org/10.1186/cc1521Google Scholar
Yu, F.-Y. & Chen, C.-Y. (2021). Student- versus teacher-generated explanations for answers to online multiple-choice questions: What are the differences? Computers & Education, 173, 104273. https://doi.org/10.1016/j.compedu.2021.104273Google Scholar

References

Asher, H. B. (1983). Casual Modelling. SAGE Publications.CrossRefGoogle Scholar
Barnett, V. & Lewis, T. (1994). Outliers in Statistical Data, 3rd ed. John Wiley & Sons.Google Scholar
Barrett, A. C. & White, D. A. (1991). How John Henry effects confound the measurement of self-esteem in primary prevention programs for drug abuse in middle schools. Journal of Alcohol and Drug Education, 36(3), 87102.Google Scholar
Bell, J. (1993). Doing Your Own Research Project. Open University Press.Google Scholar
Benjamin, L. (1988). A History of Psychology. McGraw-Hill.Google Scholar
Bonate, P. (2000). Analysis of Pretest–Posttest Designs. Chapman & Hall.CrossRefGoogle Scholar
Bordens, K. & Abbott, B. (2007). Research Design and Methods: A Process Approach. McGrath Hill.Google Scholar
Braaten, L. J. (1989). The effects of person-centred group therapy. Person Centred Review, 4(2), 18.Google Scholar
Brace, N., Kemp, R., & Snelgar, R. (2003). SPSS for Psychologists. A Guide to Data Analysis using SPSS for Windows, 2nd ed. Palgrave.Google Scholar
Campbell, D. T. & Stanley, J. C. (1966). Experimental and Quasi-Experimental Designs for Research. Rand McNally.Google Scholar
Chiva-Bartoll, O., Montero, P. J. R, Capella-Peris, C., & Salvador-García, C. (2020). Effects of service learning on physical education teacher education students’ subjective happiness, prosocial behavior, and professional learning. Frontiers in Psychology. 11, 331.CrossRefGoogle ScholarPubMed
Cleveland, W. S. (1993). Visualising Data. Hobart Press.Google Scholar
Cohen, J. (1962). The statistical power of abnormal social psychological research. Journal of Abnormal and Social Psychology, 65(3), 145153.Google Scholar
Cohen, J. (1969). Statistical Power Analysis for the Behavioral Sciences. Academic Press.Google Scholar
Cohen, J. (1973). Eta-squared and partial eta-squared statistics in fixed factor ANOVA designs. Educational and Psychological Measurement, 33, 107112.CrossRefGoogle Scholar
Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences, 2nd ed. Lawrence Erlbaum Associates.Google Scholar
Cook, T. D. & Campbell, D. T. (1979). Quasi-Experimentation: Design and Analysis for Field Settings. Rand McNally.Google Scholar
Crotty, M. (2006). The Foundations of Social Research: Meaning and Perspectives in the Research Process, 2nd ed. SAGE Publications.Google Scholar
Dawes, M., Davies, P., Gray, A., et al. (2005). Evidence Based Practice: A Primer for Health Care Professionals, 2nd ed. Elsevier Churchill Livingstone.Google Scholar
Denny, M., Denieffe, S. & Pajnkihar, M. (2017). Using a Non-equivalent Control Group Design in Educational Research. Research Methods Cases Part 2. SAGE Publications.Google Scholar
Dickinson, K. P., Johnson, T. R., &. West, R. W. (1987). An analysis of the sensitivity of quasi experimental net impact estimates of CETA programmes. Evaluation Review, 11, 452472.CrossRefGoogle Scholar
Fisher, R. A. (1971). The Design of Experiments, 8th ed. Oxford University Press.Google Scholar
Frank, M. G. & Gilovich, T. (1988). The dark side of self- and social perception: Black uniforms and aggression in professional sports. Journal of Personality and Social Psychology, 54(1), 7485. https://doi.org/10.1037/0022-3514.54.1.74CrossRefGoogle ScholarPubMed
Goodwin, J. C. (1995). Research in Psychology: Methods and Design. John Wiley & Sons.Google Scholar
Gravetter, F. J. & Wallnau, L. B. (2000). Statistics for the Behavioural Sciences. Wadsworth/Thomson Learning.Google Scholar
Hains, A. A. & Szyjakowski, M. (1990). A cognitive stress-reduction intervention program for adolescents. Journal of Counseling Psychology, 37(1), 80.Google Scholar
Heppner, P. P. (1999). Extending the tradition of the counseling psychologist by building on strengths. The Counseling Psychologist, 27(1), 5972. https://doi.org/10.1177/0011000099271005CrossRefGoogle Scholar
Heppner, P. P., Kivlighan, D. M., & Wampold, B. E. (1992). Research Design in Counseling. Brooks/Cole Publishing Company.Google Scholar
Heppner, P. P., Kivlighan, D. M., & Wampold, B. E (2004). Research Design in Counseling, 2nd ed. Brooks/Cole Publishing Company.Google Scholar
Hershberger, S. L. (2005). History of multivariate analysis of variance. In Everitt, B. & Howell, D. C. (Eds.), Encyclopedia of Statistics in Behavioral Science (vol. 2, pp. 864869). John Wiley & Sons.Google Scholar
Hopkins, W. G. (2016). A new view of statistics. Available at: www.sportsci.org/resource/stats/.Google Scholar
Hoinville, J. & Jowell, R. (1978). Survey Research Practice. Heinemann.Google Scholar
Houle, T. T., Penzien, D. B., & Houle, C. K. (2005). Statistical power and sample size estimation for headache research: An overview and power calculation tools. Headache: The Journal of Head and Face Pain, 45(5), 414418.Google Scholar
Huck, S. W. & Cormier, W. H. (1996). Principles of research design. In Jennison, C. (ed.), Reading Statistics and Research (pp. 578622). 2nd ed. Harper Collins.Google Scholar
Huck, S. W., Cormier, W. H., & Bounds, W. F. (1974). Reading Statistics and Research. Harper Collins.Google Scholar
Hunsley, J. & Lee, C.M. (2006). Introduction to Clinical Psychology. John Wiley & Sons.Google Scholar
Kazdin, E. & Bass, D. (1989). Power to detect differences between alternative treatments in comparative psychotherapy outcomes research. Journal of Consulting and Clinical Psychology, 57(1), 138147.CrossRefGoogle Scholar
Kerlinger, F. N. (1986). Foundations of Behavioural Research. Holt, Reinhart & Winston.Google Scholar
Kim, Y. & Steiner, P. M. (2019). Gain scores revisited: A graphical models perspective. Sociological Methods & Research, 50(3). https://doi.org/10.1177/0049124119826155Google Scholar
Kirk, R. E. (2005). Handbook of Research in Experimental Psychology. Blackwell Publishing.Google Scholar
Kush, K. & Cochran, L. (1993). Enhancing a sense of agency through career planning. Journal of Counseling Psychology, 40(4), 434439.Google Scholar
Lee, S. & Lee, E. (2020). Effects of cognitive behavioral group program for mental health promotion of university students. International Journal of Environmental Research and Public Health, 17(10), 3500.Google Scholar
Lipsey, M. W. & Wilson, D.B. (1993). The efficacy of psychological, educational and behavioural treatment: Conformation from meta-analysis. American Psychologist, 48, 11811209.CrossRefGoogle Scholar
Loftin, L. & Madison, S. (1991). The extreme dangers of covariance corrections. In Thompson, B. (ed.), Advances in Educational Research: Substantive Findings, Methodological Developments. JAI Press.Google Scholar
Miles, J. (2003). A framework for power analysis using a structural equation modelling procedure. BMC Medical Research Methodology, 3, 27.CrossRefGoogle ScholarPubMed
McMillan, J. H. (2000). Educational Research: Fundamentals for the Consumer. Addison Wesley Longman.Google Scholar
Murray, T. R. (2003). Blending Qualitative and Quantitative Methods in Theses and Dissertations. Corwin Press Inc.Google Scholar
Noh, G. O. & Kim, D. H. (2019). Effectiveness of a self-directed learning program using blended coaching among nursing students in clinical practice: A quasi-experimental research design. BMC Med Education, 19(1), 225.CrossRefGoogle ScholarPubMed
Parsons, H. M. (1974). What happened at Hawthorn? Science, 183, 93.Google Scholar
Patton, P. Q. (1990). Qualitative Evaluation and Research Methods, 2nd ed. SAGE Publications.Google Scholar
Pallant, J. (2006). SPSS Survival Manual, 2nd ed. McGrath Hill.Google Scholar
Pastor, D. A. & Kaliski, P. K. (2007). Examining college students’ gains in general education. Research and Practice in Assessment, 1(2), 120.Google Scholar
Polit, D. F. (2005). Essentials of Nursing Research: Methods, Appraisal and Utilization, 6th ed. Lippincott Williams & Wilkins.Google Scholar
Polit, D. F. & Beck, C. T. (2004). Nursing Research: Principles and Methods, 7th ed. Lippincott Williams & Wilkins.Google Scholar
Polit, D. F., Beck, C. T., & Hungler, B. P. (2006). Nursing Research: Methods, Appraisals, and Utilization, 6th ed. Lippincott Williams & Wilkins.Google Scholar
Robson, C. (2002). Real World Research: A Resource for Social Scientists and Practitioner-Researchers, 2nd ed. Blackwell.Google Scholar
Rosenthal, R. (1984). Meta-analytic Procedures for Social Research. SAGE Publications.Google Scholar
Rosenthal, R. & Jacobson, L. (1968). Pygmalion in the Classroom: Teacher Expectation and Pupils’ Intellectual Development. Rinehart and Winston.CrossRefGoogle Scholar
Rosenthal, R. & Rosnow, R. L. (2008). Essentials of Behavioural Research: Method and Data Analysis, 3rd ed. McGrath Hill.Google Scholar
Rosnow, R. L. & Rosenthal, R. (1996). Computing contrasts, effect sizes and counter nulls on other people’s published data: General procedures for research consumers. Psychological Methods, 1, 331340.CrossRefGoogle Scholar
Rubin, D. B. (1979) Using multivariate matched sampling and regression adjustment to control bias in observational studies. Journal of the American Statistical Association, 74, 318328.Google Scholar
Riley, M. W. (1967). Sociological Research; A Case Approach. Harcourt Brace and JovanovichGoogle Scholar
Saks, M. & Allsop, J. (2007). Health Research Sampling Methods. SAGE Publications.Google Scholar
Serlin, R. C. & Lapsley, D. K. (1985). Rationality in psychological research: The good- enough principle. American Psychologist, 40, 7383.CrossRefGoogle Scholar
Sokal, R. R. & Rohif, F. J. (1981). Biometry: The principles and practices of Statistics in Biological Research. W. H. Freeman and Company.Google Scholar
Stevens, J. (2002). Applied Multivariate Statistics for the Social Sciences, 4th ed. Erlbaum.Google Scholar
Tabachnick, B. G. and Fidell, L. S. (2001). Using Multivariate Statistics, 4th ed. Harper Collins.Google Scholar
Thistlethwaite, D. & Campbell, D. (1960). Regression–discontinuity analysis: An alternative to the ex post facto experiment. Journal of Educational Psychology, 51 309317.Google Scholar
Thorndike, E. L. (1920). A constant error on psychological rating. Journal of Applied Psychology, 4, 2529.CrossRefGoogle Scholar
White, H. & Sabarwal, S. (2014). Quasi-experimental Design and Methods, Methodological Briefs: Impact Evaluation 8. UNICEF.Google Scholar
Zimmerman, D. W. & Williams, R. H. (1982). Gain scores in research can be highly reliable. Journal of Educational Measurement, 19(2), 149154.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×