Skip to main content Accessibility help
×
Home

Exploring Bias in Student Evaluations: Gender, Race, and Ethnicity

  • Kerry Chávez (a1) and Kristina M.W. Mitchell (a2)

Abstract

Research continues to accumulate showing that in instructor evaluations students are biased against women. This article extends these analyses by examining the dynamics between evaluations and gender and race/ethnicity. In a quasi-experimental design, faculty members teaching identical online courses recorded welcome videos that were presented to students at the course onset, constituting the sole exposure to perceived gender and race/ethnicity. This enables exploration of whether and to what degree the instructors’ characteristics influenced student evaluations, even after holding all other course factors constant. Findings show that instructors who are female and persons of color receive lower scores on ordinal student evaluations than those who are white males. Overall, we add further evidence to a growing literature calling for student evaluations of teaching (SETs) reform and extend it to encompass the effects on racial/ethnic minorities in addition to women.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Exploring Bias in Student Evaluations: Gender, Race, and Ethnicity
      Available formats
      ×

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Exploring Bias in Student Evaluations: Gender, Race, and Ethnicity
      Available formats
      ×

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Exploring Bias in Student Evaluations: Gender, Race, and Ethnicity
      Available formats
      ×

Copyright

References

Hide All
Boring, Anne. 2017. “Gender Biases in Student Evaluations of Teaching.” Journal of Public Economics 145: 2741.
Curtis, John W. 2011. “Persistent Inequity: Gender and Academic Employment.” Report from the American Association of University Professors. Available at www.aaup.org/NR/rdonlyres/08E023AB-E6D8-4DBD-99A0-24E5EB73A760/0/persistent_inequity.pdf.
Djupe, Paul A., Smith, Amy Erica, and Sokhey, Anand Edward. 2019. “Explaining Gender in the Journals: How Submission Practices Affect Publication Patterns in Political Science.” PS: Political Science & Politics 52 (1): 7177.
Eagly, Alice H., and Karau, Steven J.. 2002. “Role-Congruity Theory of Prejudice Toward Female Leaders.” Psychological Review 109 (3): 573–98.
Johnson, Stefanie K., Murphy, Susan Elaine, Zewdie, Selamawit, and Reichard, Rebecca J.. 2008. “The Strong, Sensitive Type: Effects of Gender Stereotypes and Leadership Prototypes on the Evaluation of Male and Female Leaders.” Organizational Behavior and Human Decision Processes 106 (1): 3960.
Laube, Heather, Massoni, Kelley, Sprague, Joey, and Ferber, Abby L.. 2007. “The Impact of Gender on the Evaluation of Teaching: What We Know and What We Can Do.” National Women’s Studies Association Journal 19 (3): 87104.
MacNell, Lillian, Driscoll, Adam, and Hunt, Andrea N.. 2015. “What’s in a Name: Exposing Gender Bias in Student Ratings of Teaching.” Innovative Higher Education 40 (4): 291303.
Martin, Lisa L. 2016. “Gender, Teaching Evaluations, and Professional Success in Political Science.” PS: Political Science & Politics 49 (2): 313–19.
Miller, JoAnn, and Chamberlin, Marilyn. 2000. “Women Are Teachers, Men are Professors: A Study of Student Perceptions.” Teaching Sociology 28 (4): 283–98.
Mitchell, Kristina M. W., and Martin, Jonathan. 2018. “Gender Bias in Student Evaluations.” PS: Political Science & Politics 51 (3): 15.
Mitchell, Sara McLaughlin, Lange, Samantha, and Brus, Holly. 2013. “Gendered Citation Patterns in International Relations Journals.” International Studies Perspective 14 (4): 485–92.
Paludi, Michele A., and Strayer, Lisa A.. 1985. “What’s in an Author’s Name? Differential Evaluations of Performance as a Function of Author’s Name.” Sex Roles 12 (3/4): 353–61.
Rosen, Andrew S. 2018. “Correlations, Trends and Potential Biases Among Publicly Accessible Web-Based Student Evaluations of Teaching: A Large-Scale Study of RateMyProfessors.com Data.” Assessment & Evaluation in Higher Education 43 (1): 3144.
Teele, Dawn Langan, and Thelen, Kathleen. 2017. “Gender in the Journals: Publication Patterns in Political Science.” PS: Political Science & Politics 50 (2): 433–77.
Type Description Title
PDF
Supplementary materials

Chávez and Mitchell supplementary material
Tables A1-A2

 PDF (42 KB)
42 KB

Exploring Bias in Student Evaluations: Gender, Race, and Ethnicity

  • Kerry Chávez (a1) and Kristina M.W. Mitchell (a2)

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed