Algorithmic unfairness mitigation in student models: When fairer methods lead to unintended results

Frank Stinar, Nigel Bosch

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Systematically unfair education systems lead to different levels of learning for students from different demographic groups, which, in the context of AI-driven education, has inspired work on mitigating unfairness in machine learning methods. However, unfairness mitigation methods may lead to unintended consequences for classrooms and students. We examined preprocessing and postprocessing unfairness mitigation algorithms in the context of a large dataset, the State of Texas Assessments of Academic Readiness (STAAR) outcome data, to investigate these issues. We evaluated each unfairness mitigation algorithm across multiple machine learning models using different definitions of fairness. We then evaluated how unfairness mitigation impacts classifications of students across different combinations of machine learning models, unfairness mitigation methods, and definitions of fairness. On average, unfairness mitigation methods led to a 22% improvement in fairness. When examining the impacts of unfairness mitigation methods on predictions, we found that these methods led to models that can and did overgeneralize groups. Consequently, predictions made by such models may not reach the intended audiences. We discuss the implications for AI-driven interventions and student support.

Original languageEnglish (US)
Title of host publicationProceedings of the 15th International Conference on Educational Data Mining, EDM 2022
PublisherInternational Educational Data Mining Society
ISBN (Electronic)9781733673631
DOIs
StatePublished - 2022
Event15th International Conference on Educational Data Mining, EDM 2022 - Hybrid, Durham, United Kingdom
Duration: Jul 24 2022Jul 27 2022

Publication series

NameProceedings of the 15th International Conference on Educational Data Mining, EDM 2022

Conference

Conference15th International Conference on Educational Data Mining, EDM 2022
Country/TerritoryUnited Kingdom
CityHybrid, Durham
Period7/24/227/27/22

Keywords

  • Data science applications in education
  • Fairness
  • Machine learning
  • Unfairness mitigation

ASJC Scopus subject areas

  • Computer Science Applications
  • Information Systems

Fingerprint

Dive into the research topics of 'Algorithmic unfairness mitigation in student models: When fairer methods lead to unintended results'. Together they form a unique fingerprint.

Cite this