Revival of test bias research in preemployment testing

Herman Aguinis, Steven A. Culpepper, Charles A. Pierce

Research output: Contribution to journalArticlepeer-review

Abstract

We developed a new analytic proof and conducted Monte Carlo simulations to assess the effects of methodological and statistical artifacts on the relative accuracy of intercept- and slope-based test bias assessment. The main simulation design included 3,185,000 unique combinations of a wide range of values for true intercept- and slope-based test bias, total sample size, proportion of minority group sample size to total sample size, predictor (i.e., preemployment test scores) and criterion (i.e., job performance) reliability, predictor range restriction, correlation between predictor scores and the dummy-coded grouping variable (e.g., ethnicity), and mean difference between predictor scores across groups. Results based on 15 billion 925 million individual samples of scores and more than 8 trillion 662 million individual scores raise questions about the established conclusion that test bias in preemployment testing is nonexistent and, if it exists, it only occurs regarding intercept-based differences that favor minority group members. Because of the prominence of test fairness in the popular media, legislation, and litigation, our results point to the need to revive test bias research in preemployment testing.

Original languageEnglish (US)
Pages (from-to)648-680
Number of pages33
JournalJournal of Applied Psychology
Volume95
Issue number4
DOIs
StatePublished - Jul 2010
Externally publishedYes

Keywords

  • Employee selection
  • Human resource management
  • Selection fairness
  • Staffing
  • Testing practices

ASJC Scopus subject areas

  • Applied Psychology

Fingerprint

Dive into the research topics of 'Revival of test bias research in preemployment testing'. Together they form a unique fingerprint.

Cite this