Effects of Differential Item Functioning on Examinees' Test Performance and Reliability of Test

Yi Hsuan Lee, Jinming Zhang

Research output: Contribution to journalArticle

Abstract

Simulations were conducted to examine the effect of differential item functioning (DIF) on measurement consequences such as total scores, item response theory (IRT) ability estimates, and test reliability in terms of the ratio of true-score variance to observed-score variance and the standard error of estimation for the IRT ability parameter. The objective was to provide bounds of the likely DIF effects on these measurement consequences. Five factors were manipulated: test length, percentage of DIF items per form, item type, sample size, and level of group ability difference. Results indicate that the greatest DIF effect was less than 2 points on the 0 to 60 total score scale and about 0.15 on the IRT ability scale. DIF had a limited effect on the ratio of true-score variance to observed-score variance, but its influence on the standard error of estimation for the IRT ability parameter was evident for certain ability values.

Original languageEnglish (US)
Pages (from-to)23-54
Number of pages32
JournalInternational Journal of Testing
Volume17
Issue number1
DOIs
StatePublished - Jan 2 2017

Keywords

  • differential item functioning
  • expected a posteriori
  • standard error of estimation
  • test information function
  • total score

ASJC Scopus subject areas

  • Social Psychology
  • Education
  • Modeling and Simulation

Fingerprint Dive into the research topics of 'Effects of Differential Item Functioning on Examinees' Test Performance and Reliability of Test'. Together they form a unique fingerprint.

  • Cite this