An Item-Driven Adaptive Design for Calibrating Pretest Items

Usama S. Ali, Hua-hua Chang

Research output: Contribution to journalArticlepeer-review


Adaptive testing is advantageous in that it provides more efficient ability estimates with fewer items than linear testing does. Item‐driven adaptive pretesting may also offer similar advantages, and verification of such a hypothesis about item calibration was the main objective of this study. A suitability index (SI) was introduced to adaptively select pretest items, by which an easy‐to‐implement calibration methodology—adaptive design—can be used. A simulation study was conducted to evaluate the proposed adaptive design as compared to existing methodologies. Results indicate that the adaptive design has many desired features in item calibration, including less bias and more accurate parameter estimates, than the existing methods do. The SI is promising and flexible enough to apply additional constraints on the calibration sample and on the pretest items, for example, constraints on response time. It can also be used to try out individual item modules such as those used in multistage testing. Study limitations and future research are also covered in this report.
Original languageEnglish (US)
Pages (from-to)1-12
JournalETS Research Report Series
Issue number2
StatePublished - Dec 2014


  • adaptive design
  • computerized adaptive testing
  • item calibration
  • item response theory
  • pretest items
  • suitability index


Dive into the research topics of 'An Item-Driven Adaptive Design for Calibrating Pretest Items'. Together they form a unique fingerprint.

Cite this