Effects of the testing situation on item responding: Cause for concern

Stephen Stark, Oleksandr S. Chernyshenko, Kim Yin Chan, Wayne C. Lee, Fritz Drasgow

Research output: Contribution to journalArticlepeer-review

Abstract

The effects of faking on personality test scores have been studied previously by comparing (a) experimental groups instructed to fake or answer honestly, (b) subgroups created from a single sample of applicants or nonapplicants by using impression management scores, and (c) job applicants and nonapplicants. In this investigation, the latter 2 methods were used to study the effects of faking on the functioning of the items and scales of the Sixteen Personality Factor Questionnaire. A variety of item response theory methods were used to detect differential item/test functioning, interpreted as evidence of faking. The presence of differential item/test functioning across testing situations suggests that faking adversely affects the construct validity of personality scales and that it is problematic to study faking by comparing groups defined by impression management scores.

Original languageEnglish (US)
Pages (from-to)943-953
Number of pages11
JournalJournal of Applied Psychology
Volume86
Issue number5
DOIs
StatePublished - Oct 2001

ASJC Scopus subject areas

  • Applied Psychology

Fingerprint Dive into the research topics of 'Effects of the testing situation on item responding: Cause for concern'. Together they form a unique fingerprint.

Cite this