TY - JOUR
T1 - Response pattern analysis
T2 - Assuring data integrity in extreme research settings
AU - Christensen, Lisa Jones
AU - Siemsen, Enno
AU - Branzei, Oana
AU - Viswanathan, Madhu
N1 - Publisher Copyright:
Copyright © 2016 John Wiley & Sons, Ltd.
PY - 2017/2/1
Y1 - 2017/2/1
N2 - Research summary: Strategy scholars increasingly conduct research in nontraditional contexts. Such efforts often require the assistance of third-party intermediaries who understand local culture, norms, and language. This reliance on intermediation in primary or secondary data collection can elicit agency breakdowns that call into question the reliability, analyzability, and interpretability of responses. Herein, we investigate the causes and consequences of intermediary bias in the form of faked data and we offer Response Pattern Analysis as a statistical solution for identifying and removing such problematic data. By explicating the effect, illustrating how we detected it, and performing a controlled field experiment in a developing country to test the effectiveness of our methodological solution, we encourage researchers to continue to seek data and build theory from unique and understudied settings. Managerial summary: Any form of survey research contains the risk of interviewers faking data. This risk is particularly difficult to mitigate in Base-of-Pyramid or developing country contexts where researchers have to rely on intermediaries and forms of control are limited. We provide a statistical technique to identify a faking interviewer's ex post data collection, and remove the associated data prior to analysis. Using a field experiment where we instruct interviewers to fake the data, we demonstrate that the algorithm we employ achieves a 90 percent accuracy in terms of differentiating faking from nonfaking interviewers.
AB - Research summary: Strategy scholars increasingly conduct research in nontraditional contexts. Such efforts often require the assistance of third-party intermediaries who understand local culture, norms, and language. This reliance on intermediation in primary or secondary data collection can elicit agency breakdowns that call into question the reliability, analyzability, and interpretability of responses. Herein, we investigate the causes and consequences of intermediary bias in the form of faked data and we offer Response Pattern Analysis as a statistical solution for identifying and removing such problematic data. By explicating the effect, illustrating how we detected it, and performing a controlled field experiment in a developing country to test the effectiveness of our methodological solution, we encourage researchers to continue to seek data and build theory from unique and understudied settings. Managerial summary: Any form of survey research contains the risk of interviewers faking data. This risk is particularly difficult to mitigate in Base-of-Pyramid or developing country contexts where researchers have to rely on intermediaries and forms of control are limited. We provide a statistical technique to identify a faking interviewer's ex post data collection, and remove the associated data prior to analysis. Using a field experiment where we instruct interviewers to fake the data, we demonstrate that the algorithm we employ achieves a 90 percent accuracy in terms of differentiating faking from nonfaking interviewers.
KW - nontraditional contexts
KW - research methods
KW - survey administration
KW - survey design
KW - survey research
UR - http://www.scopus.com/inward/record.url?scp=84960868916&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84960868916&partnerID=8YFLogxK
U2 - 10.1002/smj.2497
DO - 10.1002/smj.2497
M3 - Article
AN - SCOPUS:84960868916
SN - 0143-2095
VL - 38
SP - 471
EP - 482
JO - Strategic Management Journal
JF - Strategic Management Journal
IS - 2
ER -