TY - GEN
T1 - Testing the Keystone Framework by Analyzing Positive Citations to Wakefield’s 1998 Paper
AU - Addepalli, Amulya
AU - Subin, Karen Ann
AU - Schneider, Jodi
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - Science is constantly developing as new information is discovered. Papers discredited by the scientific community may be retracted. Such papers might have been cited before they were retracted (as well as afterwards), which potentially could spread a chain of unreliable information. To address this, Fu and Schneider (2020) introduced the keystone framework for auditing how and whether a paper fundamentally depends on another paper, and proposed that an alerting system be developed to flag papers that fundamentally depend on retracted papers. The need for expert labor is the main challenge of such alerting in such systems. This paper tests whether a flowchart process for non-experts could accurately assess dependencies between papers, reducing the need for expert assessment. We do this by developing such a process and testing it on citations to one highly cited retracted paper. In our case study, non-experts using our process can resolve the question of dependency in about half the cases. Two annotators had 92.9% agreement on 85 papers annotated, with 100% agreement after discussion. In future work we will assess the reliability of non-experts’ decisions as compared to experts, and identify possibilities for automation.
AB - Science is constantly developing as new information is discovered. Papers discredited by the scientific community may be retracted. Such papers might have been cited before they were retracted (as well as afterwards), which potentially could spread a chain of unreliable information. To address this, Fu and Schneider (2020) introduced the keystone framework for auditing how and whether a paper fundamentally depends on another paper, and proposed that an alerting system be developed to flag papers that fundamentally depend on retracted papers. The need for expert labor is the main challenge of such alerting in such systems. This paper tests whether a flowchart process for non-experts could accurately assess dependencies between papers, reducing the need for expert assessment. We do this by developing such a process and testing it on citations to one highly cited retracted paper. In our case study, non-experts using our process can resolve the question of dependency in about half the cases. Two annotators had 92.9% agreement on 85 papers annotated, with 100% agreement after discussion. In future work we will assess the reliability of non-experts’ decisions as compared to experts, and identify possibilities for automation.
KW - Keystone citations
KW - Knowledge maintenance
KW - Misinformation in science
KW - Retracted papers
KW - Wakefield
UR - http://www.scopus.com/inward/record.url?scp=85126267209&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85126267209&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-96957-8_9
DO - 10.1007/978-3-030-96957-8_9
M3 - Conference contribution
AN - SCOPUS:85126267209
SN - 9783030969561
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 79
EP - 88
BT - Information for a Better World
A2 - Smits, Malte
PB - Springer
T2 - 17th International Conference on Information for a Better World: Shaping the Global Future, iConference 2022
Y2 - 28 February 2022 through 4 March 2022
ER -