TY - GEN
T1 - EVEDIT
T2 - 2024 Conference on Empirical Methods in Natural Language Processing, EMNLP 2024
AU - Liu, Jiateng
AU - Yu, Pengfei
AU - Zhang, Yuji
AU - Li, Sha
AU - Zhang, Zixuan
AU - Small, Kevin
AU - Sarikaya, Ruhi
AU - Ji, Heng
N1 - Publisher Copyright:
© 2024 Association for Computational Linguistics.
PY - 2024
Y1 - 2024
N2 - The dynamic nature of real-world information necessitates knowledge editing (KE) in large language models (LLMs). This edited knowledge should propagate and facilitate the deduction of new information based on existing model knowledge. We define the existing related knowledge in a LLM serving as the origination of knowledge propagation as “deduction anchors”. However, most of current KE approaches only operate on (subject, relation, object) triples. Both theoretically and empirically, we observe that this simplified setting often leads to uncertainty when determining the deduction anchors, causing low confidence in their responses. To mitigate this issue, we propose a novel task of event-based knowledge editing that pairs facts with event descriptions. This task manifests both as a closer simulation of real-world editing scenarios and a more logically sound setting, implicitly defining the deduction anchor and enabling LLMs to propagate knowledge confidently. We curate a new benchmark dataset EVEDIT derived from the COUNTERFACT dataset and validate its superiority in improving model confidence. Moreover, as we observe that the event-based setting is notably challenging for existing approaches, we propose a novel approach Self-Edit that showcases stronger performance, achieving 55.6% consistency improvement while maintaining the naturalness of generation.
AB - The dynamic nature of real-world information necessitates knowledge editing (KE) in large language models (LLMs). This edited knowledge should propagate and facilitate the deduction of new information based on existing model knowledge. We define the existing related knowledge in a LLM serving as the origination of knowledge propagation as “deduction anchors”. However, most of current KE approaches only operate on (subject, relation, object) triples. Both theoretically and empirically, we observe that this simplified setting often leads to uncertainty when determining the deduction anchors, causing low confidence in their responses. To mitigate this issue, we propose a novel task of event-based knowledge editing that pairs facts with event descriptions. This task manifests both as a closer simulation of real-world editing scenarios and a more logically sound setting, implicitly defining the deduction anchor and enabling LLMs to propagate knowledge confidently. We curate a new benchmark dataset EVEDIT derived from the COUNTERFACT dataset and validate its superiority in improving model confidence. Moreover, as we observe that the event-based setting is notably challenging for existing approaches, we propose a novel approach Self-Edit that showcases stronger performance, achieving 55.6% consistency improvement while maintaining the naturalness of generation.
UR - http://www.scopus.com/inward/record.url?scp=85217760398&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85217760398&partnerID=8YFLogxK
U2 - 10.18653/v1/2024.emnlp-main.282
DO - 10.18653/v1/2024.emnlp-main.282
M3 - Conference contribution
AN - SCOPUS:85217760398
T3 - EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
SP - 4907
EP - 4926
BT - EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
A2 - Al-Onaizan, Yaser
A2 - Bansal, Mohit
A2 - Chen, Yun-Nung
PB - Association for Computational Linguistics (ACL)
Y2 - 12 November 2024 through 16 November 2024
ER -