TY - GEN
T1 - Ontology Enrichment for Effective Fine-grained Entity Typing
AU - Ouyang, Siru
AU - Huang, Jiaxin
AU - Pillai, Pranav
AU - Zhang, Yunyi
AU - Zhang, Yu
AU - Han, Jiawei
N1 - Publisher Copyright:
© 2024 Copyright held by the owner/author(s).
PY - 2024/8/25
Y1 - 2024/8/25
N2 - Fine-grained entity typing (FET) is the task of identifying specific entity types at a fine-grained level for entity mentions based on their contextual information. Conventional methods for FET require extensive human annotation, which is time-consuming and costly given the massive scale of data. Recent studies have been developing weakly supervised or zero-shot approaches. We study the setting of zero-shot FET where only an ontology is provided. However, most existing ontology structures lack rich supporting information and even contain ambiguous relations, making them ineffective in guiding FET. Recently developed language models, though promising in various few-shot and zero-shot NLP tasks, may face challenges in zero-shot FET due to their lack of interaction with task-specific ontology. In this study, we propose øurs, where we (1) enrich each node in the ontology structure with two categories of extra information:instance information for training sample augmentation andtopic information to relate types with contexts, and (2) develop a coarse-to-fine typing algorithm that exploits the enriched information by training an entailment model with contrasting topics and instance-based augmented training samples. Our experiments show that øurs achieves high-quality fine-grained entity typing without human annotation, outperforming existing zero-shot methods by a large margin and rivaling supervised methods. øurs also enjoys strong transferability to unseen and finer-grained types. We will open source this work upon acceptance.
AB - Fine-grained entity typing (FET) is the task of identifying specific entity types at a fine-grained level for entity mentions based on their contextual information. Conventional methods for FET require extensive human annotation, which is time-consuming and costly given the massive scale of data. Recent studies have been developing weakly supervised or zero-shot approaches. We study the setting of zero-shot FET where only an ontology is provided. However, most existing ontology structures lack rich supporting information and even contain ambiguous relations, making them ineffective in guiding FET. Recently developed language models, though promising in various few-shot and zero-shot NLP tasks, may face challenges in zero-shot FET due to their lack of interaction with task-specific ontology. In this study, we propose øurs, where we (1) enrich each node in the ontology structure with two categories of extra information:instance information for training sample augmentation andtopic information to relate types with contexts, and (2) develop a coarse-to-fine typing algorithm that exploits the enriched information by training an entailment model with contrasting topics and instance-based augmented training samples. Our experiments show that øurs achieves high-quality fine-grained entity typing without human annotation, outperforming existing zero-shot methods by a large margin and rivaling supervised methods. øurs also enjoys strong transferability to unseen and finer-grained types. We will open source this work upon acceptance.
KW - fine-grained entity typing
KW - language models
KW - natural language inference
KW - zero-shot learning
UR - http://www.scopus.com/inward/record.url?scp=85203669221&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85203669221&partnerID=8YFLogxK
U2 - 10.1145/3637528.3671857
DO - 10.1145/3637528.3671857
M3 - Conference contribution
AN - SCOPUS:85203669221
T3 - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
SP - 2318
EP - 2327
BT - KDD 2024 - Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
PB - Association for Computing Machinery
T2 - 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2024
Y2 - 25 August 2024 through 29 August 2024
ER -