TY - GEN
T1 - SIR-ABSC
T2 - 2023 Findings of the Association for Computational Linguistics: EMNLP 2023
AU - Cho, Ikhyun
AU - Jung, Yoonhwa
AU - Hockenmaier, Julia
N1 - Publisher Copyright:
© 2023 Association for Computational Linguistics.
PY - 2023
Y1 - 2023
N2 - We present a simple, but effective method to incorporate syntactic dependency information directly into transformer-based language models (e.g. RoBERTa) for tasks such as Aspect-Based Sentiment Classification (ABSC), where the desired output depends on specific input tokens. In contrast to prior approaches to ABSC that capture syntax by combining language models with graph neural networks over dependency trees, our model, Syntax-Integrated RoBERTa for ABSC (SIR-ABSC) incorporates syntax directly into the language model by using a novel aggregator token. SIR-ABSC outperforms these more complex models, yielding new state-of-the-art results on ABSC.
AB - We present a simple, but effective method to incorporate syntactic dependency information directly into transformer-based language models (e.g. RoBERTa) for tasks such as Aspect-Based Sentiment Classification (ABSC), where the desired output depends on specific input tokens. In contrast to prior approaches to ABSC that capture syntax by combining language models with graph neural networks over dependency trees, our model, Syntax-Integrated RoBERTa for ABSC (SIR-ABSC) incorporates syntax directly into the language model by using a novel aggregator token. SIR-ABSC outperforms these more complex models, yielding new state-of-the-art results on ABSC.
UR - http://www.scopus.com/inward/record.url?scp=85183312099&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85183312099&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85183312099
T3 - Findings of the Association for Computational Linguistics: EMNLP 2023
SP - 8535
EP - 8550
BT - Findings of the Association for Computational Linguistics
PB - Association for Computational Linguistics (ACL)
Y2 - 6 December 2023 through 10 December 2023
ER -