TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations at Twitter

Xinyang Zhang, Yury Malkov, Omar Florez, Serim Park, Brian McWilliams, Jiawei Han, Ahmed El-Kishky

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Fingerprint

Dive into the research topics of 'TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations at Twitter'. Together they form a unique fingerprint.

Keyphrases

Computer Science