Unsupervised Word Embedding Learning by Incorporating Local and Global Contexts

Yu Meng, Jiaxin Huang, Guangyuan Wang, Zihan Wang, Chao Zhang, Jiawei Han

Research output: Contribution to journalArticlepeer-review

Abstract

Word embedding has benefited a broad spectrum of text analysis tasks by learning distributed word representations to encode word semantics. Word representations are typically learned by modeling local contexts of words, assuming that words sharing similar surrounding words are semantically close. We argue that local contexts can only partially define word semantics in the unsupervised word embedding learning. Global contexts, referring to the broader semantic units, such as the document or paragraph where the word appears, can capture different aspects of word semantics and complement local contexts. We propose two simple yet effective unsupervised word embedding models that jointly model both local and global contexts to learn word representations. We provide theoretical interpretations of the proposed models to demonstrate how local and global contexts are jointly modeled, assuming a generative relationship between words and contexts. We conduct a thorough evaluation on a wide range of benchmark datasets. Our quantitative analysis and case study show that despite their simplicity, our two proposed models achieve superior performance on word similarity and text classification tasks.

Original languageEnglish (US)
Article number9
JournalFrontiers in Big Data
Volume3
DOIs
StatePublished - Mar 11 2020

Keywords

  • global contexts
  • local contexts
  • unsupervised learning
  • word embedding
  • word semantics

ASJC Scopus subject areas

  • Computer Science (miscellaneous)
  • Artificial Intelligence
  • Information Systems

Fingerprint

Dive into the research topics of 'Unsupervised Word Embedding Learning by Incorporating Local and Global Contexts'. Together they form a unique fingerprint.

Cite this