Doubly sparsifying network

Zhangyang Wang, Shuai Huang, Jiayu Zhou, Thomas S. Huang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose the doubly sparsifying network (DSN), by drawing inspirations from the double sparsity model for dictionary learning. DSN emphasizes the joint utilization of both the problem structure and the parameter structure. It simultaneously sparsifies the output features and the learned model parameters, under one unified framework. DSN enjoys intuitive model interpretation, compact model size and low complexity. We compare DSN against a few carefully-designed baselines, and verify its consistently superior performance in a wide range of settings. Encouraged by its robustness to insufficient training data, we explore the applicability of DSN in brain signal processing that has been a challenging interdisciplinary area. DSN is evaluated for two mainstream tasks: electroencephalographic (EEG) signal classification and blood oxygenation level dependent (BOLD) response prediction, and achieves promising results in both cases.

Original languageEnglish (US)
Title of host publication26th International Joint Conference on Artificial Intelligence, IJCAI 2017
EditorsCarles Sierra
PublisherInternational Joint Conferences on Artificial Intelligence
Pages3020-3026
Number of pages7
ISBN (Electronic)9780999241103
DOIs
StatePublished - 2017
Event26th International Joint Conference on Artificial Intelligence, IJCAI 2017 - Melbourne, Australia
Duration: Aug 19 2017Aug 25 2017

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
Volume0
ISSN (Print)1045-0823

Other

Other26th International Joint Conference on Artificial Intelligence, IJCAI 2017
Country/TerritoryAustralia
CityMelbourne
Period8/19/178/25/17

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Doubly sparsifying network'. Together they form a unique fingerprint.

Cite this