Joint adaptive loss and l2/l0-norm minimization for unsupervised feature selection

Mingjie Qian, Chengxiang Zhai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Unsupervised feature selection is a useful tool for reducing the complexity and improving the generalization performance of data mining tasks. In this paper, we propose an Adaptive Unsupervised Feature Selection (AUFS) algorithm with explicit l2/l0-norm minimization. We use a joint adaptive loss for data fitting and a l2/l0 minimization for feature selection. We solve the optimization problem with an efficient iterative algorithm and prove that all the expected properties of unsupervised feature selection can be preserved. We also show that the computational complexity and memory use is only linear to the number of instances and square to the number of clusters. Experiments show that our algorithm outperforms the state-of-the-arts on seven different benchmark data sets.

Original languageEnglish (US)
Title of host publication2015 International Joint Conference on Neural Networks, IJCNN 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781479919604, 9781479919604, 9781479919604, 9781479919604
DOIs
StatePublished - Sep 28 2015
EventInternational Joint Conference on Neural Networks, IJCNN 2015 - Killarney, Ireland
Duration: Jul 12 2015Jul 17 2015

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2015-September

Other

OtherInternational Joint Conference on Neural Networks, IJCNN 2015
CountryIreland
CityKillarney
Period7/12/157/17/15

Keywords

  • Laplace equations
  • Optimization

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Joint adaptive loss and l<sub>2</sub>/l<sub>0</sub>-norm minimization for unsupervised feature selection'. Together they form a unique fingerprint.

Cite this