Unsupervised feature selection is a useful tool for reducing the complexity and improving the generalization performance of data mining tasks. In this paper, we propose an Adaptive Unsupervised Feature Selection (AUFS) algorithm with explicit l2/l0-norm minimization. We use a joint adaptive loss for data fitting and a l2/l0 minimization for feature selection. We solve the optimization problem with an efficient iterative algorithm and prove that all the expected properties of unsupervised feature selection can be preserved. We also show that the computational complexity and memory use is only linear to the number of instances and square to the number of clusters. Experiments show that our algorithm outperforms the state-of-the-arts on seven different benchmark data sets.
|Title of host publication
|2015 International Joint Conference on Neural Networks, IJCNN 2015
|Institute of Electrical and Electronics Engineers Inc.
|9781479919604, 9781479919604, 9781479919604, 9781479919604
|Published - Sep 28 2015
|International Joint Conference on Neural Networks, IJCNN 2015 - Killarney, Ireland
Duration: Jul 12 2015 → Jul 17 2015
|Proceedings of the International Joint Conference on Neural Networks
|International Joint Conference on Neural Networks, IJCNN 2015
|7/12/15 → 7/17/15
- Laplace equations
ASJC Scopus subject areas
- Artificial Intelligence