Complementary dimension reduction

Na Cui, Jianjun Hu, Feng Liang

Research output: Contribution to journalArticlepeer-review


The goal of supervised dimension reduction (SDR) is to find a compact yet informative representation of the feature vector. Most SDR algorithms are formulated to solve sequential optimization problems with objective functions being linear functions of the L2 norm of the data, for example, the well-known Fisher's discriminant analysis (FDA). A drawback of such objective functions is that they favor directions that result in large between-class distances; however, if the large between-class distance is mainly from classes that have already been well separated by prior directions, the new direction leads to a negligible improvement over classification accuracy. To address this issue, we introduce an objective function that directly quantifies classification accuracy, and present an efficient algorithm that retrieves directions sequentially from this nonlinear objective function. A key feature of our algorithm is that each sequentially added direction works complementarily with the previous sequentially-solved directions to boost the discriminative power of the reduced space as a whole. So we name our new algorithm “Complementary Dimension Analysis” (CDA). We have further generalized CDA to retrieve sparse directions that involve only a small fraction of the features. Finally we demonstrate the utility of our algorithms on several simulated and real datasets.

Original languageEnglish (US)
Pages (from-to)31-40
Number of pages10
JournalStatistical Analysis and Data Mining
Issue number1
StatePublished - Feb 2021


  • dimension reduction
  • sparse eigen-decomposition

ASJC Scopus subject areas

  • Analysis
  • Information Systems
  • Computer Science Applications


Dive into the research topics of 'Complementary dimension reduction'. Together they form a unique fingerprint.

Cite this