Model selection with covariance matching based non-negative lasso

Arash Owrang, Yoram Bresler, Magnus Jansson

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the problem of model selection for high-dimensional linear regressions in the context of support recovery with multiple measurement vectors available. Here, we assume that the regression coefficient vectors have a common support and the elements of the additive noise vector are potentially correlated. Accordingly, to estimate the support, we propose a non-negative Lasso estimator that is based on covariance matching techniques. We provide deterministic conditions under which the support estimate of our method is guaranteed to match the true support. Further, we use the extended Fisher information criterion to select the tuning parameter in our non-negative Lasso. We also prove that the extended Fisher information criterion can find the true support with probability one as the number of rows in the design matrix grows to infinity. The numerical simulations confirm that our support estimate is asymptotically consistent. Finally, the simulations also show that the proposed method is robust to high correlation between columns of the design matrix.

Original languageEnglish (US)
Article number107431
JournalSignal Processing
Volume170
DOIs
StatePublished - May 2020

Keywords

  • Covariance matching
  • Extended Bayesian information criterion
  • Generalized least squares
  • High-dimensional inference
  • Model selection
  • Non-negative lasso
  • Regularization
  • Sparse multiple measurement vector model

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Model selection with covariance matching based non-negative lasso'. Together they form a unique fingerprint.

Cite this