TY - JOUR
T1 - Scalable algorithms for locally low-rank matrix modeling
AU - Gu, Qilong
AU - Trzasko, Joshua D.
AU - Banerjee, Arindam
N1 - Funding Information:
The research was supported by NSF grants IIS-1563950, IIS-1447566, IIS-1447574, IIS-1422557, CCF-1451986, CNS- 1314560, IIS-0953274, IIS-1029711, CCF:CIF:Small:1318347, NASA grant NNX12AQ39A, ?Mayo Clinic Discovery Translation Program?, and gifts from Adobe, IBM, and Yahoo.
Publisher Copyright:
© 2019, Springer-Verlag London Ltd., part of Springer Nature.
Copyright:
Copyright 2019 Elsevier B.V., All rights reserved.
PY - 2019/12/1
Y1 - 2019/12/1
N2 - We consider the problem of modeling data matrices with locally low-rank (LLR) structure, a generalization of the popular low-rank structure widely used in a variety of real-world application domains ranging from medical imaging to recommendation systems. While LLR modeling has been found to be promising in real-world application domains, limited progress has been made on the design of scalable algorithms for such structures. In this paper, we consider a convex relaxation of LLR structure and propose an efficient algorithm based on dual projected gradient descent (D-PGD) for computing the proximal operator. While the original problem is non-smooth, so that primal (sub)gradient algorithms will be slow, we show that the proposed D-PGD algorithm has geometrical convergence rate. We present several practical ways to further speed up the computations, including acceleration and approximate SVD computations. With experiments on both synthetic and real data from MRI (magnetic resonance imaging) denoising, we illustrate the superior performance of the proposed D-PGD algorithm compared to several baselines.
AB - We consider the problem of modeling data matrices with locally low-rank (LLR) structure, a generalization of the popular low-rank structure widely used in a variety of real-world application domains ranging from medical imaging to recommendation systems. While LLR modeling has been found to be promising in real-world application domains, limited progress has been made on the design of scalable algorithms for such structures. In this paper, we consider a convex relaxation of LLR structure and propose an efficient algorithm based on dual projected gradient descent (D-PGD) for computing the proximal operator. While the original problem is non-smooth, so that primal (sub)gradient algorithms will be slow, we show that the proposed D-PGD algorithm has geometrical convergence rate. We present several practical ways to further speed up the computations, including acceleration and approximate SVD computations. With experiments on both synthetic and real data from MRI (magnetic resonance imaging) denoising, we illustrate the superior performance of the proposed D-PGD algorithm compared to several baselines.
KW - Geometrical convergence
KW - Locally low rank
KW - MRI
KW - Projected gradient descent
UR - http://www.scopus.com/inward/record.url?scp=85061209240&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85061209240&partnerID=8YFLogxK
U2 - 10.1007/s10115-018-1320-9
DO - 10.1007/s10115-018-1320-9
M3 - Article
AN - SCOPUS:85061209240
SN - 0219-1377
VL - 61
SP - 1457
EP - 1484
JO - Knowledge and Information Systems
JF - Knowledge and Information Systems
IS - 3
ER -