TY - JOUR
T1 - Smooth convex approximation to the maximum eigenvalue function
AU - Chen, Xin
AU - Qi, Houduo
AU - Qi, Liqun
AU - Teo, Kok Lay
N1 - Funding Information:
We thank two anonymous referees for their detailed comments and constructive suggestions on the organization of the paper. In particular, we thank one referee for explicitly pointing out a straightforward extension to the more general case which is briefly discussed above and may be interesting to wider audience. This work was supported by the Research Grant Council of Hong Kong.
PY - 2004/11
Y1 - 2004/11
N2 - In this paper, we consider smooth convex approximations to the maximum eigenvalue function. To make it applicable to a wide class of applications, the study is conducted on the composite function of the maximum eigenvalue function and a linear operator mapping ℝ m to S n, the space of n-by-n symmetric matrices. The composite function in turn is the natural objective function of minimizing the maximum eigenvalue function over an affine space in S n. This leads to a sequence of smooth convex minimization problems governed by a smoothing parameter. As the parameter goes to zero, the original problem is recovered. We then develop a computable Hessian formula of the smooth convex functions, matrix representation of the Hessian, and study the regularity conditions which guarantee the nonsingularity of the Hessian matrices. The study on the well-posedness of the smooth convex function leads to a regularization method which is globally convergent.
AB - In this paper, we consider smooth convex approximations to the maximum eigenvalue function. To make it applicable to a wide class of applications, the study is conducted on the composite function of the maximum eigenvalue function and a linear operator mapping ℝ m to S n, the space of n-by-n symmetric matrices. The composite function in turn is the natural objective function of minimizing the maximum eigenvalue function over an affine space in S n. This leads to a sequence of smooth convex minimization problems governed by a smoothing parameter. As the parameter goes to zero, the original problem is recovered. We then develop a computable Hessian formula of the smooth convex functions, matrix representation of the Hessian, and study the regularity conditions which guarantee the nonsingularity of the Hessian matrices. The study on the well-posedness of the smooth convex function leads to a regularization method which is globally convergent.
KW - Matrix representation
KW - Spectral function
KW - Symmetric function
KW - Tikhonov regularization
UR - http://www.scopus.com/inward/record.url?scp=14644435032&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=14644435032&partnerID=8YFLogxK
U2 - 10.1007/s10898-004-8271-2
DO - 10.1007/s10898-004-8271-2
M3 - Article
AN - SCOPUS:14644435032
SN - 0925-5001
VL - 30
SP - 253
EP - 270
JO - Journal of Global Optimization
JF - Journal of Global Optimization
IS - 2
M1 - PIPS5118271
ER -