Abstract
In this paper, we consider smooth convex approximations to the maximum eigenvalue function. To make it applicable to a wide class of applications, the study is conducted on the composite function of the maximum eigenvalue function and a linear operator mapping ℝ m to S n, the space of n-by-n symmetric matrices. The composite function in turn is the natural objective function of minimizing the maximum eigenvalue function over an affine space in S n. This leads to a sequence of smooth convex minimization problems governed by a smoothing parameter. As the parameter goes to zero, the original problem is recovered. We then develop a computable Hessian formula of the smooth convex functions, matrix representation of the Hessian, and study the regularity conditions which guarantee the nonsingularity of the Hessian matrices. The study on the well-posedness of the smooth convex function leads to a regularization method which is globally convergent.
Original language | English (US) |
---|---|
Article number | PIPS5118271 |
Pages (from-to) | 253-270 |
Number of pages | 18 |
Journal | Journal of Global Optimization |
Volume | 30 |
Issue number | 2 |
DOIs | |
State | Published - Nov 2004 |
Keywords
- Matrix representation
- Spectral function
- Symmetric function
- Tikhonov regularization
ASJC Scopus subject areas
- Control and Optimization
- Applied Mathematics
- Business, Management and Accounting (miscellaneous)
- Computer Science Applications
- Management Science and Operations Research