TY - GEN
T1 - Information complexity of black-box convex optimization
T2 - 2009 47th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2009
AU - Raginsky, Maxim
AU - Rakhlin, Alexander
PY - 2009
Y1 - 2009
N2 - This paper revisits information complexity of black-box convex optimization, first studied in the seminal work of Nemirovski and Yudin, from the perspective of feedback information theory. These days, large-scale convex programming arises in a variety of applications, and it is important to refine our understanding of its fundamental limitations. The goal of black-box convex optimization is to minimize an unknown convex objective function from a given class over a compact, convex domain using an iterative scheme that generates approximate solutions by querying an oracle for local information about the function being optimized. The information complexity of a given problem class is defined as the smallest number of queries needed to minimize every function in the class to some desired accuracy. We present a simple information-theoretic approach that not only recovers many of the results of Nemirovski and Yudin, but also gives some new bounds pertaining to optimal rates at which iterative convex optimization schemes approach the solution. As a bonus, we give a particularly simple derivation of the minimax lower bound for a certain active learning problem on the unit interval.
AB - This paper revisits information complexity of black-box convex optimization, first studied in the seminal work of Nemirovski and Yudin, from the perspective of feedback information theory. These days, large-scale convex programming arises in a variety of applications, and it is important to refine our understanding of its fundamental limitations. The goal of black-box convex optimization is to minimize an unknown convex objective function from a given class over a compact, convex domain using an iterative scheme that generates approximate solutions by querying an oracle for local information about the function being optimized. The information complexity of a given problem class is defined as the smallest number of queries needed to minimize every function in the class to some desired accuracy. We present a simple information-theoretic approach that not only recovers many of the results of Nemirovski and Yudin, but also gives some new bounds pertaining to optimal rates at which iterative convex optimization schemes approach the solution. As a bonus, we give a particularly simple derivation of the minimax lower bound for a certain active learning problem on the unit interval.
UR - http://www.scopus.com/inward/record.url?scp=77949573608&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77949573608&partnerID=8YFLogxK
U2 - 10.1109/ALLERTON.2009.5394945
DO - 10.1109/ALLERTON.2009.5394945
M3 - Conference contribution
AN - SCOPUS:77949573608
SN - 9781424458714
T3 - 2009 47th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2009
SP - 803
EP - 810
BT - 2009 47th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2009
Y2 - 30 September 2009 through 2 October 2009
ER -