TY - GEN
T1 - Optimal robust smoothing extragradient algorithms for stochastic variational inequality problems
AU - Yousefian, Farzad
AU - Nedic, Angelia
AU - Shanbhag, Uday V.
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014
Y1 - 2014
N2 - We consider stochastic variational inequality problems where the mapping is monotone over a compact convex set. We present two robust variants of stochastic extragradient algorithms for solving such problems. Of these, the first scheme employs an iterative averaging technique where we consider a generalized choice for the weights in the averaged sequence. Our first contribution is to show that using an appropriate choice for these weights, a suitably defined gap function attains the optimal rate of convergence of O(1 over √k). In the second part of the paper, under an additional assumption of weak-sharpness, we update the stepsize sequence using a recursive rule that leverages problem parameters. The second contribution lies in showing that employing such a sequence, the extragradient algorithm possesses almost-sure convergence to the solution as well as convergence in a mean-squared sense to the solution of the problem at the rate O(1 over k). Motivated by the absence of a Lipschitzian parameter, in both schemes we utilize a locally randomized smoothing scheme. Importantly, by approximating a smooth mapping, this scheme enables us to estimate the Lipschitzian parameter. The smoothing parameter is updated per iteration and we show convergence to the solution of the original problem in both algorithms.
AB - We consider stochastic variational inequality problems where the mapping is monotone over a compact convex set. We present two robust variants of stochastic extragradient algorithms for solving such problems. Of these, the first scheme employs an iterative averaging technique where we consider a generalized choice for the weights in the averaged sequence. Our first contribution is to show that using an appropriate choice for these weights, a suitably defined gap function attains the optimal rate of convergence of O(1 over √k). In the second part of the paper, under an additional assumption of weak-sharpness, we update the stepsize sequence using a recursive rule that leverages problem parameters. The second contribution lies in showing that employing such a sequence, the extragradient algorithm possesses almost-sure convergence to the solution as well as convergence in a mean-squared sense to the solution of the problem at the rate O(1 over k). Motivated by the absence of a Lipschitzian parameter, in both schemes we utilize a locally randomized smoothing scheme. Importantly, by approximating a smooth mapping, this scheme enables us to estimate the Lipschitzian parameter. The smoothing parameter is updated per iteration and we show convergence to the solution of the original problem in both algorithms.
UR - http://www.scopus.com/inward/record.url?scp=84988227345&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84988227345&partnerID=8YFLogxK
U2 - 10.1109/CDC.2014.7040302
DO - 10.1109/CDC.2014.7040302
M3 - Conference contribution
AN - SCOPUS:84988227345
T3 - Proceedings of the IEEE Conference on Decision and Control
SP - 5831
EP - 5836
BT - 53rd IEEE Conference on Decision and Control,CDC 2014
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2014 53rd IEEE Annual Conference on Decision and Control, CDC 2014
Y2 - 15 December 2014 through 17 December 2014
ER -