Projected stochastic primal-dual method for constrained online learning with kernels

Alec Koppel, Kaiqing Zhang, Hao Zhu, Tamer Basar

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the problem of stochastic optimization with nonlinear constraints, where the decision variable is not vector-valued but instead a function belonging to a reproducing Kernel Hilbert Space (RKHS). Currently, there exist solutions to only special cases of this problem. To solve this constrained problem with kernels, we first generalize the Representer Theorem to a class of saddle-point problems defined over RKHS. Then, we develop a primal-dual method which that executes alternating projected primal/dual stochastic gradient descent/ascent on the dual-augmented Lagrangian of the problem. The primal projection sets are low-dimensional subspaces of the ambient function space, which are greedily constructed using matching pursuit. By tuning the projection-induced error to the algorithm step-size, we are able to establish mean convergence in both primal objective sub-optimality and constraint violation, to respective O(√T) and O}(T3/4) neighborhoods. Here, T is the final iteration index and the constant step-size is chosen as 1/√T with 1/T approximation budget. Finally, we demonstrate experimentally the effectiveness of the proposed method for risk-aware supervised learning.

Original languageEnglish (US)
Article number8678800
Pages (from-to)2528-2542
Number of pages15
JournalIEEE Transactions on Signal Processing
Volume67
Issue number10
DOIs
StatePublished - May 15 2019
Externally publishedYes

Keywords

  • Machine learning
  • optimization methods
  • radial basis function networks

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Projected stochastic primal-dual method for constrained online learning with kernels'. Together they form a unique fingerprint.

Cite this