Sample complexity of sample average approximation for conditional stochastic optimization

YIFAN HU, XIN CHEN, NIAO HE

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we study a class of stochastic optimization problems, referred to as the conditional stochastic optimization (CSO), in the form of minx∈X Eξ fξ (Eη ξ [gη (x, ξ)]), which finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust learning, and causal inference. Assuming availability of samples from the distribution P(ξ) and samples from the conditional distribution P(η ξ), we establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from O (d/ 4) to O (d/ 3) when assuming smoothness of the outer function, and further to O (1/ 2) when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA when ξ and η are independent. Several numerical experiments further support our theoretical findings.

Original languageEnglish (US)
Pages (from-to)2103-2133
Number of pages31
JournalSIAM Journal on Optimization
Volume30
Issue number3
DOIs
StatePublished - 2020

Keywords

  • Large deviations theory
  • Sample average approximation
  • Stochastic optimization

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Sample complexity of sample average approximation for conditional stochastic optimization'. Together they form a unique fingerprint.

Cite this