Sample complexity of sample average approximation for conditional stochastic optimization


Research output: Contribution to journalArticlepeer-review


In this paper, we study a class of stochastic optimization problems, referred to as the conditional stochastic optimization (CSO), in the form of minx∈X Eξ fξ (Eη ξ [gη (x, ξ)]), which finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust learning, and causal inference. Assuming availability of samples from the distribution P(ξ) and samples from the conditional distribution P(η ξ), we establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from O (d/ 4) to O (d/ 3) when assuming smoothness of the outer function, and further to O (1/ 2) when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA when ξ and η are independent. Several numerical experiments further support our theoretical findings.

Original languageEnglish (US)
Pages (from-to)2103-2133
Number of pages31
JournalSIAM Journal on Optimization
Issue number3
StatePublished - 2020


  • Large deviations theory
  • Sample average approximation
  • Stochastic optimization

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science


Dive into the research topics of 'Sample complexity of sample average approximation for conditional stochastic optimization'. Together they form a unique fingerprint.

Cite this