TY - JOUR
T1 - Sample complexity of sample average approximation for conditional stochastic optimization
AU - HU, YIFAN
AU - CHEN, XIN
AU - HE, NIAO
N1 - Publisher Copyright:
© 2020 Society for Industrial and Applied Mathematics.
PY - 2020
Y1 - 2020
N2 - In this paper, we study a class of stochastic optimization problems, referred to as the conditional stochastic optimization (CSO), in the form of minx∈X Eξ fξ (Eη ξ [gη (x, ξ)]), which finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust learning, and causal inference. Assuming availability of samples from the distribution P(ξ) and samples from the conditional distribution P(η ξ), we establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from O (d/ 4) to O (d/ 3) when assuming smoothness of the outer function, and further to O (1/ 2) when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA when ξ and η are independent. Several numerical experiments further support our theoretical findings.
AB - In this paper, we study a class of stochastic optimization problems, referred to as the conditional stochastic optimization (CSO), in the form of minx∈X Eξ fξ (Eη ξ [gη (x, ξ)]), which finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust learning, and causal inference. Assuming availability of samples from the distribution P(ξ) and samples from the conditional distribution P(η ξ), we establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from O (d/ 4) to O (d/ 3) when assuming smoothness of the outer function, and further to O (1/ 2) when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA when ξ and η are independent. Several numerical experiments further support our theoretical findings.
KW - Large deviations theory
KW - Sample average approximation
KW - Stochastic optimization
UR - http://www.scopus.com/inward/record.url?scp=85090937150&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85090937150&partnerID=8YFLogxK
U2 - 10.1137/19M1284865
DO - 10.1137/19M1284865
M3 - Article
AN - SCOPUS:85090937150
SN - 1052-6234
VL - 30
SP - 2103
EP - 2133
JO - SIAM Journal on Optimization
JF - SIAM Journal on Optimization
IS - 3
ER -