Sample Complexity for Distributionally Robust Learning under chi-square divergence
Zhengyu Zhou, Weiwei Liu; 24(230):1−27, 2023.
Abstract
This paper explores the sample complexity of learning a distributionally robust predictor when facing a specific distributional shift based on the $\\chi^2$-divergence. The $\\chi^2$-divergence is known for its computational feasibility and statistical properties. The research demonstrates that any hypothesis class $\\mathcal{H}$ with a finite VC dimension can be robustly learned in a distributional context. Additionally, it proves that when the perturbation size is smaller than a constant, having a finite VC dimension is also necessary for distributionally robust learning. This is accomplished by deriving a lower bound of sample complexity in relation to the VC dimension.
[abs]