Minimax Nonparametric Two-Sample Test under Adversarial Losses

Rong Tang, Yun Yang

Research output: Contribution to journalConference articlepeer-review

Abstract

In this paper, we consider the problem of two-sample hypothesis testing that aims at detecting the difference between two probability densities based on finite samples. The proposed test statistic is constructed by first truncating a sample version of a negative Besov norm and then normalizing it. Here, the negative Besov norm is the norm associated with a Besov space with negative exponent, and is shown to be closely related to a class of commonly used adversarial losses (or integral probability metrics) with smooth discriminators. Theoretically, we characterize the optimal detection boundary of two-sample testing in terms of the dimensionalities and smoothness levels of the underlying densities and the discriminator class defining the adversarial loss. We also show that the proposed approach can simultaneously attain the optimal detection boundary under many common adversarial losses, including those induced by the ℓ1, ℓ2 distances and Wasserstein distances. Our numerical experiments show that the proposed test procedure tends to exhibit higher power and robustness in difference detection than existing state-of-the-art competitors.

Original languageEnglish (US)
Pages (from-to)6132-6165
Number of pages34
JournalProceedings of Machine Learning Research
Volume206
StatePublished - 2023
Event26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain
Duration: Apr 25 2023Apr 27 2023

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Minimax Nonparametric Two-Sample Test under Adversarial Losses'. Together they form a unique fingerprint.

Cite this