TY - GEN

T1 - Universal outlying sequence detection for continuous observations

AU - Bu, Yuheng

AU - Zou, Shaofeng

AU - Liang, Yingbin

AU - Veeravalli, Venugopal V.

PY - 2016/5/18

Y1 - 2016/5/18

N2 - The following detection problem is studied, in which there are M sequences of samples out of which one outlier sequence needs to be detected. Each typical sequence contains n independent and identically distributed (i.i.d.) continuous observations from a known distribution π, and the outlier sequence contains n i.i.d. observations from an outlier distribution μ, which is distinct from n, but otherwise unknown. A universal test based on Kullback-Leibler (KL) divergence is built to approximate the maximum likelihood test, with known π and unknown μ. A KL divergence estimator based on data-dependent partitions is employed, and is shown to converge to its true value exponentially fast when the density ratio satisfies 0 < Kl ≤ dμ/dπ ≤ K2, where K1 and K2 are positive constants. The performance of such a KL divergence estimator further implies that the outlier detection test is exponentially consistent. The detection performance of the KL divergence based test is compared with that of a recently introduced test for this problem based on the machine learning approach of maximum mean discrepancy (MMD). Regimes in which the KL divergence based test is better than the MMD based test are identified.

AB - The following detection problem is studied, in which there are M sequences of samples out of which one outlier sequence needs to be detected. Each typical sequence contains n independent and identically distributed (i.i.d.) continuous observations from a known distribution π, and the outlier sequence contains n i.i.d. observations from an outlier distribution μ, which is distinct from n, but otherwise unknown. A universal test based on Kullback-Leibler (KL) divergence is built to approximate the maximum likelihood test, with known π and unknown μ. A KL divergence estimator based on data-dependent partitions is employed, and is shown to converge to its true value exponentially fast when the density ratio satisfies 0 < Kl ≤ dμ/dπ ≤ K2, where K1 and K2 are positive constants. The performance of such a KL divergence estimator further implies that the outlier detection test is exponentially consistent. The detection performance of the KL divergence based test is compared with that of a recently introduced test for this problem based on the machine learning approach of maximum mean discrepancy (MMD). Regimes in which the KL divergence based test is better than the MMD based test are identified.

KW - Kullback-Leibler divergence

KW - maximum mean discrepancy

KW - outlier hypothesis testing

KW - universal exponential consistency

UR - http://www.scopus.com/inward/record.url?scp=84973333563&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84973333563&partnerID=8YFLogxK

U2 - 10.1109/ICASSP.2016.7472479

DO - 10.1109/ICASSP.2016.7472479

M3 - Conference contribution

AN - SCOPUS:84973333563

T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings

SP - 4254

EP - 4258

BT - 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016

Y2 - 20 March 2016 through 25 March 2016

ER -