Self-Normalization for Time Series: A Review of Recent Developments

Research output: Contribution to journalReview articlepeer-review

Abstract

This article reviews some recent developments on the inference of time series data using the self-normalized approach. We aim to provide a detailed discussion about the use of self-normalization in different contexts and highlight distinctive feature associated with each problem and connections among these recent developments. The topics covered include: confidence interval construction for a parameter in a weakly dependent stationary time series setting, change point detection in the mean, robust inference in regression models with weakly dependent errors, inference for nonparametric time series regression, inference for long memory time series, locally stationary time series and near-integrated time series, change point detection, and two-sample inference for functional time series, as well as the use of self-normalization for spatial data and spatial-temporal data. Some new variations of the self-normalized approach are also introduced with additional simulation results. We also provide a brief review of related inferential methods, such as blockwise empirical likelihood and subsampling, which were recently developed under the fixed-b asymptotic framework. We conclude the article with a summary of merits and limitations of self-normalization in the time series context and potential topics for future investigation.

Original languageEnglish (US)
Pages (from-to)1797-1817
Number of pages21
JournalJournal of the American Statistical Association
Volume110
Issue number512
DOIs
StatePublished - Oct 2 2015

Keywords

  • Dependence
  • Inference
  • Locally stationary
  • Long memory
  • Resampling
  • Studentization

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Self-Normalization for Time Series: A Review of Recent Developments'. Together they form a unique fingerprint.

Cite this