Sparse transition matrix estimation for high-dimensional and locally stationary vector autoregressive models

Xin Ding, Ziyi Qiu, Xiaohui Chen

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the estimation of the transition matrix in the highdimensional time-varying vector autoregression (TV-VAR) models. Our model builds on a general class of locally stationary VAR processes that evolve smoothly in time. We propose a hybridized kernel smoothing and ℓ1- regularized method to directly estimate the sequence of time-varying transition matrices. Under the sparsity assumption on the transition matrix, we establish the rate of convergence of the proposed estimator and show that the convergence rate depends on the smoothness of the locally stationary VAR processes only through the smoothness of the transition matrix function. In addition, for our estimator followed by thresholding, we prove that the false positive rate (type I error) and false negative rate (type II error) in the pattern recovery can asymptotically vanish in the presence of weak signals without assuming the minimum nonzero signal strength condition. Favorable finite sample performances over the ℓ2-penalized least-squares estimator and the unstructured maximum likelihood estimator are shown on simulated data. We also provide two real examples on estimating the dependence structures on financial stock prices and economic exchange rates datasets.

Original languageEnglish (US)
Pages (from-to)3871-3902
Number of pages32
JournalElectronic Journal of Statistics
Volume11
Issue number2
DOIs
StatePublished - 2017

Keywords

  • High-dimension
  • Kernel smoothing
  • Locally stationary processes
  • Sparsity
  • Time-varying parameters
  • Vector autoregression

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Sparse transition matrix estimation for high-dimensional and locally stationary vector autoregressive models'. Together they form a unique fingerprint.

Cite this