Model selection for correlated data with diverging number of parameters

Hyunkeun Cho, Annie Qu

Research output: Contribution to journalArticlepeer-review

Abstract

High-dimensional longitudinal data arise frequently in biomedical and genomic research. It is important to select relevant covariates when the dimension of the parameters diverges as the sample size increases. We propose the penalized quadratic inference function to perform model selection and estimation simultaneously in the framework of a diverging number of regression parameters. The penalized quadratic inference function can easily take correlation information from clustered data into account, yet it does not require specifying the likelihood function. This is advantageous compared to existing model selection methods for discrete data with large cluster size. In addition, the proposed approach enjoys the oracle property; it is able to identify non-zero components consistently with probability tending to 1, and any finite linear combination of the estimated non-zero components has an asymptotic normal distribution. We propose an efficient algorithm by selecting an effective tuning parameter to solve the penalized quadratic inference function. Monte Carlo simulation studies have the proposed method selecting the correct model with a high frequency and estimating covariate effects accurately even when the dimension of parameters is high. We illustrate the proposed approach by analyzing periodontal disease data.

Original languageEnglish (US)
Pages (from-to)901-927
Number of pages27
JournalStatistica Sinica
Volume23
Issue number2
DOIs
StatePublished - Apr 2013
Externally publishedYes

Keywords

  • Diverging number of parameters
  • Longitudinal data
  • Model selection
  • Oracle property
  • Quadratic inference function
  • SCAD

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Model selection for correlated data with diverging number of parameters'. Together they form a unique fingerprint.

Cite this