Abstract

A common problem that arises in adaptive filtering, autoregressive modeling, or linear prediction is the selection of an appropriate order for the underlying linear parametric model. We address this problem for linear prediction, but instead of fixing a specific model order, we develop a sequential prediction algorithm whose sequentially accumulated average squared prediction error for any bounded individual sequence is as good as the performance attainable by the best sequential linear predictor of order less than some M. This predictor is found by transforming linear prediction into a problem analogous to the sequential probability assignment problem from universal coding theory. The resulting universal predictor uses essentially a performance-weighted average of all predictors for model orders less than M. Efficient lattice filters are used to generate the predictions of all the models recursively, resulting in a complexity of the universal algorithm that is no larger than that of the largest model order. Examples of prediction performance are provided for autoregressive and speech data as well as an example of adaptive data equalization.

Original languageEnglish (US)
Pages (from-to)2685-2699
Number of pages15
JournalIEEE Transactions on Signal Processing
Volume47
Issue number10
DOIs
StatePublished - 1999

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Universal linear prediction by model order weighting'. Together they form a unique fingerprint.

Cite this