Universal linear prediction over parameters and model orders

Andrew C. Singer, Meir Feder

Research output: Contribution to journalArticlepeer-review


A common problem that arises in adaptive filtering, autoregressive modeling, or linear prediction is the selection of an appropriate order for the underlying linear parametric model. We address this problem and develop a sequential algorithm that has a sequentially accumulated mean squared prediction error that is as good as any linear predictor of order less than some M, where the parameters may he tuned to the data. The linear prediction problem is transformed into one of sequential probability assignment from universal coding theory. In this context, we prove that the algorithm achieves this performance uniformly for every individual sequence at the cost of a model redundancy term, which is at most proportional to n 1 ln(M ) and a parameter redundancy term which is proportional to n 1 ln(ra), where n is the length of the data. Universal prediction and equalization algorithms that use a performanceweighted average of all model orders less than M are presented. Efficient lattice filters are used to generate all of the models recursively, resulting in a complexity of the universal algorithm that is no larger than that of the largest model order. Examples of prediction performance are provided for autoregressive and speech data as well as an example of adaptive data equalization.

Original languageEnglish (US)
Number of pages1
JournalIEEE Transactions on Signal Processing
Issue number2
StatePublished - 1998
Externally publishedYes

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Universal linear prediction over parameters and model orders'. Together they form a unique fingerprint.

Cite this