Abstract

Online kernel algorithms have an important computational drawback. The computational complexity of these algorithms grow linearly over time. This makes these algorithms difficult to use for real time signal processing applications that need to continuously process data over prolonged periods of time. In this paper, we present a way of overcoming this problem. We do so by approximating kernel evaluations using finite dimensional inner products in a randomized feature space. We apply this idea to the Kernel Least Mean Square (KLMS) algorithm, that has recently been proposed as a non-linear extension to the famed LMS algorithm. Our simulations show that using the proposed method, constant computational complexity can be achieved, with no observable loss in performance.

Original languageEnglish (US)
Title of host publication2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012
DOIs
StatePublished - 2012
Event2012 22nd IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2012 - Santander, Spain
Duration: Sep 23 2012Sep 26 2012

Publication series

NameIEEE International Workshop on Machine Learning for Signal Processing, MLSP
ISSN (Print)2161-0363
ISSN (Electronic)2161-0371

Other

Other2012 22nd IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2012
Country/TerritorySpain
CitySantander
Period9/23/129/26/12

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Signal Processing

Fingerprint

Dive into the research topics of 'Online learning with kernels: Overcoming the growing sum problem'. Together they form a unique fingerprint.

Cite this