Online learning with kernels: Overcoming the growing sum problem

Abhishek Singh, Narendra Ahuja, Pierre Moulin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Online kernel algorithms have an important computational drawback. The computational complexity of these algorithms grow linearly over time. This makes these algorithms difficult to use for real time signal processing applications that need to continuously process data over prolonged periods of time. In this paper, we present a way of overcoming this problem. We do so by approximating kernel evaluations using finite dimensional inner products in a randomized feature space. We apply this idea to the Kernel Least Mean Square (KLMS) algorithm, that has recently been proposed as a non-linear extension to the famed LMS algorithm. Our simulations show that using the proposed method, constant computational complexity can be achieved, with no observable loss in performance.

Original languageEnglish (US)
Title of host publication2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012
DOIs
StatePublished - Dec 12 2012
Event2012 22nd IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2012 - Santander, Spain
Duration: Sep 23 2012Sep 26 2012

Publication series

NameIEEE International Workshop on Machine Learning for Signal Processing, MLSP
ISSN (Print)2161-0363
ISSN (Electronic)2161-0371

Other

Other2012 22nd IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2012
CountrySpain
CitySantander
Period9/23/129/26/12

Fingerprint

Computational complexity
Signal processing

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Signal Processing

Cite this

Singh, A., Ahuja, N., & Moulin, P. (2012). Online learning with kernels: Overcoming the growing sum problem. In 2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012 [6349811] (IEEE International Workshop on Machine Learning for Signal Processing, MLSP). https://doi.org/10.1109/MLSP.2012.6349811

Online learning with kernels : Overcoming the growing sum problem. / Singh, Abhishek; Ahuja, Narendra; Moulin, Pierre.

2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012. 2012. 6349811 (IEEE International Workshop on Machine Learning for Signal Processing, MLSP).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Singh, A, Ahuja, N & Moulin, P 2012, Online learning with kernels: Overcoming the growing sum problem. in 2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012., 6349811, IEEE International Workshop on Machine Learning for Signal Processing, MLSP, 2012 22nd IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2012, Santander, Spain, 9/23/12. https://doi.org/10.1109/MLSP.2012.6349811
Singh A, Ahuja N, Moulin P. Online learning with kernels: Overcoming the growing sum problem. In 2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012. 2012. 6349811. (IEEE International Workshop on Machine Learning for Signal Processing, MLSP). https://doi.org/10.1109/MLSP.2012.6349811
Singh, Abhishek ; Ahuja, Narendra ; Moulin, Pierre. / Online learning with kernels : Overcoming the growing sum problem. 2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012. 2012. (IEEE International Workshop on Machine Learning for Signal Processing, MLSP).
@inproceedings{6fe6bfeae90740dc8a70252d047d7827,
title = "Online learning with kernels: Overcoming the growing sum problem",
abstract = "Online kernel algorithms have an important computational drawback. The computational complexity of these algorithms grow linearly over time. This makes these algorithms difficult to use for real time signal processing applications that need to continuously process data over prolonged periods of time. In this paper, we present a way of overcoming this problem. We do so by approximating kernel evaluations using finite dimensional inner products in a randomized feature space. We apply this idea to the Kernel Least Mean Square (KLMS) algorithm, that has recently been proposed as a non-linear extension to the famed LMS algorithm. Our simulations show that using the proposed method, constant computational complexity can be achieved, with no observable loss in performance.",
author = "Abhishek Singh and Narendra Ahuja and Pierre Moulin",
year = "2012",
month = "12",
day = "12",
doi = "10.1109/MLSP.2012.6349811",
language = "English (US)",
isbn = "9781467310260",
series = "IEEE International Workshop on Machine Learning for Signal Processing, MLSP",
booktitle = "2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012",

}

TY - GEN

T1 - Online learning with kernels

T2 - Overcoming the growing sum problem

AU - Singh, Abhishek

AU - Ahuja, Narendra

AU - Moulin, Pierre

PY - 2012/12/12

Y1 - 2012/12/12

N2 - Online kernel algorithms have an important computational drawback. The computational complexity of these algorithms grow linearly over time. This makes these algorithms difficult to use for real time signal processing applications that need to continuously process data over prolonged periods of time. In this paper, we present a way of overcoming this problem. We do so by approximating kernel evaluations using finite dimensional inner products in a randomized feature space. We apply this idea to the Kernel Least Mean Square (KLMS) algorithm, that has recently been proposed as a non-linear extension to the famed LMS algorithm. Our simulations show that using the proposed method, constant computational complexity can be achieved, with no observable loss in performance.

AB - Online kernel algorithms have an important computational drawback. The computational complexity of these algorithms grow linearly over time. This makes these algorithms difficult to use for real time signal processing applications that need to continuously process data over prolonged periods of time. In this paper, we present a way of overcoming this problem. We do so by approximating kernel evaluations using finite dimensional inner products in a randomized feature space. We apply this idea to the Kernel Least Mean Square (KLMS) algorithm, that has recently been proposed as a non-linear extension to the famed LMS algorithm. Our simulations show that using the proposed method, constant computational complexity can be achieved, with no observable loss in performance.

UR - http://www.scopus.com/inward/record.url?scp=84870706213&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84870706213&partnerID=8YFLogxK

U2 - 10.1109/MLSP.2012.6349811

DO - 10.1109/MLSP.2012.6349811

M3 - Conference contribution

AN - SCOPUS:84870706213

SN - 9781467310260

T3 - IEEE International Workshop on Machine Learning for Signal Processing, MLSP

BT - 2012 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2012

ER -