ManifoldBoost: Stagewise function approximation for fully-, semi-and un-supervised learning

Nicolas Loeff, David Alexander Forsyth, Deepak Ramachandran

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We describe a manifold learning framework that naturally accommodates supervised learning, partially supervised learning and unsupervised clustering as particular cases. Our method chooses a function by minimizing loss subject to a manifold regularization penalty. This augmented cost is minimized using a greedy, stagewise, functional minimization procedure, as in Gradientboost. Each stage of boosting is fast and efficient. We demonstrate our approach using both radial basis function approximations and trees. The performance of our method is at the state of the art on many standard semi-supervised learning benchmarks, and we produce results for large scale datasets.

Original languageEnglish (US)
Title of host publicationProceedings of the 25th International Conference on Machine Learning
Pages600-607
Number of pages8
StatePublished - 2008
Event25th International Conference on Machine Learning - Helsinki, Finland
Duration: Jul 5 2008Jul 9 2008

Other

Other25th International Conference on Machine Learning
CountryFinland
CityHelsinki
Period7/5/087/9/08

Fingerprint

Supervised learning
Costs

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Software

Cite this

Loeff, N., Forsyth, D. A., & Ramachandran, D. (2008). ManifoldBoost: Stagewise function approximation for fully-, semi-and un-supervised learning. In Proceedings of the 25th International Conference on Machine Learning (pp. 600-607)

ManifoldBoost : Stagewise function approximation for fully-, semi-and un-supervised learning. / Loeff, Nicolas; Forsyth, David Alexander; Ramachandran, Deepak.

Proceedings of the 25th International Conference on Machine Learning. 2008. p. 600-607.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Loeff, N, Forsyth, DA & Ramachandran, D 2008, ManifoldBoost: Stagewise function approximation for fully-, semi-and un-supervised learning. in Proceedings of the 25th International Conference on Machine Learning. pp. 600-607, 25th International Conference on Machine Learning, Helsinki, Finland, 7/5/08.
Loeff N, Forsyth DA, Ramachandran D. ManifoldBoost: Stagewise function approximation for fully-, semi-and un-supervised learning. In Proceedings of the 25th International Conference on Machine Learning. 2008. p. 600-607
Loeff, Nicolas ; Forsyth, David Alexander ; Ramachandran, Deepak. / ManifoldBoost : Stagewise function approximation for fully-, semi-and un-supervised learning. Proceedings of the 25th International Conference on Machine Learning. 2008. pp. 600-607
@inproceedings{fd04e52a08d94318a363699e2578cc6c,
title = "ManifoldBoost: Stagewise function approximation for fully-, semi-and un-supervised learning",
abstract = "We describe a manifold learning framework that naturally accommodates supervised learning, partially supervised learning and unsupervised clustering as particular cases. Our method chooses a function by minimizing loss subject to a manifold regularization penalty. This augmented cost is minimized using a greedy, stagewise, functional minimization procedure, as in Gradientboost. Each stage of boosting is fast and efficient. We demonstrate our approach using both radial basis function approximations and trees. The performance of our method is at the state of the art on many standard semi-supervised learning benchmarks, and we produce results for large scale datasets.",
author = "Nicolas Loeff and Forsyth, {David Alexander} and Deepak Ramachandran",
year = "2008",
language = "English (US)",
isbn = "9781605582054",
pages = "600--607",
booktitle = "Proceedings of the 25th International Conference on Machine Learning",

}

TY - GEN

T1 - ManifoldBoost

T2 - Stagewise function approximation for fully-, semi-and un-supervised learning

AU - Loeff, Nicolas

AU - Forsyth, David Alexander

AU - Ramachandran, Deepak

PY - 2008

Y1 - 2008

N2 - We describe a manifold learning framework that naturally accommodates supervised learning, partially supervised learning and unsupervised clustering as particular cases. Our method chooses a function by minimizing loss subject to a manifold regularization penalty. This augmented cost is minimized using a greedy, stagewise, functional minimization procedure, as in Gradientboost. Each stage of boosting is fast and efficient. We demonstrate our approach using both radial basis function approximations and trees. The performance of our method is at the state of the art on many standard semi-supervised learning benchmarks, and we produce results for large scale datasets.

AB - We describe a manifold learning framework that naturally accommodates supervised learning, partially supervised learning and unsupervised clustering as particular cases. Our method chooses a function by minimizing loss subject to a manifold regularization penalty. This augmented cost is minimized using a greedy, stagewise, functional minimization procedure, as in Gradientboost. Each stage of boosting is fast and efficient. We demonstrate our approach using both radial basis function approximations and trees. The performance of our method is at the state of the art on many standard semi-supervised learning benchmarks, and we produce results for large scale datasets.

UR - http://www.scopus.com/inward/record.url?scp=56449119811&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=56449119811&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:56449119811

SN - 9781605582054

SP - 600

EP - 607

BT - Proceedings of the 25th International Conference on Machine Learning

ER -