TY - GEN
T1 - ManifoldBoost
T2 - 25th International Conference on Machine Learning
AU - Loeff, Nicolas
AU - Forsyth, David
AU - Ramachandran, Deepak
PY - 2008
Y1 - 2008
N2 - We describe a manifold learning framework that naturally accommodates supervised learning, partially supervised learning and unsupervised clustering as particular cases. Our method chooses a function by minimizing loss subject to a manifold regularization penalty. This augmented cost is minimized using a greedy, stagewise, functional minimization procedure, as in Gradientboost. Each stage of boosting is fast and efficient. We demonstrate our approach using both radial basis function approximations and trees. The performance of our method is at the state of the art on many standard semi-supervised learning benchmarks, and we produce results for large scale datasets.
AB - We describe a manifold learning framework that naturally accommodates supervised learning, partially supervised learning and unsupervised clustering as particular cases. Our method chooses a function by minimizing loss subject to a manifold regularization penalty. This augmented cost is minimized using a greedy, stagewise, functional minimization procedure, as in Gradientboost. Each stage of boosting is fast and efficient. We demonstrate our approach using both radial basis function approximations and trees. The performance of our method is at the state of the art on many standard semi-supervised learning benchmarks, and we produce results for large scale datasets.
UR - https://www.scopus.com/pages/publications/56449119811
UR - https://www.scopus.com/pages/publications/56449119811#tab=citedBy
M3 - Conference contribution
AN - SCOPUS:56449119811
SN - 9781605582054
T3 - Proceedings of the 25th International Conference on Machine Learning
SP - 600
EP - 607
BT - Proceedings of the 25th International Conference on Machine Learning
Y2 - 5 July 2008 through 9 July 2008
ER -