On Kernel Methods for Relational Learning

Chad Cumby, Dan Roth

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Kernel methods have gained a great deal of popularity in the machine learning community as a method to learn indirectly in high-dimensional feature spaces. Those interested in relational learning have recently begun to cast learning from structured and relational data in terms of kernel operations. We describe a general family of kernel functions built up from a description language of limited expressivity and use it to study the benefits and drawbacks of kernel learning in relational domains. Learning with kernels in this family directly models learning over an expanded feature space constructed using the same description language. This allows us to examine issues of time complexity in terms of learning with these and other relational kernels, and how these relate to generalization ability. The tradeoffs between using kernels in a very high dimensional implicit space versus a restricted feature space, is highlighted through two experiments, in bioinformatics and in natural language processing.

Original languageEnglish (US)
Title of host publicationProceedings, Twentieth International Conference on Machine Learning
EditorsT. Fawcett, N. Mishra
Number of pages8
StatePublished - 2003
Externally publishedYes
EventProceedings, Twentieth International Conference on Machine Learning - Washington, DC, United States
Duration: Aug 21 2003Aug 24 2003

Publication series

NameProceedings, Twentieth International Conference on Machine Learning


OtherProceedings, Twentieth International Conference on Machine Learning
Country/TerritoryUnited States
CityWashington, DC

ASJC Scopus subject areas

  • General Engineering


Dive into the research topics of 'On Kernel Methods for Relational Learning'. Together they form a unique fingerprint.

Cite this