Abstract
Recent advances in Multimedia research have generated a large collection of concept models, e.g., LSCOM and Media mill 101, which become accessible to other researchers. While most current research effort still focuses on building new concepts from scratch, little effort has been made on constructing new concepts upon the existing models already in the warehouse. To address this issue, we develop a new framework in this paper, termed LEGO, to seamlessly integrate both the new target training examples and the existing primitive concept models. LEGO treats the primitive concept models as a lego toy to potentially construct an unlimited vocabulary of new concepts. Specifically, LEGO first formulates the logic operations to be the lego connectors to combine existing concept models hierarchically in probabilistic logic ontology trees. LEGO then simultaneously incorporates new target training information to efficiently disambiguate the underlying logic tree and correct the error propagation. We present extensive experimental results on a large vehicle domain data set from Image Net, and demonstrate significantly superior performance over existing state-of-the-art approaches which build new concept models from scratch.
| Original language | English (US) |
|---|---|
| Article number | 6729585 |
| Pages (from-to) | 979-984 |
| Number of pages | 6 |
| Journal | Proceedings - IEEE International Conference on Data Mining, ICDM |
| DOIs | |
| State | Published - 2013 |
| Event | 13th IEEE International Conference on Data Mining, ICDM 2013 - Dallas, TX, United States Duration: Dec 7 2013 → Dec 10 2013 |
Keywords
- Concept recycling
- Logical operations
- Model warehouse
- Multimedia LEGO
- Probabilistic logic ontology tree
ASJC Scopus subject areas
- General Engineering
Fingerprint
Dive into the research topics of 'Multimedia LEGO: LEarning structured model by probabilistic loGic Ontology tree'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS