Boosting web image search by co-ranking

Jingrui He, Changshui Zhang, Nanyuan Zhao, Hanghang Tong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

To maximally improve the precision among top-ranked images returned by a web image search engine without putting extra burden on the user, we propose in this paper a novel co-ranking framework which will re-rank the retrieved images to move the irrelevant ones to the tail of the list. The characteristic of the proposed framework can be summarized as follows: (1) making use of the decisions from multi-view of images to boost retrieval performance; (2) generalizing present multi-view algorithms which need labeled data for initialization to the unsupervised case so that no extra interaction is required. To implement the framework, we use one-class support vector machines to train the basic learner, and propose different schemes for combination. Experimental results demonstrate the effectiveness of the proposed framework.

Original languageEnglish (US)
Title of host publication2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05 - Proceedings - Image and Multidimensional Signal Processing Multimedia Signal Processing
PagesII409-II412
DOIs
StatePublished - Dec 1 2005
Externally publishedYes
Event2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05 - Philadelphia, PA, United States
Duration: Mar 18 2005Mar 23 2005

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
VolumeII
ISSN (Print)1520-6149

Other

Other2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05
CountryUnited States
CityPhiladelphia, PA
Period3/18/053/23/05

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Boosting web image search by co-ranking'. Together they form a unique fingerprint.

Cite this