The Heisenbot uncertainty problem: Challenges in separating bots from chaff

Chris Kanich, Kirill Levchenko, Brandon Enright, Geoffrey M. Voelker, Stefan Savage

Research output: Contribution to conferencePaperpeer-review

Abstract

In this paper we highlight a number of challenges that arise in using crawling to measure the size, topology, and dynamism of distributed botnets. These challenges include traffic due to unrelated applications, address aliasing, and other active participants on the network such as poisoners. Based upon experience developing a crawler for the Storm botnet, we describe each of the issues we encountered in practice, our approach for managing the underlying ambiguity, and the kind of errors we believe it introduces into our estimates.

Original languageEnglish (US)
StatePublished - 2008
Externally publishedYes
Event1st USENIX Workshop on Large-Scale Exploits and Emergent Threats: Botnets, Spyware, Worms, and More, LEET 2008 - San Francisco, United States
Duration: Apr 15 2008 → …

Conference

Conference1st USENIX Workshop on Large-Scale Exploits and Emergent Threats: Botnets, Spyware, Worms, and More, LEET 2008
CountryUnited States
CitySan Francisco
Period4/15/08 → …

ASJC Scopus subject areas

  • Information Systems
  • Artificial Intelligence
  • Computer Science Applications

Fingerprint Dive into the research topics of 'The Heisenbot uncertainty problem: Challenges in separating bots from chaff'. Together they form a unique fingerprint.

Cite this