Abstract
In this paper we highlight a number of challenges that arise in using crawling to measure the size, topology, and dynamism of distributed botnets. These challenges include traffic due to unrelated applications, address aliasing, and other active participants on the network such as poisoners. Based upon experience developing a crawler for the Storm botnet, we describe each of the issues we encountered in practice, our approach for managing the underlying ambiguity, and the kind of errors we believe it introduces into our estimates.
Original language | English (US) |
---|---|
State | Published - 2008 |
Externally published | Yes |
Event | 1st USENIX Workshop on Large-Scale Exploits and Emergent Threats: Botnets, Spyware, Worms, and More, LEET 2008 - San Francisco, United States Duration: Apr 15 2008 → … |
Conference
Conference | 1st USENIX Workshop on Large-Scale Exploits and Emergent Threats: Botnets, Spyware, Worms, and More, LEET 2008 |
---|---|
Country/Territory | United States |
City | San Francisco |
Period | 4/15/08 → … |
ASJC Scopus subject areas
- Information Systems
- Artificial Intelligence
- Computer Science Applications