10
v1v2 (latest)

Shy Guys: A Light-Weight Approach to Detecting Robots on Websites

Rémi Van Boxem
Tom Barbette
Cristel Pelsser
Ramin Sadre
Main:7 Pages
6 Figures
Bibliography:2 Pages
3 Tables
Appendix:1 Pages
Abstract

Automated bots now account for roughly half of all web requests, and an increasing number deliberately spoof their identity to either evade detection or to not respectthis http URL. Existing countermeasures are either resource-intensive (JavaScript challenges, CAPTCHAs), cost-prohibitive (commercial solutions), or degrade the user experience. This paper proposes a lightweight, passive approach to bot detection that combines user-agent string analysis with favicon-based heuristics, operating entirely on standard web server logs with no client-side interaction. We evaluate the method on over 4.6 million requests containing 54,945 unique user-agent strings collected from website hosted all around the earth. Our approach detects 67.7% of bot traffic while maintaining a false-positive rate of 3%, outperforming state of the art (less than 20%). This method can serve as a first line of defence, routing only genuinely ambiguous requests to active challenges and preserving the experience of legitimate users.

View on arXiv
Comments on this paper