Fingerprinting HTTP smuggling

Overview

The Hypertext Transfer Protocol (HTTP) is the foundation of the World Wide Web, and is used to load webpages and other resources using hypertext links. When clicking on a link, the user’s communicates with a backend server to ask for a webpage. The user expects to be connecting directly to the server delivering the page but, more often than not, its HTTP request is handled first by some sort of proxy machine, either for caching, load balancing or security reasons.

This is an application layer proxy which will interpret the HTTP request and, possibly, modify it by inserting, removing, modifying some headers. It could also have to translate the request and, or, the response from one version of HTTP to another if the initial client and the final server do not speak the same version of the protocol (typically HTTP 2 vs. HTTP 1.1).

This phase exposes a fairly large attack surface and the process of abusing it is collectively know as ‘HTTP smuggling’.  In this research project, we aim at leveraging the architecture developed by Ilies Benhabbour in his PhD thesis (see Detecting semi active intra net components) in order to systematically detect the existence of such proxy and fingerprint them in order to test whether or not they are vulnerable to some attacks and, if yes, to which ones.

Contacts