So this is where an attacker profiles a target website, going through it and recording the document requests and the number and size of the data requests being returned. In other words something like "going to page X triggers Y separate requests of a particular size". The number and size of the resources requested are likely to differ between pages and the pattern of page progression will also indicate the page on the site, such as a user will typically follow a pattern of page visits because that is how the site is designed.
Clever enough stuff, but it does require that the site is already profiled, probably extensively and a few times... and no doubt regularly in case the site makes changes. This does limit the vector on this approach quite substantially.
The fix, of course, is to make either the page progression vary (pissing off users and making the website hard to use) or to vary the number and size of requests for each page in a site wide randomisation plan. If the website always produces, eg, 25 requests for each page and they are a consistent size then it'll be impossible to track the page progression.