This privacy risk is actually documented in the official HSTS specification, section 16.9 of<p><a href="https://www.rfc-editor.org/rfc/rfc6797.txt" rel="nofollow">https://www.rfc-editor.org/rfc/rfc6797.txt</a><p>However, the spec doesn't propose a mitigation for it. I'm afraid many new security policy mechanisms can actually be used to track users or devices this way, because you can experiment to see whether the browser has heard about a particular security policy by observing its behavior when you ask it to violate the policy. If you tell different devices about different policies, their behavior will be different (as if you told different kids who were going to visit a park about different rules for how to behave in the park, and then observed who obeyed and who violated which rules as a way of identifying individual kids).<p>For example, you can also get tracking out of public key pinning, by selectively pinning certs for some subdomains and not others, and then seeing which subresources are successfully loaded when you present a huge number of pin violations. (I think that's also documented in the HPKP spec.)
Firefox stores HSTS entries in a SQLite database, which you can query by running:<p><pre><code> echo "SELECT * FROM moz_hosts WHERE type='sts/use';" | sqlite3 permissions.sqlite
</code></pre>
from inside your profile directory.<p>To clear HSTS entries (which the "Clear recent history" UI does <i>not</i> delete), you can do:<p><pre><code> echo "DELETE FROM moz_hosts WHERE type='sts/use';" | sqlite3 permissions.sqlite
</code></pre>
I've been periodically monitoring this database for HSTS supercookies over the last couple years and have yet to see any in the wild.
> The impact is that it's possible for a site to track you even if you choose to use "incognito" or "private" browsing features in an effort to avoid such tracking.<p>I've always thought that (despite user hopes) the point of 'private' browsing was explicitly and only to avoid leaving traces on the <i>user's</i> computer anyway. (For example, I used it when shopping for Christmas presents.) The Firefox new private window has a warning to this effect:<p>> While this computer won't have a record of your browsing history, your employer or internet service provider can still track the pages you visit.
>However, unlike cookies, existing HSTS flags are still shared with sites when using "incognito" or "private" windows.<p>fwiw, though Firefox is listed in there as "leaks across private mode", I get an entirely new ID when I open a private window. v34.0.5
This has been known for quite a while. I managed to find a case where HSTS allowed information leakage between private/non-private frames within the same browser in Firefox, but I think that's been fixed.<p>In general, the browser vendors seem to think that HSTS is worth the potential privacy leak. I've also heard some people say they're monitoring to see if anyone does it and will respond if it becomes a problem.
I'm actually pretty irritated that this researcher makes it out as an iOS thing only, it feels like he/she just didn't care to try on anything other than the device they had in front of them.<p>Chrome on Android behaves the same way the researcher described (fingerprinting works in Incognito tabs), but Chrome, Opera, Firefox, and IE on Windows all get different IDs.
There's a nice survey paper from 2012 that lists dozens of supercookie vectors, including HSTS.<p><a href="https://cyberlaw.stanford.edu/files/publication/files/trackingsurvey12.pdf" rel="nofollow">https://cyberlaw.stanford.edu/files/publication/files/tracki...</a><p>FTA: "A website can encode a globally unique pseudonymous device identifier into any stateful web technology so long as it persists at least log2 n bits, where n is the number of Internet-connected devices (presently roughly 5 billion, requiring 33 bits)."
I'm not quite sure if I wouldn't _expect_ the incognito mode to respect HSTS. I'd think that you would use incognito mode for ~sensitive~ tasks.<p>Defaulting to https due to a known HSTS flag seems good in this case, otherwise every incognito session would start out blank, right? (I'm ignoring the white list from the browser vendor)
It sounds like HTTPS Everywhere is overlapping functionality with HSTS. Is there some way that HTTPS Everywhere could just inject HSTS rules rather than looking up every URL and rewriting it before sending a request?
It seems like the main use case for HSTS is with the site being requested by the user in the URI bar, for protecting cookies and login credentials associated with that domain.<p>It does not seem like there's a major use-case for secondary resources: images, css, javascript, etc loaded on the page itself, and which serve as the vector in this attack. Such resources must be requested via https on a https site itself anyways.<p>So, wouldn't it be better to just restrict the usage of HSTS protocol overrides to just the main domain being requested by the user in the URI bar?
Am I right to understand that this can even be a server side cookie, i.e. that it can't even be killed by disabling javascript (since the server can tell if there was a redirect)?
Wow, can't get it to go away on Firefox 33.0 Ubuntu. I was able to clear it by manually deleting the info from the permissions.sqlite3 database as described by agwa.<p>Very clever!
imho it's not reasonable to perform dozens of http requests in order to create a device fingerprint. especially on mobile networks this will require a lot of time until all requests are through.