TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Bucket Stream: Finding S3 Buckets by watching certificate transparency logs

122 pointsby Chris911over 7 years ago

9 comments

matt_wulfeckover 7 years ago
&gt;<i>Randomise your bucket names! There is no need to use company-backup.s3.amazonaws.com</i><p>This is really poor advice. It offers no real benefit, especially since any asset you access will betray your bucket name because it&#x27;s part of the DNS resolution. Bucket names are emphatically public as much as a DNS name is public.
评论 #15828919 未加载
评论 #15828488 未加载
评论 #15828425 未加载
feelin_googleyover 7 years ago
Could be more general: finding subdomains by watching CT logs.<p>So what is the problem here?<p>How to &quot;hide&quot; private subdomains?<p>How to &quot;securely&quot; configure S3 buckets?<p>IMO, the problem is in the use of the CA system, where control over &quot;names&quot; (e.g. subdomains) is shared with third parties (certificate issuers) instead of being solely with the user who wants to reserve names.<p>It is possible to have a non-CA PKI system where the user controls both the issuance of the public key <i>and</i> the associated name she will use. In such a system, no third party has control over names. People learn the user&#x27;s name and the user&#x27;s key from the same source: the user.<p>Thus there is no issue of trust re: using third parties, and thus no need for monitoring what names the third parties are issuing, e.g. via &quot;certificate transparency&quot; logs. CT logs do not need to exist.<p>This is not a new idea and it has been proven to work. I can prepare a post with examples if anyone is interested.
评论 #15827969 未加载
评论 #15828849 未加载
notyourworkover 7 years ago
&gt; Randomise your bucket names! There is no need to use company-backup.s3.amazonaws.com.<p>I don&#x27;t think this is a globally true statement. Random bucket names are hard, not everyone is using s3 with a code configuration and therefore remembering bucket name is actually important.
jstanleyover 7 years ago
Passive DNS might be another good way to get S3 bucket names.<p>There doesn&#x27;t seem to be a Wikipedia article on Passive DNS, but this article explains it quite well: <a href="https:&#x2F;&#x2F;help.passivetotal.org&#x2F;passive_dns.html" rel="nofollow">https:&#x2F;&#x2F;help.passivetotal.org&#x2F;passive_dns.html</a><p>Basically some resolvers submit all (some?) of their DNS query responses to a central database so that it can be searched later. It seems you can also install a passive &quot;sensor&quot; in your network that (presumably) passively MITMs DNS queries and then sends off the responses.<p>I don&#x27;t know how hard it is to get access to the data, but:<p>&gt; programs like RiskIQ&#x27;s DNSIQ allow organizations to install a sensor on their network that reports back to RiskIQ and in exchange, the organization gains access to all the passive DNS traffic inside the central repository.<p>EDIT: VirusTotal has some passive DNS data publicly available: e.g. look in &quot;observed subdomains&quot; <a href="https:&#x2F;&#x2F;www.virustotal.com&#x2F;en&#x2F;domain&#x2F;s3-us-west-2.amazonaws.com&#x2F;information&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.virustotal.com&#x2F;en&#x2F;domain&#x2F;s3-us-west-2.amazonaws....</a><p>EDIT2: And a bunch of them appear to be unprotected...
jcimsover 7 years ago
I did some analysis a few months ago and collected the names of approximately 100,000 buckets in the wild. Rough numbers, about 5% are open to the public for anonymous read, and about 5% of those are open for anonymous write.<p>I&#x27;m convinced that Chris Vickery, the guy behind a good many of the open bucket finds this year, has access to enterprise firewall&#x2F;proxy logs. Not because the buckets would have been hard to find, but because you could spend a lifetime looking through thousands upon thousands of open buckets before you find anything interesting.
kaiviover 7 years ago
Love stuff like that! I&#x27;ve quickly wrapped a prettifier for S3 xml listings in a userscript, so you can use it with Tampermonkey Beta. Tested on Chrome under OS X.<p><a href="https:&#x2F;&#x2F;gist.github.com&#x2F;kaivi&#x2F;8114cbc2080da78d67c94238af64210d" rel="nofollow">https:&#x2F;&#x2F;gist.github.com&#x2F;kaivi&#x2F;8114cbc2080da78d67c94238af6421...</a><p>Edit: Okay, the userscript won&#x27;t run on larger XML files, gotta figure it out later.
michaelbuckbeeover 7 years ago
This is concerning b&#x2F;c there have been a number of high profile data breaches that have occurred due to over reliance on S3 bucket obscurity. Where the buckets have been left with minimal or misconfigured permissions and GBs of data there for the downloading.
评论 #15828149 未加载
评论 #15828303 未加载
评论 #15831181 未加载
realusernameover 7 years ago
I was curious so I&#x27;ve tried if I could find anything compromising with it and it&#x27;s mostly just public buckets of some images used for websites so nothing strange. Maybe the README is a bit too dramatic.
ceejayozover 7 years ago
I&#x27;m confused. Aren&#x27;t S3 buckets secured by pre-existing wildcard certs?
评论 #15827028 未加载
评论 #15827646 未加载