TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Hash collisions and exploitations (2019)

117 点作者 lnyan超过 1 年前

3 条评论

RcouF1uZ4gsC超过 1 年前
I personally find it amazing that everything we have made - every photograph, every book, every movie, and everything we will ever make can be given a unique 256 bit fingerprint that has a negligible chance of collision before the heat death of the universe.
评论 #39269046 未加载
评论 #39270660 未加载
jedisct1超过 1 年前
I still frequently see MD5 and SHA1 being used &quot;because the output is smaller than other hash functions and we only need a unique identifier&quot;. There&#x27;s also a belief that an implication is that these functions are faster.<p>While a 128-bit output is indeed perfectly fine for most applications, MD5 and SHA1, in addition to being affected by practical attacks, are <i>slow</i> compared to SHA256, BLAKE, etc. Most importantly, it&#x27;s perfectly fine to truncate the output of a cryptographic hash function to any length. So, if you need a 128-bit hash, just use SHA256 and truncate the output to 128 bits. This is faster than MD5 and more secure (even against length extension attacks).
评论 #39276982 未加载
randomstring超过 1 年前
I wonder if you could construct a hash collision for high pagerank sites in the google (or Bing) index. You would need to know what hash algorithm google uses to store URLs. This is assuming that they hash the URLs for their indexing. Which surely they do. MD5 and SHA1 existed when google was founded, but hash collisions weren&#x27;t a big concern until later IIRC. You&#x27;d want a fast algorithm because you&#x27;re having to run your hashing algorithm on every URL you encounter on every page, and that adds up quickly.<p>The max legal length of URLS is 2048, but I wouldn&#x27;t be surprised if there aren&#x27;t plenty of non-compliant URLs longer than that in the wild. If you were limited to 2048 characters, and a valid URL format, I suspect it would be hard if not impossible to build a URL with the same MD5 of an arbitrary high ranking URL like &quot;<a href="https:&#x2F;&#x2F;nytimes.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;nytimes.com&#x2F;</a>&quot; But what if you just wanted to piggy back the pagerank of any mid to high rank site? Is there a URL in the top million ranked URLs you could MD5 hash collide?<p>I doubt google would use a URL hash as strong and as slow as MD5. Maybe Page and Brin weren&#x27;t even thinking about cryptographic hashes, and just a good mixing function.
评论 #39272359 未加载
评论 #39274800 未加载
评论 #39272148 未加载