TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Mutable Value Chains

28 点作者 c-rack将近 10 年前

5 条评论

gardnr将近 10 年前
It sounds smart. Just swap out the sha1 for public-key signatures and rename the &quot;value chain&quot; to &quot;block chain&quot; and Bob&#x27;s your uncle.<p>No but seriously: &quot;SHA which I have chosen in this article has 160 bits, so after more than 2^160 entries have been put into the DHT collisions will occur&quot;<p>Collisions are largely related to the content that you are hashing. If you are hashing sequential 160 bit chunks of data then Joe&#x27;s math is theoretically correct. If you are hashing DVD ISO images then you can use math to predict why you should never use sha1.<p>If you needed to cram it into 160 bits for some hardware reason then you could implement some form of write failure recovery to redistribute writes over them remaining available key space but this would get slower and slower the less and less space you have unless you implement some kind of free space map.
评论 #9748019 未加载
amelius将近 10 年前
&gt; I’ve been wondering how to do this for years, and suddenly it occurred to me how to do this. One way that doesn’t work is to view the sequence of values as some kind of linked list, where each value has an invisible write-once-only pointer to the next value. The write-once-pointer gets updated when the value is updated, unfortunately this is difficult to implement and there are problems with consistency.<p>Why?<p>&gt; The solution is blindingly simple (once you see it) - we represent the sequence as a set Key-Value pairs where the Keys are taken from an iterated sequence of SHA1 checksums. The first key is a random number, thereafter Key[i] = sha(Key[i-1]).<p>Why not simply use Key[0] = 0, and Key[i] = Key[i-1]+1, or, in other words Key[i] = i?
评论 #9744673 未加载
contravariant将近 10 年前
&gt;If the sequence of X’s is huge then multiple workers can be started tell them to start computing values at different points in the value chain.<p>How? All the keys need to be calculated sequentially which, depending on the application, can take a significant amount of time. Worse this time grows linearly with the number of cores, so massively parallel calculations with several hundreds, thousands, of cores will become increasingly inefficient.
kushti将近 10 年前
So Datomic over KV storage or DHT? Could be useful, in case of a very efficient implementation.
pjc50将近 10 年前
Part CRDT, part blockchain.