TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Fuzz me wrong – How QuickCheck destroyed my favourite theory

126 pointsby lrngjcbover 4 years ago

7 comments

Aissenover 4 years ago
There was a great introduction to QuickCheck and many other interesting concepts at 36c3: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=_2tK9G7rKQQ" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=_2tK9G7rKQQ</a><p>I&#x27;d recommend it if you don&#x27;t understand much of the article.
dreamcompilerover 4 years ago
To me this says the parallel library the author described isn&#x27;t as efficient as it could be because it&#x27;s enforcing an unexpected ordering constraint.
评论 #26115430 未加载
millstoneover 4 years ago
It seems like the monoid-thinking just obscures. Positive integers under addition are not a monoid (no identity) but you can still map-reduce their sum of squares. All that you really need is associativity, and associative operations are associative.
评论 #26118605 未加载
评论 #26125106 未加载
leblancfgover 4 years ago
That’s a great article! I have no background in either Haskell or CS but was captivated to read until the end. Great job!<p>Nit: I think this would read better if the author replaced “personal theory” with “hypothesis”.
评论 #26113986 未加载
dan-robertsonover 4 years ago
There seems to be a lot of work to get not very much. Maybe the point of this article is more about faffing around with quickcheck but the conclusion should basically be:<p>1. Parallel [efficient] MapReduce requires a commutative monoid for deterministic results<p>2. A naive implements of parallel MapReduce in haskell using par doesn’t because it will only do reduce operations in a certain order. Imagine if the map for every second element took 0s and the others took 10s. In normal mapreduce, you would reduce together the fast half of the results as they come in but in this haskell implementation you would need to sit on them and wait until the other adjacent elements come in. In the article’s very naive implementation, you can’t even reduce together the first two elements together until you’ve reduced everything after them (modulo some weirdness around lazy evaluation)<p>I think if one thinks of mapreduce as an operation on sets not lists, it should be obvious that the monoid must be commutative.
评论 #26113658 未加载
评论 #26113073 未加载
评论 #26113576 未加载
评论 #26113896 未加载
评论 #26113113 未加载
rmoreyover 4 years ago
oh, so not the convenience store...
tonethemanover 4 years ago
Another article that I am too stupid to understand... how do you get anything done in Haskell...
评论 #26113165 未加载
评论 #26113160 未加载
评论 #26112946 未加载
评论 #26112968 未加载
评论 #26112905 未加载
评论 #26113210 未加载
评论 #26113922 未加载
评论 #26113063 未加载