> Historically software was always low on memory, and subsequently sorting algorithms were developed that bent over backwards in order to use as little additional memory as possible. Nowadays most modern systems have several gigabytes of memory that are not used and are just sitting idle.<p>True, but on these systems, most of the time, users don't care about performance. And when they do, the dataset is not large enough for this kind of algorithms to shine anyway.
> Wolfsort requires 8 times the array size in swap memory for the O(n) partitioning process.<p>> Wolfsort is primarily a proof of concept, currently it only properly supports unsigned 32 bit integers.<p>A lot of sorts can beat a generic sort on an array of numbers by gathering statistics, paying better attention to locality, implementation details like use of SIMD or some combination of those three.
A bit OT: I very much appreciate well written C code for its simplicity and "rawness", but how come that many C codebases seem to place no importance on code comments?<p>This repo contains a few thousand lines of code without so much as a single comment. In my experience that tends to happen much less in other languages. And it's not like this C code is obvious.
It seems to me that it's using a strategy similar to SpreadSort, or am I wrong?<p><a href="https://en.wikipedia.org/wiki/Spreadsort" rel="nofollow">https://en.wikipedia.org/wiki/Spreadsort</a>
> Wolfsort is primarily a proof of concept, currently it only properly supports unsigned 32 bit integers.<p>Which makes it pretty useless outside of theoretical exercises(that is not saying that theoretical exercises would be useless). What would be a practical use case to sort integers? Well, the integers can be keys to some kind of records, but if your records remain unsorted accessing them is very expensive even if you have the sorted list of keys available.<p>Edit: Calculating the median is a practical case. Not sure how often that is needed in performance-critical contexts.