Author of Precomp here. First of all, thanks for the attention, the sudden rise of the GitHub stats surprised me, but now I know the reason. So some comments from my side here, I'll answer some of the threads, feel free to ask questions.<p>The project has been around quite long. I started with it in 2006, but as I'm basically working on it alone (though it got better with the change to open source) and don't have much spare time (studying, work, father of two kids) updates are less frequent than I'd wish to.<p>The upside of this long time is that the program itself is quite stable, so e.g. it's hard to find data that leads to a crash or incorrect decompression that is not specially crafted for this purpose.<p>The biggest challenge at the moment is the codebase that is very monolithic (one big precomp.cpp) and mixes newer parts (e.g. the preflate integretation) with old C style code. On the plus side of things, the code is platform independent (in the branch, it even compiles on ARM) and compiling should be no problem using CMake.<p>Another thing missing because of not-much-time is the documentation. There's some basic information in the README and the program syntax reveals the meaning of most parameters, but there could be much more doc. Very much information can be found at the encode.su forum, but of course, this is very unstructured and often related to bugs, questions about the program/algorithm or problems on certain files.<p>That said, just throw your data at Precomp and see how it performs. Both ratio and duration heavily depends on what data is fed in, but since some of the supported streams like zLib/deflate or jpg are used everywhere, there are many (sometimes surprising) examples like APK packages and Linux Distribution images where it outperforms the usual compressors like 7-Zip. And last, but not least, the usual GitHub things apply: feel free to check out the existing issues, create new ones, play with the source code, fork it, create pull requests.