Man this is exactly what I needed a year ago. We had to twiddle all kinds of knobs in jpeg compression to make images from an onboard camera fit into our very limited uplink budget. We still couldn't guarantee anything, because of the nature of jpeg. The compression target size feature is very important for actually handling images as scientific data in a constrained environment.
I recognised the artifacts at lower quality levels, thought "this looks like JPEG2000" and as expected, it's wavelet-based. A quick skim through the specifications shows that it is very similar to JPEG2000, but also much simplified.
> compressed size 69913, time taken: 0.054055<p>Converting the same image with bpgenc yields 19092 bytes. Probably not transmission errors tolerant though.
This is great. I will have a look later tonight for sure.<p>I used a NASA shape-from-shading algorithm as the basis of a Python script I wrote for Blender3D back in the early 2000s to turn photos into 3D bas reliefs for carving on my 4'x8' router table. I felt pleased to get something out of my tax dollar!<p>I was experimenting with wavelets to analyze EKG data (there's some cool stuff out there with this). I'll have to see what happens to a time series EKG data using compression!
This image compression algorithm is optimized for progressive download.<p>In the space, bandwidth is low, latency is high. A good progressive algorithm helps a lot.
i just learned about fountain codes about 2 seconds ago. i wonder how the trade offs compare between something like Raptor and this algorithm<p><a href="https://en.wikipedia.org/wiki/Raptor_code" rel="nofollow">https://en.wikipedia.org/wiki/Raptor_code</a>