I'm completely biased in this (I know the founders), but this is the real deal. I've recorded shaky, all over the place video handholding my iPhone on a bike and it came out like this: <a href="http://luma.io/v/B2-" rel="nofollow">http://luma.io/v/B2-</a><p>It is pretty neat stuff.
Digital image stabilization is a fascinating subject. On one hand, the core idea is simple (average out motion over time in some way then pan and zoom the image to compensate for deviations from the average); on the other, it's not easy to get right since outside of random jitter some camera motion, even rapid, can be what the camera operator wants.<p>Here's an example of an actual algorithm used for digital IS: <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.148.6856&rep=rep1&type=pdf" rel="nofollow">http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.148...</a>.
I’ve used this app since it was called SteadiCam (before that name was pulled for copyright reasons, of course), and there’s really nothing else like it around—long ago I thought of applying 6-axis gyroscope readings in reverse to video, but this dev has done that and more better than I ever could. It’s a fantastic application of amazing technology and crazy math. Major kudos!
If you're clever, you can do this kind of postprocessing without a big CPU hit on mobile phones. Phones compress video in hardware, and as part of that process the hardware looks for blocks of video that are roughly the same across frames. When it finds one, it attaches a pointer indicating where it should go in the next frame.<p>Taken in aggregate, all of these pointers in the compressed data stream effectively show you which way the image "shook" relative to the previous image (and how far it shook) saving you CPU cycles to determine this yourself. You can even detect rotation. So all you need to do now is compensate for the shake by rewriting the compressed stream, "panning" (and perhaps rotating) in the opposite direction of the shake. In order to have room to pan, you need to emit a smaller rectangle than the original video.<p>This isn't sufficient for advanced stabilization, but it's a quick first pass.<p>I took a stab at writing this for the iPhone in 2010, but by that time the writing was clearly on the wall: Apple was soon going to offer this functionality in hardware (and they do, on the iPhone 4S and 5), and they would only do a better job with every phone refresh. The only way one can hope to compete is to perform global optimizations across the entire video clip that the hardware encoder can't do (e.g. dynamic programming), or else apply fancier transforms which are so CPU intense they kill the mobile experience. Good job on the part of the developers; the video looks great. As iPhone GPUs get more powerful stabilization algorithms will only get better.<p>One business angle here is to give the app. away for free and charge a dollar per video to deshake clips as a web service in the cloud.
Very very cool stuff. I downloaded the app and took this (not the ideal stress test, but the panning is incredibly smooth): <a href="http://luma.io/v/CId" rel="nofollow">http://luma.io/v/CId</a>
Hmm, the comparison videos are interesting. The one with IS turned on reminds me somehow of a first person shooter video game while the other video feels more 'real'.<p>Maybe game developers could introduce more shake into their games to make them feel more real?<p>Other than that: The tech is great. The videos certainly get a cinematic feeling with that kind of image stabilization.
How different is this to AVFoundation's stabilization?<p>There's an example of it in action here, but it requires a login: <a href="https://developer.apple.com/videos/wwdc/2012/?include=520#520" rel="nofollow">https://developer.apple.com/videos/wwdc/2012/?include=520#52...</a>
Here's a better way of comparing the two identical videos. It's an even clearer improvement when viewing side-by-side:
<a href="http://goo.gl/xoMgQ" rel="nofollow">http://goo.gl/xoMgQ</a>
If you want to try it here is an iTunes link <a href="http://www.itunes.com/apps/luma-camera" rel="nofollow">http://www.itunes.com/apps/luma-camera</a>
Very awesome tech! I'm really impressed -- what is the loss in terms of frames/frame size? Is data or missing parts of a shaky frame generated based on previous shots?<p>Also, another thing that goes with shaky videos is bad or stuttering sound. If you guys handle both, I can easily see you becoming the go-to solution for filming on cell cameras.