TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Luma updates its video camera app for iOS, adding cinematic image stabilization

70 点作者 alexkcd大约 12 年前

13 条评论

cosbynator大约 12 年前
I'm completely biased in this (I know the founders), but this is the real deal. I've recorded shaky, all over the place video handholding my iPhone on a bike and it came out like this: <a href="http://luma.io/v/B2-" rel="nofollow">http://luma.io/v/B2-</a><p>It is pretty neat stuff.
评论 #5282172 未加载
networked大约 12 年前
Digital image stabilization is a fascinating subject. On one hand, the core idea is simple (average out motion over time in some way then pan and zoom the image to compensate for deviations from the average); on the other, it's not easy to get right since outside of random jitter some camera motion, even rapid, can be what the camera operator wants.<p>Here's an example of an actual algorithm used for digital IS: <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.148.6856&#38;rep=rep1&#38;type=pdf" rel="nofollow">http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.148...</a>.
rubberbandage大约 12 年前
I’ve used this app since it was called SteadiCam (before that name was pulled for copyright reasons, of course), and there’s really nothing else like it around—long ago I thought of applying 6-axis gyroscope readings in reverse to video, but this dev has done that and more better than I ever could. It’s a fantastic application of amazing technology and crazy math. Major kudos!
codex大约 12 年前
If you're clever, you can do this kind of postprocessing without a big CPU hit on mobile phones. Phones compress video in hardware, and as part of that process the hardware looks for blocks of video that are roughly the same across frames. When it finds one, it attaches a pointer indicating where it should go in the next frame.<p>Taken in aggregate, all of these pointers in the compressed data stream effectively show you which way the image "shook" relative to the previous image (and how far it shook) saving you CPU cycles to determine this yourself. You can even detect rotation. So all you need to do now is compensate for the shake by rewriting the compressed stream, "panning" (and perhaps rotating) in the opposite direction of the shake. In order to have room to pan, you need to emit a smaller rectangle than the original video.<p>This isn't sufficient for advanced stabilization, but it's a quick first pass.<p>I took a stab at writing this for the iPhone in 2010, but by that time the writing was clearly on the wall: Apple was soon going to offer this functionality in hardware (and they do, on the iPhone 4S and 5), and they would only do a better job with every phone refresh. The only way one can hope to compete is to perform global optimizations across the entire video clip that the hardware encoder can't do (e.g. dynamic programming), or else apply fancier transforms which are so CPU intense they kill the mobile experience. Good job on the part of the developers; the video looks great. As iPhone GPUs get more powerful stabilization algorithms will only get better.<p>One business angle here is to give the app. away for free and charge a dollar per video to deshake clips as a web service in the cloud.
评论 #5284524 未加载
hellopat大约 12 年前
Very very cool stuff. I downloaded the app and took this (not the ideal stress test, but the panning is incredibly smooth): <a href="http://luma.io/v/CId" rel="nofollow">http://luma.io/v/CId</a>
评论 #5282387 未加载
eclipxe大约 12 年前
This is really impressive. I'd love for a similar Android product.
评论 #5282288 未加载
kybernetyk大约 12 年前
Hmm, the comparison videos are interesting. The one with IS turned on reminds me somehow of a first person shooter video game while the other video feels more 'real'.<p>Maybe game developers could introduce more shake into their games to make them feel more real?<p>Other than that: The tech is great. The videos certainly get a cinematic feeling with that kind of image stabilization.
评论 #5282790 未加载
评论 #5284401 未加载
koudelka大约 12 年前
How different is this to AVFoundation's stabilization?<p>There's an example of it in action here, but it requires a login: <a href="https://developer.apple.com/videos/wwdc/2012/?include=520#520" rel="nofollow">https://developer.apple.com/videos/wwdc/2012/?include=520#52...</a>
评论 #5284121 未加载
joshschreuder大约 12 年前
Here's a better way of comparing the two identical videos. It's an even clearer improvement when viewing side-by-side: <a href="http://goo.gl/xoMgQ" rel="nofollow">http://goo.gl/xoMgQ</a>
softgrow大约 12 年前
If you want to try it here is an iTunes link <a href="http://www.itunes.com/apps/luma-camera" rel="nofollow">http://www.itunes.com/apps/luma-camera</a>
diziet大约 12 年前
Very awesome tech! I'm really impressed -- what is the loss in terms of frames/frame size? Is data or missing parts of a shaky frame generated based on previous shots?<p>Also, another thing that goes with shaky videos is bad or stuttering sound. If you guys handle both, I can easily see you becoming the go-to solution for filming on cell cameras.
duncans大约 12 年前
Very nice. It's missing using the volume buttons as a record button though.
评论 #5282475 未加载
barista大约 12 年前
How does this compare to the optical image stabilization on Nokia Lumia 920?
评论 #5284978 未加载