TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

First impressions of Leap Motion

192 pointsby liangzanover 12 years ago

27 comments

flixicover 12 years ago
I also own one and have started developing a 3D painting application with it. It's magical, like playing with a touchscreen was magical.<p>BTW, this post reveals a bit more information than Leap Motion would like developers to reveal. Essentially, OP broke the agreement.
评论 #5179880 未加载
评论 #5179497 未加载
DungFuover 12 years ago
I have had one way before they shipped out the SDK, and all I have to say is that it is quite a bit more janky than the hype lets it seem.<p>Fingers will disappear without notice when nothing all that crazy is happening and the frame rate of the device (which is speced at 120+ fps) is much closer to around 45-55 fps. This leads so some major problems with long term finger acquisition that has to be handled by the developer. Quite frustrating to do things yourself that should be handled by the SDK.<p>While I understand that this SDK batch is a "beta/alpha" test, it is much buggier than it should be. The SDK will hang the entire OS quite often, and there is simply no way to detect if the device is actually plugged in. It will report invalid finger data rather than telling you that no device exists.<p>And the javascript API is so new, that it is borderline useless. It doesn't even properly report finger width, which is kind of sad since that worked many versions ago.<p>Overall a cool device with lots of hype, but needs a lot more work to even be mildly useful for anything more than just simple gestures.
评论 #5179841 未加载
lifeisstillgoodover 12 years ago
My big question is really a paraphrase of Douglas Adams -<p><pre><code> Radio had advanced beyond touchscreen and into motion detection. It meant you could control the radio with minimal effort but had to sit annoyingly still if you wanted to keep listening to the same channel. </code></pre> I can see it working like the Kinect - really useful in a specific and narrow use case but there is a reason we use pens and not paint brushes for writing. Similarly this does not seem like a tool that is easy to use for say to day tasks.<p>If you have an informed (hands on) opinion to the contrary I would be very interested
评论 #5180037 未加载
评论 #5181401 未加载
评论 #5180110 未加载
评论 #5180233 未加载
candeiraover 12 years ago
I miss one of these everytime I'm cooking or doing other things that require me to get my hands dirty or unavailable. I'd love to have something like this so I can answer the phone (skype!), check out at recipes/howtos, or change playlists while elbow deep in batter, or while still holding a hot soldering gun and a fiddly piece of kit.<p>So, as much as Gorilla Arm would be a problem for everyday/all-day use of no-touch gestural interfaces, they are a great solution for existing problems.
评论 #5180204 未加载
guylhemover 12 years ago
Suggestion for the OP - read more about computer vision.<p>Extracting gestures is indeed a problem. Most of the approaches I know depend on a state triggered by the appearance of a new input (in the video, when you add or a remove a finger) and then work by doing a temporal sum of the movement to get a shape.<p>This of course introduce problems about how fast or how slow the person draws the shape in the air - unless you trigger that when a finger is added, a finger is removed (as explained before) <i>OR</i> when you have just successfully detected a gesture - I don't mean identified it, but a quick deceleration of the finger followed by a short immobilization of the finger can reset the "frame of reading".<p>You may or may not have successfully grasped what was before that shape, but then an human will usually stop and try again so you get to join the right "frame of reading"<p>I've done a little work (computer vision MA thesis) on using Gestalt perceptual grouping on 3d+t (video) imaging. The goal was automating sign language interpretation (especially when shapes are drawn in the air, something very popular with the French Sign Language - and therefore I suppose with the American Sign Language considering how close they are linguistically)<p>However we were far from that in 2003, and we used webcams only. A lot of work went to separate each finger - depending on many things on its relative position to other, ie at the extremity of the row you either have the index or the pinky, and you guess which one if you know which hand it is, and which side is facing the camera)<p>I don't think it is or even it was <i>that</i> innovative. I've stopped working on that, so I guess there must have been lot of new innovative approaches. So once again, go read more about computer vision. It's fascinating!<p>I'd be happy to send anyone a copy, but it's in french :-)
评论 #5179879 未加载
sbucciniover 12 years ago
I had the opportunity to get some hands on time with the Leap before Christmas. Leap Motion sponsored a hackathon at my school and brought in dev boards for everyone to borrow, although they were not the completed product like is pictured in this article.<p>I cannot tell you how incredible this product is. I'm a first year CS student, and I've never done anything even remotely close to gesture tracking before. But at the end of the night, I was able to play rock paper scissors with my computer. The API is that simple to use.<p>Yet, as mentioned, it's so incredibly accurate. One of the biggest bug we faced was even when we thought our hands were still, the device still registered imperceptible movements which were translated into false moves.<p>Overall it's a great product, especially for the price.
评论 #5179909 未加载
eofover 12 years ago
I also have one of these through the developer program. It is very neat, but it does not seem ready for prime time. When it is locked on to your fingers, it is <i>fast</i>; shockingly and amazingly fast and accurate. However, it drops fingers all the time; it almost never gets thumbs.<p>Even doing the calibration I could never going to all four corners of a 24inch screen. I suspect they will get it ironed out in the end, it does seem like AI/software issues rather than hardware issues.<p>I will say, that when it's working its really magical feeling. It feels accurate and like I am truly controlling something; but beyond that it didn't feel <i>nearly</i> robust enough for real world use.
nsounover 12 years ago
Very interesting! Let's pair this up with an Oculus Rift and call it a day.
评论 #5179554 未加载
评论 #5179989 未加载
aaron695over 12 years ago
What people never seem to get in the Minority Report is, it’s not cool because he used his hands.<p>It was cool because it had kick ass software behind it that could do all the work.<p>I see this continuously with things like glass/mirrors that can be touch screens etc<p>They look awesome because the (imaginary) software demoing it does awesome things.<p>Leap could be a cool device, but you’ll need to think outside the box to see how.<p>My fat ass is not going to wave at anything that it can do with a mouse let along the quicker speed at which we can type/shortcut/mouse compared to physical movement.<p>Personally if I was a developer I'd look at things totally new.
评论 #5182269 未加载
评论 #5180706 未加载
tocommentover 12 years ago
Here's my big question on the interfaces people are making for the leap motion:<p>Everyone seems to be trying to replicate iPad gestures in 3D, e.g., point your finger at something, drag something around by pointing.<p>How about instead we create a virtual hand that shows up in your screen and it mirrors the movements of your real hand and you use that to interact with objects on the screen?<p>I just think it would be awesome to be able to virtually reach into your screen and move things around! And it seems like it would be quite intuitive, no?<p>As some examples: I'm picturing moving your hand to the top of a window, make a grabbing motion and grab the top of a window and move it around. Grab the corner of a window to resize. You could even have the hand type on a virtual keyboard shown inside the screen. What do you guys think?
评论 #5182170 未加载
codexover 12 years ago
I'm excited about the Leap because I rapidly transition my hands from the keyboard to the mouse and back again. That takes a lot of time and the mouse is hard on my wrists. I would much rather "mouse in the air" even if I lost some precision. My productivity would soar.<p>Furthermore, my pinkies and thumbs take a beating pressing the command, control, and shift keys. I would much rather wave my thumb or pinky in a particular direction to get those modifiers. This may not be possible with the current Leap, but will no doubt be possible soon.
clebioover 12 years ago
Seems a fit for any sort of 3D CAD work. I'd like to pair it with unconed's MathBox (<a href="https://github.com/unconed/MathBox.js" rel="nofollow">https://github.com/unconed/MathBox.js</a>).
评论 #5179742 未加载
评论 #5179845 未加载
josh2600over 12 years ago
Super excited for this. We've got a couple coming to our office.<p>Can you talk a little bit about the construction of the unit? How does the craftsmanship look?
评论 #5179498 未加载
评论 #5179595 未加载
thedanielover 12 years ago
I have a developer unit and can confirm: it's legit. I don't want to go into too much detail because of the developer agreement, but think it is best characterized as like a "useful, portable, affordable, easy to integrate kinect". I'm having a lot of fun messing around with it, though I don't have much to share yet as other work has taken precedence. SOON.
Hovertruckover 12 years ago
I spent a week or so playing with a leap device recently, and had a great time developing a sort of Minority Report style interface for some display screens in our office recently. I did end up writing all of the gesture recognition from scratch which was a bit more difficult than I expected. :)
daralthusover 12 years ago
I guess you could use the acceleration in the gesture recognition as the start/stop signal. I will try it out tomorrow.
chewxyover 12 years ago
I was wondering if this is common with pre-launch product kits, but I have to sign in everytime to use the SDK. What am I doing wrong?<p>That said, I had a fun time writing stuff. Now, if only it had proper linux support (instead of Virtualbox hacky hacks that I have to use)
评论 #5209708 未加载
samstaveover 12 years ago
See, now this is something that would set the chromebook pixel apart: be the first to integrate this. Not brag about a touchscreen.
danellisover 12 years ago
Anyone know when this thing ships for real? I pre-ordered nearly six months ago. To be fair, they haven't taken any money yet.
评论 #5180950 未加载
ChuckMcMover 12 years ago
This is very encouraging, I so want this to be real since a 'touch screen' at a distance is something I have many uses for.
评论 #5180000 未加载
aidenover 12 years ago
What's the size of the sdk? How big would the download be (both from developers and users point-of-view).
px43over 12 years ago
When did they start shipping? I ordered mine last May and haven't heard a word from them since then.
sp4keover 12 years ago
Received mine yesterday, i was quite disappointed when i knew there's no Linux sdk !
julien_cover 12 years ago
Can you embed the Leap Motion JS driver in a webpage / browser extension ?
felixfurtakover 12 years ago
would be nice to see a hardware teardown to get an idea how it works
评论 #5179585 未加载
solarbunnyover 12 years ago
Tough times await those with Parkinson's disease...
评论 #5181303 未加载
baneover 12 years ago
How well does this work with multiple monitors?
评论 #5179642 未加载