The demo really sells it here [1]. It's amazingly intuitive and easy to use, it should be a part of video-conferencing software.<p>[1] <a href="https://handtracking.io/draw_demo/" rel="nofollow">https://handtracking.io/draw_demo/</a>
This is a GREAT website, I can understand what it does with zero clicks, zero scrolls.<p>Really great, congratulations, I hope that I can find a way to apply this lesson to my SaaS.
I've been working on a couple of chording keyboard designs and was thinking I might be able to create a virtual keyboard using this library. It would be nice to also be able to recognize the hand from the back. A keyboard would also obviously be necessary to track two hands at a time.<p>How does the application deal with different skin-tones?
Was wondering how easy it'd be to port to native mobile, so went looking for the source code, but doesn't appear to actually be open source. The meat is distributed as binary (WASM for "backend" code and a .bin for model weights).<p>Aside from being a cool hand tracker, it's a very clever way to distribute closed source JavaScript packages.
An "undo" gesture seems necessary, it was a bit too easy to accidentally wipe the screen. Aside from that, this is fantastic! Love to see what WASM is enabling these days on the web.
Hi, I'm not sure if you've looked into this or not, but another area that is interested in this sort of thing and might be very excited is musical gesture recognition.
What would be nice is a version that can be used to paint on the screen with your fingers, such that the lines are visible on a remotely shared screen. The use-case is marking up/highlighting on a normal desktop monitor (i.e. non-touch) while screen-sharing, which is awkward using a mouse or touchpad (think circling stuff in source code and documents, drawing arrows etc.). That would mean (a) a camera from behind (facing the screen), so that the fingers can touch (or almost touch) the screen (i.e. be co-located to the screen contents you want to markup), and (b) native integration, so that the painting is done on a transparent always-on-top OS window (so that it's picked up by the screen-sharing software); or just as a native pointing device, since such on-screen painting/diagramming software already exists.
This looks great! Recently I've been wanting to make a hand-tracking library for video editing. I'd make a hand gesture like an OK with my index and thumb to begin recording, and when I was done I'd make a thumbs up to keep the take or thumbs down to delete a bad take. That way, I could very easily record stuff while only keeping the good takes, to sort out later.<p>Hell, the library could even stitch the takes together, omitting the times when my hand started/finished doing the gestures.
This reminds me of TAFFI [0], a pinching gesture recognition algorithm that is surprisingly easy to implement with classical computer vision techniques.<p>[0] <a href="https://www.microsoft.com/en-us/research/publication/robust-computer-vision-based-detection-pinching-one-two-handed-gesture-input/" rel="nofollow">https://www.microsoft.com/en-us/research/publication/robust-...</a>
Bit of feedback: the home page is pretty sparse. The video is great, but it wasn't obvious how to find the repo or where to get the package (or even what language it can be used with). I had to open the Demo, wait for it to load, and then click the Github link there, and then the readme told me it was available on NPM.<p>Otherwise looks pretty impressive! I've been looking for something like this and I may give it a whirl
The demo doesn't seem to work on my chromebook. Maybe it's too underpowered?<p>Web page doesn't say anything after `Warming up...` and the latest message in the browser console is:<p><pre><code> Setting up wasm backend.
</code></pre>
I expected to see a message from my browser along the lines of "Do you want to let this site use your camera", but I saw no such message.
I wish there was a nice open-source model for tracking hands and arms with multiple viewpoints (multiple cameras), similar to commercial software like this: <a href="https://www.ipisoft.com/" rel="nofollow">https://www.ipisoft.com/</a>
BTW this would be great for spaced repetition foreign character learning (Chinese, Arabic, Japanese, Korean, etc.): if the drawn figure is similar enough to the character the student is learning mark it as studied.<p>Congrats again