Just amazing! Hardware startups are beyond sexy. There is something so compelling about tactile devices... no amount of sophisticated software engg can come close to that feeling. Wipro's chairman was once asked why wipro still made soap & shampoos even though wipro was a billion dollar software services shop. He just picked up a bar of soap and said - its just very physical. Software, you can't touch, you can't see...<p>In the early 90s, there was this thing called a centronics port. You looked behind an old dot matrix printer, you found this large ugly parallel port with 25 pins ( <a href="http://bit.ly/YT4xRh" rel="nofollow">http://bit.ly/YT4xRh</a> ). I asked my professor "how does the printer print ? ". He took my question at face value, hooked up the centronics port to a CRT...soon I realized pins 2-9 would output any goddamn digital signal ! Literally anything I wanted. I wrote a quickbasic program to print square wave, sine wave sawtooth... and those waves started showing up on the CRT. So it got me thinking...if I feed that 8 bit output to an 8-bit TI DAC, I can make an analog function generator! ( <a href="http://bit.ly/WmFy8o" rel="nofollow">http://bit.ly/WmFy8o</a> ) My final version needed a 741 op-amp & a 555 timer, but it actually worked over a wide range of frequencies, and was about one hundredth the price of a real Kenwood function generator! Ofcourse, back in those days, there was no YC or startup culture, & that was the end of that device...:( But just thinking about that one hardware device gives me more pleasure than all the software I wrote after graduating.
> The first generation can recognise around 20 gestures, some as subtle as the tap of a finger<p>I wonder if it's able to distinguish between fingers. If so, it would be possible to create some sort of chorded keyboard, without the keyboard of course. Combine with Project Glass and you have awesome sauce.
I've worked extensively with sensored gloves in my job, both designing the gloves, and developing software to do cool things with them. By far the most difficult thing to do is make a compelling application that effectively utilizes the capabilities of the hardware. This is a very cool piece of hardware, but it's success is going to be defined on whether or not they are able to make it better than existing input devices, which tend to be quite good already.
I'm terribly excited about this interface. I've been hoping that this is right path towards a Rainbow's End(<a href="http://en.wikipedia.org/wiki/Rainbows_End" rel="nofollow">http://en.wikipedia.org/wiki/Rainbows_End</a>) kind of interface.<p>I was worried that this kind of approach would languish inside MSFT research. <a href="http://www.extremetech.com/extreme/133732-microsoft-demos-muscle-computer-interface-air-guitar-hero-now-a-reality" rel="nofollow">http://www.extremetech.com/extreme/133732-microsoft-demos-mu...</a><p>Are you worried about the MSFT patent?<p>Having your arms hanging loosely or with the elbow support while gesturing seems to be the right way to take advantage of human fine motor control while manipulating a digital interface. It is a good workaround to Gorilla arm.<p>I suspect that the 20 year interface that Gabe Newell is expecting will use this kind of input.<p><a href="http://www.theregister.co.uk/2012/07/26/valve_considered_tongue_controlled_interface/" rel="nofollow">http://www.theregister.co.uk/2012/07/26/valve_considered_ton...</a>
This is cool, but I couldn't believe their video included the skiing example. As an avid skier, I can't think of anything more life threateningly dangerous than messing with menus to share a video while you're sliding down a mountain.<p>I get that it has nothing to do with armband itself, it just seems like a very poor choice. I cringed.
I haven't been so excited in a startup's product in a very long time. I'm constantly tapping out rhythms with my fingers - often quite pleasant and complex ones - and for whatever reason I've never been able to do the same onto any kind of midi device - a keyboard or a set of drum pads or whatever. I would <i>love</i> to give this a try as a midi controller!
Ok, this is what I think we will see a lot of. No talking, no vision, no touching, just some simple gestures.<p>Dave Rosenthal (former MIT/Sun/nVidia not archivist guy) and I talked a bit about gesture control when the first MEMS accelerometers came out. They allowed for a wide range of controls but they drifted horribly so it wasn't possible to do a sort of "hold for fast forward" kind of motion. You had to have a move to start and a move to end.<p>To some extent this is the same problem as the Leap which can know where your fingers are but you can't hold a gesture and move your hand (afaict, I've not seen the LEAP SDK in action yet)
First thing something would do with this: integrate it with Google Glass. While nodding your head or tapping a touchpad on the side by your temple might be awkward, discretely swiping up and down with your had would be more natural and less obvious.
Just a thought in this field of motion controls - could be cool to use gloves/finger/points of contact tapping as part of the motion/gesture control aspect - ie: tapping my thumb and forefinger together is 1 gesture, thumb and middlefinger another, thumb and ringfinger etc. So on top of using motion and speed, you also detect commands when points connect together.
I wonder how good this can be at spatial accuracy. With optical motion sensing, you can relate gestures to physical space: you can point <i>at</i> something.<p>I love the concept here, but I wonder how it can go beyond "the user is wagging their finger" to "the user is pointing at an object on the screen"
Depending on the sensitivity of this, the potential is endless. There are many multi-finger gestures that would work very well in three-dimensional space that have not yet been exploited with current technology. I'm excited to see what will come of this.
Just preordered one, if it's as good as it looks on the video the possibility to build amazing things are endless. As stated in the the NewScientist article, this combined with glass and some home control equip. Oh the future is now ! :)
Wow, not only is the product amazing, but so is the naming.<p>The product uses EMG, as in electro<i>myo</i>graphy.
And it's a personal armband, as in My 'O'.
Wow. This is the future of human/machine interface (until we get direct brain/machine interfaces (BMI) working with fidelity and no/little invasiveness). I really hope the actual product works as well and as smoothly as in those demos.
Can you ensure that common activities such as typing, writing, stretching etc. won't be registered as intended triggers for actions? Anyway, amazing product! I can totally see myself using and developing with one!
I wonder if they've looked into adding an NFC chip into MYO. It may seem crazy for a gesture control armband but it could add potential functionality to the armband and also improve communication among the masses.
Very cool technology!<p>Honest question : Is this area a patent minefield? Or are the Myo guys busily sewing it all up? (I'm glancing nervously at the may-be-more-tricky-than-anticipated 3d printing field, for instance).
`k people, I wanna see a coupla-three dev teams getting these patent specs sewn up before Myo flips behind Apple's iwatch wristband wall.<p>1. Permute & Enumerate Discrete Associative Layer(PEDAL) Group. Enumerate all normative ergo kenesiologic mapping of discrete hapt events onto an associative array we can MTF onto task expectation frequencies.<p>2. Dynamic Adaptive Nascent Cognitive Evolution(DANCE) Group. Get past the AA(Acknowledgment Annunciator), transparently and simultaneously entrain both user and device handshake onto evolving MTF surfacing situ specific command completion.<p>3. Comparitive Haptics Integral Nuance Advancement(CHINA) Team. Collide Leap, Myo, Kinect, Oculi; onto mobile, glass, watch.<p>That's it. Git'er done like yesterday, fellas. Now!<p>Out!
Very cool.<p>Just a heads up: At the bottom of <a href="https://getmyo.com/faq" rel="nofollow">https://getmyo.com/faq</a>, the @ThalmicDeveloper <a> tag is being rendered as text.
Does it suffer `Gorilla Arm' as others? Is it necessary to keep one's arm aloft for hours to get in a day's work? It certainly looks exciting for momentary input.