FTA: "When Picard and el Kaliouby were calibrating their prototype, they were surprised to find that the average person only managed to interpret, correctly, 54 per cent of Baron-Cohen's expressions on real, non-acted faces. This suggested to them that most people - not just those with autism - could use some help sensing the mood of people they are talking to. "People are just not that good at it," says Picard. The software, by contrast, correctly identifies 64 per cent of the expressions."<p>I kind of wonder how they did this testing. Much of the information we get about a person's mood is from what context there is (far more than comes from what's on their faces). There are people who cry out of joy, but if I saw a picture of one of these people, I'm sure it would look like sadness to me.<p>Further, I don't like the sound of a future where people stop talking to other people because a light starts blinking.<p>However, for the purposes of autism research and development, this is good. Better than good - this is excellent. I really hope there will be more research in this area, for the purpose of helping those who can't communicate well. (I suppose that contradicts what I said above. Perhaps there is a particular scale on which to rank necessity of aid in communication?)
Being a certified autist with the additional bonus of having very bad eyesight, I can't stress enough how important and useful developments like these can be.<p>I am extremely poor at having face to face conversations. Mostly because any kind of body language completely escapes me. This has caused numerous situations of miss-communication and generally makes any kind of meaningful interaction with a human being unreliable at best.<p>Having access to an aid like this will certainly help improve matters for me and anyone trying to have a conversation with me.
Only at the mecca of social-awkwardness (MIT) could something like this be developed.<p>Kidding aside, MIT never ceases to amaze. Boston's most valuable resource by a mile, IMHO.
<a href="http://www.affectiva.com/q-sensor/" rel="nofollow">http://www.affectiva.com/q-sensor/</a> like fitbit, but for emotion. Imagine the ecosystem-explosion given access to a decent API.