Steve Mann: Old-fashioned welding helmets use darkened glass for this. More modern ones use electronic shutters. Either way, the person welding merely gets a uniformly filtered view. The arc still looks uncomfortably bright, and the surrounding areas remain frustratingly dim.<p>Me: Hasn't this guy ever heard of HDR? He could have just used a couple of video cameras with some processing.<p>SM: A few years before this, I had returned to my original inspiration—better welding helmets—and built some that incorporated vision-enhancing technology. [...] These helmets exploit an image-processing technique I invented that is now commonly used to produce HDR (high-dynamic-range) photos.<p>Me: Oh. Right.
I don't understand why Google hasn't hired Steve Mann yet, at least as a consultant on the project. Seems like hubris to me: this guy has been testing wearable systems for 20 years or so and knows more about the experience than anyone on the planet.
To me, google glasses is worth a buy for just only one reason: sousvelliance(or inverse survelliance).<p>When somebody crashed into your house, or you witnessed a crime taking progress, or remembering an important details related to business dealing, it just may be worth the 1500 dollars you spent to get a Google Glass.<p>Other application of google glass may provide utility on a daily basis. I can imagine getting 10 dollars worth of useful service from google glass everyday, and 5 dollars in security benefit for the surrounding society. Multiply that by 365 days which is 5475 USD in term of economic benefit every year. Don't forget to mention high value recordings such as record of criminal activities, abuse of authorities by cops.<p>(Of course, if you're too poor, than google glasses isn't worth 1500 USD even if it may be someday worth 1500 USD of value to you.)
>The impact and fall injured my leg and also broke my wearable computing system, which normally overwrites its memory buffers and doesn’t permanently record images. But as a result of the damage, it retained pictures of the car’s license plate and driver, who was later identified and arrested thanks to this record of the incident.<p>This image retention also happened when he was attacked at a McDonald's in Paris last summer: <a href="http://www.huffingtonpost.com/2012/07/17/steve-mann-attacked-paris-mcdonalds-digital-eye-glass-photos_n_1680263.html" rel="nofollow">http://www.huffingtonpost.com/2012/07/17/steve-mann-attacked...</a><p>Here's his account: <a href="http://eyetap.blogspot.com/2012/07/physical-assault-by-mcdonalds-for.html" rel="nofollow">http://eyetap.blogspot.com/2012/07/physical-assault-by-mcdon...</a><p>As far as I know, nothing ever came of it (i.e. there were no charges or settlements).
I really wish someone would make an early-adopter version of his EyeTap v4/v5 and sell it.<p>Would I pay cost plus 50% for one? Absolutely. Up to and including car-level prices. This is HN, a startup, anyone?
<i>The second issue, the eyestrain from trying to focus both eyes at different distances, is also one I overcame—more than 20 years ago! The trick is to arrange things so that the eye behind the mirror can focus at any distance and still see the display clearly.</i><p>Sounds like he has some great insights here. He's also known as 'the worlds first cyborg' (<a href="http://en.wikipedia.org/wiki/Steve_Mann" rel="nofollow">http://en.wikipedia.org/wiki/Steve_Mann</a>), and the lonely trail he seemed to be on is now shifting to the mainstream.
I think eyestrain would be a factor if you're using these systems as an augmented display that's constantly on, but I don't think the point of these devices is to be constantly looking at them, but rather, to engage them as needed, otherwise, let them get out of the way.<p>His devices in the pictures are shown to get in between the eye and the external world, whereas, if you look at glass, the screen is up and out of your line of sight.<p>I think if your wearable tech display is always on and continuously visible, it'll be a problem, battery life will be negatively impacted, and the device will distract you constantly.
<i>As I went to speak with the driver, he threw the car into reverse and sped off, striking me and running over my right foot as I fell to the ground.</i><p>Probably because the driver was like "AAHHHHH!!!! A FREAKIN' CYBORG!!!!"
All I really want from computerized eyewear is telemetry for sports.<p>Current time/pace/distance/route/whatever when I'm running.<p>Current speed, next corner severity/distance when I'm longboarding. (<a href="http://swizec.com/blog/ifihadglass-the-app-i-want-to-build/swizec/6035" rel="nofollow">http://swizec.com/blog/ifihadglass-the-app-i-want-to-build/s...</a>)<p>That alone would be worth the money to me. Such things already exist for skiing goggles, but those aren't extensible and only really fit one sport. So that's no good.
Visual perception is truly an amazing thing! The author's anecdotes about vision alteration and the brain's ability to adapt were very interesting to me. I have nystagmus: my eyes move back and forth quickly all the time. I've often wondered what it looks like to see without the movement; however, that <i>is</i> how I see: I don't notice the movement at all. My vision with contacts doesn't get much better than 20/40, so I do experience the effects of the movement. I tend to think of my vision as if it's an example of two-point wave interference: <a href="http://en.wikipedia.org/wiki/Interference_(wave_propagation)" rel="nofollow">http://en.wikipedia.org/wiki/Interference_(wave_propagation)</a> The further away an image is from my focal points, the more the interference from movement affects my brain's ability to piece it all together; it's similar to tunnel vision, but instead of darkness on the periphery, it's progressively more blur. To see most clearly, I have to tilt my head to the side, to my "null point" where my eyes move the least. Not to mention my head moves often in some sort of sync with my eyes, especially while reading; once in school, a substitute teacher raised his voice angrily, thinking I was shaking my head at his work on the board!<p>I'm curious how Google and other developers of high-tech eyewear will account for us with out-of-the-ordinary eye conditions. If the glasses or certain apps rely on eye movements for communication, we probably couldn't use them.
> But as a result of the damage, it retained pictures of the car’s license plate and driver, who was later identified and arrested thanks to this record of the incident.<p>[McDonald's assault]:
> when the computer is damaged, e.g. by falling and hitting the ground (or by a physical assault), buffered pictures for processing remain in its memory, and are not overwritten with new ones by the then non-functioning computer vision system.<p>Fragile design or, say, accelerometer-controlled backup memory?
Reliant: Steve Mann wrote "Physical assault by McDonald's for wearing Digital Eye Glass" [1], back in July 2012. Speaks to the stigma Google Glass is likely to face.<p>[1] <a href="http://eyetap.blogspot.ca/2012/07/physical-assault-by-mcdonalds-for.html" rel="nofollow">http://eyetap.blogspot.ca/2012/07/physical-assault-by-mcdona...</a>
I wonder how much wearing a contraption on his head contributed to his getting hit by the car. Even state of the art high-end viewfinders with millions of pixels have frustratingly long lag, enough to easily take away the reaction speed edge gained from millions of years of evolution.
(In the far future) if Google Glass could automatically upload video of what you're seeing to your cloud storage, you could have a searchable log of your entire life.<p>Reminds me of these projects:<p><a href="http://en.wikipedia.org/wiki/MyLifeBits" rel="nofollow">http://en.wikipedia.org/wiki/MyLifeBits</a><p><a href="http://en.wikipedia.org/wiki/Microsoft_SenseCam" rel="nofollow">http://en.wikipedia.org/wiki/Microsoft_SenseCam</a><p>Maybe a V1 of this could have Google Glass take a photo every minute. You could upload it automatically to Evernote or your private G+ photo feed. Then, you could occasionally review and "star" the important moments of your life (and maybe even delete/summarize chunks that are less important).
Couldn't the fashion problem be solved by making hats cool again. Already some hipsters are wearing them, should provide plenty of space for hiding computing gear. Perhaps even the projector into the eye could be hidden in the rim of the hat?
EyeTap, and this: <a href="http://bits.blogs.nytimes.com/2013/02/27/scientists-uncover-invisible-motion-in-video/" rel="nofollow">http://bits.blogs.nytimes.com/2013/02/27/scientists-uncover-...</a><p>I would pay for this. I would pay a lot.
The pinhole aremac idea is so elegant! Infinite field of view and no need to measure the eye's lens. I wonder if video games and head-mounted displays designed for gaming will one day take advantage of that.
Apart from the potential physiological issues, the author briefly touches on the sociological impact this may have. In my mind that will be even more profound.<p>If you haven't seen it, 'Black Mirror' on TV here in the UK has an excellent episode where nearly everyone (voluntarily) has an implant which records everything they see.<p>Well worth a watch<i>: <a href="http://www.channel4.com/programmes/black-mirror/4od#3327868" rel="nofollow">http://www.channel4.com/programmes/black-mirror/4od#3327868</a><p></i>not sure how available this is outside the UK. It's called 'The Entire History of You'
This guy looks amazing, though he can hardly lament that lessons were not learned if be did not participate in the commercialization of the technology.<p>Why is this guy not consulting for Google? And I'm not sure if I'm more astounded or thankful that he has not patents his research.
I spend several thousand dollars testing a few "eyesight for the blind" products by taking video on a head mounted camera, encoding the image as an 1 image per second audio file that is transmitted to the ears. I was actually able to get it to work as advertised, and I believe that given 10 hours a day practice for a month, you could detect a sense of depth perception and make out attributes in your environment through your audio cortex enough to walk around slowly without bumping into things. I had the blind friend test out the best I could do, and although it was a technological marvel, he actually didn't like it because it made people ostracise him even MORE than him being blind. He can move around slowly without bumping into things much more fashionably with the system he already had, a stick, good hearing, touch, and memory.<p>So a few insights:<p>1. If you are putting something in front of your eyes, or on your hat brim that looks like a hacked together bunch of cameras and wires and you wear it in public, there is millions of years of evolution causing people to ostracise you. It's so bad, that a blind person told me: "The ostracism from wearing it is worse than the ostracism from them realizing your blind."<p>2. You think you're confidant and can handle it? You aren't, inside you are millions of years of evolution to remove what is causing the ostracism. If you are the kind of person who can choose to remain single and lonely for life when you burn with passion for the opposite sex, then you have the kind of mettle it takes to wear cameras and wires on your head in public.<p>3. The experience I had with converting visual to audio and using my audio cortex was tremendous. For example objects that "popped out" at me during audio-vision were completely different than normal vision. Take a brick wall for instance: I could hear the distance between the bricks (cement) was smaller in one spot, and larger in another spot because of an anomalous blip in the audio file. When looking at it visually, you think "meh", it's just a brick wall. With the audio file, the different brick leaps out at you as an anomaly. Thus exposing the data structure/algorithmic differences between the visual cortex and audio cortex.<p>Doing visual as audio makes you an infant again, the tiniest changes in things leap out as fascinating. This experience I had could probably be sold to people bored to tears with life. A billion dollar idea! Be an infant again.
For the record, this guy has been wearing actual computers.<p>Google glass is an accessory- essentially a bluetooth headset, display and camera built into glasses. The intelligence lives on the servers, and glass needs a bluetooth or wifi connection to talk to the net.<p>I think google's engaging in a bit of a PR swindle by making people think google glass is like an iPhone. It isn't, it <i>needs</i> and iPhone or android phone to connect to the net.<p>Consequently it can't replace a smartphone.<p>I'm also pretty dubious about the battery time it will get, even without having to run a local CPU.