TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

If you're interested in eye-tracking, I'm interested in funding you

374 pointsby pgover 1 year ago

60 comments

Componicaover 1 year ago
My three partners and I have be developing and selling multi-camera arrays specifically for eye tracking as well as measuring other physiological features for several years now. Our main customers are a couple university research groups, a human factors group in Lockheed, and just recently the US Air Force. In fact we just returned from a trip to Wright-Patterson installing an array in a hypobaric chamber to perform gaze tracker and pupil response for pilots under hypoxic conditions. Phase two will be a custom gaze tracker for their centrifuge. Our main features are accurate eye and face tracking up to a meter from the array, minimal calibration per subject (about 10 seconds staring at a dot), pupil response for measuring fatigue and other things, plus we can adapt the array for the client ranging from a cockpit to a large flat screen TV. We've looked into medical usage such as ALS, but we're bootstrapped based in Iowa and found the military niche as a more direct way to generate cash flow. It's ashame we can't apply this work towards people with medical needs, but we don't have the funds nor the clients to make such a pivot.
评论 #37280903 未加载
评论 #37282368 未加载
AndrewKemendoover 1 year ago
I responded to the thread but Senseye has been working on this for a while now. Originally they were working with the US Air Force to help with improving pilot training - fatigue etc.. inference with retinal reading<p><a href="https:&#x2F;&#x2F;senseye.co&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;senseye.co&#x2F;</a><p>They have generally struggled to find funding for their eye tracking focused work, and have recently had to pivot away from the really exciting but hard to fund stuff into PTSD screening (which is important too).<p>I can connect you with the founder if desired via the email in my bio
justinlloydover 1 year ago
I do hardware. I do software. I do computer vision. I built some software that ran on a cellphone used by LEO (law enforcement officers) to determine if the person they are quizzing is inebriated or impaired through controlled substances by examining the person&#x27;s eyes and having them focus on images displayed on the phone screen. I&#x27;ve done eye tracking using fully custom solutions and also through a few of the off-the-shelf SDKs such as GazeSense from eyeware and a few other SDKs.<p>The problem is not the eye-tracking, it is reasonably easy to build robust systems that can do that easily enough, even with custom hardware under all sorts of lighting conditions. The hard part is the UX if you are trying to build something that isn&#x27;t hampered by current UI paradigms.<p>Rapid typing and menus of custom actions with just eye movement, though fatiguing, shouldn&#x27;t be hard to solve, and then render the output however you want; text, text to speech, commands issued to an machine, etc. Making a usable user interface to do anything else, that&#x27;s where the rubber hits the road.<p>@pg, which software is your friend using? If it is anything like I&#x27;ve looked in to in the past, it&#x27;s over-priced accessibility crap with a UI straight out of the 1990s.
评论 #37279523 未加载
评论 #37281990 未加载
评论 #37284042 未加载
dewarrn1over 1 year ago
EEG recording is an alternative that would outlast the potential disease-related degradation of eye movements. Manny Donchin gave a brown bag at UIUC about the possibilities of using this approach to support communication by ALS patients many years ago. It&#x27;s clever: they use the P300 marker to index attention&#x2F;intention. I do not recall whether he and his colleagues ever commercialized the tech. I believe that this publication is representative: <a href="https:&#x2F;&#x2F;doi.org&#x2F;10.1016&#x2F;j.clinph.2005.06.027" rel="nofollow noreferrer">https:&#x2F;&#x2F;doi.org&#x2F;10.1016&#x2F;j.clinph.2005.06.027</a>
评论 #37279469 未加载
评论 #37278752 未加载
评论 #37279190 未加载
blackguardxover 1 year ago
I worked on eye tracking hardware for Microsoft HoloLens. Several AR headsets offer decent eye tracking, including Hololens 2 and Magic Leap&#x27;s ML2. I think Tobii&#x27;s eye tracking glasses are probably better as a stand-alone solution though: <a href="https:&#x2F;&#x2F;www.tobii.com&#x2F;products&#x2F;eye-trackers&#x2F;wearables&#x2F;tobii-pro-glasses-3" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.tobii.com&#x2F;products&#x2F;eye-trackers&#x2F;wearables&#x2F;tobii-...</a>
评论 #37279901 未加载
评论 #37282059 未加载
评论 #37283394 未加载
评论 #37279786 未加载
评论 #37282408 未加载
sam_goodyover 1 year ago
So, guy who has deployed eye-scanning machines all over Africa and has found that many of them have been hacked and are giving incorrect responses suddenly has a friend with ALS and is willing to fund better quality eye tracking?<p>Either:<p><pre><code> - Part of the whole world-coin thing was privately trying to get the data to help his friend - He doesn&#x27;t want to say &quot;looking to develop eye tracking tech for my world-coin scam&quot;, since most devs won&#x27;t touch that thing. Conveniently found a &quot;friend&quot; with ALS. </code></pre> Saying, on behalf of a friend, that he doesn&#x27;t believe PG.
评论 #37282361 未加载
评论 #37281781 未加载
评论 #37283243 未加载
评论 #37282153 未加载
musesumover 1 year ago
I&#x27;ve been working on the menuing side [1] based on crossing Fitt&#x27;s Law with Huffman trees. But, don&#x27;t know the constraints for ALS.<p>Hopefully, whomever takes this on doesn&#x27;t take the standard Accessibility approach, which is adding an extra layer of complexity on an existing UI.<p>A good friend, Gordon Fuller, found out he was going blind. So, he co-founded one of the first VR startups in the 90&#x27;s. Why? For wayfinding.<p>What we came up with is a concept of Universal design. Start over from first principles. Seeing Gordon use an Accessible UI is painful to watch, it takes three times as many steps to navigate and confirm. So, what is the factor? 0.3 X?<p>Imagine if we could refactor all apps with a LLM, and then couple it with an auto compete menu. Within that menu is personal history of all your past transversals.<p>What would be the result? A 10X? Would my sister in a wheelchair be able to use it? Would love to find out!<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;musesum&#x2F;DeepMenu">https:&#x2F;&#x2F;github.com&#x2F;musesum&#x2F;DeepMenu</a>
kubi07over 1 year ago
There is a Turkish ALS patient, he has a youtube channel, he is creating youtube videos, podcasts, streams on twitch thanks to eye tracker.<p>He is using tobii eye tracker. There is a video he made about the eye tracker. It&#x27;s in Turkish but you can see how he uses it.<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=pzSXyiWN_uw">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=pzSXyiWN_uw</a><p>Here is a article about him in English: <a href="https:&#x2F;&#x2F;www.dexerto.com&#x2F;entertainment&#x2F;twitch-streamer-with-als-beats-the-odds-by-using-eye-tracker-to-make-content-2073988&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.dexerto.com&#x2F;entertainment&#x2F;twitch-streamer-with-a...</a>
fartjetpackover 1 year ago
Another route might also be sub-vocalization[1], like TTS for your thoughts. I recently picked up some cheap toys to get started trying to emulate the results[2].<p>1. <a href="https:&#x2F;&#x2F;www.nasa.gov&#x2F;centers&#x2F;ames&#x2F;news&#x2F;releases&#x2F;2004&#x2F;subvocal&#x2F;subvocal.html" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.nasa.gov&#x2F;centers&#x2F;ames&#x2F;news&#x2F;releases&#x2F;2004&#x2F;subvoca...</a><p>2. <a href="https:&#x2F;&#x2F;github.com&#x2F;kitschpatrol&#x2F;Brain">https:&#x2F;&#x2F;github.com&#x2F;kitschpatrol&#x2F;Brain</a>
评论 #37279386 未加载
评论 #37278662 未加载
fastballover 1 year ago
Is the lack of mentioning Apple deliberate? It seems like they&#x27;ve already poured a lot of R&amp;D into this for the Vision Pro, which might be exactly the kind of thing the friend needs.
评论 #37278578 未加载
评论 #37280407 未加载
评论 #37278821 未加载
评论 #37279035 未加载
readyplayernullover 1 year ago
&gt; A friend of mine has ALS and can only move his eyes. He has an eye-controlled keyboard, but it&#x27;s not very good. Can you make him a better one?<p>When I worked for one of the big game engines I got contacted by the makers of the tech that Stephen Hawking used to communicate, which includes an eye tracker:<p><a href="https:&#x2F;&#x2F;www.businessinsider.com&#x2F;an-eye-tracking-interface-helps-als-patients-use-computers-2015-9" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.businessinsider.com&#x2F;an-eye-tracking-interface-he...</a>
lostdogover 1 year ago
I would love to hear pg&#x27;s analysis of the business case for this company.<p>By my math, 5k people in the US are diagnosed per year, and if your keyboard costs $1k, then your ARR is $5m, and maybe the company valuation is $50m. Numerically, this is pretty far from the goal of a typical YC company.<p>I hate to be so cold-hearted about the calculations, but I&#x27;ve had a few friends get really passionate about assistive tech, and then get crushed by the financial realities. Just from the comments, you can see how many startups went either the military route or got acquired into VR programs.<p>The worst I&#x27;ve seen, btw, is trying to build a better powered wheelchair. All the tech is out there to make powered wheelchairs less bulky and more functional, but the costs of getting it approved for health insurance to pay the price, combined with any possible risk of them falling over, combined with the tiny market you are addressing makes it nearly impossible to develop and ship an improvement. I do hope that we reach a tipping point in the near future where a new wheelchair makes sense to build, because something more nimble would be a big improvement to people&#x27;s lives.
评论 #37282147 未加载
评论 #37283727 未加载
评论 #37281883 未加载
评论 #37287682 未加载
评论 #37286779 未加载
评论 #37284338 未加载
评论 #37281718 未加载
casparover 1 year ago
As someone who suffered some severe mobility impairment a few years ago and relied extensively on eye tracking for just over a year, <a href="https:&#x2F;&#x2F;precisiongazemouse.org&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;precisiongazemouse.org&#x2F;</a> (Windows) and <a href="https:&#x2F;&#x2F;talonvoice.com&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;talonvoice.com&#x2F;</a> (multiplatform) are great. In my experience the hardware is already surprisingly good, in that you get accuracy to within an inch or half an inch depending on your training. Rather, it&#x27;s all about the UX wrapped around it, as a few other comments have raised.<p>IMO Talon wins* for that by supporting voice recognition and mouth noises (think lip popping), which are less fatiguing than one-eye blinks for common actions like clicking. The creator is active here sometimes.<p>(* An alternative is to roll your own sort of thing with <a href="https:&#x2F;&#x2F;github.com&#x2F;dictation-toolbox&#x2F;dragonfly">https:&#x2F;&#x2F;github.com&#x2F;dictation-toolbox&#x2F;dragonfly</a> and other tools as I did, but it&#x27;s a lot more effort)
splatcollisionover 1 year ago
This was designed for a graffiti artist with ALS:<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;EyeWriter" rel="nofollow noreferrer">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;EyeWriter</a><p><a href="https:&#x2F;&#x2F;github.com&#x2F;eyewriter&#x2F;eyewriter">https:&#x2F;&#x2F;github.com&#x2F;eyewriter&#x2F;eyewriter</a><p><a href="https:&#x2F;&#x2F;www.instructables.com&#x2F;The-EyeWriter-20&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.instructables.com&#x2F;The-EyeWriter-20&#x2F;</a><p><a href="https:&#x2F;&#x2F;www.moma.org&#x2F;collection&#x2F;works&#x2F;145518" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.moma.org&#x2F;collection&#x2F;works&#x2F;145518</a>
arketypover 1 year ago
Tobii have been doing eye-tracking since 2001 and have a product for that. <a href="https:&#x2F;&#x2F;www.tobiidynavox.com&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.tobiidynavox.com&#x2F;</a>
评论 #37279673 未加载
bacon_waffleover 1 year ago
Edit: Check out Dasher for a much better interface to enter text with a cursor, compared to a virtual keyboard.<p><a href="https:&#x2F;&#x2F;dasher.acecentre.net&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;dasher.acecentre.net&#x2F;</a> , source at <a href="https:&#x2F;&#x2F;github.com&#x2F;dasher-project&#x2F;dasher">https:&#x2F;&#x2F;github.com&#x2F;dasher-project&#x2F;dasher</a><p>---<p>I remember seeing a program years ago, which used the mouse cursor in a really neat way to enter text. Seems like it would be far better than clicking on keys of a virtual keyboard, but I can&#x27;t remember the name of this program nor seem to find it...<p>Will probably get some of this wrong, but just in case it rings a bell (or someone wants to reinvent it - wouldn&#x27;t be hard):<p>The interface felt like a side-scrolling through through a map of characters. Moving left and right controlled speed through the characters; for instance moving to the left extent would backspace, and moving further to the right would enter more characters per time.<p>Up and down would select the next character - in my memory these are presented as a stack of map-coloured boxes where each box held a letter (or, group of letters?), say &#x27;a&#x27; to &#x27;z&#x27; top-to-bottom, plus a few punctuation marks. The height of each box was proportional to the likelihood that letter would be the next you&#x27;d want, so the most likely targets would be easier+quicker to navigate to. Navigating in to a box for a character would &quot;type&quot; it. IIRC, at any instant, you could see a couple levels of letters, so if you had entered c-o, maybe &#x27;o&#x27; and &#x27;u&#x27; would be particularly large, and inside the &#x27;o&#x27; box you might see that &#x27;l&#x27; and &#x27;k&#x27; are bigger so it&#x27;s easy to write &quot;cool&quot; or &quot;cook&quot;.<p>(I do hardware+firmware in Rust and regularly reference Richard Hamming, Fred Brooks, Donald Norman, Tufte. Could be up for a change)
评论 #37280816 未加载
评论 #37280815 未加载
maccardover 1 year ago
Huh. I wrote a paper for my undergraduate dissertation on eye tracking using a laptop camera, and it ended up published and I won a scholarship award (for €150, imagine that). I wonder if it&#x27;s time to dust off that project
评论 #37282259 未加载
gwurldzover 1 year ago
This seems like it would fit<p><a href="https:&#x2F;&#x2F;thinksmartbox.com&#x2F;products&#x2F;eye-gaze&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;thinksmartbox.com&#x2F;products&#x2F;eye-gaze&#x2F;</a><p>I once interviewed at this company. Unfortunately didn&#x27;t get the job but very impressed nonetheless.
评论 #37279245 未加载
评论 #37279904 未加载
modelessover 1 year ago
I agree, eye tracking is going to have really broad applications. I&#x27;ve been interested in eye tracking for over a decade, and in fact built my own eye tracker, joined a startup, and got acquired by Google[1]. But there&#x27;s way more to do. We&#x27;ve barely scratched the surface of what&#x27;s possible with eye tracking and I&#x27;d love to take a second crack at it.<p>[1] <a href="https:&#x2F;&#x2F;techcrunch.com&#x2F;2016&#x2F;10&#x2F;24&#x2F;google-buys-eyefluence-eye-tracking-startup&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;techcrunch.com&#x2F;2016&#x2F;10&#x2F;24&#x2F;google-buys-eyefluence-eye...</a>
评论 #37281337 未加载
评论 #37279463 未加载
sprocketover 1 year ago
I used this software when my mom was battling ALS:<p><pre><code> https:&#x2F;&#x2F;www.optikey.org&#x2F; </code></pre> which ran on a &lt; $1k computer<p>At the time, the other options were much more expensive (&gt; $10-15k) which were sadly out of out budget.
sailpleaseover 1 year ago
Adhawk, adhawk.io, has the only all day ultralight eye tracking wearable I&#x27;m aware of, all MEMS based with ultra high scan rates, 500Hz+ and research grade accuracy. For ALS u likely need something light and frictionless, wearing a hot and heavy headset all day probably doesn&#x27;t work.
yyykover 1 year ago
I suspect the real moneymakers for such startups have very little to do with ALS. ALS demand is fortunately small, and can&#x27;t lead to VC desired growth curve. Imagine instead using it in a classroom to ensure the kids pay attention. Or making sure you see the advertisement.
评论 #37281590 未加载
acyouover 1 year ago
Yes, the ALS&#x2F;disability angle is noble. Viewed another way, the entire human race is afflicted by the disability of not having access to eye-tracking (and other) technologies. Paul Graham and co. are also invested in companies that are going to be highly enabled and boosted by the growth of eye-tracking and related technologies. I don&#x27;t view his statement of motivation related to ALS as insincere, I just also notice that it&#x27;s accessible, easily understandable, and also in line with other aspects of Paul&#x27;s motivation (and that&#x27;s a good thing).<p>I would also recommend Jean-Dominique Bauby&#x27;s Le Scaphandre et le Papillon to anyone interested in this topic. Typing using eye movements was used in that book in a slow, inefficient manner. In the book&#x27;s case, the question one should ask is, was his UI paced at the exact correct speed? I was and still am deeply emotionally moved by what the author was able to accomplish and convey. I am unsure if a faster keyboard would have made a meaningful and positive difference in that particular case, to the author&#x27;s quality of life. I&#x27;ll need to give that book another read with that question in mind.<p>Happily, I expect eye tracking to find fascinating, novel and unexpected applications. As others have stated, UI&#x2F;UX design is an interesting part of this puzzle. For example, if you ask an LLM to output short branches of text and have a writer look at the words that he wants to convey. It&#x27;s definitely blurring the line between reading and writing. Myself, finding writing to be a tactile exercise, I think that emotional state comes into play. That&#x27;s what I&#x27;m interested in. Yes, can you literally read someone&#x27;s eyes and tell what they are thinking?
ZeroCool2uover 1 year ago
I literally just bought this last night. Works with just a webcam and is shockingly accurate. <a href="https:&#x2F;&#x2F;beam.eyeware.tech&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;beam.eyeware.tech&#x2F;</a>
zefzefzefover 1 year ago
I&#x27;m very interested in eye-tracking and see a lot of potential in this tech.<p>For inspiration, check out the Vocal Eyes Becker Communication System: <a href="https:&#x2F;&#x2F;jasonbecker.com&#x2F;archive&#x2F;eye_communication.html" rel="nofollow noreferrer">https:&#x2F;&#x2F;jasonbecker.com&#x2F;archive&#x2F;eye_communication.html</a><p>A system invented for ALS patient Jason Becker by his dad: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=wGFDWTC8B8g">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=wGFDWTC8B8g</a><p>Also already mentioned in here, EyeWriter ( <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;EyeWriter" rel="nofollow noreferrer">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;EyeWriter</a> ) and Dasher ( <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Dasher_(software)" rel="nofollow noreferrer">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Dasher_(software)</a> ) are two interesting projects to look into.
anupamchughover 1 year ago
Does something like a blink, wink tracker for swiping UI pique your interest? I built a PoC a while back: <a href="https:&#x2F;&#x2F;github.com&#x2F;anupamchugh&#x2F;BlinkPoseAndSwipeiOSMLKit">https:&#x2F;&#x2F;github.com&#x2F;anupamchugh&#x2F;BlinkPoseAndSwipeiOSMLKit</a>
Schwolopover 1 year ago
I&#x27;m consulting with an Australian group called Control Bionics. They have a US company &amp; office, with CTO and sales team in Ohio, but software engineering is done in AU. Their primary product is an electromyography and accelerometer hardware device to detect muscle activations or movements, and then most commonly used as a mouse-click substitute in conjunction with third-party eye-gaze hardware proving the cursor. (I&#x27;ve also designed them an autonomous wheelchair module, but that&#x27;s another story...)<p>@pg - If your friend has not tried adding a mouse-click via something they can activate other than eye-gaze, this would be worth a shot. We have a lot of MND patients who use our combination to great success. If they can twitch an eyebrow, wiggle a toe or a finger, or even flex their abdomen, we can put electrodes there and give them a way forward.<p>Also, my contact details are in my profile. I&#x27;d be happy to put you in touch with our CEO and I&#x27;m confident that offers of funding would be of interest. The company is listed on the Australian stock exchange, but could likely go much further with a direct injection of capital to bolster the engineering team.<p>Cheers, Tom
mhbover 1 year ago
In responses, there seem to be dozens of experts and companies already doing this. Where does he think they fall short of meeting his friend&#x27;s needs?
mercurialsoloover 1 year ago
Eye tracking is essentially a model of visual attention. Visual attention is part of the overall attention space and big companies and use-cases are built around visual attention. Today we track attention by explicit interactions, if we can model around implicitly observable interactions - then we have a much larger observable data space around the user.,
评论 #37279783 未加载
claytongulickover 1 year ago
I did a fun project a few years ago with eye tracking.<p>We built a prototype for roadside sobriety checks. The idea was to take race&#x2F;subjectivity out of the equation in these traffic stops.<p>We modified an oculus quest and added IR LEDs and cameras with small PI zero&#x27;s. I wrote software for the quest that gave instructions and had a series of examinations where you&#x27;d follow a 3D ball, the screen would brighten and darken, and several others while I looked for eye jerks (saccades) and pupil dilation. The officer was able to see your pupil (enlarged) on a laptop in real time and we&#x27;d mark suspicious times on the video timeline for review.<p>It was an interesting combination of video decoding, OpenCV and real-time streams with a pretty slick UI. The Pi Zero was easily capable of handling real-time video stream decoding, OpenCV and Node. Where I ran into performance problems I wrote node -&gt; c++ bindings.<p>We did it all on something silly like a 50k budget. Neat project.
archoover 1 year ago
<a href="https:&#x2F;&#x2F;archive.is&#x2F;TiCun" rel="nofollow noreferrer">https:&#x2F;&#x2F;archive.is&#x2F;TiCun</a>
dimaskover 1 year ago
What do you think are some challenges that an eyetracker in this specific context has to face? What is your friend mostly struggling with the current solutions? Are there tracking specific challenges related to ALS? Is it mostly a UI&#x2F;&quot;better prediction&quot; interface issue?<p>With my group we are developing an eyetracker for studying developmental and clinical populations, which typically present challenges to conventional eyetrackers. It is a spin off from our academic work with infants, and we already have a study almost done that uses it. We are still into the very beginning phase in terms of where this may lead us, but we are interested in looking into contexts where eyetracking for different reasons may be more challenging.
MasterYodaover 1 year ago
PG mention that the solution his friend used wasn&#x27;t any good. How does the best system there is out today work? And what different solutions are there?
imranqover 1 year ago
I found this tool to be interesting to play with and seems to work pretty well assuming you stay in the same position: <a href="https:&#x2F;&#x2F;gazerecorder.com&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;gazerecorder.com&#x2F;</a><p>I&#x27;m guessing a combination of projection mapping, built in lighting, and some crowdsourced data will get accuracy to very usable levels
user3939382over 1 year ago
How about a library that starts loading a link when you look at it with intent. Or maybe with BCI integration that detects the moment you decide you want to access it.<p>Or how about a UI that automatically adapts to your eye movement and access patterns to minimize the amount of eye movement required to complete your most common tasks by rearranging the UI elements.
ricardobayesover 1 year ago
I thought this was solved long time ago, I wrote a program many years ago using kinect that tracks the center of the eye pretty precisely, using color gradients. The pupil is pretty uniform in every human being (it&#x27;s black) surrounded by some color and then white. Even just a few pixels are enough to do it.
PBnFlashover 1 year ago
I suspect a foveated system is going to be a big thing in machine vision as well.
评论 #37278681 未加载
quietthrowover 1 year ago
@paulg look in to this: <a href="https:&#x2F;&#x2F;spectrum.ieee.org&#x2F;brain-implant-speech" rel="nofollow noreferrer">https:&#x2F;&#x2F;spectrum.ieee.org&#x2F;brain-implant-speech</a>
评论 #37280418 未加载
TheGuyWhoCodesover 1 year ago
<a href="https:&#x2F;&#x2F;www.eyecontrol.co.il&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.eyecontrol.co.il&#x2F;</a> was founded exactly to solve this problem
评论 #37284866 未加载
peter_retiefover 1 year ago
I would like to look at the problem more deeply, the eyes can be tracked but what about facial movement, the more data the better training for machine learning
ameliusover 1 year ago
I hear the Apple Vision Pro has a good implementation. If this were Microsoft, you&#x27;d be able to find the details in the Microsoft Research website.
tmalsburg2over 1 year ago
Solving eye-tracking keyboards is not so much a task for a company with eye-tracking expertise but for one with expertise in large language models.
评论 #37280012 未加载
DoingIsLearningover 1 year ago
Is there any solution out there that does not use IR + dark pupil segmentation?<p>Seems like all the solutions out there are some flavour or variation of this.
6stringmercover 1 year ago
Jason Becker is a great subject because if you can help him compose with his eyes the world can use his music he’s a genius.
joshm93over 1 year ago
I can make an eye tracking keyboard with tensor flow, if anyone is interested in this problem.<p>It would be great to hear from paul about how his friend uses the keyboard and what kind of tasks he’d love to do but can’t with current solutions.<p>It seems like a throughput problem to me. How can you type quickly using only your eyes?<p>Have people explored using small phonetic alphabets or Morse code style encoding?<p>Once I got tensorflow working, I’d start mapping different kinds of ux. Throughput is king.
jjbcbover 1 year ago
I’m sorry this happened to your friend. I hope we can do something to help them.
frakkingcylonsover 1 year ago
I hope this effort bears fruit. My uncle passed from ALS eight years ago.
kkenover 1 year ago
In case anyone is interested: There are plenty of companies around.<p>Both apple and Facebook acquired eye tracking companies to kickstart their own development.<p>Here are some Top-lists<p><a href="https:&#x2F;&#x2F;imotions.com&#x2F;blog&#x2F;insights&#x2F;trend&#x2F;top-eye-tracking-hardware-companies&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;imotions.com&#x2F;blog&#x2F;insights&#x2F;trend&#x2F;top-eye-tracking-ha...</a> <a href="https:&#x2F;&#x2F;valentinazezelj.medium.com&#x2F;top-10-eye-tracking-companies-on-the-market-today-3b96ef131ab5" rel="nofollow noreferrer">https:&#x2F;&#x2F;valentinazezelj.medium.com&#x2F;top-10-eye-tracking-compa...</a><p>Its also an active research field, this is one of the bigger conferences: <a href="https:&#x2F;&#x2F;etra.acm.org&#x2F;2023&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;etra.acm.org&#x2F;2023&#x2F;</a>
评论 #37278943 未加载
quickthrower2over 1 year ago
What about brain waves to control the keyboard?
jacquesmover 1 year ago
Whatever happened to GazeHawk and their crew?
hardwaregeekover 1 year ago
Aw, that&#x27;s nice of pg to want something better for his friend. As cynical as we are about technology, new developments can be so fantastic for accessibility and better quality of life.
评论 #37278621 未加载
评论 #37278794 未加载
评论 #37278792 未加载
rsyncover 1 year ago
So you&#x27;re saying there&#x27;s a final frontier in the mapping of intention and the discernment of preferences ... and you&#x27;d like some bright young things to explore that possibility space under the guise of (<i>unimpeachable good cause</i>) ?<p>Go to hell.<p>Unless, of course, you&#x27;d like to commit the funded work to the free commons, unencumbered by patents and copyrights, and free to use by any entity for any purpose.<p>That&#x27;s what we&#x27;d do for ALS, right ?
评论 #37280337 未加载
评论 #37281774 未加载
评论 #37280289 未加载
评论 #37280105 未加载
评论 #37280351 未加载
aaron695over 1 year ago
<a href="https:&#x2F;&#x2F;nitter.net&#x2F;paulg&#x2F;status&#x2F;1695596853864321055" rel="nofollow noreferrer">https:&#x2F;&#x2F;nitter.net&#x2F;paulg&#x2F;status&#x2F;1695596853864321055</a> to see more of the thread if not logged in.<p>It&#x27;d be good to know what rate we need to beat and some other metrics.
评论 #37279093 未加载
atleastoptimalover 1 year ago
I bet Apple Vision pro eye tracking is already SOTA and will remain that way for years due to their crazy R&amp;D spending
评论 #37279393 未加载
dennis_jeeves1over 1 year ago
And now if only someone funded an ALS cure...<p>As far I know, I don&#x27;t don&#x27;t think mainstream medicine is close to solving _any_ chronic condition, except managing it.
评论 #37286888 未加载
turnsoutover 1 year ago
Someone should do this, but for the love of god, DO NOT take any venture capital to do it. No matter how well-intentioned the VCs are at the start, eventually your eye tracking startup will 100% be used for advertising as investors in your Series D need to make an exit or take you public.
评论 #37279080 未加载
评论 #37278643 未加载
评论 #37278762 未加载
评论 #37279384 未加载
评论 #37278706 未加载
soligernover 1 year ago
It feels like Apple has solved this problem with the Vision Pro? So just wait for a couple more months?
评论 #37278581 未加载
评论 #37278566 未加载
morkalorkover 1 year ago
<i>Please look directly at the screen to continue</i><p>Whilst it plays an unskippable and unblockable ad (thanks weiapi!)
评论 #37279382 未加载
评论 #37278716 未加载
FlamingMoeover 1 year ago
Not really a huge PG fan, but this is what billionaires should be doing: see where a need is exists and put some of your insane wealth towards making an improvement. This why I respect Elon even though I don’t really like him; he puts his money to use, in a very public manner.
评论 #37278746 未加载
评论 #37278652 未加载
评论 #37278683 未加载
评论 #37278641 未加载
评论 #37278720 未加载
adamnemecekover 1 year ago
I have a new approach of doing ML, where autodiff is replaced with something better. Magically a lot of things fall into place. This approach should make problems like this relatively straight forward.<p>Interested in hearing more?
评论 #37281834 未加载