TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Navigation for the Visually Impaired Using a Google Tango RGB-D Tablet

18 pointsby DanAndersenabout 10 years ago

2 comments

DanAndersenabout 10 years ago
This is a project I&#x27;ve been working on for the past couple of months, using the Project Tango tablet for a navigation system for people with visual disabilities.<p>It uses pose estimation and point cloud data to (1) build a chunk-based voxel environment of the user&#x27;s surroundings, (2) render a set of depth maps surrounding the user, and (3) use the depth map and OpenAL to generate 3D audio that gives indications of where mapped obstacles are.<p>I don&#x27;t have it at a state where folks can try it out, but I did do a writeup of my approach and wanted to share it.<p>Demonstration video (with quiet audio) here: <a href="https:&#x2F;&#x2F;youtu.be&#x2F;EnNuDiJazBs" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;EnNuDiJazBs</a>
dm2about 10 years ago
This is awesome, great job, I think you&#x27;ve just given humans echolocation.<p>If someone was given a similar device at an early age that was semi-permanently attached to them, would their brain possibly be able to create a map of the room?<p>There have been previous attempts but the Tango device didn&#x27;t exist then so the hardware was bulky and usually required a backpack.
评论 #9557572 未加载
评论 #9559501 未加载