TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Google, OpenAI, and Meta Could Revolutionize Smart Glasses

7 pointsby prestonlau12 months ago

1 comment

prestonlau12 months ago
Google&#x27;s recent demo of Project Astra, an early preview of a future AI-powered universal assistant, has reignited interest in the potential of smart glasses. Near the end of the demo, a Google employee seamlessly continues conversing with the assistant while wearing a pair of glasses, hinting at the possibility of a game-changing feature for the next generation of Google Glass.<p>Picture yourself wearing smart glasses with a heads-up display (HUD) advanced enough to show you all the information you need.This could be the shakeup the world of personal computing needs, allowing smartphones to remain in our pockets while we interact with our digital lives through our glasses.<p>By having a highly versatile, universal AI assistant with you at all times, you could handle every phone-related task without constantly juggling multiple devices.<p>To achieve this, OpenAI, Google and Meta may need to create a new kind of multimodal foundation model that can understand both voice and typed commands. This model would serve as a subroutine that communicates with the main AI assistant and executes commands to control your phone apps. By residing on your phone, this model could address privacy concerns and eliminate the need for cloud processing.