TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: RoBart – open-source LLM-powered robot using iPhone+ARKit for compute

1 点作者 trzy6 个月前
A robot that uses an iPhone Pro for its compute and sensor stack, controlled entirely by Claude 3.5-Sonnet or GPT-4. The mobile base is a salvaged hoverboard with electronics replaced.<p>Mobile phones are incredibly capable -- loads of compute (with neural network acceleration), great connectivity (LTE, BLE, WiFi), front and rear-facing cameras, LiDAR, microphones, a screen and speakers built-in for output. AR frameworks provide very sophisticated perception capabilities out of the box that are incredibly useful for navigation. RoBart uses ARKit&#x27;s scene meshing to identify navigable areas and communicates that to the LLM by annotating camera images with numbered landmarks.<p>I&#x27;d love to see more projects leveraging phones for robotics prototypes, as I&#x27;ve found the development cycle to be much more pleasant than working with something like ROS and a Jetson board.

暂无评论

暂无评论