TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Spatial – Collaborate from anywhere in AR

124 pointsby yurisagalovover 6 years ago

13 comments

zawerfover 6 years ago
&gt; Join a Spatial meeting from HoloLens, MagicLeap, VR, PC or Phone<p>I think supporting cross &quot;platform&quot; is a really cool viral feature.<p>At first it&#x27;s just going to be one guy leading the meeting with real full body tracking (think meetings with architects or mech engineers where you might want to discuss and move around 3d models)<p>Anyone else working remotely can facechat in with their phone (w&#x2F;accelerometer &amp; gyro or ARKit&#x2F;ARCore) or skype in on desktop.<p>But every time they want to discuss something that&#x27;s outside a non-vr&#x2F;ar user&#x27;s fixed field of view, the dude leading the meeting will have to rotate the model for them. They will feel left out and eventually will want to buy their own AR&#x2F;VR setup.<p>(or more realistically they will preshare their cad files first ... but the above still sounds like a plausible future)
comexover 6 years ago
Looks like a serious attempt at an ambitious concept. I love how the video showcases what seems to be an actual working prototype, with its capabilities and imperfections – as opposed to the usual trend of AR &#x27;demo&#x27; videos that are actually just mockups, and completely unrealistic ones at that (as seen with Google Glass, Pokemon Go, and Magic Leap, just from memory.)<p>I&#x27;m curious about the hardware. Is there a base station or two hiding in a corner somewhere? If there is, why isn&#x27;t head tracking accurate enough to prevent &quot;sway&quot; of virtual objects with respect to real-world ones, which seems to be visible in the video? But if not, how does it capture the position and pose of the user&#x27;s hands? In any case, what kinds of sensors are being used?<p>Unfortunately, when watching the video, what really stands out is that the 3D &quot;ghosts&quot; of the other participants have juddery, unsmooth motion. Surely it couldn&#x27;t hurt to add a bit of interpolation? It would increase latency, but not by much, given that a user&#x27;s view of a different user&#x27;s body pose is not especially latency sensitive.<p>Edit: On second look, it seems like the HMD is just a Magic Leap One, though that doesn&#x27;t answer the question of whether there&#x27;s other hardware in the room.
评论 #18295958 未加载
评论 #18299740 未加载
评论 #18295306 未加载
评论 #18295550 未加载
shafyyover 6 years ago
I love that people are investing time and money to make this happen. From my point of view, remote collaboration might make more sense in VR than in AR, though. AR tech is much harder than VR, and with today&#x27;s AR headsets this kind of thing is not really usable. However with today&#x27;s VR tech, it&#x27;s fun!<p>What is the rationale to make this happen in AR here? Is the physical location important for remote collaboration?
评论 #18299351 未加载
评论 #18295930 未加载
gregmacover 6 years ago
It&#x27;ll be interesting to see how this feels for collaboration compared to current video chat. It clearly has the advantage of allowing people to &#x27;interact&#x27; with things in the space, but you lose the feedback aspect of people&#x27;s facial expressions.<p>With video, especially in a group, you can see if people are following what you say, nodding or shaking heads, have confused looks, or are not engaged (or falling asleep). Compared to voice-only discussions, for example, sometimes people won&#x27;t ask the question if they&#x27;re confused, presumably thinking they&#x27;re the only one and don&#x27;t want to waste others time and&#x2F;or embarrass themselves.<p>Maybe facial expressions can be emulated -- but there&#x27;s a very big uncanny valley to get over to make that usable.
评论 #18297533 未加载
dangover 6 years ago
There&#x27;s a writeup here: <a href="https:&#x2F;&#x2F;techcrunch.com&#x2F;2018&#x2F;10&#x2F;24&#x2F;spatial-raises-8-million-to-bring-augmented-reality-to-your-office-life&#x2F;" rel="nofollow">https:&#x2F;&#x2F;techcrunch.com&#x2F;2018&#x2F;10&#x2F;24&#x2F;spatial-raises-8-million-t...</a> (via <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18292217" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18292217</a>).
pj_mukhover 6 years ago
We&#x27;ve been experimenting with using VR for remote meetings and it really does work in making physical presence more relevant. I think Spatial is really onto something but they can simplify the product even more. My main &quot;immediate&quot; use cases are actually<p>a) meetings while taking notes or dealing with Trello and,<p>b) pair programming,<p>So, all I want is to stream my desktop and some aspects of my physical self (face and hand-movements are enough).There is no real need to bring in aspects of individual desktop apps into immersion (as they show with notes or 3D design tools), just stream my desktop and let everyone see. For pair programming, this would need to work seamlessly hours at a time.<p>P.S: Incidentally AR vs. VR is less of an issue here, however, if it was real MR (i.e. I can see my real laptop screen exactly where it is) and I don&#x27;t have to stream my own laptop that would be great.
Animatsover 6 years ago
Has anyone actually seen this? The only video available is clearly a fake demo. Remember Magic Leap.
评论 #18304754 未加载
评论 #18299595 未加载
stcredzeroover 6 years ago
A powerful use of low cost AR&#x2F;VR, which I don&#x27;t see a lot of people pursuing, would be to 3D render building plans while displaying feeds from smartphone cameras as &quot;viewports&quot; inside the rendered building. This could be used to very quickly get information from people onsite to experts and decision makers, with more easily displayed and digested contextual information. This could also be used in the building of large machines as well.<p>A lot of the challenge in this sort of technical communication is conveying the Point of View of the person onsite. We have the technology now, so why not just render it?
评论 #18295759 未加载
merittover 6 years ago
I&#x27;m having a bit of trouble grokking how this experience works for each participant.<p>I get the AR piece: People who are physically in a location they see their remote peers magically floating in the room with them while they can interact with virtual objects. That&#x27;s cool.<p>But what do the remote people see? They don&#x27;t have the benefit of AR, so I&#x27;d imagine they don&#x27;t experience the real-world environment as they&#x27;re sitting at a desk with a headset on, or just simply viewing a 3D world like they&#x27;re playing The Sims?
评论 #18295284 未加载
joshumaxover 6 years ago
It&#x27;s quite an interesting and ambitious concept, but I really hope they fix the avatar designs. I talked to some people around the room and the general agreement is that it&#x27;s stuck in a sort of &quot;uncanny valley&quot; where users look like eerie ghosts with only half of a torso and no legs.
smcameronover 6 years ago
Anywhere in AR? Will this work in my cabin in Arkansas with no electricity?
评论 #18295691 未加载
poormanover 6 years ago
SecondLife 2 coming soon to a basement near you.
will_crusherover 6 years ago
Spatial developer here. You too can play with these demos.<p><a href="https:&#x2F;&#x2F;spatialsys.github.io&#x2F;res&#x2F;shots&#x2F;webapp&#x2F;" rel="nofollow">https:&#x2F;&#x2F;spatialsys.github.io&#x2F;res&#x2F;shots&#x2F;webapp&#x2F;</a><p><a href="https:&#x2F;&#x2F;spatialsys.github.io&#x2F;res&#x2F;shots&#x2F;webapp&#x2F;room_mars_terrain" rel="nofollow">https:&#x2F;&#x2F;spatialsys.github.io&#x2F;res&#x2F;shots&#x2F;webapp&#x2F;room_mars_terr...</a>
评论 #18320557 未加载