TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The HTTP of VR

53 点作者 rvkennedy超过 3 年前

15 条评论

Animats超过 3 年前
Since I&#x27;m writing a new client, in Rust, for Second Life&#x2F;Open Simulator, I&#x27;m very aware of these issues.<p>A metaverse client for a high-detail virtual world has most of the problems of an MMO client plus many of the problems of a web browser. First, much of what you&#x27;re doing is time-sensitive. You have a stream of high-priority events in each direction that have to be dealt with quickly but don&#x27;t have a high data volume. Then you have a lot of stuff that&#x27;s less time critical.<p>The event stream is usually over UDP in the game world. Since you might lose a packet, that&#x27;s a problem. Most games have &quot;unreliable&quot; packets, which, if lost, are superseded by later packets. (&quot;Where is avatar now&quot; is a typical use.) You&#x27;d like to have that stream on a higher quality of service than the others, if only ISPs and routers actually paid attention to that.<p>Then you have the less-critical stuff, which needs reliability. (&quot;Object X enters world&quot; is a typical use.) I&#x27;d use TCP for that, but Second Life has its own not very good UDP-based protocol, with a fixed retransmit timer. Reliable delivery, in-order delivery, no head of line blocking - pick two. TCP chooses the first two, SL&#x27;s protocol chooses the first and third ones. Out of order delivery after a retransmit can cause avatars to lose clothing items, because the child item arrived before the parent item.<p>Then you have asset fetching. In Second Life&#x2F;Open Simulator this is straight HTTP&#x2F;1. But there are some unusual tricks. Textures are stored in progressive JPEG 2000. It&#x27;s possible to open a connection and just read a few hundred bytes to get a low-rez version. Then, the client can stop reading for a while, put the low-rez version on screen, and wait to see if there&#x27;s a need to keep reading, or just close the connection because a higher-rez version is not needed. The poor server has to tolerate a large number of stalled connections. Worse, the actual asset servers on AWS are front-ended by Akamai, which is optimized for browser-type behavior. Requesting an asset from an Akamai cache results in fetching the entire asset from AWS, even if only part of it is needed. There&#x27;s a suspicion that large numbers of partial reads and stalled reads from clients sometimes causes Akamai&#x27;s anti-DDOS detection to trip and throttle the data flow.<p>So those are just some of the issues &quot;the HTTP of VR&quot; must handle. Most are known to MMO designers. The big difference in virtual worlds is there&#x27;s far more dynamic asset loading. How well that&#x27;s managed has a strong influence on how consistent the world looks. It has to be constantly re-prioritized as the viewpoint moves.<p>(Demo, from my own work: <a href="https:&#x2F;&#x2F;vimeo.com&#x2F;user28693218" rel="nofollow">https:&#x2F;&#x2F;vimeo.com&#x2F;user28693218</a> This shows the client frantically trying to load the textures from the network before the camera gets close. Not all the tricks to make that look good are in this demo.)<p>It&#x27;s not an overwhelmingly hard problem, but botch it and you will be laughed off Steam.
评论 #29518303 未加载
gfxgirl超过 3 年前
I think there is a different problem that needs to be solved and it&#x27;s probably impossible.<p>I&#x27;ve dreamed of the metaverse since Snow Crash and maybe before (Tron?) but ... when it comes to actually making it, lets assume unlimited CPU&#x2F;GPU power and unlimited memory.<p>Ideally, I want the Metaverse to allow people to run their own code. Whether its VR or AR it&#x27;s a shared 3D space. So I want my Nintendo &quot;Nintendogs&quot; to be able to run around my &quot;Ikea furniture&quot; with my &quot;Google&#x2F;Apple&#x2F;OSM maps&quot; showing me navigation directions and my &quot;FB Messenger&#x2F;Discord&#x2F;iOS Messenger&quot; letting me connect to people inside. In a webpage, each of these things runs in an IFRAME isolated from the other and browsers go to great lengths to disllow one spying on another.<p>But in this 3D space my Nitendogs can&#x27;t run through the space unless they can &quot;sense the space&quot;. They need to know where the fire hydrants are, where the side walk is, what things they&#x27;re allowed to climb&#x2F;chew etc. But to do that effectively means they need enough info to spy on me.<p>Same for all the other apps. I can use messaging apps on my phone with GPS off and full network access off so that the app can&#x27;t know my location, but order for different apps in the Metaverse to do similar they&#x27;ll need to know at least the virtual location of themselves and the stuff around them which is enough to track&#x2F;fignerprint<p>You can maybe get around some of this with a massive walled garden but that arguably is not the metaverse.
评论 #29519748 未加载
评论 #29525912 未加载
bborud超过 3 年前
Nothing in the blog posting suggests to me you can&#x27;t use HTTP and Websockets for VR. The understanding of HTTP in the blog posting seems to be rooted in the early 2000s. I don&#x27;t think the author has much experience in protocol design (it is harder than it looks).<p>It would be more productive to define a layer on top of HTTP&#x2F;2 so we can leverage a lot of code that already works, rather than having to spend 10-15 years creating a new spec and codebases that need maturing.<p>And if you&#x27;re not happy with websockets for low latency bidirectional communication: it would make more sense to improve websockets rather than reinvent the wheel.
评论 #29517088 未加载
Dirak超过 3 年前
Networking for multiplayer games is a super interesting problem space since games tend to be more sensitive to latency, packet loss, and the accuracy of game states between clients. The problems are even more pronounced in VR where noticeable latency or artifacts can cause motion sickness.<p>In modern fighter games, the industry seems to be tending toward predictive lockstep networking. This is a type of networking where if the client doesn&#x27;t receive the inputs of other clients from the server, it will &quot;predict&quot; those inputs (usually by replaying the last received input) to give the illusion of zero latency gameplay. The drawback being that you need to implement rollback in the case where the predicted input doesn&#x27;t match the real received input. When poorly executed, this could look like jittery player movement with entities rubber banding and teleporting and cause artifacts, but when done properly is mostly unnoticeable.<p>If you&#x27;re interested in this domain, I recommend checking out <a href="https:&#x2F;&#x2F;www.ggpo.net&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.ggpo.net&#x2F;</a> which is the library used in many of the modern fighter games (notably Skullgirls). It also comes with an in depth explanation of how to implement predictive networking with rollback on your own <a href="https:&#x2F;&#x2F;drive.google.com&#x2F;file&#x2F;d&#x2F;1cV0fY8e_SC1hIFF5E1rT8XRVRzPjU8W9&#x2F;view" rel="nofollow">https:&#x2F;&#x2F;drive.google.com&#x2F;file&#x2F;d&#x2F;1cV0fY8e_SC1hIFF5E1rT8XRVRzP...</a>
Mizza超过 3 年前
I don&#x27;t want to have to strap a fucking telephone to my face to go to some shitty fake job. Please don&#x27;t build this world.
Ono-Sendai超过 3 年前
I&#x27;m building something similar for metaverses, although with less emphasis on VR currently. See <a href="https:&#x2F;&#x2F;substrata.info&#x2F;about_substrata" rel="nofollow">https:&#x2F;&#x2F;substrata.info&#x2F;about_substrata</a><p>Currently it&#x27;s a relatively simple bidirectional protocol over TLS. It&#x27;s not fully documented yet but you can get an idea of it by looking at an example bot client in python: <a href="https:&#x2F;&#x2F;github.com&#x2F;glaretechnologies&#x2F;substrata-example-bot-python&#x2F;blob&#x2F;main&#x2F;substrata_chatbot_demo.py" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;glaretechnologies&#x2F;substrata-example-bot-p...</a>
jayd16超过 3 年前
This is pretty silly. We can&#x27;t throw away http because http solves problems that VR does not alleviate.<p>&gt;A real-time, dynamic, stateful two-way client-server protocol. As such, it will be if not fully RTP then close to it.<p>Why didn&#x27;t we always have this if all we needed to do was ask? So...realizing we still have the internet of today, what we actually need to rethink is html and the concept of the web as documents alone.<p>I would be interested to see some work on hyper-objects. As in, hypertext beyond text. The article should be &quot;HTML for VR&quot; and we should be musing about how to find, load, interact and link web based virtual objects.
raidicy超过 3 年前
Aframe comes to mind. You can have full VR experiences that link just like a Link in HTML to other VR experiences.<p><a href="https:&#x2F;&#x2F;aframe.io&#x2F;examples&#x2F;" rel="nofollow">https:&#x2F;&#x2F;aframe.io&#x2F;examples&#x2F;</a>
评论 #29517854 未加载
binarynate超过 3 年前
&gt; By far the greatest reason to look beyond HTML and HTTP for spatial computing is simply this: these technologies will continue to develop, and will always be driven by their primary purpose: to deliver webpages, websites and static, or marginally dynamic content.<p>This is a valid point, but I believe there&#x27;s still enormous potential to innovate on top of WebXR. Since browser engines are open source, it&#x27;s possible for upstart XR browser apps to add additional features to Gecko or Chromium that push WebXR forward.
评论 #29517688 未加载
bullen超过 3 年前
The HTTP of VR is HTTP!<p><a href="http:&#x2F;&#x2F;fuse.rupy.se&#x2F;about.html" rel="nofollow">http:&#x2F;&#x2F;fuse.rupy.se&#x2F;about.html</a><p>You also need a P2P protocol (probably some binary UDP thing) for tick based data like limb positions if you want body language.<p>But really VR is much less important for immersion than action MMO = Mario&#x2F;Zelda with 1000+ players.
unwind超过 3 年前
Oh how this reminded me of the Verse protocol and Uni-Verse! To be young again, and so on. :)<p>[1]: <a href="https:&#x2F;&#x2F;en.m.wikipedia.org&#x2F;wiki&#x2F;Verse_protocol" rel="nofollow">https:&#x2F;&#x2F;en.m.wikipedia.org&#x2F;wiki&#x2F;Verse_protocol</a>
sxp超过 3 年前
tl;dr: &quot;So at Simul, for the past few years we’ve been building this protocol: it’s called Teleport VR. Let’s see what we can make with it!&quot;<p>An alternative view would be that HTTP(S) would be &quot;the HTTP of VR&quot;. With WebXR and standard JS APIs for HTTPS, async fetching, WebRTC, etc, all the items listed in &quot;Imagine an application-layer protocol for VR with the following characteristics...&quot; are satisfied. And the stack can use battle-tested web technologies so that it can leverage standard CDNs, cloud servers, etc.<p>VR has some extra constraints over 2D webpages due to tighter frames per second and latency tolerances, but most of the web protocols can get you 90% of the way there.
评论 #29517099 未加载
schmorptron超过 3 年前
Kinda Off-topic, but if anyone is looking to play around with building vr spaces or games, i recently found out about LÖVR[0] which is a sipmle lua-based open source VR &quot;framework&quot;. Haven&#x27;t had a chance to play with it but it seems other people like it!<p>[0] <a href="https:&#x2F;&#x2F;lovr.org&#x2F;" rel="nofollow">https:&#x2F;&#x2F;lovr.org&#x2F;</a>
douglaswlance超过 3 年前
Latency is incredibly important in VR. If everything is streaming from a remote server, even if it&#x27;s a straight fiber connection, it&#x27;ll still be too much latency.
usrbinbash超过 3 年前
What exactly is the &quot;metaverse&quot; supposed to be, other than a marketing term to sell a more expensive class of IO devices?<p>People will not switch over in droves to do their text&#x2F;image&#x2F;video editing in VR all of a sudden, because other than a few special design applications, there is no point in doing so...it&#x27;s slower, clumsier and the input devices are much less precise than mouse&amp;keyboard.<p>Another supposed target demographic, people in IT won&#x27;t switch either. I see no point in virtually grabbing a glowing code-ball and throing it into the &quot;deploy-tube&quot;, or navigate a codebase using haptic gestures with the huge meat-styluses at the end of my arms, when I can simply type `git push` or `&#x2F;myAwesomeStruct`<p>I also have a hard time imagining management sitting in meetings while wearing a 400g headset for 3h. Or companies being willing to cough up 350+$ for every employee just so they can join meetings, when Zoom is basically free.<p>So, what else is there? Gaming and maybe some &quot;recreational apps&quot; (aka. alsogaming, only less interactive). And since not all games will take place in the same unified MMORPG-ish permanent universe (yes, people want to play in sessions, and people want to play single player, and people want to play while not connected to the internet), this will not be a paradigm-shift, but rather a new toy in an already large collection of other toys.
评论 #29517112 未加载
评论 #29516826 未加载
评论 #29516955 未加载
评论 #29517291 未加载
评论 #29516872 未加载