TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Analysis of the OnLive "cloud console" system

2 pointsby zemajabout 16 years ago

2 comments

zemajabout 16 years ago
Wrote this article last night. It's really got me thinking about the implications for web apps. If they really have got sub 100ms response times it could revolutionise web app delivery. It solves massive issues surrounding virtual worlds too. All graphics processing is done at the server. Amazing stuff if it really does work.
评论 #532679 未加载
John_Galtabout 16 years ago
Hardware seems to get cheaper/better faster than networks do. Also the trend in networking is bigger bands not lower latency. Someday cloud based low latency apps will work, but I doubt this will happen anytime soon.<p>The real news here is that compression card they have created. The article doesn't go into detail but large scale 1ms compression?!<p>What it would take to make this work: A unified, or at least more unified internet topology. Latency occurs at the gateways/modems/MUXes where you are translating from one type of packet to another. Perhaps you could get around enough of that with peering... hmmm Ultimately I just don't see a lot of applications requiring this kind of latency so OnLive will be swimming upstream.