OK so someone has noticed that you "only" need 15MBs-1 for a 4K stream. Jolly good.<p>Now let's address latency - that entire article only mentioned the word once and it looks like it was by accident.<p>Latency isn't cool but it is why your Teams/Zoom/whatevs call sounds a bit odd. You are probably used to the weird over-talking episodes you get when a sound stream goes somewhat async. You put up with it but you really should not, with modern gear and connectivity.<p>A decent quality sound stream consumes roughly 256KBs-1 (yes: 1/4 megabyte per second - not much) but if latency strays away from around 30ms, you'll notice it and when it becomes about a second it will really get on your nerves. To be honest, half a second is quite annoying.<p>I can easily measure path latency to a random external system with ping to get a base measurement for my internet connection and here it is about 10ms to Quad9. I am on wifi and my connection goes through two switches and a router and a DSL FTTC modem. That leaves at least 20ms (which is luxury) for processing.
> Could maximum data speeds—on mobile devices, at home, at work—be approaching “fast enough” for most people for most purposes?<p>That seems to be the theme across all consumer electronics as well. For an average person mid phones are good enough, bargain bin laptops are good enough, almost any TV you can buy today is good enough. People may of course desire higher quality and specific segments will have higher needs, but things being enough may be a problem for tech and infra companies in the next decade.
>Regulators may also have to consider whether fewer operators may be better for a country, with perhaps only a single underlying fixed and mobile network in many places—just as utilities for electricity, water, gas, and the like are often structured around single (or a limited set of) operators.<p>There are no words to describe how stupid this is.
> Of course, sophisticated representations of entire 3D scenes for large groups of users interacting with one another in-world could conceivably push bandwidth requirements up. But at this point, we’re getting into Matrix-like imagined technologies without any solid evidence to suggest a good 4G or 5G connection wouldn’t meet the tech’s bandwidth demands.<p>Open-world games such as Cyberpunk 2077 already have hours-long downloads for some users.
That's when you load the whole world as one download. Doing it incrementally is worse. Microsoft Flight Simulator 2024 can pull 100 to 200 Mb/sec from the asset servers.<p>They're just flying over the world, without much ground level detail.
Metaverse clients go further. My Second Life client, Sharpview, will download 400Mb/s of content, sustained, if you get on a motorcycle and go zooming around Second Life. The content is coming from AWS via Akamai caches, which can deliver content at such rates.
If less bandwidth is available, things are blurry, but it still works. The level of asset detail is such that you can stop driving, go into a convenience store, and read the labels on the items.<p>GTA 6 multiplayer is coming. That's going to need bandwidth.<p>The Unreal Engine 5 demo, "The Matrix Awakens", is a download of more than a terabyte. That's before decompression.<p>The CEO of Intel, during the metaverse boom, said that about 1000x more compute and bandwidth was needed to do a Ready Player One / Matrix quality metaverse. It's not that <i>quite</i> that bad.
We live in a rural location, so we have redundant 5G/Starlink.<p>It's getting pretty reasonable these days, with download speeds reaching 0.5 gbit/sec per link, and latency is acceptable at ~20ms.<p>The main challenge is the upload speed; pretty much all the ISPs allocate much more spectrum for download rather than upload. If we could improve one thing with future wireless tech, I think upload would be a great candidate.
With 5G, I have to downgrade to LTE constantly to avoid packet loss in urban canyons. Given the even higher frequencies proposed for 6G, I suspect it will be mostly useless.<p>Now, it's possible that that raw GB/s with unobstructed LoS is the underlying optimization metric driving these standards, but I would assume it's something different (e.g. tower capex per connected user).
This article misses the forest through the trees.<p>I can grant that a typical usage of wireless bandwidth doesn't require more than 10Mbps. So, what does "even faster buy you"?<p>The answer is actually pretty simple, at any given frequency you have a limited amount of data that can be transmitted. The more people you have chatting to a tower, the less available bandwidth there is. By having a transmission standard with theoretical capacities in the GB or 10GB, or more you make it so you can service 10, 100, 1000 more customers their 10Mbps content. It makes it cheaper for the carrier to roll out and gives a better experience for the end users.
> Transmitting high-end 4K video today requires 15 Mb/s, according to Netflix. Home broadband upgrades from, say, hundreds of Mb/s to 1,000 Mb/s (or 1 Gb/s) typically make little to no noticeable difference for the average end user.<p>What I find fascinating is that in a lot of situations mobile phones are now way faster than wired internet for lots of people. My parents never upgraded their home internet despite there being fire available. They have 80MBit via DSL. Their phones however due to regular upgrades now have unlimited 5G and are almost 10 times as fast as their home internet.
Higher bandwidths are good to have. They're great for rare, exceptional circumstances.<p>10G internet doesn't make your streaming better, but downloads the latest game much faster. It makes for much less painful transfer of a VM image from a remote datacenter to a local machine.<p>Which is good and bad. The good part is that it makes it easier for the ISPs to provide -- most people won't be filling that 10G pipe, so you can offer 10G without it raising bandwidth usage much at all. You're just making remote workers really happy when they have to download a terabyte of data on a single, very rare occasion instead of it taking all day.<p>The bad part is that this comfort is harder to justify. Providing 10G to make life more comfortable the 1% of the time it comes into play still costs money.
I got a 5G capable phone a few months back, and I can't say I've noticed a difference from my old one. (Aside from the new phone being more expensive, worse UI, slower, heavier, unwieldy, filled with ads, and constantly prompting me to create a "Samsung account".)
> Is that such a foregone conclusion, though? Many technologies have had phases where customers eagerly embrace every improvement in some parameter—until a saturation point is reached and improvements are ultimately met with a collective shrug.<p>> Consider a very brief history of airspeed in commercial air travel. Passenger aircraft today fly at around 900 kilometers per hour—and have continued to traverse the skies at the same airspeed range for the past five decades. Although supersonic passenger aircraft found a niche from the 1970s through the early 2000s with the Concorde, commercial supersonic transport is no longer available for the mainstream consumer marketplace today.<p>OK, "Bad Analogy Award of the Year" for that one. Traveling at supersonic speeds had some fundamental problems, primarily being that the energy required to travel at those speeds is so much more than for subsonic aircraft, and thus the price was much higher for supersonic travel, and the problem of sonic booms meant they were forbidden to travel over land. When the Concorde was in service, London to NYC flights were 10-20x more expensive on the Concorde compared to economy class on a conventional jet, meaning the ~4 hours saved flight time was only worth it for the richest (and folks just seeking the novelty of it). There are plenty of people that would still LOVE to fly the Concorde if the price were much cheaper.<p>That is, the fundamental variable cost of supersonic travel is much higher than for conventional jets (though that may be changing - I saw that pg posted recently that Boom has found a way to get rid of the sonic boom reaching the ground over land), while that's not true for next gen mobile tech, where it's primarily just the upfront investment cost that needs to be recouped.
Note that existing bandwidth usage has been driven by digitization of existing media formats, for which there was already a technology and industry - first print, then print+images, then audio, then video. People have been producing HD-quality video since the beginning of Technicolor in the 1930s, and while digital technology has greatly affected the production and consumption of video, people still consume video at a rate of one second (and about 30 frames) per second.<p>There are plenty things that *could* require more bandwidth than video, but it's not clear that a large number of people want to use any of them.
I was hoping to see some mention of latency. Agree with the premise that for most consumer applications we don’t need much more wireless throughput but latency still seems way worse than Ethernet heyday times in college
What we really need is pervasive low data rate backup systems for text messaging and low fidelity emergency calls that don't kill handset batteries. If this means "Starlink" and/or lower frequency bands (<400 MHz): the more options, the merrier for safety. Perhaps there may come a time where no one needs an EPIRB/ELT because that functionality is totally subsumed by smartphones offering equal or superior performance.
Is there a reason we keep trying to use higher frequencies in every new wireless standard (Wi-Fi, 5G, now 6G) instead of trying to increase the maximum possible bitrate per second into lower frequencies? Have we already reached the physical limits of the amount of data the can be encoded at a particular frequency?<p>Lower frequencies have the advantage of longer distances and permeating through obstructions better. I suppose limited bandwidth and considerations of the number of devices coexisting is a limiting factor.
We need the freedom to do more on our mainstream pocketable devices, my hotspot for my laptop will be always throttled down to 3G, as if to say<p>“Hey, this isn’t actually supposed to be used to get work done. Keep doing simple phone stuff!“
The most data intensive thing you can stream to a device is video. For two reasons:<p>1. The whole point of video is that you need all of the data, in order<p>2. Video is the only thing, really, that requires that much data per second of sensory input.<p>Mobile devices aren't really getting more pixels. And hardware isn't improving enough the we have spare cycles for decoding/decrypting/rendering 16K video or whatever, and even if it did, the batteries can't handle that. It's obvious to me that we don't need more bandwidth. My phone would be instantly dead if I was downloading and processing data at a gigabit per second.<p>Other than downloading very large files (why?), I don't think we'll invent any new use cases that can make use of more bits or second.
I find the premise of the article completely out of touch with the growth environment of the last 40 years. Of course the median consumer don't have an application for more than what the current infra can offer, it has consistently been the case since broadband inception. The reason is simple, no one will create applications designed for the median consumer with requirements higher than what the infrastructure can offer as this is guaranteed to fail. I thought it was clear that it was a typical offer driven market. Reading this article I'm really afraid that we will shoot ourselves in the foot if we forget this. A bit like what is happening with CPUs.
Wireless networks' latency problems are almost entirely caused by contention, buffering, and vast over-booking of bandwidth, where raw bandwidth number competition has been over-valued relative to actual network application performance.<p>L4S is on its way, and may finally be the beginning of the end for bufferbloat and congestion, and vendors of mobile devices are in an almost unique position of being able to roll out network stack changes en masse. And just for once, consumer incentives, vendor incentives and network operator incentives all align - and it's incremental, and lacking in incentives for bad actors.<p>See this blog entry: <a href="https://www.ietf.org/blog/banishing-bufferbloat/" rel="nofollow">https://www.ietf.org/blog/banishing-bufferbloat/</a> for more on L4S and bufferbloat. And this: <a href="https://datatracker.ietf.org/meeting/105/materials/slides-105-tsvwg-sessa-5-l4s-presentation-00" rel="nofollow">https://datatracker.ietf.org/meeting/105/materials/slides-10...</a> for a proper technical deep dive.<p>The development of L4S has been a pincer operation across all levels of the network stack, integrating everything previously understood about latency and congestion in real networks, and one of the most impressive bits of network engineering I've seen in the history of the Internet.
My personal bugbear is the network coverage.
Context: London / UK (EE).
Yes, I have 5G at home, but it's just one bar and sometimes even this one bar will disappear. Yes, there is 5G/4G all around the city, but you can't hold an uninterrupted conversation over FaceTime Audio while on the overground train or driving. I'll not even discuss the underground.
However, uninterrupted, low-latency, average bandwidth is a hard market and even harder to design.
I follow what the operators tell Wall St, because the executives have large personal financial stakes in getting it right (modulo fraud, but lets trust for now).<p>For example, at the last investor day that AT&T held, they indicated [0:pdf] that their growth plans are in broadband fiber, not building more 5G capacity to serve a surge in traffic. Reading their charts and knowing how AT&T traditionally worked, I believe that they are going to try to cut the expense of running the 5G network via various optimizations and redirect capital heavily to go after the fiber broadband market instead, using convergence (ahem: price bundling) to win subscribers.<p>(I bet it really stuck in their craw that Comcast mastered the bundle so well that they even built a successful MVNO on the back of their xfinity customer base and parked their tanks on AT&Ts lawn while the latter was futzing about with HBO.)<p>However, bundling is a price compression game, not an ARPU growth game. If you start at $100 a month and then begin chipping away with autopay discounts, mobile convergence, free visa gift cards and all that nonsense, pretty soon you are selling broadband for $35 a month and can't make it work except with truly cruddy service, which leads to high churn rates. So we'll see how this turns out.<p>[0:pdf] <a href="https://investors.att.com/~/media/Files/A/ATT-IR-V2/reports-and-presentations/2024-analyst-day-with-notes.pdf" rel="nofollow">https://investors.att.com/~/media/Files/A/ATT-IR-V2/reports-...</a>
The issue is that they never offer the actual listed speed. Not anywhere close to it. Mobile network is usually worthless in my experience these days in most busy cities. I might be waiting a good 2 minutes for a pageload if the tower is heavily saturated. That is most of my experience in fact. Websites like the mobile version of reddit are way too heavy and don't work at all. HN works, but the articles it links usually don't unless its an html blog post. Social media like instagram buffers terribly for crappy connections. Even the old guard youtube doesn't let you cache a large portion of the video anymore before playing.<p>I guess its a nice quality filter, but still. I can't be the only one but feel like the mobile web experience has taken a nose dive over the last 10 years as we got so many more mobile data users, and mobile web developers forgot how to write for slow internet speeds in the same time.
I have been arguing since 2020 ( That is before majority of 5G being deployed ) that we need 6G not as another 10 - 1000x bandwidth / capacity improvement. We have enough within 5G roadmap already. 6G should make it cheaper for both devices and carrier, more efficient and simplified because current 5G is overly complex.<p>For those criticising 5G, most mobile network has yet to move to Stand Alone network with NR yet. And after that they will have to refarm all the 4G spectrum to 5G. They could also replace equipment that were 4G / 5G to 5G+ / 5G-A, allowing cheaper Massive MIMO hence much higher capacity. Doing all that while continue to tune the network both front and backend. Despite 5 years into 5G we are still behind in many aspect. Most likely due to COVID.<p>But it seems some network has <i>NO</i> interest to further improve on current tech. I assume they would do minimum investing once they force everyone to 5G.
This article is just a mess. It constantly switches contexts between speed and data usage. In the paragraph about White House task force projections it implies they were wrong but switches between speed/data usage context to justify the claim. It also bounces between mobile and wired, which are obviously very different.<p>It's obvious that mobile data usage is constrained by available user time in mobile usage contexts. People use their mobile devices when they aren't busy doing something else and that time is limited. Most people are at the limit. Previous growth was driven both by adding more per user hours AND by adding new users. The author only showing data for overall bandwidth consumption masks the real drivers of the trends being discussed.<p>Mobile speed requirements are constrained by being a ~6-inch hand-held screen. You don't need 4K or ultra-res textures on a 6-inch screen.
Relatedly, Nokia announced a new CEO this week with a datacentre/enterprise background.<p><a href="https://www.reuters.com/business/media-telecom/nokia-ceo-steps-down-2025-02-10/" rel="nofollow">https://www.reuters.com/business/media-telecom/nokia-ceo-ste...</a>
There is not a single 5G provider with unlimited high-speed access. <i>Not one.</i><p>Perhaps this has something to do with limited mobile bandwidth?<p>Now imagine we add more bandwidth: what would happen to Comcast and other fiber monopolists if people started replacing fiber with 5G?
For VoIP applications, one-way mouth-to-ear delay should ideally be less than 150 ms for natural conversations [1].<p>Other factors, such as jitter, transmission delay, queuing delay, etc., also impact quality. However, if the delay occurs mid-transmission (e.g., due to network congestion or routing inefficiencies), there’s little that can be done beyond optimizing at the endpoints.<p>[1] <a href="https://www.wikiwand.com/en/articles/Latency_(audio)#Telephone_calls" rel="nofollow">https://www.wikiwand.com/en/articles/Latency_(audio)#Telepho...</a>
> Consider a very brief history of airspeed in commercial air travel. Passenger aircraft today fly at around 900 kilometers per hour—and have continued to traverse the skies at the same airspeed range for the past five decades<p>This is a terrible example.<p>If supersonic mass travel could be provided safely and cheaply, demand would be bonkers.<p>If I could get to Tokyo in an hour for $50, I would visit every weekend.<p>Overall, the article is sound.<p>But what a terrible example of demand not filling supply.<p>If I had a terabit per second, indeed I probably wouldn't use it.<p>But you can not make travel fast enough.
Introducing such new technologies is ALWAYS about rendering existing hardware "obsolete" so that new devices can be forced upon the consumer (who most often does not need them).
The providers have alreadyy fixed this (coming soon near you): My fixed network operator, instead of finally forcing the fiber supplier to honor their contract and fix the final 5m of cabel, is pushing me to use a 5g box for fixed internet.<p>So you want me to trade my horrible 100mbit line with ok latency for anything between 10 and 120mbit depending on the time of day with a latency if over 50ms guaranteed. And you also get a CGNAT IP and thus no incoming connections because who needs ipv6!
In the metropolitan area I live, it’s not the theoretical maximum speed that’s the issue - 30mbps would be more than enough for almost anything. It’s the contention in high traffic areas. The phone just doesn’t work near mainline stations in London.
I think what matters a lot more is the peak data rate per cell, not per device.<p>Once users stop demanding higher speeds, you can still get a lot more growth by giving them higher data caps or even unlimited plans.<p>This will make 5G home routers a very attractive option, particularly when the only other options you have available are cable and/or DSL.<p>I think wireless providers competing in the same playing field as traditional ISPs will turn out to be an extremely good thing for consumers.
My biggest issue isn't the raw speed, it's the quality of the connection. What does it matter if it's theoretically so fast if you can never get a decent connection? I live in a decent size city with supposedly great 5G coverage according to 3rd party maps. Yet, I often struggle to load simple images with a brand new phone and an unlimited data plan.<p>It's metro, which is lower-priority, but still.
> Instead, the number of homes with sufficient connectivity and percentage of the country covered by 10 Mb/s mobile may be better metrics to pursue as policy goals.<p>Hard pass.<p>5G is far from ubiquitous as it. Though how would we even know? I feel like my phone is always lying about what type of network it's connecting to and carrier shave the truth with shit like "5Ge" and the like.<p>I have not, ever, really thought "Yeah, my phone's internet is perfect as-in". I have low-signal areas of my house, if the power goes out the towers are sometimes unusable due to the increased load, etc. I do everything in my power to never use cellular because I find incredibly frustrating and unreliable.<p>Cell service has literally unlimited headroom to improve (technologically and business use-cases). Maybe we need more 5G and that would fix the problems and we don't need 6G or maybe this article is a gift to fat and lazy telecoms who are masters at "coasting" and "only doing maintenance".
The problem I find is with the backhaul rather than the 5G signal. I can still get unreliable internet with a strong 5G signal. Even today I find for general internet browsing is good 3G is OK and can even support a video stream. My understanding is that 4G/5G allow for better network design and the faster headline speeds are just a positive for marketing as much as anything else.
I wonder why the poster blatantly changed the original title altogether since it's very misleading?<p>Currently 5G doesn't even meet their very own 3 objectives namely:<p>1) Enhanced Mobile Broadband (eMBB) - eMBB,
2) Massive Machine Type Communication (mMTC) - mMTC,
3) Ultra-Reliable Low Latency Communication (URLLC) - URLLC<p>The article is just focusing on the first part that's arguably can be met by prior standard the 4G LTE+ if it's only about 1 Gbps bandwidth of the 5G lower range microwave frequency bands or FR1.<p>For most parts of the world, 5G higher range millimeter waves (mmWave) band or FR2 are not being widely deployed and implemented that can delivered bandwidth much higher than 1 Gbps but then again the wireless transmission ranges and distances are severely limited for mmWave compared to microwave or RF.<p>One of the notable efforts by 3GPP (5G standards consortium) for objectives part 2 and 3 is the DECT NR+ [1]. It's the first non-cellular 5G standard that can support local mesh wireless networks but it's availability is very limited now although no base stations modifications are required for its usages since it's non-cellular and the backhaul to base stations can be performed by the existing 5G connectivity. I'd imagined that it will be available inside most phones now since the standard already out for several years but it seems only Nordic is interested in the new standard.<p>The upcoming 6G standards probably need to carry on and focusing on the 2 and 3 objectives of 5G even more so, since it's expected machine to machine (M2M) will surpass the conventional human-to-human (H2H) and even the new (H2M) with the rise of IoT and AI based system for example intelligent transports of (vehicle-to-everything) V2X systems.<p>[1] DECT NR+: A technical dive into non-cellular 5G (30 comments):<p><a href="https://news.ycombinator.com/item?id=39905644">https://news.ycombinator.com/item?id=39905644</a>
For me 3G is still 100% sufficient for my usage.
Also the latency has not been a problem for me, but jitter is. The good thing generally latency is pretty much stable so jitter (which is derivative of latency) is minimal.
I already don't understand what 4G added to me, let alone 5G
Without 5G SA that standard is an awful battery sucker. My Pixel 6 looses almost 1/3 of its battery capacity when being stuck at 5G NSA.<p>With my Pixel 8 and 5G SA activated (Telefonica Germany) everything is back to normal.
I am actually using less mobile data since I have fiber. Of-course I can also get gigabyte speeds on my mobile thanks to 5G... (This country can be pretty nice although I will deny it under torture).
Someone I know at a mixed-signal company many of whose chips go to 5G deployments said their revenue really slowed down last year due to 5G deployment uptake decreasing significantly.
Um, the airspeed analogy used up front is remarkably silly.<p>Nobody would shrug at being able to fly 2x faster.
The reason it stopped is because it made lots of noise and was expensive.
Not because it was not needed.<p>I think you'd be hard pressed to find anyone who would not want faster flights if it could be done at reasonable cost.
> for those with good 4G connectivity, 5G makes much less of an improvement on the mobile experience than advertisers like to claim<p>Complete BS. This misses the critical and common issue of user contention with 4G, which is the main relief 5G brings.<p>If all you do is compare peak throughput on paper you will miss the real world performance issues. In reality you will never see "good" 4G speeds, there just isn't enough bandwidth to go around in practice due to the lower frequency bands it operates in.<p>I've got both a 4G and 5G LTE router, both are Cat20 which means in theory can operate at up to 2 Gbit/s. Yet in practice, with any of the carriers in the UK the 4G modem will scape 40 Mbit/s at best in the wee hours, and drop as low as 3 Mbit/s at peak time. The 5G one will give me 600 Mbits all day long... because there is enough bandwidth to go around for all users, this is the key difference.
My need for speed is a long way from saturated. 360° virtual reality calls at a resolution better than my physical senses is just one thing my unimaginative mind can come up with on the spot.
After reading this, I can't help but point out the misinformation about 5G coverage. At least in America, 5G coverage is at best around 50% of the country. I distrust the carrier claims and maps about their coverage because even in covered areas I'm not able to get advertised speeds.<p>It could be that mobile applications have hesitated to rollout features like 4K streaming video because most users don't have true 5G coverage. We might not see mobile data growing because we've failed in our 5G infrastructure goals. The political atmosphere might not support building out more 5G networks either.
Honestly after the big investment (and with a lot of drawbacks introduced by it) from 5G, I don't think telecoms have a lot of appetite for massive investments in such a short term again
It's rare I come across something so myopic, unimaginative and laughable. This will age as well as the Paul Krugman prediction:"the Internet's impact on the economy has been no greater than the fax machine's".
We have not even begun to explore the Uber/Airbnb applications of a 6G+ world. And the VR bandwidth ceiling lazy thought is an extension of the limited mindset of this author.
The "fast enough" illusion comes from having consumers adjust for years to flaky wireless connections after we had a good period of steady broadband performance improvements. If the latter had continued at the same rate, we could now rely on home users having at least 10-100 Gbit/s bandwidth and could build entirely different kidns of applications.<p>IOW: Apps could definitely use high bandwidth & low latency networking (think AI acceleration, remote storage, volumetric telepresence etc) if we had it reliably, but our due to wireless transition apps are adopted to stagnating speeds.
Are there really use cases for faster chips? I can run all models I want on an H100 pod.
No models exist that I can't run with at least 64 H100s. NVIDIA should just stop.
I’m part of the early test team for tachyon-enabled 7G. Obviously it’s only early alpha but I think non-Googlers can get Pixel 10s with it at the Mountain View Google store. If you upgrade the firmware to version 0.3.75A sometimes short text messages arrive before you type them.
Idiot author is a kindness. Data rates compared to an obsolete 1970s technology. Author is too old to write articles on the "newfanged internet". IEEE spectrum needs to kick this guy to the curb hard for writting crap and crossposting his book..