I just want to say I am glad pro gaming took over. Back in the day it was only Quake players advocating for 120 FPS (for various reasons, including Q3 physics being somewhat broken), 125hz mice and stuff like that. I am talking 20 years ago.<p>The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.<p>Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.
> The cause of Bufferbloat was that previously, congestion control relied on packet loss, which was a signal that the buffer is full.<p>I couldn't understand "congestion control relied on packet loss"? could somebody explain? Thanks!<p>does it mean "congestion control is triggered by the packet loss event, which is a signal for buffer being full"?
Great read. I have one nit-pick recommendation for clarity: the article makes no mention of "input latency" anywhere. Saying just "latency" is very confusing since the term applies to many areas of a game, and in most cases will typically be attributed to network latency in multiplayer games.
I usually get 10ms ping on CSGO.... they must have something better? (I have 5ms right now with a Comcast cable link)... as much as I hate having to call Comcast for any issues, when it works, it is pretty good.)
The elephant in the room here is that you can pay to win in any game by buying a monitor with higher reftesh rate and use a larger GPU that uses more electricity to have 2x more time to react.<p>Fortunately for us humans that seems to stop at 120Hz because most games can't even hold that at a steady rate with a 3090.<p>Now whether a 300+W gaming device is interesting in the long run will be answered this year by your electricity bill!