2011 <a href="https://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking" rel="nofollow">https://techreport.com/review/21516/inside-the-second-a-new-...</a> article similarly started new era for computer gaming, beginning a larger, more technically informed discussion. Things got better in 2013 <a href="https://www.guru3d.com/articles-pages/fcat-benchmarking-review,1.html" rel="nofollow">https://www.guru3d.com/articles-pages/fcat-benchmarking-revi...</a> and culminated with G-sync/Adaptive-Sync/FreeSync. Everybody working with hi-end hardware knew about the problem (manufacturers, hardware testers, power users, technical forums), but it was widely ignored. Things got especially weird/bad with introduction of Alternate Frame Rendering by ATI(Rage MAXX) in 1999, and Nvidia following suit in 2004 (SLI) - great benchmark results with really unpleasant hands on experience due to micro-stuttering. Took 13 years for someone to finally say it out loud <a href="https://www.anandtech.com/show/7195/amd-frame-pacing-explorer-cat138/2" rel="nofollow">https://www.anandtech.com/show/7195/amd-frame-pacing-explore...</a><p>Things arent perfect even today, with some dev studios especially bad when it comes to frame pacing. XBOX is full of games released in 2019 with "locked" 30/60 fps, but jittery gameplay.