TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Typometer: A tool to measure and analyze the visual latency of text editors

162 pointsby mrzoolalmost 5 years ago

8 comments

userbinatoralmost 5 years ago
The companion article at <a href="https:&#x2F;&#x2F;pavelfatin.com&#x2F;typing-with-pleasure&#x2F;" rel="nofollow">https:&#x2F;&#x2F;pavelfatin.com&#x2F;typing-with-pleasure&#x2F;</a> has some example results, showing Atom, the only Electron-based one in the list, as having several times more latency than the others. I wonder how VSCode compares.<p>It&#x27;s unfortunate that software seems to feel slower the newer it is, including developer tools such as text editors&#x2F;IDEs. I suspect this is because most people - including younger developers - have never seen how fast things can be, but are instead constantly subjected to delays in using applications to the extent that they think it is an absolutely normal speed[1], and then propagate that notion to the software they themselves create.<p>Also related: <a href="https:&#x2F;&#x2F;danluu.com&#x2F;input-lag&#x2F;" rel="nofollow">https:&#x2F;&#x2F;danluu.com&#x2F;input-lag&#x2F;</a><p>[1] For example, everything that uses UWP in Windows. Someone who had only ever used the Settings app in Windows 10, or (even worse) the Calculator, might not realise how absurdly slow they are in comparison to the previous versions&#x27; where they would open nearly instantaneously.
评论 #23953562 未加载
评论 #23953439 未加载
评论 #23953356 未加载
评论 #23955781 未加载
评论 #23954002 未加载
评论 #23953337 未加载
评论 #23955522 未加载
评论 #23953355 未加载
评论 #23953870 未加载
评论 #23955283 未加载
评论 #23953372 未加载
formerly_provenalmost 5 years ago
&gt; Because sampling rate is fast enough to misinterpret contact bounce as keystrokes, keyboard control processor perform so-called debouncing of the signals by aggregating them across time to produce reliable output. Such a filtering introduces additional delay, which varies depending on microcontroller firmware. As manufacturers generally don’t disclose their firmware internals, let’s consider typical debouncing algorithms and assume that filtering adds ~7 ms delay, so that maximum total “debounce time” is about 12 ms, and average total debounce time is ~8.5 ms.<p>Debouncing in software is one of those things that 99 % of developers get wrong, and is something even hardware manufacturers get wrong all the time.<p>A lot of hardware debounces in a dumb and naive way: On the first state change, it waits 5-10 ms and sample the switch again to figure out if it was pressed or not. So you get an inherent &quot;debounce&quot; delay, which is entirely unnecessary.<p>Debouncing keys correctly works so: If the switch generates an edge, sent key down&#x2F;up event immediately and ignore further switch transitions for 5-10 ms. <i>There is no point in waiting for the switch to surely have finished bouncing before reporting the event, because if it is bouncing it MUST have been pressed&#x2F;released and you know which one it is because you know the prior state of the switch</i>.<p>---<p>Compositor delay due to VSync<p>Obviously compositors are using double-buffered vsync precisely because they intend to limit FPS to $smallValue in order to save power and prevent the 3D hardware from entering a high performance power state. They really <i>should</i> be using triple-buffered vsync, but only start to render if something changed, resulting in much lower latency without constantly running at 1000 FPS. There should be a way for the compositor to be notified of changes in client areas, since stuff like X11 damage protocol and RDP are a thing.
评论 #23956402 未加载
arexxbifsalmost 5 years ago
Seeing as how this and similar articles regularly crop up here, I&#x27;m curious as to when this started and why nothing is done about it. I still regularly use my 14 MHz Amiga 1200 for recreational programming and I _never_ experience input lag on that machine.<p>Thinking back, the first time I noticed input lag was on a Mac LCIII running Word in the mid-90s. Then for a long long time I didn&#x27;t come across any particularly noticeable latency, except on really crappy websites and in really crappy Java apps. Then Microsoft bought Skype and started working their magic on it, and this seemed to open some kind of floodgate of high-latency crap. That&#x27;s not even a decade ago.<p>After that, little by little, everything seemed to slow down noticeably. We&#x27;ve now reached a level when this is seemingly normal. Even programmer colleagues who are my age and older are looking at me like I&#x27;m curious when I complain about the latency. I&#x27;d say something toxic about Electron here, but it&#x27;s prevalent in native programs as well.<p>Have things really gotten so much more complex since 2010 that we can no longer put a character on screen in a timely fashion?
评论 #23955473 未加载
PragmaticPulpalmost 5 years ago
I&#x27;m all for faster software, but input lag measurements need to be put in perspective. Just because an editor can process an input in 3ms doesn&#x27;t mean the pixels will change on your screen in 3ms.<p>If you&#x27;re using a 60Hz monitor, you&#x27;re only going to see new frames every 17ms at most (1 second &#x2F; 60Hz). Your graphics pipeline might have some additional buffering, adding 10s of milliseconds of lag. Your monitor likely has some input processing as well, adding anywhere from 10-20ms before it sends the frame to the physical display. Add a few milliseconds here and there for input processing and even the response time of the physical pixels, and you&#x27;re looking at something like 50-60ms minimum for total display latency <i>before</i> you factor in the software.<p>Using 144Hz FreeSync or G-Sync monitors can shorten that update time, but you&#x27;re still looking at 30ms end-to-end latency in even the fastest setups, and that&#x27;s before you account for software processing lag.<p>The difference between Sublime Text responding in 11.4ms average and Atom responding in 28.4ms average is difference of almost exactly 1 frame of latency for a typical 60Hz monitor. Add up all of the other sources of lag (buffering, monitor input lag) and you&#x27;re looking at something like a 5 frame latency instead of a 4 frame latency. Still less than the blink of an eye (literally).<p>From another perspective: If you really believe that you&#x27;re sensitive enough to feel a difference between something like Sublime Text&#x27;s 11ms processing latency vs. Atom&#x27;s 28ms latency, then you might want to invest in a proper 144Hz gaming monitor with low input lag, as it would improve your experience by the same margins. A gaming-specific keyboard might also help, as average keyboards can have 10-20ms of input lag before the keypress registers with the OS (Source: <a href="https:&#x2F;&#x2F;pavelfatin.com&#x2F;typing-with-pleasure&#x2F;" rel="nofollow">https:&#x2F;&#x2F;pavelfatin.com&#x2F;typing-with-pleasure&#x2F;</a> ) Realistically, though, I doubt many people could A&#x2F;B test the difference between a 1ms and a 30ms latency editor under ideal conditions, let alone while typing out some code.
评论 #23953606 未加载
评论 #23954363 未加载
评论 #23953505 未加载
评论 #23953557 未加载
评论 #23956806 未加载
vardumpalmost 5 years ago
Correct me if I&#x27;m wrong, but this doesn&#x27;t seem to measure input stack latency nor how long it takes for the pixels to be actually visible on the display after all of the compositing delays.<p>All this measures seems to be time from injected keyboard event until pixels change on whatever bitmap&#x2F;surface in memory. Message passing, in other words.
评论 #23954264 未加载
fxtentaclealmost 5 years ago
I would love to see XCode on that benchmark, as I regularly manage to out-type it.
galaxyLogicalmost 5 years ago
I&#x27;ve been working on a browser-based editor, and it feels slow. So does this input-field on Hacker News website. Is it something about the browsers?
29athrowawayalmost 5 years ago
Using Java to measure latency... Does not sound very accurate.
评论 #23954561 未加载
评论 #23954299 未加载
评论 #23954243 未加载