Tangentially relevant, but I've been part of several communities that have discovered a lot of interesting things about some video games using high-speed cameras.<p>Super Smash Bros. Melee: A game where input lag really matters a lot, the reason we still play using CRTs, not only has several frames of input lag, but it's not a constant number of frames of lag. It can range from 2 to <i>5</i> frames of input lag.<p>Tetris the Grandmaster series: Another game where input lag has a huge impact not only has several frames of input lag (and people have resorted to all sorts of ways to reduce it, including AHK scripts that constantly move the mouse on Windows XP) but the first two games of the series <i>don't</i> run at 60Hz. TGM1 runs at a slightly lower rate, not really significant, but TGM2+ runs at 62.68Hz, which is quite significant and makes some of the challenges a tad harder.<p>Both of these communities took the measuring test a few steps further than what can be done with isitsnappy. They connected LEDs to the buttons so it was easier to know precisely in what frame the button was pressed.<p>Someone in the Melee community also placed photoresistors close to the screen and used an oscilloscope to know exactly when the brightness changed.<p>Not exactly the most relevant anecdotes, but I felt like sharing.
Nice. That reminds me, Carmack tweeted that he could send an IP packet to Europe faster than he could send a pixel to the screen[1], and this is unfortunately true on modern hardware. It's nice that phones are putting the kind of high-speed cameras that can measure this latency into the hands of consumers. Maybe this will allow gamers to put some pressure on hardware manufacturers to reduce the latency that they add to the systems.<p>I tried to measure the latency of my own system a little while back. I used a digital camera to record video at 240 fps and measured the time it took for a button press on a DS4 connected over Bluetooth to be reflected on a Mega Drive emulator running the 240p test suite. I can't remember the exact latency, but I think it was around 80ms, which is okay, though there is definitely room for improvement.<p>[1]: <a href="https://twitter.com/id_aa_carmack/status/193480622533120001" rel="nofollow">https://twitter.com/id_aa_carmack/status/193480622533120001</a>
> This one’s bizarre: the onboard keyboards on both the 2015 Macbook Pro and Macbook Air have worse latency than an external keyboard. Plugging in an external wireless keyboard improves response latency by 40 ms. What are you doing, Apple?<p>I noticed this myself when I upgraded my wireless Magic Keyboard a few weeks ago. I haven't noticed any difference between the two (keyboards) before, and assumed that the new one felt snappier due to shorter travel.
" The response latency of the real world is zero. When you pick up an object, your fingers and the object move in lockstep."<p>This is a misleading and mostly false statement.<p>The human consciousness requires time for discrete data to be combined by the brain into what we perceive.<p>It is NOT instantaneous in real life. There is multiple forms of biological latency from the maximum speed of electrical impulse propagation through neurons to the "compile time" of the brain to assemble and modify information into a cohesive output for internal human consumption.<p>It may seem pointless to point this out, but many sources indicate that the delay between real life and perception is ~80 miliseconds, around 1/10th of a second.<p>These are speeds which are highly relevant to the discussion of software lag, as a program running 60 times per second updates several times during the "brain lag" of 80ms.<p>(Oh and the reason why you can overcome the 1/10th of a second lag and have your hand move in lockstep with reality is a variety of compensation mechanisms, like proprioception, which allow you to estimate where your hand will be in relation to reality and achieve it successfully).<p><a href="https://blogs.scientificamerican.com/observations/time-on-the-brain-how-you-are-always-living-in-the-past-and-other-quirks-of-perception/" rel="nofollow">https://blogs.scientificamerican.com/observations/time-on-th...</a>
My use for the timing capabilities of the 240Hz camera has been in measuring the shutter speeds of antique film cameras, which tend to get slower with age. So I find myself using a 240Hz camera from 2016 to calibrate and use a film camera from 1941.
Great work! I've already measured 4 of my gaming systems. The NDS outperformed them all - Rhythm Paradise reacts in 33ms from tap to image change B-)<p>I can imagine screenshot test results from your app becoming a standard way to show the latency of a certain game setup in internet forums. Ideally, an easily sharable result would have to show several things all in one image:<p>- 1 frame before you pressed the button<p>- the frame when you pressed the button<p>- 1 frame before something changed on the screen<p>- the frame when something changed on the screen<p>- plus timing data, obviously.<p>Since changes are sometimes quite small (like when mario jumps but he is kinda in the background because you have the joypad in the same shot), one would have to be able to zoom into a part of the screen for each of the 4 images. And maybe mark a part of the screen with a circle.<p>If you could do these changes, you've got a certified hit on your hands! Keep up the great work!
<i>Ubuntu latency is better than Windows 10 on this hardware?! Way to go!</i><p>Better check your videocard driver's settings. For NVidia on Windows, you can for example alter number of prerendered frames which IIRC defaults to at least 3. Putting that to 0 and turning off a bunch of other things listed in the settings, decent nVidia cards on decent hardware can get a latency of 2 frames which is probably the bare minimum. At 60Hz with a screen which doesn't add additional latency that is 33.3mSec (between the software issueing a render call and the actual output on the screen).<p>At work we measure latency using an oscilloscope, a photodiode pointed to the (top left of the) screen and some input event e.g. from a fast digital output card (i.e. less than a mSec to get an output high from software). E.g. in software we set an output high in the piece of code of interest then just measure the time between that high edge and the photodiode firing. But using a camera is a neat though somewhat more manual process.
Love these high speed cameras on phones. I used my Pixel's 240 fps camera to expose a 120 Hz flicker on subpar LED bulbs: <a href="https://www.youtube.com/watch?v=QbenId_F2RQ" rel="nofollow">https://www.youtube.com/watch?v=QbenId_F2RQ</a>
Only slightly related, but your phone camera is also useful for checking if your television remote is working.<p>The otherwise non visible IR led in the remote lights up when viewed via a phone camera, as you press buttons on the remote.<p>Handy if you suspect the remote is broken, but aren't sure.
Our current benchmarks tell us what's happening after OS sends keystroke to the text editor and before we hand off rendering to the OS. This tool has potential to tell us what's happening outside these bounds.<p>I would like to see a visualization of the recorded sound, so that I can set the "input" frame to exactly when I hit the keyboard. I'll even pay to have this accurately aligned.<p>Good job so far!
Last night I was going to use my video camera to measure the startup latency of a vibration motor by comparing it to a "Power On" LED. So this HN post is great <i>timing</i>.
This is awesome! I work in the cash register software development world, and our customers are sometimes quite picky about the latency of our systems as a whole (= the time between a scan of an item happening, indicated by the scanner beeping and lighting an LED, and the output being visible on the display). I found myself measuring this total latency using an iPhone (at that time it was a 4, without the high speed camera, but nevertheless sufficient to see whether we were below the critical limit of - at that time - 200ms), and it worked out very well, although I had to load all the videos into external software to count the frames accurately because the phones' own players weren't sufficient and there were no apps like this.<p>I will definitely give this tool a try, it could vastly improve future measurements using the same technique :D
End-to-end latency is definitely something that deserves attention. It'd be interesting to modify mouse&kb by adding LEDs in series with the switches. That way you'd get good visual indicator when the event happened without any added latency. Combine that with 144hz monitor and actual high-speed camera (I hear good things about Chronos camera), and you could do very accurate software latency measurements<p>This MSR video from few years back is also pretty cool wrt touch interface latency: <a href="https://www.youtube.com/watch?v=vOvQCPLkPt4" rel="nofollow">https://www.youtube.com/watch?v=vOvQCPLkPt4</a>
Re: the gotchas, it's possible that this is due to how the app is architected in combination with how the phone and OS are architected. The input rate of 3D Touch, for example, is tightly coupled to the display output frame rate, which, in turn, is tightly coupled to the things you do on the main thread. I don't know off-hand if the camera input is coupled to anything in this way, but it's something to look into, and Apple's documentation should indicate this sort of behavior.
This is great! Measuring latency is the first step to fixing it, and it's a step most never take. I've previously worked on a method of measuring latency in an automated way without cameras, which may also be of interest: <a href="https://google.github.io/latency-benchmark/" rel="nofollow">https://google.github.io/latency-benchmark/</a>
Unrelated to the article but coincidental post about the iPhone camera's speed on Reddit yesterday:<p><a href="https://www.reddit.com/r/apple/comments/63117n/the_speed_of_iphone_7s_camera/" rel="nofollow">https://www.reddit.com/r/apple/comments/63117n/the_speed_of_...</a>
> It is interesting that Atom’s latency gets worse by two frames when Haskell syntax highlighting is enabled.<p>Heh, that's not so surprising to me. There are several vim plugins for Haskell, and one of them had some <i>very</i> slow features that made the latency go into the "many seconds" range :D<p>The Macbook keyboards one is ridiculous though.
What a crazy cool idea!<p>Question: what is the standard human physiologic "click" time?<p>How fast or slow does it take to make a click. A mouse click takes longer than a screen tap, no?<p>Sometimes one will hold a mouse click if they are unsure.<p>What might be a better way is to record a scenario and measure response time on the replay?<p>Cool idea regardless
Oh, you could use this for treadmill calibration:<p><a href="http://fellrnr.com/wiki/Treadmill_Calibration" rel="nofollow">http://fellrnr.com/wiki/Treadmill_Calibration</a>
Surprised that the author is quoting latency of editors to 0.1ms, when the precision of the camera is only 4ms. Are they really repeating these tests 40+ times?
Awesome! This is also very visible in touch based interfaces, especially in scrolling. I did a video with high speed camera to show how laggy real interfaces are <a href="https://www.youtube.com/watch?v=8EhJo2OPR44" rel="nofollow">https://www.youtube.com/watch?v=8EhJo2OPR44</a>
I'm building a similar app for measuring UI latencies, but as a desktop app (for regressions checks on CI). Your app seems much more universal and platform-agnostic, really cool idea!
I don't know if it's accurate but it also works on iPad Air 2 which supposedly only does 120 FPS.<p>The app crashes when attempting to delete captures though.
The next level would be a desktop app that syncs up with your phone and flashes the screen at a precise time while you record video. Then monitor lag could be tested without expensive hardware.<p>But maybe UNIX time isn't precise enough for that sort of thing? I actually don't know.
I used a similar trick when evaluating mice and monitors for gaming, but only had 60fps, but used some tricks (seperate lcd based display displaying microsecond timer in frame, some other timing hacks) to discover my trusty old crt outpreformed even the best flat panel in latency and black level. Still use it for gaming, and scrounging for another for when it finally goes. Other discoveries were being able to vary the ps2 keyboard polling rate (although the correct way seems broken in linux), and that even with higher polking rate on usb, the ps2 interrupts handled faster. Mouse ended up better as usb due to thruput issues with serial. Take care to find an uncorrected sensor output to avoid microjitter and variable response time. Only a few sensors do this, so finding mice by sensor is how I approached it.