> Many Laptop screens are in fact 6-bit panels performing dithering to fake an 8-bit output. This includes even high-priced workstations replacements, like the HP Zbook Fury 15 G7 and its 6-bit LCD panel, that I sit in front of right now.<p>This made me double-check the date of the article (it's from 2023.)<p>The author's laptop appears to have launched in 2020. I'm astounded any manufacturer would think this is even remotely acceptable this day and age, much less on such a high-end device. (Going off the current generation, these laptops start around $3k.)<p>Is this actually that common or did the author just get unlucky?
This can also be done temporally not simply spatially.
<a href="https://www.shadertoy.com/view/tslfz4" rel="nofollow">https://www.shadertoy.com/view/tslfz4</a>
Apple UI blur & vibrancy[1] look smooth without having to introduce noise. They have the advantage of owning the entire pipeline with phones & laptops, but the effect is decent even on budget external displays.<p>1: <a href="https://developer.apple.com/design/human-interface-guidelines/materials" rel="nofollow">https://developer.apple.com/design/human-interface-guideline...</a>
This is great... but it's at the wrong level.<p>The LCD itself should ideally be a nice HDR LCD, but if it isn't, it should apply time-dependant error diffusion dithering. Ie. if you took a photograph, you would see a 6-bit or whatever error diffused image, but that dither pattern changes at 60fps. That should happen entirely within the firmware of the screen, and is actually easy to do (some types of dithering require no RAM, others require 1 line of pixels as a buffer).
Just dropping this here, because there is more to this issue in real-world scenarios:<p><a href="https://loopit.dk/banding_in_games.pdf" rel="nofollow">https://loopit.dk/banding_in_games.pdf</a><p>(Not my talk, but I found it enlightening.)
My recommendation is to do the equivalent of FRC in your shaders. I use one of the dither functions from <a href="https://developer.oculus.com/blog/tech-note-shader-snippets-for-efficient-2d-dithering/" rel="nofollow">https://developer.oculus.com/blog/tech-note-shader-snippets-...</a> to break up banding in my geometry rasterization shaders that I use for rendering all my UI elements. It looks good in static images and fantastic in motion since the dithering pattern shifts every frame.<p>One thing to watch out for is that if you're alpha blending, you need to be fairly cautious about how you dither and multiply. Premultiplied alpha does not get along with dithering, so the output from your shaders needs to be dithered but NOT premultiplied, with premultiplication performed by the GPU's ROPs (they typically seem to operate above 8 bits of precision.) If you don't do this you will get really nasty banding on alpha fades even though you dither. Also, counter-intuitively, you may not want to dither the alpha channel (test it yourself, ymmv)<p>When dealing with sRGB and linear space it can also be important whether you dither before or after the color-space conversion.
Are there any debanding methods that add a point to the color channels?<p>E.g. A delta of +(1, 1, 1) in RGB space would have six intermediary (but not perceptually evenly spaced) values, e.g. (1,0,0), (0,0,1), (0,1,0), (1,0,1), (0,1,1), and (1,1,0).<p>This might be something dithering already does (and I just don't understand it).