I hate HDMI and wish it would die in favor of DisplayPort for everything. HDMI even makes manufacturers print HDMI on the front of their monitors for a discount on the royalties. That's why your new $3500 monitor says "HDMI" on the front of it for no reason: to save ten fucking sense on licensing HDMI for the unit. Meanwhile, DisplayPort is free.<p>HDMI must die.
The article asks: <i>Which is the better trade off? More color range per pixel, or more pixels with color channels?</i><p>The answer is more color range per pixel.<p>Chroma is subsampled compared to Luma because that's literally how vision works: the Human eye has far more Rods (brightness sensors) than Cones (color sensors) per unit area. Increasing the color resolution on a TV[1] to match the luma resolution is literally a waste of bits[2]. On the other hand, the color receptors are very sensitive to the exact shade of color[3], so increasing color bit depth is not a waste.<p>[1] Increased color resolution on a computer monitor <i>is</i> useful because you sometimes closely examine a small section of the monitor -- and possibly lean in to examine it better.<p>[2] The old NTSC analog system used the same principle: color was encoded using less bandwidth than the luma.<p>[3] As the author mentions, this is best observed in a gradient where the TV must present the gradations in sufficiently small steps to fool the eye into seeing a continuous gradation.
I'm not sure why this is important. Obviously the author is much more knowledgeable than I am, so perhaps I can be enlightened.<p>As noted in the article, Deep Color is important so that sampling errors do not accumulate while images are filtered, processed & combined. But a television or monitor is the final step, it should be performing very little image processing. (Should is the operative word here, that's not necessarily true).<p>If you used HDMI to connect cameras to recorders or to connect effects processors, this would be important. Does anybody do this for 4K?
I wonder why we don't use the CIE XYZ color space for monitors - in practice, humans see much fewer blues than reds, so having the same pixel dedication to both seems like a waste. I'm not a panel manufacturer here, so I imagine it is hard to architect a panel with subpixels of varying signaling depth, right?<p>Also, are there even 12-16 bit color panels in production? I like to hope 4k / 8k mean the end of the pixel race, because we really need adaptive vblank (from Displayport) of at least 100 hz before we keep pushing the pixel density. Hell, most of my family don't recognize the difference between 480 and 1080p already because the colors are so bad on most consumer tv panels.
Awesome highly-detailed write-up. Thank you.<p>I agree that deep color is fascinating, especially in light of the fact deep color for end users has been right over the horizon for so long (I've ranted previously about my disappointment that 24-bit color has been the pinnacle of desktop computing for well over a decade). I personally find 24-bit banding quite annoying, especially with animation and fade effects, which visually exaggerate the limitation.<p>I would absolutely love a 50" concave OLED display with high-DPI, high-speed deep color. For the time being, in the real world of compromises, I'd be happy with either HDMI 2 or DisplayPort 1.2+, since I am presently dealing with 30Hz at 4K. Given that GPUs available today support 4K at higher refresh rates on DisplayPort, my current very slight preference is DisplayPort over HDMI 2 (which as far as I know is not supported by any GPU I can buy today).
Filled with wrong information.<p>First and easiest. Anything that you can display over DVI can be displayed on a display that has implemented DVI over HDMI, so DVI doesn't look better than HDMI for any technical reason. This is how all those non-standard resolutions over HDMI work, and allows for color combinations that are not part of the standard.<p>Now that is established. 4k doesn't actually require HDMI 2.0 to work because any combination of resolution and color that is supported by the "speed" of the wire and can be communicated in the "handshake" between devices can work now.<p>Unlike most standards HDMI has "requirements" for certification, but those are not the upper limits they are the lower limits.<p>So 4k 24 FPS at 4:2:0 could be negotiated on an HDMI cable that is 1.1 compliant.<p>Author should read less Wikipedia. When I was at Microsoft's Media Room I had to have IT block Wikipedia because engineers were getting so much wrong information from its articles. Read the standards, talk to device manufactures. Wikipedia is not a place for deep tech, it is a place for a quick answer. If it isn't a knol of data like a stat for transfer speed, assume someone like this author wrote it, and it will be inaccurate.
Using 12-bits per color would seem to give more precision within that axis, but doesn't seem to say anything about the dynamic range covered. sRGB is a subset of what human vision is capable of, e.g. the CIE XYZ/RGB colorspace.<p>Seems like the ideal display can:<p>1) match the dynamic range of the real world in luma and chroma
2) resolution that makes seeing pixels very hard
3) enough precision/gradation within the dynamic range to avoid banding
4) support high framerates
People can not see 4k resolution at normal couch viewing distances unless they have really huge displays.
<a href="http://s3.carltonbale.com/resolution_chart.html" rel="nofollow">http://s3.carltonbale.com/resolution_chart.html</a>
<a href="http://carltonbale.com/does-4k-resolution-matter/" rel="nofollow">http://carltonbale.com/does-4k-resolution-matter/</a>
Video formats can be weird.<p>I think there is (at least) one factual error in the blog post; 16-235 range was not chosen because of CRTs: <a href="http://en.wikipedia.org/wiki/Rec._709#Digital_representation" rel="nofollow">http://en.wikipedia.org/wiki/Rec._709#Digital_representation</a><p>In that light I'd also claim that the expansion to full-range of values is probably one of the smallest advantages of xvYCC. The wider gamut is significantly more important.<p>Also if when watching video files you notice that blacks do not look very black, then you should change your video player. As the aforementioned wikipedia article says, value 16 is intended to represent pure black in Rec709 colorspace.
It's kind of interesting, but damn this guy cannot communicate effectively. Was the point that > 8 bits of colour in HDMI is a good idea? That modern displays can't show them (I don't think this is true). Is his example image really suffering from being 8-bit colour, because it looks like it was exaggerated by setting the quantizer steps really big. There's also no real discussion or example of what chroma subsampling looks like.
Very nice detailed write-up, thanks!<p>Chrome subsampling goes a long way to explain the issues I was seeing with the Seiki 4k TV at <a href="http://hardforum.com/showthread.php?p=1040546819#post1040546819" rel="nofollow">http://hardforum.com/showthread.php?p=1040546819#post1040546...</a>.
"That’s why we GameDevs prefer formats like PNG that don’t subsample."<p>Can anyone explain why we have this disconnect between video/monitor standards and bitmap/png standards? What would be the problem with doing 8/10/12 bit RGB front to back, or YCbCr front to back?
>All this video signal butchering does beg the question: Which is the better trade off? More color range per pixel, or more pixels with color channels?<p>I would tend to say higher resolution is more worthwhile. I obviously don't have a way to test this directly, but my guess is that 8k at 180 fps and 2-bit, using spatial and temporal dithering, will look better than 4k at 60 fps and 12-bit, even though those two signals would contain the same amount of information.<p>The reason is that the dithering allows the same color depth to be presented, but the discreteness is kept well beyond the boundaries of what your eye can pick up.<p>Obviously there are plenty of <i>technical</i> reasons that this isn't exactly practical.
Good thing to point out. Though I think this is, as zokier points out, a less important point than accuracy and breadth of color gamut. And also in regards to "the blacks don’t look very black," a lack of dynamic range is also somewhat to blame for that. Luckily, OLED displays are here to address both issues!<p>4K at this point has little point, but the increased resolution will be nice once the quality catches up.