TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

A Pixel Is Not a Little Square (1995) [pdf]

30 pointsby justin_about 2 months ago

9 comments

codefloabout 2 months ago
This classic article is wrong, BTW, there&#x27;s no nicer way to put it. It applies the wrong theory. It was already wrong in 1995 when monitors where CRTs, and it&#x27;s way wrong in 2025 in the LCD&#x2F;OLED era where pixels are truly discrete.<p><i>Audio</i> samples are point samples (usually). This is nice, because there&#x27;s a whole theory on how to upsample point samples without loss of information. But more importantly, this theory works because it matches how your playback hardware functions (for both analog and digital reasons that I won&#x27;t go into).<p>Pixels, however, are actually displayed by the hardware as little physical rectangles. Take a magnifying glass and check. Treating them as points is a bad approximation that can only result in unnecessarily blurry images.<p>I have no idea why this article is quoted so often. Maybe &quot;everybody is doing it wrong&quot; is just a popular article genre. Maybe not everyone is familiar enough with sampling theory to know exactly <i>why</i> it works in audio (to see why those reasons don&#x27;t apply to graphics).
评论 #43770555 未加载
评论 #43770634 未加载
评论 #43770515 未加载
mklabout 2 months ago
Never really convinced me; I&#x27;ve done lots of graphics stuff, and I find thinking about pixels as squares works fine. Under magnification, LCD pixels are usually square blocks of rectangular RGB segments (OLED and phone screens can be stranger geometry), and camera sensors are usually made of square (ish) pixel sensor blocks in a Bayer colour array pattern. They&#x27;re not point sources or point samples, they emit or sense light over an area. Maybe I&#x27;m missing something.<p>Lots of past discussions:<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=35076487">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=35076487</a> 74 points, 2 years ago, 69 comments<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=26950455">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=26950455</a> 81 points, 4 years ago,70 comments<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=20535984">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=20535984</a> 143 points, 6 years ago, 79 comments<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=8614159">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=8614159</a> 118 points, 10 years ago, 64 comments<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=1472175">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=1472175</a> 46 points, 15 years ago, 20 comments
mordaeabout 2 months ago
There&#x27;s pixels and pixels.<p>Screen pixels are (nowadays) usually three vertical rectangles that occupy a square spot on the grid that forms the screen. This is sometimes exploited for sub-pixel font smoothing purposes.<p>Digital photography pixels are reconstructed from sensors that perceive cone of incoming light of certain frequency band, arranged in a Bayer grid.<p>Rendered 3D scene pixels are point samples unless they approximate cones via sampling neighborhood of the pixel center.<p>In any case, Nyquist will tear your head off and spit into your neck hole as soon as you come close to any kind of pixel. Square or point.
captainmuonabout 2 months ago
Except pixels are little squares. Sure, if you look under a microscope, they have funny shapes, but they are always laid out in a rectangular grid. I&#x27;ve never seen any system where the logical pixels are staggered like a hex grid, for example. No matter how the actual light emitters are arranged, the abstraction offered to the programmer is a rectangular grid.<p>If you light up pixels in a row, you get a line - a long thin rectangle - and not a chain of blobs. If you light them up diagnoally, you get a jagged line. For me that is proof that they squares - at least close enough to squares. Heck even on old displays that don&#x27;t have a square pixel ratio they are squished squares ;-). And you have to treat them like little squares if you want to understand antialiasing, or why you sometimes have to add (0.5, 0.5) to get sharp lines.<p>(And a counterpoint: The signal-theoretical view that they are point samples is useful if you want to understand the role of gamma in anti-aliasing, or if you want to do things like superresolution with RGB-sub-pixels.)
评论 #43770525 未加载
评论 #43770653 未加载
评论 #43770539 未加载
GrantMoyerabout 2 months ago
People get caught up on display technology, but how pixels are displayed on a screen is irrelevant. From a typical viewing distance and with imperfect human lenses, a point impulse and a little square are barely distingishable. The important part is that thinking about pixels as little squares instead of points makes all the math you do with them harder for no benefit.<p>Consider the Direct3D rasterization rules[1], which offset each sample point by 0.5 on each axis to sample &quot;at the pixel center&quot;. Why are the &quot;pixel centers&quot; even at half-integer coordinates in the first place? Because if thinking of pixels as little squares, it&#x27;s tempting to align the &quot;corners&quot; with integer coordinates like graph paper. If instead the specifiers had thought of pixels as lattice of sample points, it would have been natural to align the sample points with integer coordinates. &quot;Little square&quot; pixels resulted in an unneeded complication to sampling, an extra translation by a fractional distance, so now every use of the API for pixel perfect rendering must apply the inverse transform.<p>[1]: <a href="https:&#x2F;&#x2F;learn.microsoft.com&#x2F;en-us&#x2F;windows&#x2F;win32&#x2F;direct3d11&#x2F;d3d10-graphics-programming-guide-rasterizer-stage-rules" rel="nofollow">https:&#x2F;&#x2F;learn.microsoft.com&#x2F;en-us&#x2F;windows&#x2F;win32&#x2F;direct3d11&#x2F;d...</a>
评论 #43794434 未加载
评论 #43774452 未加载
gitroomabout 2 months ago
i think i&#x27;ve argued with friends over this exact thing - like, once you zoom in, does it even matter what shape the pixel is or is it just about how we use it? you think treating pixels as points or little squares actually changes decisions when making art or code
gomijacogeoabout 1 month ago
The paper would be a lot less infamous if the title had more accurately been &quot;A Texel is Not a Little Square&quot;.
turtleyachtabout 2 months ago
See also: <i>Pixel is a unit of length and area</i> - <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=43769478">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=43769478</a> - 1 hour ago (11 points, 20 comments)
roflmaostcabout 2 months ago
Mathematically speaking the paper is correct.<p>I think it actually depends what you define as &quot;pixel&quot;. Sure, the pixel on your screen emits light on a tiny square into space. And sure, a sensor pixel measures the intensity on a tiny square.<p>But let&#x27;s say I calculate something like:<p><pre><code> # samples from 0, 0.1, ..., 1 x = range(0, 1, 11) # evaluate the sin function at each point y = sin.(x) </code></pre> Then each pixel (or entry in the array) is not a tiny square. It represents the value of sin at this specific location. A real pixelated detector would have integrated sin from `y[u] = int_{u}^{u + 0.1} sin(x) dx` which is entirely different from the point wise evaluation before.<p>So for me that&#x27;s the main difference to understand.