I prefer Mac to Windows. This is a subjective thing. Mac and Windows are optimized towards different use cases. Mac is optimized towards WYSIWYG display for desktop publishing. OS X displays fonts <i>exactly</i> as they'd appear on a page. Windows makes adjustments to character positions to align them to the pixel grid. This results in clearer text, but at the expense of things being slightly off as they'd appear in print.<p>Ultimately, different strokes for different folks. While you're not wrong that Cleartype is empirically better for most people for text content, it is objectively worse, by design, for those for whom positional precision matters.<p>> This occurs because Apple didn’t dare go near any ClearType patents that Microsoft got for their rendering techniques. For decades OS X remained a very ugly baby, until in 2015 they just gave us courage HiDPI in form of Retina. This was a bid to make all hinting technology obsolete and put everyone else to shame.<p>Is just wrong. Apple intentionally chooses to display fonts the way they do. This is quite the extraordinary claim -- that a company that is otherwise known for design would not focus on their fonts.<p>With regard to the freetype font size rendering. You claim that freetype doesn't scale the fonts smoothly, but then you also claim Windows does it correctly. Just FYI, Windows most certainly does not scale fonts linearly. OS X does of course, given its focus on publishing. Windows cleartype actually changes the font size in order to make text readable. If linearity of font scaling is a metric by which you measure font engines, then cleartype is a failure.<p>In case anyone doesn't believe the claim above, this article (<a href="https://damieng.com/blog/2007/06/13/font-rendering-philosophies-of-windows-and-mac-os-x" rel="nofollow">https://damieng.com/blog/2007/06/13/font-rendering-philosoph...</a>) goes into it in depth.
“What they didn’t account for is that an overwhelming majority of legacy software had 96 DPI hardcoded into it, as well as the problem of mixed use of HiDPI and non-HiDPI displays where you need to somehow scale the entire UI back-and-forth when moved between displays. Since Windows never had the font rendering problems that Apple did, majority of the computer market share didn’t back the ultra-expensive 4K screens even to this day (2018). Software maintainers too didn’t buy into the idea of massive code rewrites and UI tinkering to support DPIs other than 96. As of today, 4K experience is still a mixed bag that solves one simple problem for Apple, but introduces multiple complex problems for everyone else.”<p>Uhmmm.... [citation needed]?<p>I don’t have a single Mac app that assumes anything about 96 dpi and pretty much the entire Internet is 2x now.<p>Windows on the other hand... yes. That’s a clusterf<i></i>k in HiDPI.
> <i>OS X objectively has the absolute worst font rendering. It does not use hinting at all and relies only on grayscale anti-aliasing. This makes everything appear very bold and blurry.</i><p>This almost reads like satire.<p>It's well-known that Windows prefers to distort letterforms for the sake of crispness, while Macs preserve letterforms for the sake of fidelity.<p>Saying that one is better than the other is <i>entirely</i> subjective -- there are many, many articles on the subject. There's absolutely nothing objective about it.<p>If it were so "objective", then it seems quite odd that Macs would be the predominant choice of graphic designs and type designers, who would be expected to care about this the most...
As others have noted this is a hilariously bad post, because in practice Windows (which is so enthusiastically praised here) is incredibly inconsistent in font rendering program by program, and on most non-hidpi screens looks like trash. Especially when using scaled displays (such as 1.25 or 1.5x scaling on a 4k screen), where half of software turns into a blurry, unreadable mess.
This guy has some pretty weird opinions. Not only do I much prefer macOS's accurate font rendering aesthetically, the entire issue of sub pixel aliasing is moot with high-DPI screens anyway.
> Some users even disable all ClearType rendering and anti-aliasing, claiming that it reduces eyestrain and that anti-aliasing damages eyesight. It’s kind of like anti-vaxxing (hello from 2019 if you are reading this in the future).<p>I hope the anti-vaxxing comparison is only for the "eyesight damage" part, otherwise this paragraph is bullshit in its entirety. From my own personal experience, i <i>vastly</i> prefer antialising disabled, <i>assuming that there fonts used have hinting available that support it</i>. Or even better, bitmap fonts made to be clean and crisp (as opposed to just being rasterized versions of vector fonts). Of course if you try to use fonts that disregard hinting and aren't compatible (or even tested at all) with antialiasing disabled, like the font the author's site uses which looks like total crap on my PC, then, yeah, compared to that i prefer AA with CT.<p>I find annoying that you need to mess with the registry to disable antialiasing in Windows 10, but at least the option is still there.
I'm surprised there was hardly any mention of pixel grid fitting, or even just accuracy. My favorite article on font rendering is one from Antigrain Geometry[1] that goes a fair bit further in improving subpixel rendering imo.<p>Nonetheless their use of the word objective is very annoying since I greatly prefer grayscale with no hinting even on Linux at 96 DPI. It's a bit blurry, but it consistently looks right. No kerning issues.<p>Also, it is laughable to suggest that Windows never had issues with font rendering. Just look at Microsoft Word struggling to balance between accurately displaying documents and rendering the fonts with Cleartype as shown in the AGG article.<p>[1]: <a href="http://www.antigrain.com/research/font_rasterization/" rel="nofollow">http://www.antigrain.com/research/font_rasterization/</a>
<i>> The traditional way of [installing the Windows core Web fonts] is through installing ttf-mscorefonts-installer or msttcorefonts. The msttcorefonts package looks like some Shenzhen basement knockoff that renders poorly and doesn’t support Unicode. I suspect that these fonts have gone through multiple iterations in every Windows release and that the versions available through the repositories must be from around Windows 95 days.</i><p>That does seem to be the case. Per Wiki (<a href="https://en.wikipedia.org/wiki/Core_fonts_for_the_Web" rel="nofollow">https://en.wikipedia.org/wiki/Core_fonts_for_the_Web</a>):<p><i>> The latest font-versions that were available from Microsoft's Core fonts for the Web project were 2.x (e.g. 2.82 for Arial, Times New Roman and Courier New for MS Windows), published in 2000. Later versions (such as version 3 or version 5 with many new characters) were not available from this project. A Microsoft spokesman declared in 2002 that members of the open source community "will have to find different sources for updated fonts."</i><p>So while Windows and MacOS both have up-to-date versions of these fonts (Windows because Microsoft owns them, and MacOS because Apple licenses them from MS), the best Linux distributors can do is to package the last versions released before the 2002 re-licensing. (Or at least, that's the best they can do without paying Microsoft.)
There’s not enough resolution at 96dpi to render fonts correctly. You have to make compromises one way or another. Windows snaps to pixel boundaries more strongly, while Mac does no hinting to keep shapes correct.<p>The author, while his opinion should be taken lightly, is trying to get subpixel positioning into Chrome and GTK. This is something positive, and something Windows and MacOS already have. Qt can do it, too, but as of a few years ago KDE/Plasma was actively disabling it because of inconsistencies in Xlib and image rendering backends.
s/font rendering/wifi<p>s/font rendering/bluetooth<p>s/font rendering/sleep mode<p>s/font rendering/battery life<p>I've been using Linux for 15 years now (started on Mandrake), but those issues haven't been fixed in a decade.<p>I really want to support free software, so I keep using it, reporting bugs and donating again and again. Many do. We hope that like with Firefox, if we support it consistently, it will eventually overcome the situation.<p>But damn it's hard sometimes.
I used to use Windows and Linux for a very, very, very long time. Never even thought "hey, those fonts don't look quite as good as they could". Then one day I tried Mac. Period.<p>Oh boy, ever since I just <i>can't</i> use anything else because of the fonts. It visually hurts my eyes.
<a href="https://www.freetype.org/freetype2/docs/text-rendering-general.html" rel="nofollow">https://www.freetype.org/freetype2/docs/text-rendering-gener...</a><p>> It turns out that font rendering in the Linux ecosystem has been wrong since scalable fonts were introduced to it. Text must be rendered with linear alpha blending and gamma correction, which no toolkit or rendering library does by default on X11, even though Qt5 and Skia (as used by Google Chrome and other browsers) can do it.<p>The primary effect of no gamma correction is that light-on-dark text is too thin (and dark-on-light is thick but readable).
I used to care a great deal about font rendering, but when I got eyeglasses I had trouble learning to focus with them. To make it easier, I disabled anti-aliasing, giving the fonts maximum sharpness, and making it obvious if I was focusing correctly.<p>Surprisingly, I got used to this and never switched back. The most legible font is the one you're most used to, but if there's any objective measure of quality, it has to be sharpness. No anti-aliasing = maximum sharpness. It might look ugly at first, but you get used to it quickly. I recommend trying it.
My subjective take on font rendering, based on tweaking and hinting web fonts, is that MacOS has the best font rendering (since their laptops have been using retina displays, with the exception of the Macbook Air; even here the font rendering may look a little blurry but the fonts look quite good and are readable); Linux has quite good font rendering (they may make the letters look bigger on low resolution displays, but things are nice and readable); and Windows font rendering is very uneven.<p>Most browsers have their own take of Clear type font rendering when rendering web fonts. While some make web fonts look quite good (Firefox, Internet Explorer/Edge), Chrome has had issues with using settings which make fonts harder to read; I had to increase the weight of the font I use some to compensate for this. Clear Type, on the default settings Chrome used for a long time, is really great, if you’re rendering a Windows font like Calibri or Cambria. For anything else, the results are uneven. (I think Chrome finally started tweaking things in Windows to look better)<p>In terms of the linked webpage, his comparison is unfair: He is comparing how Arial, a Microsoft font, looks in Linux compared to how it looks in Windows. Liberation Sans has the same metrics as Arial, so is not a good comparison font; he should had used something more OS-agnostic, such as Bitstream Vera Sans (DejaVu Sans if you want more languages).
<i>OS X objectively has the absolute worst font rendering. It does not use hinting at all and relies only on grayscale anti-aliasing.</i><p>And yet OS X's font rendering being by far the best <i>for me</i> is a key thing that has kept me on the Mac despite all the other warts. I find text elsewhere horribly blocky. It's all very subjective, it seems. Also, I think the grayscale antialiasing is a new thing, it certainly didn't used to be the case, but maybe they switched how it worked once retina screens became the norm.
For everyone criticizing his views on MacOS X, try using an external display with a lower PPI. It's not obvious on native laptop screens with 220+ppi.<p>On my Dell U3415W (109ppi) the issues he pointed out are very obvious. An equal sign (=) for example has a much thicker and blurrier bottom bar than the top. The rendering of the H in "History" is different than the H in "Help" in the menu bar.
Weird. I just opened a PDF on my linux PC, everything displayed perfectly. I haven't done anything with fonts, everything is default Fedora install.<p>Linux seems to have a lot of detractors around these parts. This person obviously cares more about fonts than freedom, privacy, and respect for the end user.
I'm probably less picky than the author of the article, but issues I do notice are (in Linux, but since I only use it I don't know if they exist in other OSes too):<p>-some websites render with a very strange font, that includes quote symbols '"' being very tiny, which I'm not sure if actually intended like that or a problem in linux specifically, and sometimes all letters being rendered in such ugly way to be hard to read<p>-symbols in mathematical formulas (some types of arrow, ...) rendered as colored emoji, even if they shouldn't in that situation since it's not a chat program according to the unicode spec<p>-sometimes unicode chars becoming a square box, even if having tons of fonts installed
I recently moved to nixos with sway for my desktop and after trying Firefox with wayland there I realized I had never seen text in linux so sharp.<p>I read the whole post but could not tell if what he mentions implicates both xorg as well as wayland?
Really interesting read on how fonts render on different systems. However a huge reason why I went from Win to macOS are the incredibly frustrating scaling issues. At work I gave up and just set my laptop display to 100% since I’d rather deal with the occasional tiny text than always having incredibly blurry fonts in certain crucial apps like Outlook.<p>I do agree that macOS fonts are nearly unreadable without a Retina/4K display. However I’ve never noticed any scaling issues on macOS.
There is a way to restore Infinality-like font rendering on freetype2, which improves text readability. Linux users may find these instructions helpful:<p><a href="https://gist.github.com/cryzed/e002e7057435f02cc7894b9e748c5671" rel="nofollow">https://gist.github.com/cryzed/e002e7057435f02cc7894b9e748c5...</a><p>Skip the "Removing the infinality-bundle" section if you don't currently use Infinality.
A related article from AGG talking about some of the subtleties of font rasterization: <a href="http://www.antigrain.com/research/font_rasterization/" rel="nofollow">http://www.antigrain.com/research/font_rasterization/</a>
I quite like the fonts on linux with my 4k monitors, subpixel smoothing, and slight hinting. Mac and Windows are also fine on a ~200 dpi monitor. Believe this is a solved problem for those who do not obsess about fonts.
The bad part is that you need to do a s/objective/subjective/g on it. Once you realize it's subjective, it's a fun fine informative <i>rant</i>
I don't know...turn on subpixel with slight hinting, disable autohinter, enable lcdfilter and the result is pretty damn good. For monospace truetype fonts I use grayscale instead of subpixel, which I think is cleaner in the terminal... unless of course I'm using a bitmapped font. It took me quite a while to get it looking good, but ultimately I guess it's a matter of taste. Do you want to preserve the look of the font? Do you want to smash it into the grid? Do you want colors on the fringes? With Linux I can choose.
I came here to bitch about the fact that Mac has "objectively" the worst fonts, when I've preferred it over Windows or Linux for years. Glad to see everybody has my back.
Funny how you singled out Linux, a free/open source OS.<p>Because just a few days ago I read that fonts, as well as UI elements in general, are a complete mess in both Mac and Windows, two paid/closed-source OSes, when you try to use them with Hi DPI displays.