A few random comments:<p>• Obviously, this is typeset with TeX.<p>• Though originally Knuth created TeX for books rather than single-page articles, he's most familiar with this tool so it's unsurprising that he'd use it to just type something out. (I remember reading somewhere that Joel Spolsky, who was PM on Excel, used Excel for everything.)<p>• To create the PDF, where most modern TeX users might just use pdftex, he seems to first created a DVI file with tex (see the PDF's title “huang.dvi”), then gone via dvips (version 5.98, from 2009) to convert to PostScript, then (perhaps on another computer?) “Acrobat Distiller 19.0 (Macintosh)” to go from PS to PDF.<p>• If you find it different from the “typical” paper typeset with LaTeX, remember that Knuth doesn't use LaTeX; this is typeset in plain TeX. :-) Unlike LaTeX which aims to be a “document preparation system” with “logical”/“structured” (“semantic”) markup rather than visual formatting, for Knuth TeX is just a tool; typically he works with pencil and paper and uses a computer/TeX only for the final typesetting, where all he needs is to control the formatting.<p>• Despite being typeset with TeX which is supposed to produce beautiful results, the document may appear very poor on your computer screen (at least it did when I first viewed it on a Linux desktop; on a Mac laptop with Retina display it looks much better though somewhat “light”). But if you zoom in quite a bit, or print it, it looks great. The reason is that Knuth uses bitmap (raster) fonts, not vector fonts like the rest of the world. Once bitten by “advances” in font technology (his original motivation to create TeX & METAFONT), he now prefers to use bitmap fonts and completely specify the appearance (when printed/viewed on a sufficiently high-resolution device anyway), rather than use vector fonts where the precise rasterization is up to the PDF viewer.<p>• An extension of the same point: everything in his workflow is optimized for print, not onscreen rendering. For instance, the PDF title is left as “huang.dvi” (because no one can look at it when printed), the characters are not copyable, etc. (All these problems are fixable with TeX too these days.)<p>• Note what Knuth has done here: he's taken a published paper, understood it well, thought hard about it, and come up with (what he feels is) the “best” way to present this result. This has been his primary activity all his life, with <i>The Art of Computer Programming</i>, etc. Every page of TAOCP is full of results from the research literature that Knuth has often understood better than even the original authors, and presented in a great and uniform style — those who say TAOCP is hard to read or boring(!) just have to compare against the original papers to understand Knuth's achievement. He's basically “digested” the entire literature, passed it through his personal interestingness filter, and presented it an engaging style with enthusiasm to explain and share.<p>> when Knuth won the Kyoto Prize after TAOCP Volume 3, there was a faculty reception at Stanford. McCarthy congratulated Knuth and said, "You must have read 500 papers before writing it." Knuth answered, "Actually, it was 5,000." Ever since, I look at TAOCP and consider that each page is the witty and insightful synthesis of ten scholarly papers, with added Knuth insights and inventions.<p>(<a href="https://blog.computationalcomplexity.org/2011/10/john-mccarthy-1927-2011.html?showComment=1319546990817#c6154784930906980717" rel="nofollow">https://blog.computationalcomplexity.org/2011/10/john-mccart...</a>)<p>• I remember a lunchtime conversation with some colleagues at work a few years ago, where the topic of the Turing Award came up. Someone mentioned that Knuth won the Turing Award for writing (3 volumes of) TAOCP, and the other person did not find it plausible, and said something like “The Turing Award is not given for writing textbooks; it's given for doing important research...” — but in fact Knuth did receive the award for writing TAOCP; writing and summarizing other people's work is his way of doing research, advancing the field by unifying many disparate ideas and extending them. When he invented the Knuth-Morris-Pratt algorithm in his mind he was “merely” applying Cook's theorem on automata to a special case, when he invented LR parsing he was “merely” summarizing various approaches he had collected for writing his book on compilers, etc. Even his recent volumes/fascicles of TAOCP are breaking new ground (e.g. currently simply trying to write about Dancing Links as well as he can, he's coming up with applying it to min-cost exact covers, etc.<p>Sorry for long comment, got carried away :-)