I read it a long time ago, around 1990. I remember people talking about it a lot in the early 80s<p>It's a seminal early text, but CS is a fast-changing field so I couldn't recommend it as a primary text for somebody who wants to learn CS today, however, it's definitely fun to flip through.<p>His stuff on "Searching and Sorting", for instance, just isn't relevant in a day when your language has a sort() function built in and hashtables. Also, in some of the areas I know about in depth, such as random number generation, Knuth's book is dangerously behind the times.
"It's a pleasure to meet you, Professor Knuth, I've read all of your books." (Steve Jobs)<p>"You're full of shit," (Don Knuth) [1]<p>I think that no one in the world, except Knuth, has read the entire TAOCP (joke.)<p>[1] <a href="http://www.folklore.org/StoryView.py?story=Close_Encounters_of_the_Steve_Kind.txt" rel="nofollow">http://www.folklore.org/StoryView.py?story=Close_Encounters_...</a>
I've "read" most of the first three volumes if it counts to read the text and skip most of the math. I always figured it was enough to "get" the algorithm, and go back to the math if I ever needed it. So far, years later, I've never needed to go back to the math, so I'm glad I didn't spend much time on it.<p>No doubt the math is useful in a theoretical way, and I'm pretty sure I could follow through most of it if I really needed to, but as far as proving algorithms and asymptotic run times, I've yet to need more than a general understanding of big O notation in my real work.<p>How long it takes is entirely up to you. If you try to understand every detail of every proof and theorem it will probably take a very long time.<p>I'm not sure it's worth reading all the way through, to be honest. I guess I'd suggest the first volume for a solid, math heavy intro to algorithm analysis and design. But after that, there's not much point unless you're going to read it for fun, and in that case, why ask about it on HN? The reality is that few people will ever need to know that much detail about any of the algorithms covered because almost all of them are library functions now.
I've read portions of volumes 1, 3 and 4A so I haven't finished the books.
The thing about these books is that while you can _Read_ them like any other book, reference or otherwise, what sets them apart are the questions at the end that have always been very thought provoking.<p>4A to me was the most useful - combinatorial algorithms, talking about generating trees, permutations et al. Reading it really cleared out some concepts that relate to sorting and searching. It's almost like someone condensed all known information about those algorithms (till the last 2-3 years) into one dense manual. (oh wait..)<p>There's no better time to read them than right now IMHO. And I don't think the right way to read them is volume wise. What I've done is to go through the table of contents and pick something that looks interesting, and chew on it for a bit. But to each their own.
I haven't read <i>TAoCP</i>. I am reading them. I have been for years. Every time I open one up, I learn something. I am also always confused. That's what I love about the work. It reminds me that this stuff is hard - so hard that Knuth hasn't finished learning the subject even after 50 years spent writing a 12 chapter book about compilers.<p>What makes them great books is Knuth's ability to tackle writing at multiple levels of sophistication. As I learn more maths new portions of each section become accessible. As I learn more about data structures and algorithms a similar thing tends to happen.<p>I haven't been in a rush to 'finish' them. Neither has Knuth. I read them because I enjoy reading them.
I've spent some time implementing my own multi-precision integer library, and I've used TAoCP as a reference for that, but not otherwise.<p>From this (admittedly limited) data point, I'd say it's a useful reference if you're working on implementing some of the tricky algorithms contained therein, but you'll need to work hard for it. The code is all assembler (a dialect Knuth invented for the book; I think it predates structured programming), so you'll need the proofs to even understand what's going on. I'm not sure how much benefit you'll derive from just reading it through without digging into the problems, but I haven't done it.
I have read sections, more in the super-reference mode. For example, there was a period i was interested in random number generation, so that section was one. Another time, we had to implement floating point (in the pre-80x87 days), and that section was absolutely essential.<p>Incidentally, I have two copies, one set is the original edition. That set has something that has been deleted from later references--the fold-out page for Tape Merging. It is illustrating such tings as <i>Read-backward polyphase merge</i>, <i>Read-backward oscillating sort</i>, and <i>Read-forward polyphase merge</i>. I guess we don't sort on tape much anymore.
I've been reading a little bit of it every day. The quality of the writing is top notch, so I can actually ready it just for fun. I'm still on the first volume, and plan to be for a while. I definitely get lost on the mathier parts of it, but for the most part I can follow along, and I'd consider myself on the advanced side of intermediate.
I've read very carefully some parts of it, mostly combinatorial algorithms, because I had a program to write or an algorithm to study. I've also done exercises here and there when they caught my interest (obBrag: and once got a $2.56 check out of it). I'm also using TAOCP as a bedside table book, because Knuth is fun and relaxing to read.