My parents bought me this book when I was in elementary school. I was fascinated with it; I read it and reread it until I understood everything in it. It took me something like eight months, but when I was done I had an instinctive grasp of computers and mathematics that many of my college classmates still don't have. Highly recommended.
BTW, if you want to buy the kindle version, buy it from O'Reilly instead of Amazon: 5 formats and no-DRM: <a href="http://shop.oreilly.com/product/9780735611313.do" rel="nofollow">http://shop.oreilly.com/product/9780735611313.do</a> ($1.83 more though).
So, I picked up this book on a whim from Half-Priced Books about six months ago. I've since loaned it out to a mechanical engineering friend of mine, and he's finished it; I'll be loaning it out again soon--it's one of those books that I consider essential for spreading the good word of CS/EE.<p>ACTUAL REVIEW/SPOILER CONTENT:<p>So, the book does start pretty slowly. It considers the case of signaling with flashlights. This motivates semaphores (not the concurrency type!), which in turn motivates telegraphs. The telegraph repeater and its automation is used to motivate logic gates. Around here (forget exactly whether it's before or after) the author diverges for a chapter or two into boolean logic and counting mechanisms. Then the idea of storage is brought in, then the idea of state machines. Calculation is brought in, and soon the author has a simple little ALU. The author talks about wiring up an interface for this, and then there is talk about interrupts, operating systems, and also real embodiments of this sort of hardware--6502 or 8086 assembler is introduced and discussed.<p>The writing is geared such that someone in highschool shouldn't have trouble understanding anything, and there is enough history thrown in with the light style that it isn't a chore to slog through.<p>WHY THIS BOOK IS NOT A WASTE OF TIME:<p>So, I've already taken the introductory computer engineering/CS classes during my time at university. I've already read a lot in high school and a lot after college on computers and their architecture. This is not a new subject to me. Why is this still a good read?<p>Folks, our professions (whether you are a mechanical engineer black-boxing gear trains, a software engineer black-boxing Java/Scala/JVM bytecode, or a Perl hacker black-boxing the very mouth of madness) are all rooted in abstraction. Everything we do, everything we touch, is a slice of a pyramid of (sometimes shaky/leaky/smelly) abstractions.<p>The great thing about Code isn't what it teaches you about programming (it doesn't cover much other than assembly, if that) or computer engineering (no help soldering or designing ring buses or whatnot) or even mathematics (boolean algebra is pretty straightforward in its presentation); instead, Code focuses on bringing us to a functioning microcomputer from a flashlight in our bedroom, without ever skipping a layer of abstraction.<p>Even if you already know each slice (in broader detail than presented), seeing the entire journey is at worst enjoyable and at best extremely educational.
Another good book by Charles Petzold is <i>The Annotated Turing</i>.<p><a href="http://www.amazon.com/Annotated-Turing-Through-Historic-Computability/dp/0470229055/" rel="nofollow">http://www.amazon.com/Annotated-Turing-Through-Historic-Comp...</a>
After reading Code my hacking skills increased considerably. I am never comfortable relying on abstractions so understanding more of the programming stacks I use increased my willingness to experiment with then. It also helped me to better understand pointers and memory management as memory became a real thing to me.<p>Also, I had fun building my own simple computers in logic gate software.
"The Elements of Computing Systems" is very good in that respect too: <a href="http://www1.idc.ac.il/tecs/plan.html" rel="nofollow">http://www1.idc.ac.il/tecs/plan.html</a>
Sounds a lot like my Digital Circuits course in university. We started with a breadboard of transistors, and after we built each component we got that component as an IC to build the next level of abstraction (flip-flop → memory → adder → processor → etc) . At the end we ended up with a fully programmable computer that we were tasked to program as a sensor-driven traffic light controller.<p>Seeing every step from the ground-up really helps demystify the machines you see around yourself every day.
Before reading this book, I had what I thought was a fairly solid grasp of computing. I was 19 (20 now), largely self-taught from spending nights on Google instead of going to parties, and if I didn't know the details of something, I knew enough about the concept to get by. I could program well enough in a few languages, talk about hardware, security, and protocols for hours...<p>I still had no idea how 10010011000110101001010 etc etc allowed me to play World of Warcraft. There was a small part of me that hoped beyond hope that computers were magic and there was some grand conspiracy going on to cover that fact.<p>Then I read CODE.<p>My hopes were shattered, but they were replaced by something much more treasured: Understanding.
CODE is good, but there's a weird disjoint between the very slow start (really, the stuff with light switches and relays goes on for much too long) and the suddenly much harder late section.
I am a programmer - have been earning my bread by programming for over a decade and before that had been studying CS for another 5 years with all those courses on Turing Machines, Random Access Machines, von Neuman computation model, logic and so on. Why should I read this book? Is there anything in it that was not covered by this standard curriculum?
I graduated w/ a degree in Electrical & Computer Engineering with a minor in CompSci. From the review, this sounds like all the stuff I learned that wasn't CompSci, distilled into an easy-to-read form. Stuff like flip-flops, RISC assembly, how memory works, etc. It sounds fantastic.<p>Having the benefit of an EE based undergrad degree was awesome for this reason, esp. since my first job was firmware design. The downside is that I never took a compiler course and have had very little exposure to stuff like discrete mathematics and turing machines. I didn't learn anything about OO design 'til after school, and I still haven't gotten into functional programming.<p>Sometimes I worry that I actually missed out on the good stuff. :/
I would not propose this book as one that every programmer should read. But others have posted about books that started with fundamentals and illustrated how computing systems evolved, which brings this book to mind... Henry D'Angelo "Microcomputer Structures".<p>"Microcomputer Structures" is out-of-print, but I recall the small text starting with atomic physics and building up to an introduction of the Von Neumann architecture.<p>As a freshman at Boston University in the 1980's, I took this mind-expanding course (probably because it was way above my level) with Professor D'Angelo. The final project was building an interface to a single-board computer.
Has anyone ever read The Cartoon Guide to the Computer?
<a href="http://www.amazon.com/Cartoon-Guide-Computer-Larry-Gonick/dp/0062730975" rel="nofollow">http://www.amazon.com/Cartoon-Guide-Computer-Larry-Gonick/dp...</a><p>I have given this book to 2 people who told me they wanted to know more about computers. It's very easy to read and does an excellent job of explaining basic computer concepts. Probably not too interesting for this crowd. But it's an excellent introduction to computing.
alright, meh, not convinced. we're all busy people and my bedside book stack is 12 high.<p>compare books like CODE, SICP, and related books discussed below, to books like Beautiful Code, Higher Order Perl, Mythical Man Month, Seven Languages in Seven Weeks, tons of whitepapers.<p>do I really need to read CODE? Seems like a bottom-of-stack book that will never quite make it to the top.
It sounds like the book does a lot to flesh out software engineering as a legitimate form of engineering. Programming is how we apply and manipulate electrical circuits, which arise from the natural laws of electromagnetism.
I picked the basics of programming and how computers work with a book called "The Beginner's Computer Handbook - Understanding & Programming The Micro". Actually this book was the reason I became a programmer. I was 8 years old when received it as present on birthday from my dad.<p>I could not find a link to the book, but found this blog post which has also some pictures:<p><a href="http://www.currybet.net/cbet_blog/2004/06/the-beginners-computer-handboo.php" rel="nofollow">http://www.currybet.net/cbet_blog/2004/06/the-beginners-comp...</a>
At my university, this kind of stuff is covered in a class called Digital Logic, which is a precursor to Computer Architecture. I've a pretty good understanding of how all the stuff works that's mentioned in the blog post.<p>edit: It may actually be more of an electrical engineering class, but the CS program here is ABET certified, so it's essentially an engineering degree itself, but this is the kind of thing that any CS graduate should be intimately familiar with.
Another book that takes this "soup to nuts" approach to the pillar of computer abstractions is Zalewski's "Silence on the Wire". It's mostly focused on computer security, but it does an excellent job of peeling back software abstractions and showing how innocent decisions like blinkenlights and link-following web spiders can be utilized to compromise security. My favorite computer security book, easily.
Petzold gave a great keynote at CodeStock 2011 on analog computing, but it seems no one recorded it.<p>Here's a link to an interview of him that briefly covers some of the topics in his keynote. <a href="http://technologyandfriends.com/archive/2011/06/13/tf160.aspx" rel="nofollow">http://technologyandfriends.com/archive/2011/06/13/tf160.asp...</a>
Do CS programs not teach this? Pretty early in getting my degree I was taught about gates, timing signals, flip-flops, deMorgan's Law, etc. By the time of the Sys Architecture class it was assumed this had been covered.<p>While I agree that programmers should understand the fundementals of how a cpu et al works, is this book the best one?
Code seems to cover similar material to MIT's 6.004: <a href="http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-004-computation-structures-spring-2009/index.htm" rel="nofollow">http://ocw.mit.edu/courses/electrical-engineering-and-comput...</a>
A good layman's guide for computer science is Danny "Connection Machine" Hillis' "<i>The Pattern On The Stone: The Simple Ideas That Make Computers Work</i>". He builds up from Boolean logic to Turing machines to quantum computing in fewer than 200 pages.
The book that influenced on how I think about programming and eventually made me better programmer is the Pragmatic Programmer.<p><a href="http://pragprog.com/the-pragmatic-programmer" rel="nofollow">http://pragprog.com/the-pragmatic-programmer</a>
While this book sounds like it would cover a lot of what someone might have missed in a CS program if they were not very inquisitive, I picked that stuff up, and continue to do so as i forget by going back and looking things up.<p>My vote for must read is: "Zen and the art of Motorcycle Maintenance" - Pirsig