One of my favorite experiences is learning that something that I thought was true isn't. All the better when other people think it's true too. What are your favorite misconceptions about computers?<p>Some that come to mind for me are:<p>(1) The idea that relational databases are named "relational" because they allow you to represent relationships between tables. Actually, "relational" refers to the fact that they are modeled as relations (aka tables). A relational database would be just as relational if it was only capable of storing a single table.<p>(2) The idea that HTTP "packets" are the same sort of thing as TCP packets. I used to imagine HTTP headers being attached to the beginning of every TCP packet when communicating over HTTP. Of course TCP doesn't present a "packet" interface at all. It's just a stream of bytes, so an HTTP packet is just a division within the stream of bytes, defined by the Content-Length header.<p>(3) The term C in ACID and the C in CAP refer to the same concept of consistency. Actually, ACID consistency is a single-node property, whereas CAP consistency is a multi-node property. One is about maintaining uniqueness and foreign key constraints, whereas the other is about making a multi-node system appear to behave like a single-node system.
Heyyyyy did someone call tech support? I have some common misconceptions about computers from people who don't work with computers :P
Biggest one is just not knowing names for stuff. I spend way too much time describing icons bc there's not really anyone teaching adults this stuff.<p>Another one is having disparate expectations for what a computer can/can't do. We can do sooooo much on our phones, but then a chromebook or 5 year old laptop will still run all slow and not work with lots of games, and there's a big disconnect for people with that. I think part of it is because we grew up with home phones and then gradually improving cell phones, but the computers were always significantly more powerful. Now that a very low-end computer and a mid-high end phone appear much closer in function to the average user, I've seen a lot of people who are very frustrated that their mini laptops aren't better. (And to be fair, when I was in college for elementary education I did the same thing and bought a mini laptop to save money.)
That AI is about human intelligence<p>That ML includes an analogue to human learning<p>That the Turing test is real and that roboticists will apply Asimov's laws of robotics
Seymour Cray's million chickens has been hugely misunderstood. Now we're all multicore, a lot of the big iron koans don't make sense any more.<p>I'm not sure a modern RISC instruction set is as simple as we thought it would be. Some CISC complexity might have turned out to be useful?<p>People seriously misapprehend how many discrete computers exist inside their computer. The keyboard has one, the CPU often has two or three adjunct processors to bootstrap and implement Trusted zones.<p>People misunderstand what is actually signed by RSA: it is very rarely the entirety of the message being RSA signed: RSA is used to sign the symmetric ephemeral key which signs the data.