I like the first part, entropy is essentially a measure of degeneracy for a state (the elementary definition is the log of the number of states multiplied by a unitful constant). Technically, if you're a supposed super-intelligence, that is good at remembering detail, every state of a system can be distinguishable and thus have small entropy. For example, consider a finite number of legos in a room. A computer could potentially remember where every lego is placed, while person can't do as well but could distinguish between a state where the legos are strewn about or built into a castle. So, the person would lump all the disordered states into just one state (the "mess" state) and give it a high entropy compared to the number of organized states (castles or ships made out of the legos).<p>I guess I always knew this, and this is sort of what we mean when we say "high entropy" but it's kind of fun to say it out explicitly like this. Most of the others seem like conflating the strict applicability of a theory vs. it's practical limits, like that QM doesn't really mean "small" or (equivalently) high energy, it just means when you're near the Heisenberg uncertainty limit for the observations you're making, which in most cases means small.<p>Allow me to add another one: "Special Relativity means when you go faster, time slows down for you!"<p>And another (controversial may be): "In Schrodinger's cat, the cat is both dead and alive at the same time!"