> When we (humans using Hindu-Arabic numerals) represent numbers, we write the most significant value first, and continue in descending order.<p>Funny trivia: while the Arabs wrote the numbers exactly as we do, they actually wrote them in the opposite order, the least significant digit first, and continued in ascending order, because Arabic script is actually written right-to-left. We the Europeans borrowed the notation as it looked when finished being written, so it had the unfortunate side effect of switching to the unnatural big-endian order.
So back in the 1970s I had a job working on an IBM mainframe computer. Not multicore, no cache or instruction pipeline that I was aware of. A coworker had the idea of writing a timer interrupt to see what kind of instruction was executing at the instance of the interrupt; eventually it accumulated tens of thousand of samples. What kind of instruction was most frequent, was it addition, multiplication, bit-logic, or what? If I remember correctly, at least 90% of the samples were moving data from one place to another (memory-to-memory, register to memory, memory to register, register to register). Computers don't "calculate" most of the time the way I had previously thought. They mostly just rearrange bits, copying this to that.