Because the International System of Units isn't binary? That's why we use GiB in computing.<p><a href="https://en.wikipedia.org/wiki/International_System_of_Units" rel="nofollow">https://en.wikipedia.org/wiki/International_System_of_Units</a>
Gigabytes were renamed to Gibibytes (2^30), and a new "Gigabyte," (10^9) replaced the old meaning of the word.<p>It is always sad to see an 8TB SSD come back with only 7.2TiB
Because the marketing department highjacked computing's lingo for a quick buck.<p>There should never have been any confusion because SI doesnt belong in computing because computing is not on a continuum. Therefore it has a natural multiplier, typically 2, making 10 completely arbitrary and capricious.
Normal world things usually scales by unit. You can have 1 apple, 3 apples, or 10 apples, so the shortcuts for amounts of digits matter.<p>In the digital world, a lot of things scale by powers of two. You can't have 2 isolated bits. You have 1 byte (that is 8 bits), the numbering system used to put addresses is based on powers of two (1 byte can be used to tell between 256 addresses, because that is 2^8, then 2, 4 and more bytes) and a lot of bases for packing information followed or required that (like storage, with 512 or 2048 bytes per sector). The kilobyte was then the closest binary number closer to 1000 (2^8 * 4 = 2^10) so from there it was the standard for scaling and you get a kilobyte of kilobytes as 1 gb (2^20) and 1 kilobyte of megabytes 1gb (2^30).<p>With networking it is a bit different, because down to the layer 1, you don't transfer whole bytes, but bits. And you have things like stop bits, parity bits, frames, retries and so on that mess a bit when you are talking about individual bits or higher level of abstraction bytes, but as it is close enough for big numbers you can usually consider that for transferring 1 byte you transferred 10 bits, and more or less match the power of 10 network speed with the amount of (byte packed) information you transferred.<p>In any case, switching measuring units for things that scale in binary, bytes and higher, to things that scale in individual bits is more a common marketing reason than a technical one. You get the next big number just taking out a letter that most don't notice from your unit name.
It must be mentioned that early PC hard drives actually did have (often even slightly more than) the correct binary size. The "10MB" model that was widely used actually had a capacity of 10653696 bytes.<p>Don't get me started on the -i prefixes, they sound absolutely stupid.
See also perhaps "Timeline of binary prefixes":<p>> <i>This timeline of binary prefixes lists events in the history of the evolution, development, and use of units of measure which are germane to the definition of the binary prefixes by the International Electrotechnical Commission (IEC) in 1998,[1] used primarily with units of information such as the bit and the byte.</i><p>* <a href="https://en.wikipedia.org/wiki/Timeline_of_binary_prefixes" rel="nofollow">https://en.wikipedia.org/wiki/Timeline_of_binary_prefixes</a><p>* <a href="https://en.wikipedia.org/wiki/Binary_prefix#History" rel="nofollow">https://en.wikipedia.org/wiki/Binary_prefix#History</a>
I refuse to say mebibyte or whatever alternative unit. 1024 bytes is one kilobyte, and 1000 kilobytes is not a useful unit (and so on). As far as being a conspiracy by hard drive manufacturers, Western Digital did settle the case rather than win: <a href="https://arstechnica.com/uncategorized/2006/06/7174-2/" rel="nofollow">https://arstechnica.com/uncategorized/2006/06/7174-2/</a>
Ugh...<p>I used to have this conversation on a monthly basis with co-workers. Many people seem to incorrectly believe that network speeds are measured in base-2 (binary) where in fact they are measured in base-10, just like a hard drive. And, just like misunderstanding of hard drive capcity, network bandwidth is misunderstood to be greater than it actualyl is.... because people percieve bandwidth in the same base as the thing they send through the wire... a thing usually measured in 8-bit bytes, which are NOT 10-bit bytes. The correction of 2-bits per unit byte quickly magnifies to be a pretty significant error. That error is something like 1GbE being actually capable of handling 119 MiB persecond, rather than 125 MiB persecond.
What is the difference between a Software Engineer and a Mechanical Engineer?<p>Mechanical Engineer thinks that 1KB = 1000 Bytes and Software Engineer thinks that 1km = 1024 meters. :-)
Because disk manufacturers can produce less drives per TB if they simply change the measurement.
Not to mention they have a whole army of useful idiots who are willing to fight for their profits.
This author apparently isn't aware that awhile back, units of storage were changed to include another category specifically for the 2^n way of measuring. His article is wrong as a result. Technically a gibibyte is 2^30 and a gigabyte is 10^9.