>Entropy, an Information Theory term coined by Claude Shannon in 1948, describes the minimum number of bits, on average, needed to encode a dataset.<p>Shannon didn't coin the term entropy. He borrowed it from the analogous definition in thermodynamics.