If your input is "perfectly" random then the codebreaker has no chance of determining when he/she has successfully decrypted it (regardless of how weak your encryption is). So the more random your unencrypted data, the harder it is to decrypt it once it is encrypted. It follows that less random data will be easier to decrypt. Now I can't prove the shape of the intermediate function (e.g. might there be a sweet spot in the middle? "Common sense" suggests the relationship would be monotonic, but common sense is often wrong).<p>Incidentally, it's also clear that if the encryption schema is large relative to the compressed data then it will also be harder to decrypt (of course if the encryption schema is public, e.g. a public key, then this is decidedly not true).<p>So, none of this should be terribly surprising. What it does suggest is that you should <i>compress</i> data before (or while) encrypting it, and the better your compression algorithm, the more entropy there will be in your data, and the more secure your encryption.