This is interesting because it's one of those cases where <i>insecurity</i> can turn out to be a good thing - had those cybercriminals been more careful with their systems and made them more secure, this may have never been possible; but then again, the malware might not have been able to do this in the first place if the users' systems were more secure. How that could be accomplished is also worth considering - there is a school of thought that suggests taking control away from the users and disallowing them from doing anything that some entity (corporate or government) does not approve of on the assumption that users will always make mistakes (e.g. Trusted Computing), but this also means loss of freedom - as the saying goes, "freedom is not worth having if it does not include the freedom to make mistakes."<p>However, if on the other hand we allow the users freedom, and thus assume that mistakes (such as being infected with malware like this) will happen, then it makes sense that a means of recovery should be available, which is not something that "perfect" security allows. To use an analogy, people who have lost their keys or had them stolen should still be able to gain access to their house. In the physical world, perfect security is nearly impossible, but with digital data, it's not. Locking an item in a safe means it can still be retrieved if the key is lost by, in the worst possible circumstance, cutting open the safe, no matter how physically strong it is. Encrypting data with a long-enough key and sufficiently strong algorithm means it's truly practically <i>destroyed</i> without the key. I think this point - that encryption can be really, really, <i>really</i> unrecoverably strong - needs to be made more aware as we continue to use more of it.<p>It would be particularly ironic if this recovery was made possible through exploiting the malware servers with something like Heartbleed...