I thought about this for awhile, and here are my thoughts about having the source code:<p>With the older YubiKey NEO devices, the applet source was available and I could freely upload an applet. This was great for a few reasons. I could modify or upgrade the app (of course, doing so would cause me to lose existing keys, which makes sense from a security PoV). (I actually did this on my old YubiKey.) I could also, in principle, audit the app. And, if I trusted Yubico to get their security right, I would trust that my freshly-arrived-in-the-mail device was secure. Moreover, if I trusted Yubico not to act maliciously, then the applet on the device I got in the mail would match the firmware on github, and I could trust that it did what I thought it did.<p>There were, of course, problems. The GlobalPlatform platform is awkward to use, the toolchain is terrible, and the key management is awkward at best.<p>I could not trust that a key I installed in the OpenPGP applet while my computer was compromised was secure.<p>With the new locked-down NEO devices, I can't change out the applets, and the bad guys would also have trouble doing so. As before, if I trusted Yubico not to act maliciously, then the applet on the device I got in the mail would match the firmware on github, and I could trust that it did what I thought it did. Also, as before, I could not trust that a key I installed in the OpenPGP applet while my computer was compromised was secure (because an attacker would simply export it before uploading rather than swapping out the whole applet).<p>Enter the YubiKey 4. If I use one, I am completely at the mercy of Yubico and their third-party audits. I cannot audit the code myself. Even if I trust Yubico not to act maliciously, I have to take them entirely at their word that they didn't accidentally mess up. And, of course, I cannot not trust that a key I installed in the OpenPGP applet while my computer was compromised is secure.<p>In other words, there's a big difference between source-available and source-not-available, even if I can't personally verify that the source I think I'm running is the source I'm running.<p>As an aside:<p>> There is an inverse relationship between making a chip open and achieving security certifications, such as Common Criteria. In order to achieve these higher levels of certifications, certain requirements are put on the final products and their use and available modes.<p>This may well be true, but, if so, it's a sad statement about Common Criteria and their misguided rules. Publicly disclosing the source code of an EAL5+ device should not reduce its supposed security level.<p>With SGX, Intel had the chance to offer a widely available security token (built in to every new CPU!) that anyone could freely program and use for their own security purposes. They blew it when they created their "launch control" policy, which essentially says that developers who don't sign lots of contracts (which you can't even <i>read</i> without an NDA AFAICT) can write an applet but can't run it. The Linux community, at least, is pushing back <i>hard</i>, and this just might change in the next generation of CPUs or maybe even sooner. Fingers crossed.<p>This inspires a challenge to Yubico: give me a hardware token that runs applets. Let the token attest to the hash of a running applet, but let it run any applet whatsoever. If I want to verify that I'm running the bona fide Yubico OpenPGP applet, I can check the hash myself. If I want to replace it, I can, but then the hash will change. It'll be hard: you'll have to figure out a real isolated execution environment. It's definitely doable, though.