This page is absolutely correct that you should not use 32 bit key IDs, ever.<p>However, some of its information about GnuPG is out of date. As of version 1.4.17 (released in June), GnuPG no longer blindly accepts responses from key servers:<p><a href="http://git.gnupg.org/cgi-bin/gitweb.cgi?p=gnupg.git;a=commit;h=5230304349490f31aa64ee2b69a8a2bc06bf7816" rel="nofollow">http://git.gnupg.org/cgi-bin/gitweb.cgi?p=gnupg.git;a=commit...</a><p><a href="http://bugs.g10code.com/gnupg/issue1579" rel="nofollow">http://bugs.g10code.com/gnupg/issue1579</a><p>The fix was backported to wheezy-security as well:<p><a href="https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=725411" rel="nofollow">https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=725411</a>
I love gpg but openpgp tooling is just ancient piece of crap from CLI perspective.<p>- writing gpg just gives you cryptic “gpg: Go ahead and type your message ...” without any more information.<p>- `gpg --list-keys` apart from being hard to remember* has a table of misaligned values without any headers.<p>- I still can’t figure out how to make use of gpg-agent.<p>* I always tend to write --keys and I get another non-helpful info error “gpg: Option “--keys” is ambiguous”)
See also: Trolling the Web of Trust[1]<p>The slides from Micah's original OHM2013 presentation are in trollwot.pdf<p>0xdeadbeef keys[2] have been around for quite a while, too. Generating a custom 32-bit key ID using a simple unoptimized brute force program takes under an hour on a midrange PC.<p>[1] <a href="https://github.com/micahflee/trollwot" rel="nofollow">https://github.com/micahflee/trollwot</a><p>[2] <a href="http://pgp.mit.edu/pks/lookup?search=0xdeadbeef&op=index" rel="nofollow">http://pgp.mit.edu/pks/lookup?search=0xdeadbeef&op=index</a>
Interesting, this article from 2011 shows the Debian community moving from short 8 char IDs to 16 chararters one.<p>Now I'm really confused, and I have to ask:<p>Why are we calling a 8 hexadecimal character word 32 bit when it reuqires 64 bit to encode, or a 16 hexadecimal character word 64 bit when it take 256 bit to encode?