> "The symbol’s modern obscurity ended in 1971"<p>It's missing a step or two. How did it get into ASCII if it were obscure? Model 33 teletype, after all, was one of the first machines to use ASCII, so there must be a reason why Tomlinson could look down and see the key in the first place.<p>A quick check of typewriter keyboards from the 1950s (eg <a href="http://mytypewriter.com/smith-coronaelectraportablec1958.aspx" rel="nofollow">http://mytypewriter.com/smith-coronaelectraportablec1958.asp...</a> ) shows that @, ¢, ½, and ¼ were all common in that era, so it isn't like other options weren't available.<p>And we know from <a href="http://trafficways.org/ascii/ascii.pdf" rel="nofollow">http://trafficways.org/ascii/ascii.pdf</a> , which was recently posted here on HN, that:<p>> The September 14-15, 1961 meeting of X3.2 saw further revisions of the printing characters of the code and the most elaborate plans so far for the arrangement of the control characters. The angular tilde (), multiplication sign (×), and vertical line (|) were deleted and replaced by an at sign (@) and less-than-or-equal-to (≤) and greater-than-or-equal-to (≥) operators.<p>On page 22 it does say that '@' was the "softest" of characters, and "more readily subject to replacement" and on page 23 quotes Mackenzie "it was forecast that, in the French national variant of the ISO 7-Bit Code, @ would be replaced by à", for why the "@" was placed before "A".<p>But that doesn't explain why an otherwise "obscure" characters is prominent enough to include over other possibilities, like ±, √, or ° from figure 35.<p>Going back to the Smithsonian article:<p>> "Tomlinson chose @—“probably saving it from going the way of the ‘cent’ sign on computer keyboards,”"<p>Cent wasn't in ASCII. ASCII was standard by 1971. There's no way '@' would have been removed from ASCII, and I don't see why it would have been removed from the keyboard had, say, '#' been used for email instead of '@'.