TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

The deification of Alan Turing

67 pointsby mbellottiabout 3 years ago

14 comments

rscabout 3 years ago
&gt; In terms of actual influence the much more obvious choice for father of computer science is Charles Babbage, whose work was both influential during his lifetime and who pioneers like von Neumann and Aiken referenced in their development of early working computers.<p>This is a strange thing to say.<p>George Dyson in Turing&#x27;s Cathedral wrote:<p>&gt; “Von Neumann was well aware of the fundamental importance of Turing’s paper of 1936 ‘On computable numbers …’ which describes in principle the ‘Universal Computer’ of which every modern computer (perhaps not ENIAC as first completed but certainly all later ones) is a realization,” Stanley Frankel explains. “Von Neumann introduced me to that paper and at his urging I studied it with care.… He firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing.”<p>Re-quoting Randell, On Alan Turing and the Origins of Digital Computers [1].<p>Dyson also came and gave a talk at Google as part of the book tour [2], and the talk consisted of him telling stories about the book and showing pictures of interesting artifacts. At 18:51 he shows a picture of Turing&#x27;s paper as stored in the IAS library, and this is what he has to say:<p>&gt; And so it irritated me how all these historians still are arguing about &quot;What did Von Neumann take from Turing? Why didn&#x27;t he give Turing credit? Did he read Turing&#x27;s paper?&quot; People say &quot;oh no he didn&#x27;t read Turing&#x27;s paper.&quot; So I decided I would go look in Von Neumann&#x27;s library.<p>&gt; Turing&#x27;s paper was published in the Proceedings of the London Mathematical Society. That&#x27;s the volume it was in. When you go to the Institute [for Advanced Study] library, all the volumes are absolutely untouched, mint, hardly been opened except volume 42. And it&#x27;s clearly, you know, it&#x27;s been read so many times it&#x27;s completely fallen apart. They didn&#x27;t have Xerox machines so they just... Yeah, so the engineers I spoke with, the few that were left said &quot;Yeah... we all had to read that paper. That&#x27;s what we were doing was building a Turing machine.&quot; So I think that answers the question.<p>[1] <a href="https:&#x2F;&#x2F;eprints.ncl.ac.uk&#x2F;159999" rel="nofollow">https:&#x2F;&#x2F;eprints.ncl.ac.uk&#x2F;159999</a><p>[2] <a href="https:&#x2F;&#x2F;youtu.be&#x2F;_FibuHyIHnU?t=1131" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;_FibuHyIHnU?t=1131</a>
评论 #30625026 未加载
lou1306about 3 years ago
This article just goes to show that, when people hear about &quot;computer science&quot;, they think that it only refers to actual hardware computers. Case in point, the reference about &quot;many [...] innovations around computer architecture&quot;.<p>Computer science is about the mechanized manipulation of symbols. Turing proposed a fundamental model to reason about this. He wasn&#x27;t trying to glorify tabulating machines into mathematical ones; rather, he was solving the Entscheidungsproblem, like a lot of other people. In the circle of logicians, philosophers and mathematicians that cared for this issue (the proto-computer scientists, in a sense), he has never been &quot;obscure&quot;.
评论 #30623085 未加载
Animatsabout 3 years ago
Turing has definitely increased in visibility over the last few decades. Von Neumann was considered the &quot;father of digital computing&quot;, because he set down in detail how a general purpose stored-program digital computer ought to work. One was built, and it worked. A few billion Von Neumann architecture machines later...<p>Turing&#x27;s automata theory work was obscure and not very usable. Turing&#x27;s code breaking work was very specialized. The real theoretician of cryptanalysis was Friedman, who gave the field a theoretical basis, along with breaking the Japanese Purple cypher and founding the National Security Agency.<p>&quot;Colossus&quot;, the electronic codebreaking machine at Bletchley, was not a general purpose computer. It was a key-tester, like a Bitcoin miner. Its predecessors, the electromechanical &quot;bombes&quot;, were also key-testers.<p>Almost forgotten today are Eckert and Mauchly. They were the architects of the ENIAC, which was a semi general purpose computer programmed with plugboards and knobs. This was a rush job during WWII, when naval gunnery and navigation tables were needed in a hurry. It did the job it was supposed to do. After the war, they formed Eckert-Mauchley Computer Corporation, and produced the BINAC.[1] This was the minimum viable product for a commercial electronic digital computer. All the essential subsystems were there - CPU, memory, magnetic tape drive, printer. Everything was duplicated for checking purposes. That got them acquired by Remington Rand, and their next machine was the famous UNIVAC I, with more memory, better tape drives, a full set of peripherals, and profitable sales. Eckert had a long career with Remington Rand&#x2F;UNIVAC&#x2F;Unisys. Mauchley stayed for a few years and then did another startup. More like a good Silicon Valley career.<p>[1] <a href="http:&#x2F;&#x2F;archive.computerhistory.org&#x2F;resources&#x2F;text&#x2F;Eckert_Mauchly&#x2F;EckertMauchly.BINAC.1949.102646200.pdf" rel="nofollow">http:&#x2F;&#x2F;archive.computerhistory.org&#x2F;resources&#x2F;text&#x2F;Eckert_Mau...</a>
评论 #30624746 未加载
评论 #30623374 未加载
评论 #30639185 未加载
评论 #30632862 未加载
评论 #30623693 未加载
sn41about 3 years ago
Very slanted article. Turing was hardly unknown to logicians. In fact, even though the undecidability in lambda calculus was proven by Church in 1936 [1], Godel remained unconvinced that it was a complete model of mechanical computation.<p>It was Turing&#x27;s paper, in which, among many other things, he showed that Turing machines were equivalent to lambda calculus, that convinced Godel that both Turing machines, and hence lambda calculus, were the right models of mechanical computation.<p>This was circa 1936, before Turing came to Princeton. So he was hardly unknown to mathematicians even then.<p>The Turing test and his codebreaking work are of course, well known. Of course, there are other achievements of Turing including a powerful version of the central limit theorem [2] , and &quot;the chemical basis of morphogenesis&quot; [3], to show that he was hardly incapable or obscure. Turing was an original genius, with a wide variety of original views that were later found to be far-reaching.<p>[1] <a href="https:&#x2F;&#x2F;www.jstor.org&#x2F;stable&#x2F;2371045?origin=crossref&amp;seq=1#metadata_info_tab_contents" rel="nofollow">https:&#x2F;&#x2F;www.jstor.org&#x2F;stable&#x2F;2371045?origin=crossref&amp;seq=1#m...</a><p>[2] <a href="https:&#x2F;&#x2F;www.jstor.org&#x2F;stable&#x2F;2974762?seq=1#metadata_info_tab_contents" rel="nofollow">https:&#x2F;&#x2F;www.jstor.org&#x2F;stable&#x2F;2974762?seq=1#metadata_info_tab...</a><p>[3] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;The_Chemical_Basis_of_Morphogenesis" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;The_Chemical_Basis_of_Morphoge...</a>
评论 #30628227 未加载
groby_babout 3 years ago
This is... somewhat untethered from reality?<p>Of course there&#x27;s a link between mathematics and computing, and it well predates Turing or Berkeley. Starting somewhere with Leibniz&#x27; &quot;stepped reckoner&quot;, stumbling further along with Babbage, Lovelace, and the many actuarial computers.<p>And that link was <i>very</i> obvious by the time Hilbert &amp; Ackerman formulated the Entscheidungsproblem. Turings biggest contribution to computing was two-fold:<p>One, he created a formal (theoretical) machine that had behavior equivalent to first-order logic. Second, he formally proved that equivalence - probably the most important part here. That formally proven equivalence means that all problems decidable by first-order logic are decidable by a machine. Three, he used that to formally prove the Entscheidungsproblem isn&#x27;t generally solvable, and so proved the limits of first order logic.<p>That&#x27;s the fundamental breakthrough. Proving that a machine can solve an entire class of problems, and that there are limits to what that machine can solve.<p>It&#x27;s not that he somehow shaped what computers should look like, but that he had formally proven their capabilities and limitations.<p>I&#x27;m not surprised by the article given the background of the author - if you value practicality over theory (i.e. you favor software engineering over computer science), Aiken and Berkeley <i>are</i> more relevant. But the ACM has always cared about a theoretical foundation, and so their admiration of Turing makes sense.<p>The fact that the author doesn&#x27;t bother to even acknowledge that distinction is a bit surprising, though.
评论 #30623837 未加载
评论 #30623660 未加载
rubatugaabout 3 years ago
Strangely negative. The ACM award isn&#x27;t the only reason people celebrate Turing...
评论 #30612034 未加载
r0sabout 3 years ago
Side bar: as far as I can tell Turing was the first to use the term &quot;assertion&quot; in the context of software testing.<p>&gt; How can one check a large routine in the sense of making sure that it&#x27;s right? In order that the man who checks may not have too difficult a task, the programmer should make a number of definite assertions which can be checked individually, and from which the correctness of the whole program easily follows.<p><a href="https:&#x2F;&#x2F;turingarchive.kings.cam.ac.uk&#x2F;publications-lectures-and-talks-amtb&#x2F;amt-b-8" rel="nofollow">https:&#x2F;&#x2F;turingarchive.kings.cam.ac.uk&#x2F;publications-lectures-...</a>
ggmabout 3 years ago
I suggest people look for Bowden &quot;Faster than Thought&quot; and put Turing in the context of his peers, and publishing of work that was contemporary, and current at the time. he writes in this about his work, alongside Wilkes and others of his contemporaries in the field. Yes, its very UK specific.<p><a href="https:&#x2F;&#x2F;archive.org&#x2F;details&#x2F;faster-than-thought-b.-v.-bowden" rel="nofollow">https:&#x2F;&#x2F;archive.org&#x2F;details&#x2F;faster-than-thought-b.-v.-bowden</a>
SassyGrapefruitabout 3 years ago
&gt;It was only later, when the young Association for Computing Machinery (ACM) needed to establish computer science as a legitimate field of study that the history got edited to suggest a smooth evolution from theoretical mathematics to computing<p>This is a bit conspiratorial. I don&#x27;t think anyone believes the evolution of computing started with us stumbling around in plato&#x27;s cave until Turing opened the door. Then Boom! the next day we had macbooks. Of course practical and theoretical computing evolved together. This is hardly unique to computer science. What Turing represents is a person that asked(and answered) the big questions. Take a parallel in physics. Lorentz and Minkowski explored obscure mathematical tools, but Einstein shattered our thinking. Turing did something similar.<p>Why take it back to Babbage? The abacus was in use in the 11th century BCE. This practical device was a tool to solve a problem. Individuals were not asking &quot;Why does the abacus work?&quot;, &quot;What does it mean?&quot;, &quot;What can I or can&#x27;t compute with this thing?&quot;, &quot;What other applications are there for calculating machines?&quot;. We spent 2,000 years using this tool to count things and not much more. Babbage added some mechanization but he wasn&#x27;t exactly trying to bridge the abacus with consciousness.<p>Turing was thinking broadly. When we celebrate Turing we are celebrating the formalization. He gave us a deterministic framework for reasoning about computers. He allowed us to consider their limitations, their philosophical implications, and the opportunities they represent. The huge leaps from the 1960&#x27;s on would not have been possible without his work.
tbrownawabout 3 years ago
&gt; <i>It was only later, when the young Association for Computing Machinery (ACM) needed to establish computer science as a legitimate field of study that the history got edited to suggest a smooth evolution from theoretical mathematics to computing. To sell that message ACM needed founding figures and they settled on a deceased British mathematician named Alan Turing.</i><p>&gt; <i>Scientists active in ACM — specifically John W. Carr III and Saul Gorn — began connecting Turing’s 1936 paper “On Computable Numbers” to a broader vision of computer science in 1955.</i><p>Hm. I wonder what this writer would think of the Church-Turing thesis.
foucabout 3 years ago
I&#x27;ve always understood &quot;Father of &lt;some industry&gt;&quot; to be fluid, and usually there&#x27;s more than one possible figure. Turing, or Babbage, or Shannon, etc could all be called that. One organization thinks Turing is the father. So? Who really cares?
评论 #30623102 未加载
xigoiabout 3 years ago
Scribe link: <a href="https:&#x2F;&#x2F;scribe.rip&#x2F;the-deification-of-alan-turing-ffcfda28fa55" rel="nofollow">https:&#x2F;&#x2F;scribe.rip&#x2F;the-deification-of-alan-turing-ffcfda28fa...</a>
kragenabout 3 years ago
It seems like the author doesn&#x27;t understand, in a fundamental sense, what a computer is. She has been led astray by surface appearances.<p>Digital &quot;computers&quot; are called that because they developed as higher-precision, lower-speed versions of &quot;analog computers&quot;, which integrated systems of ordinary differential equations in real time (but faster). Examples included Bush&#x27;s mechanical differential analyzer, the MONIAC hydraulic computer, electronic differential analyzers built out of op-amps, and, earlier, Michelson&#x27;s harmonic analyzer and various kinds of planimeters. Reconfiguring these devices to solve different &quot;programs&quot; of equations involved reconnecting their parts in new ways.†<p>The thing that makes digital computers special, fundamentally different from both the analog &quot;computers&quot; they were named after and Shannon&#x27;s pioneering digital-logic circuits, is that they are <i>universal</i>; instead of reconnecting the pieces physically to carry out a different &quot;program&quot;, you can leave them connected according to a &quot;universal program&quot;, which runs a <i>stored</i> program <i>made out of data</i> in some kind of data storage medium, such as a loop of movie film with holes punched in it, a player piano roll, a mercury delay line, or a DRAM chip. It can even run a program that interprets <i>programs for a different computer</i>, a so-called &quot;simulator&quot; or &quot;emulator&quot;. So all such computers are, in a certain sense, equivalent; one may be faster than another, or happen to be connected to different I&#x2F;O devices at some time, but there&#x27;s no feature you can add to one of them that enables it to do <i>computations</i> that another one can&#x27;t.<p>That&#x27;s why we can use the same digital computer not only to numerically integrate systems of differential equations but to play card games, edit text, control lathes, symbolically integrate algebraic expressions, decode radio transmissions, and encrypt and decrypt. And it&#x27;s why we can run Linux on an 8-bit AVR microcontroller.⁜<p>Because the designers of ENIAC lacked this insight when the design was frozen in 01943, at first ENIAC was programmed by reconnecting its parts with a plugboard, like an analog computer. It wasn&#x27;t modified to be programmable with data storage media until 01948, three years after von Neumann&#x27;s <i>First Draft</i> in 01945, in which he (and his colleagues) proposed keeping programs in RAM.<p>The Harvard Mark I (built in 01944) and Konrad Zuse&#x27;s Z3 (designed in 01935, built in 01941) could run stored programs from tape, like Babbage&#x27;s later designs and unlike pre-01948 ENIAC. But they were not designed around this insight of universality, and neither was well-suited to emulating more complex machines, lacking for instance jumps. The Z3 was proven to be accidentally Turing-complete, but not until 01998, and not in a practical way.<p>That insight into the protean, infinitely adaptable nature of digital computers was not enunciated by Babbage, by Lovelace, or even by the brilliant Zuse. It was discovered by Turing; it is the central notion of his 01936 paper, from which Dyson tells us von Neumann was working, as Russ Cox points out in <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=30623248" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=30623248</a>.<p>And that is why Alan Turing is the creator of the discipline that later became known as computer science: it was he who discovered what we now call, simply, &quot;computers&quot;.<p>______<p>† &quot;Program&quot; is used to mean &quot;configure by connecting&quot; up to the present day in analog electronics; an LM317 is a &quot;programmable voltage regulator&quot; not because its output voltage is controlled by software but because you can change its output voltage by hooking a resistor up to it.<p>⁜ Though Linux on an AVR isn&#x27;t very practical: <a href="https:&#x2F;&#x2F;dmitry.gr&#x2F;index.php?proj=07.+Linux+on+8bit&amp;r=05.Projects" rel="nofollow">https:&#x2F;&#x2F;dmitry.gr&#x2F;index.php?proj=07.+Linux+on+8bit&amp;r=05.Proj...</a><p>Turing&#x27;s concept of computational universality permits an amazing economy of hardware; it is the reason that machines like the LGP-30, the Intel 4004, the PDP-8&#x2F;S, or the HP 9100A could be so much smaller and simpler than the ENIAC, despite being able to handle enormously more complex problems. The ENIAC contained 18000 vacuum tubes, 1500 relays, and 7200 (non-thermionic) diodes; the LGP-30 had 113 vacuum tubes and 1450 diodes; the 4004 had 2300 transistors (not including RAM); the PDP-8&#x2F;S had 519 logic gates (not including RAM, which was magnetic cores; <a href="https:&#x2F;&#x2F;www.ricomputermuseum.org&#x2F;collections-gallery&#x2F;equipment&#x2F;pdp-8s" rel="nofollow">https:&#x2F;&#x2F;www.ricomputermuseum.org&#x2F;collections-gallery&#x2F;equipme...</a> says the CPU contains 1001 transistors, and I&#x27;m guessing about 1500 diodes); the HP 9100A had 2208 bits of read-write core, 29 toroids of read-only core (holding 1856 bits), 32768 bits of ROM fabricated as a printed circuit board with no components, and what looks like a couple hundred transistors from <a href="https:&#x2F;&#x2F;www.hpmuseum.org&#x2F;tech9100.htm" rel="nofollow">https:&#x2F;&#x2F;www.hpmuseum.org&#x2F;tech9100.htm</a>, many of which are in the 40 J-K flip-flops mentioned in <a href="https:&#x2F;&#x2F;hpmemoryproject.org&#x2F;news&#x2F;9100&#x2F;hp9100_hpj_02.htm" rel="nofollow">https:&#x2F;&#x2F;hpmemoryproject.org&#x2F;news&#x2F;9100&#x2F;hp9100_hpj_02.htm</a> or <a href="https:&#x2F;&#x2F;worldradiohistory.com&#x2F;Archive-Company-Publications&#x2F;HP-Journal&#x2F;60s&#x2F;HPJ-1968-09.pdf" rel="nofollow">https:&#x2F;&#x2F;worldradiohistory.com&#x2F;Archive-Company-Publications&#x2F;H...</a>.
评论 #30624915 未加载
cromwellianabout 3 years ago
To me the real deity of CS is Donald Knuth, not for theoretical contributions (ok, but TeX now), but for assembling a large swath of computer science into a biblical form which educated a great many people. His books were indispensable for me in the late 80s and early 90s, before you could just go Google anything.