If all you have is a hammer, everything looks like a nail...<p>I'm not familiar with Togelius' other writings so I don't know how fast and loose the man is with his words, but in isolation, these "arguments" are like a compilation of youtube comments. Normally I ignore them, but some days I take the bait.<p>The question at issue is made out to be "who's more intelligent, computers or human beings?" when the real question is "what is intelligence in the first place?". To merely assume some definition because it suites the author's position is nothing short of question begging.<p>There also do exist powerful arguments against strong AI and computationalism. Searle's Chinese room argument is perhaps the best known, but by all appearances, often unappreciated or misunderstood. The essential point he makes is that computers are syntactic machines, i.e., machines that transform strings of symbols (which are intrinsically meaningless) according to syntactic rules. However, human minds contain semantics (concepts). Because computers are syntactic machines only, they necessarily do not and cannot possess semantics. They can simulate semantics when a human being formalizes semantics by producing syntactic rules for the simulation, but no amount of syntax ever results in semantics any more than skillfully adding clay to a sculpture can ever produce a human being. Remember, a computer is anything that implements anything equivalent to the Turing machine (a formalization of effective method).<p>Aristotle, on the other hand, makes a much deeper argument about the nature of the intellect that can reinforce a restricted form of Searle's argument, viz., his arguments can be used to explain why computers lack semantics by showing that matter per se cannot possess "concepts" as such and apart from particular instances. This argument is difficult to appreciate without an understanding of Aristotle's broader metaphysics. However, the outline of the argument is as follows:<p>1. Matter is particular/concrete (e.g., "<i>this</i> tree/<i>that</i> rose").<p>2. Concepts are abstract (e.g., "<i>Tree</i> as a class/<i>Redness</i> as such").<p>3. The intellect, the organ of abstracting concepts from particular instances, holds concepts.<p>4. Therefore, the intellect is not material. QED.<p>...adding own minor premise and conclusion...<p>5. Computers are purely material.<p>6. Therefore, computers cannot be intelligent.<p>Note that "intellect" is not a synonym for "mind". Aristotle distinguishes such things as imagination (phantasm) from the intellect, the former of which he argues is material. To better see how concepts are immaterial, consider the word "tree". You may imagine a tree, or even a number of trees, but the image is always particular, it is always an image of a particular tree whether real or not. However, none of these is the concept "tree" which is not particular (if it were particular, then there could only be one particular tree). You can repeat the same reflection with anything: every triangle you imagine will be isosceles, scalene or right-angle and of some particular color, and indeed something <i>triangular</i> and not a triangle as such.<p>The general problem here can be related to the problem of qualia (and intensionality) and thus the mind-body problem introduced by Descartes' metaphysics and haunting much of philosophical discourse since (even when the mind is dropped and the body endowed with the powers attributed to the mind). Note that Aristotle's immaterial intellect is NOT Descartes' mind.<p>Others who have argued against computationalist or materialist conceptions of the mind include Kripke and Popper, but there are many in-depth treatments of the subject that address many of the claims and objections raised by the computationalists. That being said, I find "AI" (arguably a misnomer) to be a very interesting field.