I think this guy has mathematics totally wrong. Maths just doesn't work today the way it did in Newton's time, and even back then people weren't satisfied with Newton's proofs, but they lacked an alternative so they had to use them anyway. If the author had his way, we would have refused to accept Newtonian physics for two centuries! Could you imagine the damage that would have done?<p><i>> [A]ny realistic mathematical proof will leave out a great many steps, which are considered to be the "required background knowledge"</i><p>Computer science papers are different how? Computer science != programs!<p><i>> [T]he inference rules are not always specified accurately, or are not believable if they are. This is why you will sometimes read a mathematical proof and say "I understand the reasoning, but I just don't buy it"</i><p>I think this guy just isn't too hot at mathematics. Omitting a trivial step (or domain specific knowledge) is <i>not</i> a lack of rigour, but a courtesy to the reader. The details can always be filled in cleanly. If you ever see a modern mathematical proof that is accepted by all mathematicians, but you don't "buy it", then I can assure you that it's <i>you</i> that's at fault, not the proof.<p>Oh, and computer science papers never leave out trivial steps or assume domain knowledge? Not the papers I've read. Once again, CS != programming!<p><i>> This is reminiscent of Whitehead and Russell's Principia Mathematica where hundreds of pages pass until the authors can prove that 1 + 1 = 2.</i><p>Surely <i>Principia</i> is a <i>reductio ad absurdum</i> of the argument that everyone should always spell out all the steps!
Quick note: many of you are saying that this guy is totally out of touch with how math works, and that formal derivation of proofs is just fantasy. It's true that that's an extreme view, but it is in fact one that is held by some top mathematicians, even if a small minority. There are people who have built up research programs out of trying to make it a reality. See, for instance, Doron Zeilberger, who apparently describes himself as an "ultrafinitist":<p><a href="http://en.wikipedia.org/wiki/Doron_Zeilberger" rel="nofollow">http://en.wikipedia.org/wiki/Doron_Zeilberger</a><p><a href="http://www.math.rutgers.edu/~zeilberg/OPINIONS.html" rel="nofollow">http://www.math.rutgers.edu/~zeilberg/OPINIONS.html</a><p>I'm not saying I agree with that, just that it's not lunacy. Carry on.
This disdain for fields without rigor reminds me of the general attitude towards people using excel macros or vb. I understand it's fun & I used to feel the same way. But at some point you just have to stop playing with your tools & get some real work done.
A compiled computer program is certainly a rigorous description of something, God knows what, certainly not the programmer.<p>This is just funny:<p><i>"I think computer science has a tremendous amount to offer the fields of logic and mathematics. Specifically, I think that requiring all formulas to be executable by a finite, deterministic system (a computer program) could lead to a great increase in the level of rigor of these fields, would make it easier for students to learn existing results, would make it easier for practitioners to develop new results, and might possibly suggest whole new approaches...</i>*<p>Also he keeps refering to computers as finite. I can only assume he means in a physical sense that there are a non-infinite amount of atoms making up his CPU.<p>I suppose I should critic his argument seriously but it's just too far away from any actual reality.
"I really loathe vagueness in science (or, for that matter, in philosophy and argumentation in general)."<p>Is he a computer, true or false no other answer? I think he's fooling himself, what happens if quantum computing becomes common and a result can be a 1, 0 or 'maybe'?