I think this guy has mathematics totally wrong. Maths just doesn't work today the way it did in Newton's time, and even back then people weren't satisfied with Newton's proofs, but they lacked an alternative so they had to use them anyway. If the author had his way, we would have refused to accept Newtonian physics for two centuries! Could you imagine the damage that would have done?<p><i>> [A]ny realistic mathematical proof will leave out a great many steps, which are considered to be the "required background knowledge"</i><p>Computer science papers are different how? Computer science != programs!<p><i>> [T]he inference rules are not always specified accurately, or are not believable if they are. This is why you will sometimes read a mathematical proof and say "I understand the reasoning, but I just don't buy it"</i><p>I think this guy just isn't too hot at mathematics. Omitting a trivial step (or domain specific knowledge) is <i>not</i> a lack of rigour, but a courtesy to the reader. The details can always be filled in cleanly. If you ever see a modern mathematical proof that is accepted by all mathematicians, but you don't "buy it", then I can assure you that it's <i>you</i> that's at fault, not the proof.<p>Oh, and computer science papers never leave out trivial steps or assume domain knowledge? Not the papers I've read. Once again, CS != programming!<p><i>> This is reminiscent of Whitehead and Russell's Principia Mathematica where hundreds of pages pass until the authors can prove that 1 + 1 = 2.</i><p>Surely <i>Principia</i> is a <i>reductio ad absurdum</i> of the argument that everyone should always spell out all the steps!