I agree with this article, but think that by singling out social sciences it fails to reveal the bigger picture: that all scientific fields are subject to these same mistakes.<p>I just started working in a research lab at a hospital that creates finite element cardiac models based on data taken during heart surgeries on sheep along with MRI images taken at various intervals pre and post surgery. Although I'm still new to this position, it seems that our own methods are subject to just as much deception. We basically want our models to coordinate with the actual heartbeat at only two exact moments, during the beginning of contraction and relaxation. If I've understood what's been done before, modeling these two brief periods during a single heart beat are all that are needed for publication.<p>I bring this up not to criticize my lab, obviously our work is meant to be a progression towards getting more and more accurate models. I just think that it shows that even something that is considered hard science is subject to many of the same faults as anything else. There are so many parameters and considerations to take into account that I don't think the end goal is to build a comprehensive theory that explains computational modeling of physiological function in a similar way to how Newton's laws predict the motions of the planets. The goal is simply to create a model that works for the purposes of helping diagnose and treat people more accurately.<p>It seems that a crisis is imminent in the coming age of computational, statistical, and mathematical applications to all fields where researchers are not properly taught to distinguish between data science and building actual theories. Just as there is a humungous gap between using a computer and actually coding, there is an equivalent difference between being able to collect & analyse data with a computer, and able to actually build a substantial theory that can describe a vast number of phenomena and result.