It's an interesting point of view, for sure.<p>> There are cases of non-neural computer vision algorithms based on Bayesian inference principles,22 though it has been challenging to develop such models that can be trained as easily as deep learning networks.<p>Maybe. The paper he cites, "Human-level concept learning through probabilistic program induction," has like "On a challenging one-shot classification task, the model achieves human-level performance while outperforming recent deep learning approaches" right in its abstract. One shot sounds like exactly the meaning of easy to train.<p>Maybe he meant <i>accelerated</i> training, but that kind of gets to the core of what's flawed about this analysis. There's no economic incentive to build specialized, accelerated hardware for one of Tenenbaum's research projects, until there <i>is</i>. There's enormous economic incentive to build video cards for games, which is what all those deep learning advances were predicated on, and it remains to be seen if there's any economic incentive for TPUs or whatever specialized hardware he's imagining, of whatever architecture.<p>Looking at computing architectures, like CPUs versus GPUs, the way he does is a post-hoc analysis, and nobody except those deep in the research community and paying attention to NVIDIA papers could have anticipated how GPUs would have affected research.<p>There isn't, and there is, an architectural difference between CPUs and GPUs that matter. He's sort of picking and choosing which architectural differences matter in a way that favor R&D that he conceded has problems with "falsifiability."<p>If anything, we need better, cheaper CPUs still! They're still very useful for R&D. I'd rather get a slow supercomputer built today than a nothing built tomorrow, if Tenenbaum is telling me he's chasing something now and needs it now.<p>Conversely, why should we listen to R&D people about problems that are fundamentally economically motivated? It would be a bad bet. We're getting low power CPUs now not because the world is "data oriented" or whatever he's saying, but because the iPhone is so fucking popular there's immense demand for it. Ask Paul Otellini, an expert on the CPU business, what he thinks about that! So maybe we should actually be doubling down on the needs of consumer electronics manufacturers?