> For example, humans are much better at learning languages compared to even simple mathematics. Even those who are "good at math" are objectively much better at languages, in terms of how much complexity they can absorb.<p>Is this actually the general consensus? I think this is wrong. I am mediocre at math, but I can write it, and then verify that it is correct, and then it is quite abstractable and transferrable. Conventional languages handle fuzzy input better, but they do that by allowing ambiguous parsing..<p>> A14: Deterministic beats heuristic.<p>> Heuristics make it hard to give precise safety guarantees, so you should only use them when you don't guarantee much.<p>I think this is overly broad. If you can run a heuristic and then validate the results in a way that provides a guarantee, that can often be just as good, right? You've still got the bound, you just had to wait until after the computation to get it. Sometimes you'll get a failure to validate instead, but that should be rare if the heuristic is good. Anyway, all our programs are relying on the heuristic "hardware usually works."<p>I mean, nobody loves heuristics, but we keep coming up with them, so that must be a sign, right?