Seems to me that life is fundamentally different from y = f(x). It's more like an imperative x = f(x), where the output keeps getting fed back into the input. That inherently produces more interesting behavior, even when f is relatively simple.<p>And the thing that is missing in AI today is the back-and-forth conversion of correlations to causal relationships, which, yes, humans are not that good at, generally don't consciously do, but have a built-in mechanism for.