Suppose you created a machine which, in the style of Solomonoff induction (sorta), was designed to find as short or as "elegant" (in whatever sense of that word) of a (computation based) description of the world (so, it makes a model of the inputs it receives, and it receives many inputs, including from the internet, etc.)<p>Further, pretend that it has arbitrarily large computational resources.<p>Even if it could completely predict the inputs it receives (modulo quantum mechanics based randomness), would you expect that it's internal model of the world includes some description of consciousness, of an internal experience?<p>I do not mean, would it have some sort of a model of agents, entities that act in a way that tends to optimize some things or other,<p>Rather, I specifically mean the internal experience of things, not just being able to predict that people would say that they experience.<p>I do not think that it would come up with a model of internal experience,
And yet, I experience, and so do you.<p>So, I think an understanding of reality based only on computation about physical objects, is incomplete.