This was interesting to me, and overlaps a lot with different things on my mind the last few years.<p>The other day I was talking about something related with my daughter, for example, who was learning about rounding in elementary school — we ended up in a discussion about accuracy in calculations versus "number of operations" very vaguely speaking (in elementary school terms), and the tradeoff, and how you're always rounding at some level in practice, so that tradeoff always exists at some level.<p>I also do research in information theory, and somehow the topic Tao discusses seems related. In that area, there's always some potential or actual loss of information due to information and computational constraints, things are always being discretized, and some representation always has some information cost. What Tao is talking about is an information cost, but cast in terms of numerical accuracy rather than stochastic terms.<p>This is all very vague in my head but it seems like there is some path from stochastic information costs of representation to deterministic information costs of representation, along the lines of approximations and limits. People use probabilistic arguments in proofs, for instance, and there's pseudorandom numbers; I imagine you could treat both what Tao is talking about and more traditional information theory problems in the same framework.