See also "the SNAFU principle" which posits that centralized data processing <i>cannot</i> receive the data it would actually need to function effectively because of the noise introduced by the various necessary disintermediation and translation interface layers.<p>A hardware analogy: we have Gbit Ethernet common and cheap now. how hard would it be for a CPU to bit bang signals like that onto a wire? we have the layers of translators and information losses through them: the hardware connected to the Ethernet plug knows things about the cable characteristics, electrical performance of the link, etc that we hardly ever want. but a real "top down" view of traffic performance in order to make a "ETA" dialog on a file transfer <i>Accurate</i> might require that much data <i>and more</i>. So we make do with estimates, then decide that the estimates are too much trouble and just write an animation to a timer.