Bringing in IT is a form of optimization. Optimization needs to speed up hot spots. If you make something 1000X faster, but it only took 1% of the overall time in the system, then the gain is around 1%.<p>When you bring in IT, there are inefficiencies. Things that were done without IT have to be done with IT, so there are steps to prepare the input for the machines, and manage the information, and keep backups and all the rest. Some of those activities are new, not replacing another process. Or not replacing it with something less time consuming.<p>A clerk armed with a room full of filing cabinets won't necessarily be able to handle that many more requests in a day, if that is replaced by a terminal and database. Suppose that during the average request, he has to spend 7 minutes talking with someone on the phone, of which one minute is spent with the filing cabinets. If we replace that minute with 30 seconds of working with the terminal and database, maybe that will go to 6.5 minutes. Or not. If the cabinet or database shuffling is done in parallel with chatting on the phone, it may make hardly any difference at all to the duration of the service episode.<p>You will not get anywhere near the theoretical gains from technology, with its blinding speed, until all the human steps are replaced. If there is a "for each request x do ..." loop in the operation, and each iterations has steps done by a human, you will not speed it up until you get rid of those steps. But at that point, the efficiency gain will no longer be attributed to a working human as productivity increase.<p>For instance, there is no question that the web has greatly increased the productivity of corporations in many customer service areas. The farm of servers works unattended, serving up all sorts of information to the users. Thousands of concurrent users. It does something that was impossible prior to the advent of the internet. Because no humans are actually sitting there serving the web requests, there is no human productivity measure.