> These changes are happening because application requirements have changed dramatically in recent years. Only a few years ago a large application had tens of servers, seconds of response time, hours of offline maintenance and gigabytes of data. Today applications are deployed on everything from mobile devices to cloud-based clusters running thousands of multi-core processors. Users expect millisecond response times and 100% uptime. Data is measured in Petabytes. Today's demands are simply not met by yesterday’s software architectures.<p>Is this true? There are surely more companies than before, but is most of the programming done for applications like this? I always thought that at any point in time, the vast majority (~70%) of developers are working on internal applications with users ranging from a few to a few thousands. Is there any hard data on this?