There's a theory in economics about the optimal size of a firm. How big should a company be? The optimal size could be infinite, where all of society acts as one firm. Think Communist-style command economy. Or it could be one, where everyone acts as networks of individual contractors or single-owner businesses. Think anarcho-capitalism. But in reality it's neither extreme and falls somewhere in the middle; why?<p>It turns out that the optimal size depends on the balance between the overhead costs associated with allocating resources within one firm and the transaction costs associated with two firms doing business with each other. The overhead costs are higher with large firms because there's more internal resources, including people, to allocate. On the other hand, transaction costs are higher with small firms because each firm does less themselves so they need to transact more with others to accomplish their goals.<p>As the relative costs vary over time, the optimal size varies too, and firms in an industry will grow and shrink. If it increases, then you'll see mergers and acquisitions produce larger firms. If it decreases then you'll see firms start splitting or small startups disrupting their lumbering competition.<p>I suspect a similar thing happens in software, where there's an optimal service size. It could be infinite, where it makes sense to build large monoliths to reduce the cost of two systems communicating. Or it could be one, where it's optimal to break the system at as fine a granularity as possible (function level?).<p>The optimal size depends on the balance of costs. All else being equal, by drawing a service boundary between two bits of functionality you shrink the services on either side but you increase the number of services and add communication costs for them to exchange data and commands.<p>How these costs balance out depends on the technology, and there are competing forces at work. As languages, libraries and frameworks improve, we can manage larger systems at lower costs. That tends to increase the optimal service size. As platforms, protocols and infrastructure tools improve, the costs to run large numbers of services decreases. That tends to decrease the optimal service size.<p>The microservices movement, and to an extent the serverless movement, assume that in the medium- and long-term the technological improvements are going to tip the scales sharply in favour of small services. I agree that's likely the case. But we're not there yet, except in some specialized cases such as large distributed organizations (Conway's law). But it's going to be at least a few years before it's worthwhile to build most software systems in a microservice architecture.