I can see that virtualization has driven a lot of architectural decisions that have paid off in terms of OS imaging and configuration agents, but at the end of the day I'm stuck with the same question I've had for nigh-on four years: does virtualization really make a difference?<p>That story is fine for a "hey, the market is growing!" sentiment, but if power isn't being saved or something else tangible, then it's just moving beans (or bottlenecks) around. Imaging advances have obvious benefits outside of virtualization, and as long as five or six years ago you could send an image to Dell and they would slap it on however many computers you wanted to order. Virtualization has always seemed to be tantamount to this, a non sneaker-net way of doing the same thing. Nice, as far as it goes, but you save overhead (less and less every day, to be fair) by declining virtualization, and I think in the long run virtualization will be the province of large budgets and bad or subperformant code (which benefits from from the stop/start speed of images), and in classical terms introduces another single point of failure.