Ok, so 20 years ago, we all had to rely on lowly, pathetic servers. But <i>now</i>, thanks to <i>THE CLOUUUUDD</i> (read: hourly pricing on servers) there is a nebulous sludge of buzzwords running your applications, instead.<p>Seriously, come on. People still have to think about their servers. AWS is nice, yes, <i>but it still uses servers</i>, and developers and ops people <i>still need to think about those servers</i>.<p>It not like I press the magic button and my distributed app suddenly becomes available for millions to use simultaneously. I have to think (really hard) about what services I need, how they interact, how they scale in relation to each other, how to deal with failure scenarios, etc etc. None of this has gone away with the advent of "cloud" computing, and there is no way to invisibly scale your app (save maybe Heroku-like services, but they break down at high load).<p>Is it becoming easier to deploy scalable applications? Yes. But let's not confuse the fact that hourly pricing on VPS instances is still just running your app on a VPS, no matter how many of them there are. You still need to know your fair share of unix commands and you still need to be smart about the different pieces and how they interact. <i>This is nothing new</i>.<p>Let's also not forget what the cloud really is: an annoyingly overused buzzword thrown around by people who have never actually logged into a server in their lives.