(As much as I hate to post twice on a thread, I've had some thoughts that I haven't seen expressed yet.)<p>I think a lot of folks are missing some important perspective on the origin and rise of "agile" and the current state of process in the software industry. So let's take a quick trip back to the bad old days before agile to get a sense of what software development was often like back then.<p>First off, a lot of software was developed "out of house" so to speak, as works for hire by consultancy type companies. There would be extensive negotiations between the customer and representatives of the development company, the end goal of this process was typically a specification and a timetable that was then mutually agreed upon, through a legally binding contract, as to what would be built.<p>Next, the development company would take that specification and come up with a design that would be able to match the specs. Then that design would be sliced up and handed out to various teams and eventually down to individual devs. After the individual coding was done it'd be integrated together and compiled into a working product. The product would then be passed off to QA to test in order to make sure it didn't have any bugs and that it matched the spec. After that the product would then be passed on to the customer to use.<p>And everyone lived happily ever after.<p>Of course, there are a few problematic aspects to this process. Namely: everything. Usually what would happen is that the spec and/or the design would be unrealistic or impossible to implement, and wouldn't be what the customer actually wanted regardless. Often this was handled by acrimonious bouts of negotiation, often involving lawyers. Meanwhile, attempting to build anything that worked at all using this sort of process was a nightmare. When the software was integrated only at the last minute there were always a million new problems discovered. With such "big-bang" integrations a lot of effort is spent spinning away just getting the software to build and work at all. Meanwhile, when QA is the last step before handing the product off to the customer that means that defects, especially design defects, have had the greatest amount of opportunity to fester and take the most effort to remove. And, of course, the chance that the project would chew up many staff-years of effort without actually producing anything of use whatsoever was quite high with this model, since actually building software was a fairly late step.<p>In the face of all these very fundamental problems with the venerable "waterfall" process model a lot of new process ideas started to gain traction, more or less culminating in the "agile manifesto". The core idea of agility is to use iterative development, continuous integration, and open lines of communication to keep on track. The "customer" (or "stake holders") can see the direction of the product mid-stream and have many chances to correct communication errors or even errors in their original conception of what they wanted. The software is always being built and always being tested so integration overhead costs are much less severe and defects are spotted much closer to where they are introduced, making them easier to fix. The software is routinely in a "shipable" state with a continually evolving subset of the "final" featureset, this lets the customer see how close the developers are to the schedule and also dramatically reduces the risk of not shipping anything at all. The developers can always time box the release and ship <i>something</i> of value, even if it wasn't what was originally intended.<p>And so on.<p>The thing is, today we live in a fundamentally agile world of development. Waterfall is so far from the norm it's essentially extinct. The very idea of futzing around with nothing but requirements and specs for months or <i>years</i> before bothering to write a line of code is so anathema to the current standards of software development it seems ridiculous. Everyone knows you start with a skeleton and you flesh it out iteratively. The idea that you'd have your code base in an unbuildable state, let alone an unusable product state, for more than a few hours or days at most is similarly seemingly preposterous.<p>The fact that the basic principles of agility are now so ubiquitous that they are like a mold infecting every nook and cranny of the software industry is still unsatisfactory for a lot of folks. Management wants a process they can sink their teeth into. They want something that requires no effort on their part but seems like a silver bullet that can solve any problem. They want gadgets and tools. They want a process that they can leverage to justify all of their bad behaviors while removing accountability from themselves. And that's what agile-the-noun has become. Not agility, but rather an excuse for micro-management. A way to plan without planning. A justification for short-sightedness and disengagement. A convenient rolodex of excuses for why everyone but management is at fault for being late or building something bad or broken or that no one wants.<p>You either die a hero or you live long enough to see yourself become the villain.