An interesting article, and we need more like it, but I feel the author didn't go far enough in understanding the root causes of complexity.<p>For example, they cite both schedule/budget pressure, and insufficient docs. The <i>incomplete</i> docs were "thousands" of pages. Does anyone really believe even half the team would read "complete" documentation, that are thousands upon thousands of pages? Would complete docs speed up development time? Would they speed up development time even taking into account the cost of producing and consuming complete docs? Is the size of module truly essential complexity, or is part of the problem that they're building on legacy code?<p>The author mentions interop with other teams and third party software as a large source of friction. Why are other modules so large and complex that they're maintained by separate teams? Is that essential complexity, or was it caused by previous attempts to patch their way to release? Would the modules be more manageable, and hence, require smaller teams, if they used other practices/languages/tools?<p>Access to test hardware, and managing the test personnel was another source of friction. What is the cost of buying more test hardware? In previous companies where I've worked, with manual QA depts and under-funded test hardware budgets, doubling the hardware budget would allow halving the QA salary budget (primarily because you don't need to hot-swap equipment several times a day), and shorten test cycles. Is that the case here?<p>Further, how much manual test is truly necessary, and how much is caused by the developers not writing automated tests?