Sounds like every company for those who have spent time in the software world at a large enough company.<p>Example of some of the stuff I've seen at various places:<p>- Team A self ddos Team B taking down a critical platform that stopped the entire company from processing orders.<p>- Bug lead to everyone's password being reset to QA value and restoring backups would have taken too long. Any customer who complained about access was told support had reset their password just now to test123 but in reality every customer's password had been reset to that.<p>- Another scenario where some QA values got ran on a production environment and some settings and configs had to be manually restored as it was a third party service it got applied against. Halted nearly whole company.<p>- Bug miscalculates how to charge users and basically drains most people's bank accounts.<p>- Accidental DNS rebinding exposing CDN traffic to another company and took them down.<p>- Dev mistake bypassing a endpoints on the CDN for caching and throttling rules. Nearly $1M mistake before anyone even noticed.<p>- Accidentally using another S3 bucket template for another S3 bucket and not noticing it had versioning on. Program that dumped it's state for restoring to this versioning bucket had +450TB of state versioned. It didn't care about the history of state just the present state for restoring. Noticed only by mistake when cleaning up somethings.<p>And many other stories, things happen, you learn, you improve and it is really a risk vs reward calculation on the business side.
Apart of reddit, I remember a site where all these horror stories were submitted (anonymously of course) and published.<p>Kinda like Wikileaks for horror tech stories. Anyone remember it?