I work in AppSec for a Very Large Company. I've worked in large companies before. These are not new trends. We have programmers who do F# and other functional programming. I would think the bigger inhibitor to functional programming is that most of the existing apps are Java or .Net so unless you are building a brand new team, you reuse the skills and technology you already have working for you.<p>Our devs use plenty of small open source projects. We [Security] like to recommend software that we are comfortable with, but any determination of "stacks" we leave to the actual software engineering teams. If something is pretty bad, not updated, constantly having problems, etc - we might ban it... but what's your case for using poorly engineer software given alternatives?<p>Not sure if mom and pop is supposed to mean commercial, but not OSS? OSS we can patch and modify if necessary. We can even PR patches back based on what our "scanners" and manual testing find.<p>Generally, we don't care about language, most issues are in implementation not the language, and less so in more modern languages where the creators have heard about security.<p>The basic type of code scanning needed for PCI and other compliance is a commodity offering and is manageable cost compared to marketing and relationship management costs needed to pursue big clients.<p>I am not sure the OP understands what a SOC2 report says/does. It talks about pretty high level controls and practices. You certify an app/service, not a stack. If you scan and fix your bugs and have a proactive security training, it doesn't care about how you do it. There is no golden stack that will help you pass a SOC2. You may be able to make your life easier with certain services/SaaS, but the issues come up in your practices and in the actual code implemented. If you have bugs in procedural or functional programming, its the same problem from this perspective.<p>Vendor due diligence? Some companies have their own questions, there are also agreed standards for these that some companies opt into. I am not sure why a big company should risk their bottom line on something unproven or that isn't ready for prime-time. It's like getting an inspection when you buy a house. In the same way, your org can improve and make improvements. This is no different then adding in features some customer wants in order to win your business.<p>I don't understand the ultimate point, people who build functional apps shouldn't have to care about security? It's just another non-functional requirement that helps you win a broad audience. It's the same argument that says government should regulate this or that, that financial advisers shouldn't need to act as fiduciaries. It's the cost of doing business.<p>Maybe the OP has some weird experiences where auditors jumped on functional programming as an issue to justify not doing more work or make their lives easier, but I don't think this is something that is a commonly held belief across audit and security (if people even know what functional programming is).