IMO this isn't a real problem most users face. As developers we can be hypersensitive about permissions and such, but most regular people... just don't care. They value convenience over security. Apple's solution for this was to better vet incoming apps. Google's OEM business model meant that wasn't practical, so we just saw a lot of malware instead, but nonetheless Android is everywhere and popular across the world.<p>Eventually we got fine-grained permissions on both systems, but it's still just popups that users click through unthinkingly.<p>I don't know that there is a magical solution to this. As with most of human interaction and commerce, the solution isn't perfect transparency and trust, it's the threat of later retribution -- by chargeback, by lawsuit, by intimidation, whatever -- that really keeps bad actors in check.<p>When you download a Meta app, there's no way to audit its entire source code. The deal with the devil you make there is that Meta is going to be somewhat responsible with your stuff. Of course, they're not. And when they're caught, they pay some minor fine and move on. And yet billions of people still use their apps. It's just not a big deal to most people.<p>And developers aren't magically immune to this either. NPM is wildly popular and utterly insecure. Most of the web is built on someone else's unvetted source code, itself built on ten other organizations' unvetted source code, all the way down.<p>The trust systems we have are more marketing devices than protective. They give users the illusion of security without sacrificing these companies' profit. The most simple fix to most of this is to simply disable internet access for apps by default, and then optionally let users examine plaintext packets prior to transmission (with the OS encrypting that same packet before sending it out, if needed). But they don't want that because that'd be the end of ad dollars and tracking.<p>No part of your system is secure or private. They're designed in huge profit-seeking corporations with embedded government agents. They're manufactured with parts from fifty different countries, each with its own levels of corruption and government interference. They're made by engineers who mostly just focus on their little spheres of concern and go home afterward. These systems are too big, too complex, for anyone to fully secure, so it's always a losing arms race against orgs like the NSO Group.<p>So, given that these aren't really secure to begin with, can't really be secured even with extensive effort and billions of dollars, and can't be proven secure even if they were, and is actively made less secure because the companies that make them have a vested interest in lowering your privacy and security for ad dollars... there's no way to win. You just decide it's not a big enough deal, accept it, and use it anyway. Or don't, and become one of those people who insist on using third-party hardware and your own Signal build that none of your friends bother texting anymore. But most people just don't care, and never will...