TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Is there a solution to the trust problem when installing apps?

6 pointsby PrimaryAlibiover 1 year ago
It seems like there is no great solution. You can install from Google Play Store but then you have to trust Google which is the USA gov which have proven to be corrupt and criminals and constantly lying to us.<p>You can install from a different repository of apps such as F-droid or any other FOSS repository. But that means you must trust the managers of that repository who are building the apps from source code with their own keys which means they have the opportunity to modify it in secret, maybe adding some module, and we wouldn&#x27;t know. So we would have the same problem as we have with Google Play Store but now we shift the trust problem to this other repository&#x27;s developers.<p>You can use another frontend to Play Store but we will have to trust Google, it just means we dont need a Google account. Also we need to trust Aurora store if we install with APK file because the devs could add some malicious module before building the apk file, so we won&#x27;t know that by looking at the source code they have published. And if we install Aurora store from F-droid then we have to trust F-droid instead. So same trust problem no matter what.<p>We can also choose to just trust the developers of each app by installing the apps directly from the devs by downloading the APK file but then the trust problem is moved to each developer instead. We don&#x27;t know if they are adding malicious code before building.<p>So no matter how we install apps, we will never know really, there&#x27;s always a trust problem. Maybe the best thing to do is to isolate each app as much as possible. Create a different user profile for each app. And then decide on a case by case basis who is most trustworthy when it comes to which method of installing the app, but no matter what there will be a trust problem.<p>Or do i have something wrong or missing something?

4 comments

mikewarotover 1 year ago
I&#x27;m an old man, here&#x27;s an old man answer, <i>that might seem silly, but gives you a direction you might be able to pursue with more modern hardware</i>, let&#x27;s say a USB bootable computer. (It&#x27;s too bad you can&#x27;t truly write-protect a USB drive, though)<p>== old man answer begins ==<p>The most secure model of computing, that I&#x27;m aware of, is to use a non-networked computer without any permanent mass storage, booting off of &quot;trusted&quot; read-only media. Such as an IBM PC&#x2F;XT with dual Floppy disks.<p>We used to buy software on &quot;Shareware&quot; diskettes at user groups, coming home with dozens of disks of totally untrusted software, then spend the next month trying things out, and repeating this every month, for years.<p>We did it in almost perfect safety, because we had a simple way to copy our boot disks, and our data disks, that was proven and reliable, and easy to grok.<p>You ALWAYS knew what disks were at risk, because they were the ones without write-protect tabs on them in the floppy drive.<p>If you had any sense at all, you had multiple copies of your critical data and boot disks. You could always recover to a known state in a few minutes time, no matter what. The hardware didn&#x27;t have any place large enough to store virii, etc.<p>== old man answer ends ==<p>If you have to be super-paranoid, get a stack of USB disks, and a &quot;known good&quot; boot disk (thumbdrive), and a computer that can boot from USB with no persistent storage. Make a stack of copies of your boot media when you get your configuration settled. Use them only once, like pens used in public during Covid.<p>Use a second disk for your data.<p>Treat them all like we used to treat floppy disks, but with the additional caveat that <i>you CAN NOT EVER trust any &quot;write protection&quot; of USB sticks.</i><p>As for OS choice, Edward Snowden used to recommend Tails[1] as an OS boot choice, I&#x27;m not sure if that&#x27;s still the case.<p>[1] <a href="https:&#x2F;&#x2F;tails.net&#x2F;about&#x2F;index.en.html" rel="nofollow">https:&#x2F;&#x2F;tails.net&#x2F;about&#x2F;index.en.html</a>
akkartikover 1 year ago
Depends on your threat model. If you are concerned about nation state interference, no there&#x27;s no solution. The power disparity is just too great. Render unto Caesar, etc.<p>My solution is to try to spread my exposure between a small number of uncorrelated entities that are as trustworthy as possible. For example, here&#x27;s an app I publish: <a href="https:&#x2F;&#x2F;akkartik.itch.io&#x2F;carousel" rel="nofollow">https:&#x2F;&#x2F;akkartik.itch.io&#x2F;carousel</a><p>It depends on an app published by <a href="https:&#x2F;&#x2F;love2d.org" rel="nofollow">https:&#x2F;&#x2F;love2d.org</a> (not the Android app store), which in turn depends on Lua. Both stages are independent reputable projects publishing fairly parsimonious source code anyone can look at.<p>This is likely still exposed to nation states. But it feels as good as things can get.
bruce511over 1 year ago
I&#x27;m not sure I understand the Google Play store argument. If you&#x27;re running on Android,Google already supplied you the OS, which would be a better target than some app.<p>So given that you are using Google code already, the &quot;most trustworthy&quot; source becomes the Play Store.<p>But Google does not write that code, and frankly given the volume of submissions (and obvious problems) by &quot;most trustworth&quot; I don&#x27;t mean you should trust it.<p>Or to put it another way, once you start using code, written by someone else, the -source- of the code is irrelevant. It should not be trusted.<p>Which is true for your phone in general. It is not a &quot;trustworthy&quot; device.and you should treat it as such.
评论 #39227413 未加载
solardevover 1 year ago
IMO this isn&#x27;t a real problem most users face. As developers we can be hypersensitive about permissions and such, but most regular people... just don&#x27;t care. They value convenience over security. Apple&#x27;s solution for this was to better vet incoming apps. Google&#x27;s OEM business model meant that wasn&#x27;t practical, so we just saw a lot of malware instead, but nonetheless Android is everywhere and popular across the world.<p>Eventually we got fine-grained permissions on both systems, but it&#x27;s still just popups that users click through unthinkingly.<p>I don&#x27;t know that there is a magical solution to this. As with most of human interaction and commerce, the solution isn&#x27;t perfect transparency and trust, it&#x27;s the threat of later retribution -- by chargeback, by lawsuit, by intimidation, whatever -- that really keeps bad actors in check.<p>When you download a Meta app, there&#x27;s no way to audit its entire source code. The deal with the devil you make there is that Meta is going to be somewhat responsible with your stuff. Of course, they&#x27;re not. And when they&#x27;re caught, they pay some minor fine and move on. And yet billions of people still use their apps. It&#x27;s just not a big deal to most people.<p>And developers aren&#x27;t magically immune to this either. NPM is wildly popular and utterly insecure. Most of the web is built on someone else&#x27;s unvetted source code, itself built on ten other organizations&#x27; unvetted source code, all the way down.<p>The trust systems we have are more marketing devices than protective. They give users the illusion of security without sacrificing these companies&#x27; profit. The most simple fix to most of this is to simply disable internet access for apps by default, and then optionally let users examine plaintext packets prior to transmission (with the OS encrypting that same packet before sending it out, if needed). But they don&#x27;t want that because that&#x27;d be the end of ad dollars and tracking.<p>No part of your system is secure or private. They&#x27;re designed in huge profit-seeking corporations with embedded government agents. They&#x27;re manufactured with parts from fifty different countries, each with its own levels of corruption and government interference. They&#x27;re made by engineers who mostly just focus on their little spheres of concern and go home afterward. These systems are too big, too complex, for anyone to fully secure, so it&#x27;s always a losing arms race against orgs like the NSO Group.<p>So, given that these aren&#x27;t really secure to begin with, can&#x27;t really be secured even with extensive effort and billions of dollars, and can&#x27;t be proven secure even if they were, and is actively made less secure because the companies that make them have a vested interest in lowering your privacy and security for ad dollars... there&#x27;s no way to win. You just decide it&#x27;s not a big enough deal, accept it, and use it anyway. Or don&#x27;t, and become one of those people who insist on using third-party hardware and your own Signal build that none of your friends bother texting anymore. But most people just don&#x27;t care, and never will...