Note that years ago, Moxie has studied a similar problem of how to let users know if their contacts use Signal or not without uploading the whole address books like e.g. WhatsApp does [0]. It's similar because in both instances you want to "match" users in some fashion using a centralized service while keeping their privacy.<p>He ruled out downloads of megabytes of data (something that the Google/Apple proposal would imply) and couldn't find a good solution beyond trusting Intel's SGX technology, arguably not really a good solution but better than not adopting it at all [1].<p>You have kind of a computation/download/privacy tradeoff here. You can increase the time interval of the daily keys to weeks. Gives you less stuff to download but the devices have to do more hashes to verify whether they have been in contact with other devices. You can increase the 10 minutes to an hour. That means less privacy and more trackability, but also less computation needed.<p>My guess to why Google/Apple didn't introduce rough location (like US state or county) into the system was to prevent journalists from jumping onto that detail and sensationalizing it into something it isn't (Google/Apple grabbing your data). Both companies operate the most popular maps apps on the planet as well as OS level location services that phone home constantly so they are already in possession of that data.<p>[0]: <a href="https://signal.org/blog/contact-discovery/" rel="nofollow">https://signal.org/blog/contact-discovery/</a><p>[1]: <a href="https://signal.org/blog/private-contact-discovery/" rel="nofollow">https://signal.org/blog/private-contact-discovery/</a>
Regardless of the technical issues with this, I think the "prank" issue Moxie brings up is much more serious. We've already seen the phenomenon of "Zoom bombing", I can imagine "tracer bombing" would be a much more serious issue. The only way I could see this working is that if when you enter a positive result you have to enter some sort of secret key from the testing authority, but that's totally not tenable given a lot (most?) testing these days is from private providers.
> So first obvious caveat is that this is "private" (or at least not worse than BTLE), <i>until</i> the moment you test positive.
> At that point all of your BTLE mac addrs over the previous period become linkable.<p>Linkable over the period of 14 days. Or even linkable during one day - each day means new key, so linking between these might be attempted only on basis on behavioral correlations.<p>What to do with such data? Microanalysis of customer behaviors? It won't be possible to use such data for future customer profiling, as it won't be possible to match the history with identifiers after the infection. This data is practically worthless.
Let's just answer these<p>* Use stationary beacons to track someone’s travel path<p>Doesn't work because there's no externally visible correlation between reported identifiers until after the user chooses to report there test result.<p>* Increased hit rate of stationary / marketing beacons<p>Doesn't work because they depend on coherence in the beacons, and the identifiers roll every 10 or so minutes. Presumably you'd ensure that any rolling of the bluetooth MAC also rolls the reported identifier.<p>* Leakage of information when someone isn’t sick<p>The requests for data simply tell you someone is using an app - which you can already tell if they're using app.<p>The system can encourage someone to get tested, if your app wants to tell people to get tested, then FairPlay to that app (though good luck in the US).<p>- Fraud resistance<p>Not a privacy/tracking concern, though I'm sure devs will have to do something to limit spam/dos
Again, this solution _cannot_ work and it is a _threat_ to a permanent loss of privacy.<p>This is like the government and the adtech companies sleeping in the same bed, without any other power opposition in the balance.<p>1) The "solution" is created by a monopoly of 2 american private corporations.<p>2) It can only work reliably if everyone wear an (Apple or Android) phone at all time, and consent to give data<p>3) You are not necessarily infected if you cross an infected in the street at 5 meters. This will have too many false positives and give fuzzy information to people<p>4) It doesn't help people who are infected and _dying_<p>It just _doesnt make sense_. To me, it looks like electronic voting, but worse. No one can understand how it works, beside experts.<p>Today it is reviewed, but then the app will be forgotten and updated in the background with "new features" for adtech.<p>We are forgetting what we are fighting : a biological virus. All effort should go toward understanding the biological machinery of the virus and the hosts, in order to _cure_ the virus. We should be 3D printing ventilators, analysing DNA sequences, build nanorobots and synthesis new molecules.
Is there an official document somewhere?<p>Also, how does it compare to DP-3T? (<a href="https://github.com/DP-3T/documents" rel="nofollow">https://github.com/DP-3T/documents</a>) (<a href="https://ncase.me/contact-tracing/" rel="nofollow">https://ncase.me/contact-tracing/</a>)<p>Edit: Apple's preliminary specification was linked in another HN comment. (<a href="https://covid19-static.cdn-apple.com/applications/covid19/current/static/contact-tracing/pdf/ContactTracing-CryptographySpecification.pdf" rel="nofollow">https://covid19-static.cdn-apple.com/applications/covid19/cu...</a>)
What's it with people making long, split-up twitter threads like this? They're cumbersome and hard to read. Be an adult, write and publish an article on your blog.<p>It feels weird having to criticize Marlinspike about this, but stupid practices are stupid no matter how prestigious the person doing them is.
The system doesn't need to ship every key to every phone, much more compact structures like Bloom filters could be used instead. If we assume about 1000 positives per day and each positive uploading 14 days of keys at 4 keys per hour that's a bit over 1 million keys per day. A Bloom filter with a false positive rate of 1/1000 could store that in about a megabyte. Phone downloads the filter each day and checks its observed keys, and only needs to download the actual keys if there's a potential match.
> Published keys are 16 bytes, one for each day. If moderate numbers of smartphone users are infected in any given week, that's 100s of MBs for all phones to DL.<p>Seems like a usecase for bloom filters or k-anonymity.
Yikes, this is prep for big brother's guilt by association. I wouldn't want to test positive for anything the state can track (radical ideas? you're now a positive in this system). Opt out.
that's the new electronic voting: making easy stuff more complicated and dangerous...<p>the problem is not not a technological problem, it's a political problem.
No clue who/what a moxie is (presumably some guy) and it makes this threads title seem even more absurd.<p>OP feeling like we all need to know what moxie thinks about this reminds me of this [Chappelle Show skit](<a href="https://www.youtube.com/watch?v=Mo-ddYhXAZc" rel="nofollow">https://www.youtube.com/watch?v=Mo-ddYhXAZc</a>) about getting Ja Rule's hot take on current events.
Of course Google promises [1]:<p>“ adhering to our stringent privacy protocols and protecting people's privacy. No personally identifiable information, such as an individual's location, contacts or movement, will be made available at any point."<p>[1] <a href="https://turnto10.com/news/local/privacy-advocates-raise-concerns-about-googles-mobility-report" rel="nofollow">https://turnto10.com/news/local/privacy-advocates-raise-conc...</a>
Finally a decent use-case for blockchain and nobody is paying attention. Seems to make a lot more sense to reconcile location and proximity from a shared user-controlled anonymous ledger.
A modest proposal: since almost everyone is going to get this and a much smaller percentage is vulnerable, perhaps we should just use this system to track those who choose to register as vulnerable.