For the depressing truth on the crypto wars:
<a href="https://news.ycombinator.com/item?id=7757978" rel="nofollow">https://news.ycombinator.com/item?id=7757978</a> (Crypto won't save you either [PDF])<p>...or to paraphrase Jeff Atwood: "I love crypto, it tells me what part of the system not to bother attacking"
The fake URL in a JavaScript comment in the the JavaScript URI is a hilarious and neat trick.<p><pre><code> javascript://bishopfox.com/research?%0d%0aalert(1)
</code></pre>
gets interpreted as:<p><pre><code> //bishopfox.com/research?
alert(1)
</code></pre>
Fortunately most browsers prevent you from pasting JavaScript URIs in the URL bar these days.<p>It's a little surprising Apple overlooked not one but two fairly obvious major holes: allowing JavaScript URIs, and the lack of same-origin policy. I wonder how many other applications are similarly vulnerable.
This is the article that years ago convinced me it's not worth obsessing about my own technological privacy: <a href="http://www.gaudior.net/alma/johnny.pdf" rel="nofollow">http://www.gaudior.net/alma/johnny.pdf</a><p>I despise the "if you have nothing to hide..." argument for the surveillance state. And I argue against it every chance I get.<p>But, practically speaking, I <i>don't</i> have much to hide. I also realized that one can draw <i>more</i> attention to oneself by taking drastic measures to preserve one's own privacy.<p>I know, citation needed... I believe FB (or a related party) released some research about detecting "holes in the social network". Browser fingerprinting is another front on which I've probably made myself more unique to trackers.
Man, that's depressing. It's fairly easy to prevent this particular kind of injection—you just have to add a Content Security Policy to the HTML page. The appropriate value for web pages running from file://, with no expectation of downloading and executing remote JavaScript is: `script-src 'self';`<p>Really sad to see that Apple is using embedded web views without these sort of basic protections. I bet worse exploits than this are possible, given that they probably expose parts of the ObjectiveC layer through the JavaScriptCore bridge.
It looks like the code was pulled from Github<p><a href="https://github.com/BishopFox/cve-2016-1764" rel="nofollow">https://github.com/BishopFox/cve-2016-1764</a>
In case of Android what you only need is that your application can read notifications (and has notifications/accessibility permissions). E.g. all whatsapp messages go through it...
I had a similar thought with WhatsApp's Signal announcement. I believe that on iOS, by default all WhatsApp messages are backed up to iCloud Drive. So that would seem to be an easier attack vector.