https://www.washingtonpost.com/national-security/2025/03/26/trump-signal-chat-war-plan-texts-released/<p>By Alex Horton and Missy Ryan<p>"the conversation that occurred over an unsecure, commercially available messaging platform."<p>My understanding has been that Signal is actually well out ahead of other platforms in terms of respecting user privacy, so this seems confusing to me. Has Signal failed an audit that I'm unaware of?
Unsecure in terms of being vulnerable to state spying on cell phones. Not of network interception, but rather compromised phones where a foreign adversary can read all your phone's data.<p>From this perspective, <i>all</i> phones are insecure. Classified government stuff isn't ever supposed to be on commercial smartphones in the first place.<p>The kind of security Signal provides is sufficient for people who aren't active targets of foreign states.
"Secure", particularly when used in the casual general public sense, is a pretty overloaded term. All real security is in the context of a specific threat profiles, and makes tradeoffs vs other required functionality. Signal is definitely "secure" in the sense of its core cryptography and design, and it's aimed to be of practical value to the global general public. But that requires being able to scale massively, making authentication more convenient and leaving more up to the users, who won't tend to have their own sophisticated centralized auth system, IT support, and constant life/safety critical stuff being thrown around. Signal provides tools that can be used for better assurance in who you're talking to but it doesn't simply take that out of users' hands entirely because for its use case that simply isn't feasible.<p>For small vetted group top secret conversations by a sophisticated organization, it makes more sense to have something where inviting anyone who hasn't already been brought into the magic circle with physical interaction is simply impossible. If technically unsophisticated users are important, ideally one would have fully vetted tech support who will be monitoring all participants and doing the verification work for them. All managed via central systems and heavily walled off with multiple layers from crossing between high and low sides. If they want to talk to the general public, they should use physically different devices. Worse scaling, far more friction, but that's OK for top levels of a big organization in the context of extremely sensitive information.<p>Signal is a tool and a decent one, but no tool is good for absolutely everything and trying to use a hammer as a saw isn't a defect in the hammer it's a problem with the user/organization trying to do something so foolish.
For the threat profile of top leadership of the US government, yes, Signal is not secure. Signal runs on phones and phones can be compromised or lost, which can grant non-authorized individuals the ability to read the messages.<p>Spyware like Pegasus [0] has been able to use zero-click exploits to penetrate target phones and read messages as though they were the phone's owner.<p>The US has the best SigInt capacity in the world. The leaders of the US government know that phones are not secure against sophisticated adversaries and they know that we have very sophisticated adversaries. It's deeply troubling that so many of our leaders were so comfortable discussing Secret level plans in such a reckless and illegal way, and it's extremely likely that hostile adversaries have fly-on-the-wall level access to extremely sensitive US planning.<p>[0] <a href="https://en.wikipedia.org/wiki/Pegasus_(spyware)" rel="nofollow">https://en.wikipedia.org/wiki/Pegasus_(spyware)</a>
When you work for most public corporations, you aren't allowed to bring personal devices linked to company servers to specific countries. You need to bring a burner device instead, because you are perceived as a target for corporate espionage.<p>This is like that, except the government and the type of people on the list are even better targets for their personal devices. The government has strict rules about secrecy and communication for military operations, and strong punishments for not following these protocols, because they can lead to a loss of life.<p>This is a different sort of "unsecure". The platform itself may be "secure", but the device, being in public where someone could take a picture of military secrets, etc. isn't.
Just guessing but perhaps what they meant was that Signal allows one to invite anyone into a chat thread whereas their actual in-house classified comms will not permit that without going through a massive chain of approvals and being assigned custom hardware.
We publicly know about tools like Pegasus and competitors Predator, Hermit and I would confidently assume hundreds of other tools that dont publicly advertise themselves. (they all might be using the same handful of 0days for all we know)<p>There are multiple public price lists for 0days, Crowdfense currently has iOS full Zero Click Full Chain listed as $5m-$7m<p>And thats a long way to say - thats correct, its insecure. For the price of $7m any adverse of the US (or friendly country, who cares) can read all these government messages (who knows how many more Signal groups exist without the Atlantic editor)<p>That would be the cheapest way to get US confidential information in the history of spy agencies. The NSA budget is $10B per year<p>The assumption of anyone should be - everything in my iPhone and Android phone can be read for $7m. The conversations im having in front of my iPhone can be recorded for $7m. Then the only question left is - is the information worth more than that<p>If the answer is yes, assume your phone is compromised and only talk near it / message using it, information you understand will become public
End to end encryption doesn't make the <i>ends</i> secure, just the channel between them.<p>Not something the average Jane needs to worry about, but people discussing military action should.<p>Edit: if Jane's phone gets hacked, they're going to swipe her credit cards and send messages to all her whatsapp contacts asking to borrow money urgently and here's a convenient Revolut link*. Not exfiltrate her Signal messages.<p>* whatsapp thing is for real, the latest scam making the news around where I am.
They're not saying it's not secure for normal conversation, but not up to the national security standards for such coversations. It not being a proper tool for the job is what makes it "unsecure".
I think the "unsecure" is relative - instead of something in-house, locally hosted, and up to the required standards for classified information
The sentence applies the "unsecure" adjective directly to Signal as a "messaging platform", not to the phone itself or the wider context. Signal by itself is secure. No need to mince words here, the Washington Post is simply wrong.
Well, unsecure in the sense that a reporter was mistakenly added to a group chat they certainly should not have been in. A secure app in this context would prevent random people from being added to secure areas.
The threat model for using signal is wider than what signal can audit internally:<p>Audits of a signal deployment, vs signal software at some point in time, aren't just of the app, but also how it is installed, configured, patched, operated, monitored, etc. Likewise, it's the full system, like device, os, network.<p>This stuff is supposed to run managed, especially at the level of the VP and secdef. Ex: Are they running signal patched from this week or 6 months ago, so a network attacker can leverage a software exploit to work around the crypto. Ex: Was an attacking payload sent through one of the chats while one of the people talking to the VP's + secdef's device was in Russia?<p>With the unmonitored auto deletion, and on who knows what device/network, external + internal crimes audit trails are being intentionally, recklessly, and illegally deleted. Managed detection and response, and post-crime investigations, are hard when you can't see.
It is certainly insecure compared to the normal channels for sharing classified information. The US government maintains a network that is cut off from the rest of the internet (more or less, there’s some nuances). The only way to access it is through a SCIF room. So they aren’t just using encryption for sending data, they are also using physical layer security. You can’t hack what you can’t reach, after all.
My understanding - largely based on this person's blog - is that Signal is the best secure messaging app that exists today: <a href="https://soatok.blog/2024/07/31/what-does-it-mean-to-be-a-signal-competitor/" rel="nofollow">https://soatok.blog/2024/07/31/what-does-it-mean-to-be-a-sig...</a>
No, they're wrong. Signal is considered extremely secure, which is why journalists and government uses it. Some people like to criticize anything Trump does, right or wrong.<p>That being said, the Signal non-profit entity is located in the US, so probably subject to the same risks as WhatsApp and Messenger; namely US courts compelling them to share data.
Signal responded to this directly on X: <a href="https://x.com/signalapp/status/1904666111989166408" rel="nofollow">https://x.com/signalapp/status/1904666111989166408</a>
In this case, assuming you are using Signal on iOS, the app could very well decide to send all the decyphered messages of targeted users (users that say a certain thing, or users with a certain name) to a 3rd party server.
If they wanted to be undetected in all cases, they could leak data via the timing of the network packets.<p>And they could do all that without even knowing it, just by using a compromised toolchain.<p>Long story short, unless the SW (the app, the OS, the toolchains) and the HW have been audited, you have no idea what's going on.
Just to reinforce what others are saying, security isn't a binary yes or no thing, it's on a continuum that has tradeoffs with usability, and where you want to be on that continuum depends on risk. There are things you could do to be more secure than Signal, but they would also be more difficult to use, and many of those things aren't about Signal itself, but the hardware and networks it is on.
If it is know that secret agencies are using Signal, then it is almost certain that other agencies are working to exploit that.<p>An obvious attack on Signal is to get one of your people a job working there, or to bribe/blackmail and existing employee, and have them install a backdoor or other exploitable code (maybe a secret weakening of the encryption?).
Nothing stops you from opening signal in a bar and having a guy sitting behind you from the KGB reading the texts. Or say, adding a rando to the group. In their context, that means it’s unsecure.<p>The cryptography of Signal is not the issue.
Stupid question, if anyone still reads this thread:<p>Why do these oh-so-secure offerings allow any idiot to add you to a group chat without asking you if you approve?
Soatok wrote a good blog post about this that was discussed yesterday:
<a href="https://news.ycombinator.com/item?id=43471223">https://news.ycombinator.com/item?id=43471223</a> <i>The Practical Limitations of End-to-End Encryption</i> (41 points, 42 comments)<p>The gist is that there are potential threats that any end-to-end encryption cannot fully protect against. Signal is a good provider of that encryption, but there are other considerations to protect highly confidential data, and Signal often lures non-technical users into disregarding those.
I wouldn't trust any form of symmetric encryption to secure anything.<p>And I would bet that there used to be people in the govt that could have told you why.
It's missing the point of the story to focus on this aspect. The characters involved in this event were not using Signal because they thought it was secure. They used Signal because they intended to break and knew they were breaking the law.
How could anyone know, unless they have contributed to Signal's repo ?<p>Presumably within Signal, there are plenty of weak points. And certainly Signal's ability to modify their app as they please doesn't fit within the OPSEC guidelines.<p>The question is: why would one of the most powerful militaries on the planet use a consumer app, regardless of its reputation ?<p>And the answer is: because the Trump administration is compromised.
come on dude.<p>"unsecured" as in "not a secure comms system managed and approved by the NSA", which for the US government is normally considered <i>a bad thing</i>.<p>for normal people who <i>don't</i> want the NSA to be managing their comms then Signal is approximately the best possible choice, along with not being a fucking idiot while using it.
1) Those Alex Horton and Missy Ryan do owe an apology to Signal project. Their publication was incorrect and caused misinformation regarding one of the most secure platforms on Earth.<p>2) As for Gov officials - I understand they used Signal on 1) Government issued devices, without a doubt running NSA built OS; 2) preinstalled Signal App, without a doubt audited by NSA line by line; 3) tactical OP information which has very close expiry date.<p>3) That "journalist", IMO, is guilty of high treason. They must have immediately notified the group about their presence and they must have not publish any of the secrets they accidentally got privy to. And even more, from professional POV, the actions of journalist were deeply non-ethical. I dare say, un-American and definitely not something that any US Citizen can be expected to do.<p>4) The "deep state" is furious because they can't leak Signal chat messages. IMO, it's a good choice. They (Administration) just need to carefully audit the groups and distribution lists. That was a very bad call.<p>I personally will _continue_ using Signal, even with more confidence now.