The other members of the five eyes had better be careful about what they share with the U.S. while this is going on.
Public key encryption, like Signal uses, offers good security for most purposes. e.g. It's fantastic for credit card transactions. The problem with using it for transmitting state secrets is that you can't rely on it for long-term secrecy. Even if you avoid MITM or other attacks, a message sent via Signal today could be archived in ciphertext and attacked ten years from now with the hardware/algorithms of ten years in the future. Maybe Signal's encryption will remain strong in ten years. Maybe it will be trivial to crack. If the secrets contained in that message are still sensitive ten years from now, you have a problem.
Anything sent with Signal needs to be treated as published with an unknown delay. If you're sharing intelligence with the U.S., you probably shouldn't find that acceptable.
Signal’s encryption algorithm is fine. The problem is the environment in which it runs: a consumer device connected to the general internet (and it’s hard to believe that someone who does this installs patches promptly). He’s one zero day or unwise click away from an adversary getting access to those messages and potentially being able to send them. Signal’s disappearing message feature at least helps with the former risk but runs afoul of government records laws.
The reason why the policies restrict access to government systems isn’t because anyone thinks that those systems are magically immune to security bugs, but that there are entire teams of actually-qualified professionals monitoring them and proactively securing them. His phone is at risk to, say, a dodgy SMS/MMS message sent by anyone in the world who can get his number, potentially not needing more than a commercial spyware license, but his classified computer on a secure network can’t even receive traffic from them, has locked down configuration, and is monitored so a compromise would be detected a lot faster.
That larger context is what really matters. What they’re doing is like the owner of a bank giving his drunken golf buddy the job of running the business, and the first thing he does is start storing the ledger in his car because it’s more convenient. Even if he’s totally innocent and trying to do a good job, it’s just so much extra risk he’s not prepared to handle for no benefit to anyone else.
At least in the case of the leak the culprit was the UX, no?
Suppose a user wants the following reasonable features (as was the case here):
1. Messages to one's contacts and groups of contacts should be secure and private from outside eavesdroppers, always.
2. Particular groups should only ever contain a specific subset contacts.
With Signal, the user can easily make them common mistake of attempting to add a contact who already is in the group. But in this case Signal UI autosuggested a new contact, displaying initials for that new contact which are the same initials as a current group member.
Now the user has unwittingly added another member to the group.
Note in the case of the leak that the contact was a bona fide contact-- it's just that the user didn't want that particular contact in that particular group. IIRC Signal has no way to know which contacts are allowed to join certain groups.
I don't know much about DoD security. But I'm going to guess no journalist has ever been invited to access a SCIF because they have the same initials as a Defense Dept. employee.
>> The problem is the environment in which it runs
Too deep. The problem is the physical environment, the room in which the machine displays the information. Computer and technological security means nothing if the information is displayed on a screen is in a room where anyone with a camera can snap a pic at any time.
What other type of encryption would you use for state secrets? You seem to be implying that governments and three-letter agencies use some vastly superior cryptographic scheme, whereas AFAIK Signal is as close to the state of the art as it gets.
Also, to be clear, Signal doesn't use public-key cryptography in the naive way (i.e. to encrypt/decrypt messages) as was/is possible with RSA. It uses asymmetric key pairs to first do a Diffie-Hellman key exchange, i.e. generate ephemeral symmetric keys, which are then used for encryption/decryption. This then also guarantees forward secrecy, see https://signal.org/blog/asynchronous-security/ . (Add to that they incorporate an additional post-quantum cryptographic scheme these days, and I'm probably omitting a lot of other details.)
> Signal is as close to the state of the art as it gets.
For their use case, which requires communication between two (or more) arbitrary users who never communicated before among millions of users, running on cheap commodity hardware over wireless connectivity to the internet.
Leaving encryption aside, looking only at the network level, the DoD is capable of using a dedicated fiber line. Or rather a parallel fiber infrastructure.
The issue isn't the encryption. It's the unsecure device it's running on. Nobody has to waste time cracking Signal if they have backdoored one of the computers at the endpoints. The US government categorically doesn't use unapproved hardware for secure communications. This is something the Secretary of Defense is supposed to know about.
Poking around it seems like pre shared keys are used for the secure stuff, so no public keys, no rsa. It isn't that signal isn't state of the art, it just makes compromises for usability.
Edit: I didn't state something perhaps I should have. Symmetric key is considered more secure because public key is more complicated so more room for side channel mistakes, and the computation needes to break public keys doesn't scale as fast with key size. I am not an expert but that is what I've read.
I believe much of the secret government communications are accomplished using layered secret encryption algorithms. Many of these are symmetric and have physical key loading accomplished by a guy with a gun.
came here to say similar. GGP is another great example of hn people jumping in to make comments without having even a basic understanding of what they're talking about. Frustrating as it spreads misinfo about security which is the last thing we need.
Even if Signal's encryption implementation is secure, the device on which it is running probably doesn't satisfy TEMPEST requirements. Most consumer crypto is vulnerable in some way to a side-channel attack.
None of that matters if Signal is running on what is effectively a personal device connected to the internet. That device is now the weak link and is what intelligence agencies in many countries are now probably trying to get into.
> Anything sent with Signal needs to be treated as published with an unknown delay.
Oddly they have thought of that already, to the point all encryption systems in use in the gov are thought of in these terms.
All that matters are the different assumed times to publication (weeks to years), and then treating the strength of measures involved differently based on what is reasonable for the given use.
If you absolutely need something to never be published then encryption isn't the solution, and nor are computers generally.
It's the entire mandate of the NSA's Utah Data Center. Archive all the world's encrypted data until such a time as it can be decrypted when either the algorithms have been cracked or machines are powerful enough to brute-force.
Concur. This is part of why Suite A ciphers (algorithms) exist, and the second component includes robust key management practices are so important (this includes hardening of devices to prevent leakage of signals that could compromise those keys or the cryptographic processes themselves).
You shouldn't share state secrets with the US. They will be on or transferred between misconfigured cloud accounts. Some agency will eventually get authorization for analysis of them with an intention of financial espionage. The probable or confirmed loss of them will serve as a plausible deniability for the US when it misuses them.
Isn't that true for basically everything though? I'm not familiar with what other encrypted messaging systems security agencies use, but either (1) they store ciphertexts that can in theory be attacked later or (2) they delete their data after some time, but signal has that option was well.
Obviously using signal here is a terrible opsec failure, I'm just not sure how what you are saying changes anything
I worked at a videoconferencing hardware/software company. We provided systems to USA government offices like the NSA and State Department and provided an input for which they gave us hardware specs but told us nothing else, the customer did the final testing on it to make sure it worked as specified. We assumed it was for some sort of encryption method of which they revealed to us as little as possible, the hardware engineers who saw it tested only saw a large, portable black box. Otherwise our system used the standard encoding/decoding methods of the day in the 1990s.
The most secure method of communication is a one-time pad, a pre-shared private key.
"A one-time pad (OTP) is considered theoretically the most secure method of communication — when it’s implemented correctly. That means:
1. The key (pad) is truly random.
2. The key is at least as long as the message.
3. The key is used only once.
4. The key is securely shared in advance and kept completely secret.
When all these conditions are met, a one-time pad provides perfect secrecy — an eavesdropper cannot learn anything about the message, even with infinite computing power."
There are significantly fewer concerns about symmetric encryption, and while it doesn't scale to the size or budget of a service like Signal, it's exactly the type of thing the military is good at:
Distribute a bunch of physical artifacts (smartcards) across the globe; guard a central facility (a symmetric key exchange center) extremely well etc.
The military can also afford to run its (encrypted or plaintext) communications over infrastructure it fully controls. The same isn't true for a service provided out of public clouds, on the public Internet.
Signal's crypto is quite good. The problem with it is that it has zero authorization functionality, otherwise the government could use something like Signal internally. The lack of military-grade IM solutions is a problem.
> Even if you avoid MITM or other attacks, a message sent via Signal today [...]
That's not the threat model. The threat model is that Signal is a tiny LLC making an app on behalf of a foundation and open source software project. It's a small group of human beings.
Small groups of human beings can be coerced or exploited by state-level actors in lots of ways that can't feasibly be prevented. I mean, if someone walks up to you and offers $2M (or blackmails you with whatever they found in your OneDrive, etc...[1]) to look the other way while you hand them your laptop, are you really going to say no to that keylogger? Would everyone?
At scale, there are auditing techniques to address this. The admins at e.g. github are (hopefully) not as vulnerable. But Signal is too small.
[1] Edit: Or, because let's be honest that's the scale we're playing at: straight up threatens to Novichok you or your family.
There’s a million threats. These are not particularly bright people. They are busy and not aware of or concerned with much beyond limiting their own accountability for when they inevitably get burned by their bosses.
You and I know that. So do the adversaries. The biggest issue for them is going to be not tripping over the intelligence collecting agencies (or corps) already on their devices.
> The other members of the five eyes had better be careful about what they share with the U.S. while this is going on.
Right, but this is nothing new: Hegseth is only a recent example of Trump's camp mishandling sensitive docs; I'll bet there's been an inner secret Four Eyes group since the the Mar-a-Lago bathroom official-document-archive story dropped years ago.
What surprises me is that I expected Tulsi Gabbard to be the centre of mishandling allegations, not SecDef.
It would be gauche to attack Tulsi Gabbard, because you would have to start with her connections to Russia's Assadist interests and to RT & the Russian web brigades, and we have established (we voted on it) that any connection to Russia is Old News and Not A Big Deal and Hillary's Dirty Tricks. But Hegseth? Hegseth leaked something to The Atlantic. A far greater threat than Russia.
The circus has only been in town for a few months. This thing is, as scandals go, an almost comically dumb scenario - even by Trump whack pack standards.
Tulsi is by all appearances more experienced in operating under the radar. That said, I’m sure she won’t disappoint.
Even before thus, Ukraine learned painfully that it shouldn't share every plan and every detail with the US. It kind of looks like a sad, self-fulfilling proficy. Ukraine makes a plan, some details get leaked state side, plan goes disastrously. Ukraine plans another operation, doesn't say anything, the plan goes off ok. The US feels betrayed and Ukraine looks like an ungrateful ally abusing trust, the relationship is strained. The election happens and Trump points at how they're a bad partner yadda yadda. Ukraine is blamed for the outcome of what is originally an American problem, the US leaking like a sieve.
Signal has been used widely in US intelligence for many, many years. Nothing about this is new, though perhaps people that never paid attention are just now becoming aware of it. As for the rest of Five Eyes, they use WhatsApp the same way. I’m not sure that WhatsApp would be considered an improvement.
It is clear there is a gap between how people imagine this works, or should work in theory, and how it actually works.
They're paying attention to Signal now because Hegseth doesn't know his ass from his elbow when it comes to tech and secrecy, instead acting like someone who has watched too many action films and thinks those are just like real life. The problem is not Signal. The problem is incompetence. Plain and simple. Because he blindly added persons to the group that probably didn't belong there, we now have the infamous "we have OPSEC" line, but instead of questioning why this idiot still has a job anywhere near the intelligence agencies, we're wasting our breath scrutinizing what is easily one of the best opens for secure comes if the user understands how it works.
"To be clear, this is not to suggest that in similar circumstances, a person who engaged in this activity would face no consequences. To the contrary, those individuals are often subject to security or administrative sanctions. But that is not what we are deciding now."
What she did was wrong. There is no doubt about it. It needed to be investigated and dealt with accordingly. But let’s not pretend that the Secretary of State mishandling classified docs is is at all similar or related to the Secretary of Defense, sharing upcoming attack plans and actively circumventing information security, ESPECIALLY after the outcry and investigation of the Secretary of State.
But it’s not hypocritical of our country to want to improve our government officials and not for them to stagnate or slip backwards.
The two situations are not actually (legally) equivalent. One huge difference being that Hesgeth et al are setting communications to auto-delete, which is against records keep statues (there is no evidence Clinton purged e-mails).
Also: her email was at no time intended for classified materials and would have had all the safeguards that are now being circumvented in place.
Every single sender and recipient (excluding bcc) was aware or could have been aware that she was not using a .gov email address and is somewhat complicit or tacitly ok with her using that server.
Occasionally previously unclassified materials can later be deemed classified, or there can be a data spill where a sender transmits classified information and recipients need to participate in deletion, investigations, etc.
I agree that her using an external server was bad but it was also in plain sight the whole time.
> In looking back at our investigations into mishandling or removal of classified information, we cannot find a case that would support bringing criminal charges on these facts. All the cases prosecuted involved some combination of: clearly intentional and willful mishandling of classified information; or vast quantities of materials exposed in such a way as to support an inference of intentional misconduct; or indications of disloyalty to the United States; or efforts to obstruct justice.
Whataboutism is when you bring up something about person A, then the only argument against it is something relating to person B.
For example, when you point out the call the president made to the secretary of state in Georgia begging him to "find" 11,780 votes. Then, without a great excuse, the other person brings up Biden's mental decline.
Both true, both concerning, but the reply just being blatant and desperate misdirection.
...no it isn't? Whataboutism is when you redirect attention from issue #1 to unrelated issue #2 in an attempt to change the conversation topic: "forget that, look at this!"
OP's comment was pointing out the similarities between issue #1 and issue #2. There's no dismissal.
Valid concerns about op-sec and personal responsibility aside, I think this is another example of why "security at the expense of usability comes at the expense of security". Official DoD communications equipment sucks, so people use the less secure, more usable encrypted communications platform when they feel they can get away with it.
Maybe the DoD should work on developing some internal Android and Signal forks that focus on adding additional critical security controls without impacting usability. There's an obvious desire path here.
That's possible I suppose, but do you have any evidence of that or is it just your personal biases causing you to assume the worst motivation you can imagine must be the correct one?
I know personally that given the choice I'd probably rather use Signal than whatever messaging system the DoD contractors managed to come up with. And private conversations between senior military officials over encrypted DoD communication channels probably aren't FOIAable anyway.
It's not just this. Security involves compromises and trade-offs. Humans will be stupid humans and re-use passwords, install better but insecure software, not ever update, etc. It's an old story.
In the year 2025, if communication with any other human on the globe isn't as simple as opening and app and typing, then people will find another way because there are about a thousand better ways.
So I doubt they are trying to get away with anything. They're just preferring the trivial option over the option that probably involves a physical token or slow biometrics or 15-second logout or whatever arduous security features the government comms probably have. Just like any human would.
Perhaps this will force the government COMSEC people to re-evaluate their practices.
Updated to add: I'm not defending their practices, just giving a likely explanation. Blaming the users is not always the best way to evaluate a security failure.
It's partly true. They even had Android that runs on General Dynamics OKL4 and Green Hills' INTEGRITY RTOS. Signal could be ported to that. They could fund their own separation layer if they wanted which any vendor could use.
I think big companies' influence on purchasing decisions (aka corruption) drives a lot of this.
The dude has a staff of 30 people who's whole job is to connect him to literally anyone he wants to communicate with -- you're telling me that the usability of concierge service with more than two dozen staffers is inferior to using signal in a building with shitty cell service?
I've wondered about this quite a bit and imagine there's got to be a "telephone" (the game of message distortion) like aspect where if some of the communication was explained, even a little push back might change the outcome. For example, a human intermediary presented with "send these details to these people" might get a "are you sure this person should have access to this?" ultimately preventing a bad/illegal action. People avoiding this kind of accountability, even just to a communications staffer, seems like it would have to be to reduce the subtle steering that happens when people are faced with conflict they don't want to, or have run out of psychological budget to, address.
If you're going to put a guy in charge who is completely unqualified and has a history of alcohol abuse you should at least make sure he's competent. It's actually very grating to see someone operating at this highest level of authority and treating it like its beneath them. It feels like we're watching history get written by the most entitled and inept among us.
What kind of tickles me is that any new poltical thriller tv series or movie that posits that matters of state in the US are conducted by serious and knowledgable people is now virtually unwatchable for me. It's virtually impossible to suspend the disbelief required to enjoy something that is so far removed from the reality of today's politicians.
(The recent cringe inducing Deniro series comes to mind)
Yeah it has ruined the old zombie movies where the government ends up the bad guy but a competent bad guy that makes unilateral decisions like quarantining a city or bombing a civilian center to contain infection. I am pretty sure they are gonna all be either 1) running around panicking with the rest of us 2) infighting and useless 3) denying the truth before their eyes if such a catastrophic crisis ever were to happen.
On the other hand, the UK Spitting Image puppet series of sketches The President's Brain is Missing holds up remarkably well, due to being about Reagan.
I think your suspicions are wrong. Yes, there are plenty of less competent people in history, but not to this level. Trump is an imbecile who just happens to be good in one direction (bullying and manipulation). Put him in an escape room on his own and he'd die in there. While I can think that many previous UK leaders have not been genius-level, I can't think of one who was anywhere near as stupid and beligerent as Trump. Other than maybe Liz Truss, who is thick as two short planks, but fortunately was ousted quickly. That won't be happening with Trump, and the US (and the larger world) will pay for these pocket-lining morons' mistakes.
Should be a disqualifier for US security clearance.
Is easily manipulable - either give them booze to pry secrets or encourage "booze mind" in 1-1 conversations to pry secrets. Plus the huge slip-up of using Signal #signalgate.
One skirts the official tools like this to prevent accountability from a written record. Completely sensible if you're planning to be judged for your actions.
For a High-Tech President, a Hard-Fought E-Victory
For more than two months, Mr. Obama has been waging a vigorous battle with his handlers to keep his BlackBerry, which like millions of other Americans he has relied upon for years to stay connected with friends and advisers. (And, of course, to get Chicago White Sox scores.)
He won the fight, aides disclosed Thursday, but the privilege of becoming the nation’s first e-mailing president comes with a specific set of rules.
“The president has a BlackBerry through a compromise that allows him to stay in touch with senior staff and a small group of personal friends,” said Robert Gibbs, his spokesman, “in a way that use will be limited and that the security is enhanced to ensure his ability to communicate.”
[...]
The presidency, for all the power afforded by the office, has been deprived of the tools of modern communication. George W. Bush famously sent a farewell e-mail address to his friends when he took office eight years ago.
While lawyers and the Secret Service balked at Mr. Obama’s initial requests to allow him to keep his BlackBerry, they acquiesced as long as the president - and those corresponding with him - agreed to strict rules. And he had to agree to use a specially made device, which must be approved by national security officials.
"Some of the classified emails found on former secretary of state Hillary Clinton’s home server were even more sensitive than top secret, according to an inspector general for the intelligence community."
UK comparison: https://www.bbc.co.uk/news/uk-england-london-47996907 ; black woman MP has a train can, everyone treats it as a massive scandal. I think someone had a survey once where they found that one third of all hate mail and death threats directed at UK MPs was aimed at her.
Let's pretend you work for a non-US state intelligence agency. How would you find Hesgeth's personal computer in his office on the public Internet? A genuine thought experiment.
Yep. The CIA uses these same techniques to track foreigners of interest (e.g. Putin's entourage) so we should assume other countries are attempting to use similar techniques on American officials.
I would make Witkoff sit on his ass in hotel for 8 hours while my team one room over wirelessly breaks into his phone and gets into those Signal chats.
This assumes that the patsy needs to never discover that his device was compromised.
I'm guessing there are a few scenarios where they could be tortured / blackmailed into compliance, even if it meant that the DoD would know about it in a day or two, and it would still be worth it.
E.g., shortly before a real fight over Taiwan began.
I really, really hope Hegseth gets his OPSEC act together, yesterday.
You would just send him a link in a Signal message. His phone number is widely known and he has Signal installed on his desktop computer.
Signal’s protocol secures the message in transit. But their desktop app may or may not have client-side vulnerabilities. And if he clicks a link, you’re out of Signal and into the browser. If the link downloads a file, you’re into the OS.
There is a Signal social engineering vulnerability where the attacker gets people to click a link that links the attacker's device to the target's Signal account.
Compromise the device of one of his contact and send him a juicy link via telegram that renders "Error: Not viewable on mobile" when opened a phone. Bonus points if the link has 0-day malware dropper
I think a pretty good show would be something written like West Wing, where everyone takes themselves very seriously, but with rampant, blatant incompetence. Like, not funny at all. Nothing tongue in cheek, no winks to the audience. A drama of morons.
“Burn After Reading”: only adjacent to the halls of power, but a great portrayal of a majority of morons trying to impose themselves into a mostly-offscreen world of aghast professionals. It’s a harbinger.
a) beaurocrats' real comms setups (3 telephones, four monitors all sitting on the desk – versus mounted on arms/wall) full of clutter and sitting on an anachronism of a wood desk
and b) what you'd see in any "spy" movie with dark-mode graphics displaying fancy l33t charts displayed on quad-monitor setups mounted on arms, probably in a low-light setting and the beaurocrat doesn't look at the "small" monitors himself, his cronies do that, the only monitor he looks at is the single 136" on the wall used for teleconferencing with villains
I try to apply Hanlon's Razor to this administration, but it's hard not to occasionally entertain other explanations with the sheer volume of incompetence.
The same reason teenagers might use Instagram DMs to communicate about school projects - It's just the platform he's familiar with.
Or the same reason I have Whatsapp - communication in my social groups happens there, and if I don't have it I get left out.
Your explanations assume there is some deeper meaning, looking at the tradeoffs for each communication platform, and then coming to some rational conclusion. I don't think there's much evidence for that.
The people around trump just happen to be used to using signal to communicate, and if Pete doesn't get on board he gets left out.
Public key encryption, like Signal uses, offers good security for most purposes. e.g. It's fantastic for credit card transactions. The problem with using it for transmitting state secrets is that you can't rely on it for long-term secrecy. Even if you avoid MITM or other attacks, a message sent via Signal today could be archived in ciphertext and attacked ten years from now with the hardware/algorithms of ten years in the future. Maybe Signal's encryption will remain strong in ten years. Maybe it will be trivial to crack. If the secrets contained in that message are still sensitive ten years from now, you have a problem.
Anything sent with Signal needs to be treated as published with an unknown delay. If you're sharing intelligence with the U.S., you probably shouldn't find that acceptable.
The reason why the policies restrict access to government systems isn’t because anyone thinks that those systems are magically immune to security bugs, but that there are entire teams of actually-qualified professionals monitoring them and proactively securing them. His phone is at risk to, say, a dodgy SMS/MMS message sent by anyone in the world who can get his number, potentially not needing more than a commercial spyware license, but his classified computer on a secure network can’t even receive traffic from them, has locked down configuration, and is monitored so a compromise would be detected a lot faster.
That larger context is what really matters. What they’re doing is like the owner of a bank giving his drunken golf buddy the job of running the business, and the first thing he does is start storing the ledger in his car because it’s more convenient. Even if he’s totally innocent and trying to do a good job, it’s just so much extra risk he’s not prepared to handle for no benefit to anyone else.
I assume he copy pasted the message on his unsecured device.
How many apps had access to that text in his clipboard?
To me this isn't a technical problem with Signal, it's an opsec problem, and that's quite a lot harder to explain to people.
At least in the case of the leak the culprit was the UX, no?
Suppose a user wants the following reasonable features (as was the case here):
1. Messages to one's contacts and groups of contacts should be secure and private from outside eavesdroppers, always.
2. Particular groups should only ever contain a specific subset contacts.
With Signal, the user can easily make them common mistake of attempting to add a contact who already is in the group. But in this case Signal UI autosuggested a new contact, displaying initials for that new contact which are the same initials as a current group member.
Now the user has unwittingly added another member to the group.
Note in the case of the leak that the contact was a bona fide contact-- it's just that the user didn't want that particular contact in that particular group. IIRC Signal has no way to know which contacts are allowed to join certain groups.
I don't know much about DoD security. But I'm going to guess no journalist has ever been invited to access a SCIF because they have the same initials as a Defense Dept. employee.
Too deep. The problem is the physical environment, the room in which the machine displays the information. Computer and technological security means nothing if the information is displayed on a screen is in a room where anyone with a camera can snap a pic at any time.
Also, to be clear, Signal doesn't use public-key cryptography in the naive way (i.e. to encrypt/decrypt messages) as was/is possible with RSA. It uses asymmetric key pairs to first do a Diffie-Hellman key exchange, i.e. generate ephemeral symmetric keys, which are then used for encryption/decryption. This then also guarantees forward secrecy, see https://signal.org/blog/asynchronous-security/ . (Add to that they incorporate an additional post-quantum cryptographic scheme these days, and I'm probably omitting a lot of other details.)
For their use case, which requires communication between two (or more) arbitrary users who never communicated before among millions of users, running on cheap commodity hardware over wireless connectivity to the internet.
Leaving encryption aside, looking only at the network level, the DoD is capable of using a dedicated fiber line. Or rather a parallel fiber infrastructure.
About a month ago there was a discussion here saying Signal is preinstalled and widely used at the CIA.
https://news.ycombinator.com/item?id=43478091
It's also recommended by the government's cybersecurity agency CISA.
https://www.cisa.gov/sites/default/files/2024-12/guidance-mo...
Edit: I didn't state something perhaps I should have. Symmetric key is considered more secure because public key is more complicated so more room for side channel mistakes, and the computation needes to break public keys doesn't scale as fast with key size. I am not an expert but that is what I've read.
Maybe it’s the servers that is the problem.
Oddly they have thought of that already, to the point all encryption systems in use in the gov are thought of in these terms.
All that matters are the different assumed times to publication (weeks to years), and then treating the strength of measures involved differently based on what is reasonable for the given use.
If you absolutely need something to never be published then encryption isn't the solution, and nor are computers generally.
https://en.wikipedia.org/wiki/Utah_Data_Center
https://en.wikipedia.org/wiki/NSA_Suite_A_Cryptography
You shouldn't share state secrets with the US. They will be on or transferred between misconfigured cloud accounts. Some agency will eventually get authorization for analysis of them with an intention of financial espionage. The probable or confirmed loss of them will serve as a plausible deniability for the US when it misuses them.
Obviously using signal here is a terrible opsec failure, I'm just not sure how what you are saying changes anything
"A one-time pad (OTP) is considered theoretically the most secure method of communication — when it’s implemented correctly. That means: 1. The key (pad) is truly random. 2. The key is at least as long as the message. 3. The key is used only once. 4. The key is securely shared in advance and kept completely secret.
When all these conditions are met, a one-time pad provides perfect secrecy — an eavesdropper cannot learn anything about the message, even with infinite computing power."
Distribute a bunch of physical artifacts (smartcards) across the globe; guard a central facility (a symmetric key exchange center) extremely well etc.
The military can also afford to run its (encrypted or plaintext) communications over infrastructure it fully controls. The same isn't true for a service provided out of public clouds, on the public Internet.
That's not the threat model. The threat model is that Signal is a tiny LLC making an app on behalf of a foundation and open source software project. It's a small group of human beings.
Small groups of human beings can be coerced or exploited by state-level actors in lots of ways that can't feasibly be prevented. I mean, if someone walks up to you and offers $2M (or blackmails you with whatever they found in your OneDrive, etc...[1]) to look the other way while you hand them your laptop, are you really going to say no to that keylogger? Would everyone?
At scale, there are auditing techniques to address this. The admins at e.g. github are (hopefully) not as vulnerable. But Signal is too small.
[1] Edit: Or, because let's be honest that's the scale we're playing at: straight up threatens to Novichok you or your family.
You and I know that. So do the adversaries. The biggest issue for them is going to be not tripping over the intelligence collecting agencies (or corps) already on their devices.
Right, but this is nothing new: Hegseth is only a recent example of Trump's camp mishandling sensitive docs; I'll bet there's been an inner secret Four Eyes group since the the Mar-a-Lago bathroom official-document-archive story dropped years ago.
What surprises me is that I expected Tulsi Gabbard to be the centre of mishandling allegations, not SecDef.
Tulsi is by all appearances more experienced in operating under the radar. That said, I’m sure she won’t disappoint.
Deleted Comment
It is clear there is a gap between how people imagine this works, or should work in theory, and how it actually works.
For lunch orders and office softball schedules. Not top secret information.
https://www.theguardian.com/us-news/2016/sep/02/hillary-clin...
https://www.theguardian.com/us-news/2016/jul/05/fbi-no-charg...
Also:
https://www.fbi.gov/news/press-releases/statement-by-fbi-dir...
"To be clear, this is not to suggest that in similar circumstances, a person who engaged in this activity would face no consequences. To the contrary, those individuals are often subject to security or administrative sanctions. But that is not what we are deciding now."
But it’s not hypocritical of our country to want to improve our government officials and not for them to stagnate or slip backwards.
The Legal Eagle channel did an analysis of the two situations, "Signal War Plans v.s. Hillary's Emails":
* https://www.youtube.com/watch?v=cw1tNTIEs-o
The two situations are not actually (legally) equivalent. One huge difference being that Hesgeth et al are setting communications to auto-delete, which is against records keep statues (there is no evidence Clinton purged e-mails).
Every single sender and recipient (excluding bcc) was aware or could have been aware that she was not using a .gov email address and is somewhat complicit or tacitly ok with her using that server.
Occasionally previously unclassified materials can later be deemed classified, or there can be a data spill where a sender transmits classified information and recipients need to participate in deletion, investigations, etc.
I agree that her using an external server was bad but it was also in plain sight the whole time.
Hypocrisy indeed.
Whataboutism is when you bring up something about person A, then the only argument against it is something relating to person B.
For example, when you point out the call the president made to the secretary of state in Georgia begging him to "find" 11,780 votes. Then, without a great excuse, the other person brings up Biden's mental decline.
Both true, both concerning, but the reply just being blatant and desperate misdirection.
Dead Comment
OP's comment was pointing out the similarities between issue #1 and issue #2. There's no dismissal.
Maybe the DoD should work on developing some internal Android and Signal forks that focus on adding additional critical security controls without impacting usability. There's an obvious desire path here.
I know personally that given the choice I'd probably rather use Signal than whatever messaging system the DoD contractors managed to come up with. And private conversations between senior military officials over encrypted DoD communication channels probably aren't FOIAable anyway.
It's not just this. Security involves compromises and trade-offs. Humans will be stupid humans and re-use passwords, install better but insecure software, not ever update, etc. It's an old story.
In the year 2025, if communication with any other human on the globe isn't as simple as opening and app and typing, then people will find another way because there are about a thousand better ways.
So I doubt they are trying to get away with anything. They're just preferring the trivial option over the option that probably involves a physical token or slow biometrics or 15-second logout or whatever arduous security features the government comms probably have. Just like any human would.
Perhaps this will force the government COMSEC people to re-evaluate their practices.
Updated to add: I'm not defending their practices, just giving a likely explanation. Blaming the users is not always the best way to evaluate a security failure.
I think big companies' influence on purchasing decisions (aka corruption) drives a lot of this.
(The recent cringe inducing Deniro series comes to mind)
I suspect this is somewhat common in history (this is not meant to excuse it), but we can’t tell because those people still wrote the narrative.
Should be a disqualifier for US security clearance.
Is easily manipulable - either give them booze to pry secrets or encourage "booze mind" in 1-1 conversations to pry secrets. Plus the huge slip-up of using Signal #signalgate.
One skirts the official tools like this to prevent accountability from a written record. Completely sensible if you're planning to be judged for your actions.
For a High-Tech President, a Hard-Fought E-Victory
For more than two months, Mr. Obama has been waging a vigorous battle with his handlers to keep his BlackBerry, which like millions of other Americans he has relied upon for years to stay connected with friends and advisers. (And, of course, to get Chicago White Sox scores.)
He won the fight, aides disclosed Thursday, but the privilege of becoming the nation’s first e-mailing president comes with a specific set of rules.
“The president has a BlackBerry through a compromise that allows him to stay in touch with senior staff and a small group of personal friends,” said Robert Gibbs, his spokesman, “in a way that use will be limited and that the security is enhanced to ensure his ability to communicate.”
[...]
The presidency, for all the power afforded by the office, has been deprived of the tools of modern communication. George W. Bush famously sent a farewell e-mail address to his friends when he took office eight years ago.
While lawyers and the Secret Service balked at Mr. Obama’s initial requests to allow him to keep his BlackBerry, they acquiesced as long as the president - and those corresponding with him - agreed to strict rules. And he had to agree to use a specially made device, which must be approved by national security officials.
"Some of the classified emails found on former secretary of state Hillary Clinton’s home server were even more sensitive than top secret, according to an inspector general for the intelligence community."
even bush fooled everyone he was literate (save from the two times he held books upsidedown) while in office.
Deleted Comment
Deleted Comment
https://news.sky.com/story/trumps-fixer-was-made-to-wait-eig...
His personal PC? Send Big Ballz his way to do some upgrades
https://www.npr.org/2025/04/15/nx-s1-5355896/doge-nlrb-elon-...
maybe a free Starlink dish
https://www.nytimes.com/2025/03/17/us/politics/elon-musk-sta...
I'm guessing there are a few scenarios where they could be tortured / blackmailed into compliance, even if it meant that the DoD would know about it in a day or two, and it would still be worth it.
E.g., shortly before a real fight over Taiwan began.
I really, really hope Hegseth gets his OPSEC act together, yesterday.
Signal’s protocol secures the message in transit. But their desktop app may or may not have client-side vulnerabilities. And if he clicks a link, you’re out of Signal and into the browser. If the link downloads a file, you’re into the OS.
Title:”0-click deanonymization attack targeting Signal, Discord, other platforms”
Maybe not 0-click anymore, but still applies if the user browsing the internet.
Yes, I should have thought of that old and obvious one. It opens up a universe of possibilities.
Get me inside the minds of these freaks.
Deleted Comment
a) beaurocrats' real comms setups (3 telephones, four monitors all sitting on the desk – versus mounted on arms/wall) full of clutter and sitting on an anachronism of a wood desk
and b) what you'd see in any "spy" movie with dark-mode graphics displaying fancy l33t charts displayed on quad-monitor setups mounted on arms, probably in a low-light setting and the beaurocrat doesn't look at the "small" monitors himself, his cronies do that, the only monitor he looks at is the single 136" on the wall used for teleconferencing with villains
is hilarious
1) He is avoiding some sort of corrupt signals intelligence folks from knowing what he's working on.
2) He is avoiding the government catching him in some corruption by avoiding the official records act.
Anything else?
Or the same reason I have Whatsapp - communication in my social groups happens there, and if I don't have it I get left out.
Your explanations assume there is some deeper meaning, looking at the tradeoffs for each communication platform, and then coming to some rational conclusion. I don't think there's much evidence for that.
The people around trump just happen to be used to using signal to communicate, and if Pete doesn't get on board he gets left out.
We have to assume malicious intent. These people could start a nuclear war. They get zero flexibility or grace.