This document is classified U//FOUO (unclassified//for official use only). The actual abilities of the FBI/NSA and like agencies are surely classified to some higher level.
The real question is if you actually believe what this document writes about Wickr.
Wickr is set up like an expected honeypot would be set up. So for people that don't or aren't willing to understand that, I'm wondering if this document validates them, or if the skepticism of this document's classification level validates the idea Wickr should be avoided for sensitive communications.
Considering endpoints are compromised like swiss cheese in 2021 and third party apps are all compromised, people should be of the belief that they cannot trust anything they didn't write and build themself.
U//FOUO is an obsolete caveat. It has been replaced by CUI (Controlled Unclassified Information).
If this information has been publicly released, I would assume that it does not comprehensively list all of the methods/sources that could be in use. Thus, I would not trust this document to be accurate.
You make some good points, but I need to point out that hacking a phone to obtain message contents is different from serving a warrant to a third party. Legally and practically.
> If you root-hack a phone you can easily get all messages the user sees after you hacked it.
This is not even necessary if you're in bed with hardware manufacturers, which could be the case with governments.
All is needed is a system level daemon running in background, disguised as service or device driver pushed as mandatory upgrade, having access to the underlying iron. The user employs super strong cryptography? Who cares if we can read what is sent to the screen and tunnel it over the network to us. Passwords? Who cares if we can sniff the touch screen data to get the tapped points coordinates containing the position of keys on the virtual keyboard.
Point is that there is zero protection against surveillance if the adversary has control over the hardware. Which is the reason why flashing an FOSS operating system on a closed hardware/firmware platform, although being indeed a good thing, isn't enough to claim victory.
I'm skeptical of the accuracy of this document. Telegram is by default unencrypted and virtually public. Yet this document says the FBI can't get any message content?
> Telegram group chats are very much available to law enforcement if they can convince Telegram to hand over the data.
Telegram public channels and groups are open, including for law enforcement. They also say openly that they cooperate with everyone to take down certain illegal material from channels and open groups.
Telegram claimed as late as a few months ago - and nobody has proven otherwise in any form as far as I can see - that not a byte of their users private data (i.e. not on open groups or channels) have been handed over to any government.
I cannot prove this and also I'm getting more careful with Telegram these days (this might come as a surprise for some of you who know my history of defending Telegram) but I still think
1. if it was possibly to prove something else there are enough Telegram haters just on HN to make sure to leak it
2. just to be clear I still think it is a very good alternative for friend-to-friend-communication, group communication etc, I'm just looking for alternatives as I go forward, and also I am worried when I see police using it at work.
>if they can convince Telegram to hand over the data.
Or if they hack Telegram's servers. Or ask some other agency like the NSA (that hacks systems all the time) to do that for them.
As for the legal aspects, I'm fairly sure Telegram can be made to comply, no individual user's is worth losing (tens/hundreds of) millions of customers in that particular country. It's not like Telegram can't do that technically*, the server-side database encryption key is by definition in the RAM of the server system.
* That hasn't prevented them from actively misleading customers with their split-key-and store-parts-under-multiple-jurisdictions -scheme.
The document is about what can be easily requested from companies, not what can be hacked. Because telegram hosts no servers in the US they can't trivially request it. They can get it other ways and of course by hacking. But the document isn't about what they can successfully hack or request through back channels. This is why people say that you have to trust Telegram, as opposed to fully E2EE systems (like Signal) which require (almost) zero trust.
It is because you and others got your facts wrong.
Telegram is not unencrypted. This is a lie spread by certain WhatsApp and Signal fanboys (not all, count me in with the Signal fans - I just happen to be a reasonable one that to some degree know what I'm talking about) with the excuse that "of course we mean end-to-end-encrypted when we say encrypted".
What we see now is the resulting confusion: why don't law enforcement have access to it if it virtually unencrypted? Well, the answer is despite all claims of how lousy the encryption is for some weird reason[1] it doesn't seem to leak data.
Now that we have seen the confusion that stems from saying "encrypted means end-to-end-encrypted when we say it does", can we stop repeating that nonsense?
Also, can we think twice before mindlessly repeating such stuff in the future even if it was originally said by some extremely smart people that are well respected for good reasons?
Because those very smart people were the same who recommended WhatsApp for a long time until it became painfully clear to everyone that:
- WhatsApp leaks metadata to Facebook which cooperates happily with basically any government as far as I understand
- WhatsApp has uploaded unencrypted backups to Google Cloud (yes, probably over https, but Google got all you messages and it was known that they would datamine it.)
- and more¨
PS: Some time around half a year before Telegram launched WhatsApp actually sent data unencrypted, i.e. as plaintext. And they sent it over port 443..!
PPS: Stay safe folks, opsec is probably more important than the exact messenger you use. My bets today are Signal in the short run and Matrix as soon as possible, but personally I send photos to my parents using Telegram and receive a lot more info back from various groups.
[1]: Meaning either this is a bigger honeypot than An0m and everyone Three Letter Agency including both FSB and NSA are in on it or Telegram actually got something right. Or they have just been extremely lucky for 9 years in a row or something.
Because telegram can access the messages. If the vendor can access the message data (eg: not end to end encrypted), anyone can. That is the bar. E2E||GTFO.
I assume they are referring to telegram "secret chats"? Also their own website notes all chats are "encrypted" by default. It's simply that by default they are not "end-to-end" encrypted unless you use secret chats for instance.
That is the thousand dollar question. How would we know Telegram isn't an FSB front
* The CEO isn't a developer
* We know practically nothing about their developers, they're all anonymous
* The server has access to overwhelming majority of messages (among fellow CS students only ~10% said they use secret chats, and most likely even they don't do that for every 1:1 chat. Furthermore, groups can not be E2EE at all, and neither can Win/Linux desktop chats)
* Journalists that went to see Telegram's offices at Dubai found an empty office, and their office neighbors said they've never seen Telegram developers let alone anyone enter the offices https://www.youtube.com/watch?v=Pg8mWJUM7x4 They did speculate Telegram might be using Dubai for tax evasion.
* That being said, we know absolutely nothing about Telegram's financials, nothing official has ever been reported by the company. Yet the system manages to stay afloat year after year with 600M+ users.
I'd love to be able to give good reasons why Telegram can't possibly be an op, but hand-waved opt-in E2EE for some clients, is the only one I can find, and that encryption has been most effective in online debates defending Telegram's bad security model.
It isn't "the KGB" anymore. In Russia it is now "the FSB". There are other KGBs in other countries such as Belarus but these aren't the KGB. (KGB is Russian for "Committee for State Security".)
"Along with its counterparts in Transnistria and South Ossetia,[1] it is one of the few intelligence agencies that kept the Russian name "KGB" after the dissolution of the Soviet Union, albeit it is lost in translation when written in Belarusian (becoming KDB rather than KGB)."
Telegram moved out of Russia precisely so they wouldn't be under their influence, so that's wild speculation. If the argument is that the KGB (or is it FSB now?) might go and just put a gun to their head: they could do that to literally anyone anywhere, so it doesn't matter.
Telegram was in Berlin for a while but moved out of Germany for privacy/legal reasons as well (good call, considering that Germany is discussing trying to outlaw them) and moved to Dubai iirc.
I worked long enough in telecom industry to know that there is no way for regulators to leave major communication platforms without some sort of surveillance. They can't sleep without it, and they don't take "Oh! sorry it's encrypted" as an answer.
I don't buy this. Maybe it's true about FBI, but other agencies have the keys for right or wrong reasons.
I was also in that industry and agree. We could rewrite the firmware on phones over-the-air live and this was long ago. The only reason we didn't do this for customers was the one-off chance we brick a phone and cause higher customer support load. We could read and write to anything on the phone remotely. Such capabilities would surely not be abandoned.
Since you do not [most likely] have root access to your phone, you cannot directly examine what Apple/Google has installed on _your specific_ phone. Any of these applications could have its memory examined transparently if the operating system is evil.
Yes, that actually highlights the absurdity of the government's stances to encryption.
They can still do an actual investigation on the person and try to dump from the physical devices they find (whether it requires a court order first, or not).
Of course, this is expensive, and often it is not possible to know who to go to, or whether they can access that person, or they don't have enough information yet to determine if a crime has been committed. But that's the point. In the US, for example, the constitution was constructed specifically to be an alternative to how England and its colonies were governed.
Governments have had a small window of time where electronic communications had a combination of: existing, not being private, and being understood by law enforcement. That window is closing and is just a reversion to the mean.
Of course they are going to say "everyone using this convenient poorly implemented system without privacy helps us greatly in investigations", but thats not the point. They have to do an actual investigation now and target the devices itself physically, and even after all that only sometimes that will yield anything, depends on the OS version and app used, and app version.
Yes, and don't forget: Anyone who wants to steal your password can covertly look over your shoulder. They can also physically break into your house by smashing a window, or running a bulldozer through your wall.
Security always requires a certain amount of trust. If you can't get that trust, meet in person and keep your electronic communication vague.
Wickr is set up like an expected honeypot would be set up. So for people that don't or aren't willing to understand that, I'm wondering if this document validates them, or if the skepticism of this document's classification level validates the idea Wickr should be avoided for sensitive communications.
If this information has been publicly released, I would assume that it does not comprehensively list all of the methods/sources that could be in use. Thus, I would not trust this document to be accurate.
It's reasonable to believe that at any point in time Root exploits exist for both iOS and Android.
It's viable that the FBI or someone they cooperate with has such exploits from time to time (which doesn't mean they are reliable, or cheap to use).
If you root-hack a phone you can easily get all messages the user sees after you hacked it.
Even without root hacking you might get some, in some circumstances.
EDIT: I should have read the article first, it's more about what content they get without hacking.
https://en.wikipedia.org/wiki/Expectation_of_privacy
https://en.wikipedia.org/wiki/Third-party_doctrine
https://en.wikipedia.org/wiki/Dragnet_(policing)
This is not even necessary if you're in bed with hardware manufacturers, which could be the case with governments. All is needed is a system level daemon running in background, disguised as service or device driver pushed as mandatory upgrade, having access to the underlying iron. The user employs super strong cryptography? Who cares if we can read what is sent to the screen and tunnel it over the network to us. Passwords? Who cares if we can sniff the touch screen data to get the tapped points coordinates containing the position of keys on the virtual keyboard. Point is that there is zero protection against surveillance if the adversary has control over the hardware. Which is the reason why flashing an FOSS operating system on a closed hardware/firmware platform, although being indeed a good thing, isn't enough to claim victory.
Telegram group chats are very much available to law enforcement if they can convince Telegram to hand over the data.
It could also be that Telegram (and any other foreign chat company) is more reluctant to (and more difficult to force to) share data with the FBI.
> Telegram group chats are very much available to law enforcement if they can convince Telegram to hand over the data.
Telegram public channels and groups are open, including for law enforcement. They also say openly that they cooperate with everyone to take down certain illegal material from channels and open groups.
Telegram claimed as late as a few months ago - and nobody has proven otherwise in any form as far as I can see - that not a byte of their users private data (i.e. not on open groups or channels) have been handed over to any government.
I cannot prove this and also I'm getting more careful with Telegram these days (this might come as a surprise for some of you who know my history of defending Telegram) but I still think
1. if it was possibly to prove something else there are enough Telegram haters just on HN to make sure to leak it
2. just to be clear I still think it is a very good alternative for friend-to-friend-communication, group communication etc, I'm just looking for alternatives as I go forward, and also I am worried when I see police using it at work.
Or if they hack Telegram's servers. Or ask some other agency like the NSA (that hacks systems all the time) to do that for them.
As for the legal aspects, I'm fairly sure Telegram can be made to comply, no individual user's is worth losing (tens/hundreds of) millions of customers in that particular country. It's not like Telegram can't do that technically*, the server-side database encryption key is by definition in the RAM of the server system.
* That hasn't prevented them from actively misleading customers with their split-key-and store-parts-under-multiple-jurisdictions -scheme.
Telegram is not unencrypted. This is a lie spread by certain WhatsApp and Signal fanboys (not all, count me in with the Signal fans - I just happen to be a reasonable one that to some degree know what I'm talking about) with the excuse that "of course we mean end-to-end-encrypted when we say encrypted".
What we see now is the resulting confusion: why don't law enforcement have access to it if it virtually unencrypted? Well, the answer is despite all claims of how lousy the encryption is for some weird reason[1] it doesn't seem to leak data.
Now that we have seen the confusion that stems from saying "encrypted means end-to-end-encrypted when we say it does", can we stop repeating that nonsense?
Also, can we think twice before mindlessly repeating such stuff in the future even if it was originally said by some extremely smart people that are well respected for good reasons?
Because those very smart people were the same who recommended WhatsApp for a long time until it became painfully clear to everyone that:
- WhatsApp leaks metadata to Facebook which cooperates happily with basically any government as far as I understand
- WhatsApp has uploaded unencrypted backups to Google Cloud (yes, probably over https, but Google got all you messages and it was known that they would datamine it.)
- and more¨
PS: Some time around half a year before Telegram launched WhatsApp actually sent data unencrypted, i.e. as plaintext. And they sent it over port 443..!
PPS: Stay safe folks, opsec is probably more important than the exact messenger you use. My bets today are Signal in the short run and Matrix as soon as possible, but personally I send photos to my parents using Telegram and receive a lot more info back from various groups.
[1]: Meaning either this is a bigger honeypot than An0m and everyone Three Letter Agency including both FSB and NSA are in on it or Telegram actually got something right. Or they have just been extremely lucky for 9 years in a row or something.
And Telegram most certainly cannot encrypt group chat.
Even your private messages only require you to enter an SMS code to view, so anyone that can intercept an SMS sent to you, can read your messages.
* The CEO isn't a developer
* We know practically nothing about their developers, they're all anonymous
* The server has access to overwhelming majority of messages (among fellow CS students only ~10% said they use secret chats, and most likely even they don't do that for every 1:1 chat. Furthermore, groups can not be E2EE at all, and neither can Win/Linux desktop chats)
* Journalists that went to see Telegram's offices at Dubai found an empty office, and their office neighbors said they've never seen Telegram developers let alone anyone enter the offices https://www.youtube.com/watch?v=Pg8mWJUM7x4 They did speculate Telegram might be using Dubai for tax evasion.
* That being said, we know absolutely nothing about Telegram's financials, nothing official has ever been reported by the company. Yet the system manages to stay afloat year after year with 600M+ users.
I'd love to be able to give good reasons why Telegram can't possibly be an op, but hand-waved opt-in E2EE for some clients, is the only one I can find, and that encryption has been most effective in online debates defending Telegram's bad security model.
https://en.wikipedia.org/wiki/State_Security_Committee_of_th...
"Along with its counterparts in Transnistria and South Ossetia,[1] it is one of the few intelligence agencies that kept the Russian name "KGB" after the dissolution of the Soviet Union, albeit it is lost in translation when written in Belarusian (becoming KDB rather than KGB)."
Telegram was in Berlin for a while but moved out of Germany for privacy/legal reasons as well (good call, considering that Germany is discussing trying to outlaw them) and moved to Dubai iirc.
* all Telegram chats by default
* all Telegram group chats
* all Telegram Win/Linux desktop chats
is indistinguishable from end-to-end encryption where the service provider has a backdoor (ANOM[1])
In both cases the service provider, and anyone who hacks them, can read your messages.
[1] https://www.pcmag.com/news/fbi-sold-criminals-fake-encrypted...
I don't buy this. Maybe it's true about FBI, but other agencies have the keys for right or wrong reasons.
"can render 25 days of iMessage lookups and from a target number."
I thought iMessage was E2EE and with all the iJunk turned off this isn't possible?
They can still do an actual investigation on the person and try to dump from the physical devices they find (whether it requires a court order first, or not).
Of course, this is expensive, and often it is not possible to know who to go to, or whether they can access that person, or they don't have enough information yet to determine if a crime has been committed. But that's the point. In the US, for example, the constitution was constructed specifically to be an alternative to how England and its colonies were governed.
Governments have had a small window of time where electronic communications had a combination of: existing, not being private, and being understood by law enforcement. That window is closing and is just a reversion to the mean.
Of course they are going to say "everyone using this convenient poorly implemented system without privacy helps us greatly in investigations", but thats not the point. They have to do an actual investigation now and target the devices itself physically, and even after all that only sometimes that will yield anything, depends on the OS version and app used, and app version.
Security always requires a certain amount of trust. If you can't get that trust, meet in person and keep your electronic communication vague.