One interesting way to detect this would be to observe sender's outgoing and recipient's incoming ciphertexts inside the client-to-server TLS that can be MITM'd by users. Since the ratchet state differs, so do the keys, and thus under same plaintext, so do the ciphertexts. That would be really easy way to detect MITM.
>So I wouldn't count on its E2EE either.
This is the worst way to assses E2EE deployment. 5D-chess.
>Signal still requires a phone number and proprietary Google blobs on mobile.
Telegram also requires a phone number. If you didn't have double standards, I bet you'd have no standards.
>Many third-party Telegram clients exist
The official implementation and default encryption matters. 99.99% just assume Telegram is secure because Putin supposedly tries to block it. They don't know it's not E2EE. And no third party TG desktop client offers cross-platform E2EE or E2EE groups. IIRC there's exactly one desktop client that tries to offer E2EE 1:1 chats but that's not seamless. TG has no idea how to make seamless E2EE like Signal.
You ignoring that Signal is both open source and always E2EE and complaining about it's "proprieatry blobs" yet looking past TG's atrocious E2EE speaks volumes.
No wonder the great Russian firewall is struggling to keep TG at bay. Wake up.
No end-to-end encryption for groups. WhatsApp has.
No end-to-end encryption on desktop. WhatsApp has.
No break-in key-recovery. WhatsApp has.
Inferring Telegram's security from public statements of *checks notes* former KGB officer and FSB director -- agencies that wrote majority of the literature in maskirovka, isn't exactly reliable, wouldn't you agree?
The client is open source. It's trivial to verify this is 100% factually happening. They have access to every group message. Every desktop message. Every message by default. If you enable secret chats for 1:1 mobile chats, you are now disclosing to Telegram you're actively trying to hide something from them, and if there ever was metadata worth it for Keith Alexander to kill someone over, it's that.
>they seem less cooperative and I never got the notion that they ever read private messages until the Macron incident
We have no way to verify Telegram isn't a Russian OP. I'd love to say Pavel Durov fled for his life into exile https://www.nytimes.com/2014/12/03/technology/once-celebrate...
But the "fugitive" has since visited Russia over SIXTY times https://kyivindependent.com/kremlingram-investigation-durov/
Thus, I wouldn't be as much concerned about what they're handing EUROPOL, but what they're handing FSB/SVR.
Even if Telegram never co-operated with Russian intelligence, who here thinks Telegram team, that can't pull off the basic thing of "make everything E2EE" that ~all of its competition has successfully done, can harden their servers against Russian state sponsored hackers like Fancy Bear, who obviously would never make noise about successful breach and data exfiltration.
>How come they are able to be this exception despite not having end to end encryption by default?
They've pushed out lie about storing cloud chats across different servers in different jurisdictions. Maybe that scared some prosecutors off. Or maybe FVEY is inside TG's servers too, and they don't like the idea of going after users as that would incentivize deployment of usable E2EE.
Who knows. Just use Signal.
I don't really see how it's possible to mitigate client compromise. You can decrypt stuff on a secure enclave but at some point the client has to pull it out and render it.
You could of course offload plaintext input and output along with cryptographic operations and key management to separate devices that interact with the networked device unidirectionally over hardware data diodes, that prevent malware from getting in or getting the keys out.
Throw in some v3 Onion Services for p2p ciphertext routing, and decent ciphersuite and you've successfully made it to at least three watch lists just by reading this. Anyway, here's one I made earlier https://github.com/maqp/tfc
Private keys, probably not. WhatsApp is E2EE meaning your device generates the private key with OS's CSPRNG. (Like I also said above), exfiltration of signing keys might allow MITM but that's still possible to detect e.g. if you RE the client and spot the code that does it.
This DCL could be fetching some forward_to_NSA() function from a server and registering it to be called on every outgoing message. It would be trivial to hide in tcpdumps, best approach would be tracing with Frida and looking at syscalls to attempt to isolate what is actually being loaded, but it is also trivial for apps to detect they are being debugged and conditionally avoid loading the incriminating code in this instance. This code would only run in environments where the interested parties are sure there is no chance of detection, which is enough of the endpoints that even if you personally can set off the anti-tracing conditions without falling foul of whatever attestation Meta likely have going on, everyone you text will be participating unknowingly in the dragnet anyway.
https://developer.android.com/privacy-and-security/risks/dyn...
I wonder if that would deter Meta.