This info gives us an interesting opportunity to estimate the rate at which Signal is adding new users. They've been very tight-lipped (understandably) about their usage stats but anecdotally they seem to be an increasingly common presence on my friends' phones, even the non-techies.
As far as I can tell, Signal uses Twilio only to send SMS for phone number verification. Verification happens when a user registers a new number or changes the number on their existing account.
The rate at which Signal is adding new users could be calculated by:
1900 * (proportion of new registrants among SMS recipients) / (length of Twilio incident)
You could probably make some common-sense assumptions about the first variable. But I can't find any publicly available info on when Twilio was first compromised. Their press release only mentions that they discovered the intrusion on August 4, which is presumably close to the end date of the incident. Does anyone know what the estimated start of the incident might be?
> Verification happens when a user registers a new number or changes the number on their existing account.
Doesn't verification also occur when you re-install the app? Between that and how hard Signal makes device <-> device upgrade transfers I wouldn't be surprised if most messages were for existing users.
Signal's SMS registration codes expire after a few minutes, so you wouldn't even need to know the duration of the incident. Let's be conservative and say the codes expire after 5 minutes (it's probably shorter), then Signal is registering 380 devices a minute.
My reading of the post is that they determined the "1,900 users" figure by the number of users who had requested a code during the duration of the Twilio incident, as the attacker could have accessed their SMS messages at any point during the compromise:
> During the window when an attacker had access to Twilio’s customer support systems it was possible for them to attempt to register the phone numbers they accessed to another device using the SMS verification code. The attacker no longer has this access, and the attack has been shut down by Twilio.
380 devices / minute would imply Signal is adding 547,200 users / day, or 199,728,000 users / year. That seems way too high. Granted some could be multiple devices per user, but still...
Interesting idea. The number could also be an attempt to cover up their actual intention of targeting specific users.
Among the 1,900 phone numbers, the attacker explicitly searched for three numbers, and we’ve received a report from one of those three users that their account was re-registered.
> This info gives us an interesting opportunity to estimate the rate at which Signal is adding new users. They've been very tight-lipped (understandably) about their usage stats but anecdotally they seem to be an increasingly common presence on my friends' phones, even the non-techies.
I am assuming US or Germany.
I can't remember which thing it was exactly but there was a huge privacy scare in the US at some point which got people to switch in droves to Signal. Maybe the WhatsApp T&C change?
German's have always been more privacy-aware (hence they have a much bigger cash payment culture than almost anywhere else in North-West Europe) and it seems like a steady trickle is switching over.
But for example here in The Netherlands, I'd say 99% of people is on WhatsApp, 10% is on Telegram, and 0.1% is on privacy-focussed messaging services.
The attack Twilio suffered is almost identical to the recent attack against Cloudflare: https://blog.cloudflare.com/2022-07-sms-phishing-attacks/ (even down the wording of the text messages, which are nearly identical). Cloudflare’s use of security keys prevented the attackers getting access to any accounts in that case.
These attacks are sophisticated and are capable of bypassing TOTP or mobile-app-based MFA. If this is widespread, I’d be surprised if we didn’t see a massive influx of breaches soon. The vast majority of companies are not well defended against this.
> These attacks are sophisticated and are capable of bypassing TOTP or mobile-app-based MFA.
To be honest, I wish people would stop parroting that these attacks were "sophisticated". In my opinion, I'd call something like Pegasus spyware "sophisticated". I don't think these attacks were that sophisticated at all - they were just standard issue, MITM attacks using targeted text messages - and they just took advantage of what is always the weakest link in security: people. I think of myself as a general middle-of-the-road software developer but I think I could have easily replicated this attack myself.
to be clear they are not able to "bypass" TOTP or mobile-app-based MFA in the way security folks think of that term. They were able to bypass humans[1], which are often the weakest link in security related matters.
[1]: "Twilio became aware of unauthorized access to information related to a limited number of Twilio customer accounts through a sophisticated social engineering attack designed to steal employee credentials. This broad based attack against our employee base succeeded in fooling some employees into providing their credentials. The attackers then used the stolen credentials to gain access to some of our internal systems, where they were able to access certain customer data. " https://www.twilio.com/blog/august-2022-social-engineering-a...
I would consider myself "security folks" and while maybe I wouldn't choose the word "bypass" the effect is that TOTP is basically useless against phishing and always was, and I don't object to that word from lay people.
At Cloudflare, or Google, or several other places that took this seriously, "fooling some employees into providing their credentials" doesn't get you anywhere. With WebAuthn your employees don't have a way to give bad guys credentials the bad guys can use - no matter how badly they were fooled.
TOTP is effective against credential stuffing, but it does nothing for phishing.
I'd contest the sophisticated characterisation. Phishing activity targeting banking has been doing this for ages and it's been common against office365 accounts for a long time as well. And it's plainly obvious to common sense that the adversary can just proxy the verification code check in a phishing attack.
I wonder if Twilio corporate culture has an aversion to or outright not-implemented or banned more secure methods of MFA such as security keys because of their $$ acquisition of Authy. Anyone have any insight?
>Among the 1,900 phone numbers, the attacker explicitly searched for three numbers, and we’ve received a report from one of those three users that their account was re-registered.
I wonder if this was a curious attacker trying to see what they could do with their access, or a targeted attack.
The page is also quite vague about how the attacker got these 1900 phone numbers. It seems to imply that they were just the ones around when the attacker got access. But it doesn’t actually state that clearly. Were they 1900 random numbers or were they chosen somehow? The latter is of course far worse.
They also apparently have logs of the attacker searching out three specific accounts within these 1900. That seems odd. What’s the chance that, out of all signal accounts, the three they are curious about just happen to be among the 1900 they got access to? (Perhaps signal/trillio don’t have logs from failed searches? That would be pretty poor logging though)
Those 1900 phone numbers would be all the accounts that started the registration/re-registration process with Signal during the time the unauthorized access was available. That process is started on Signal's side and Twilio is only used at the midway point to send a device verification SMS.
Any Signal accounts that did not start that process during that time would not be able to be intercepted or accessed since Twilio has no means to begin it. The three specific accounts mentioned would be the cases found that the verification message was accessed through Twilio to register the account on the attacker's device.
So yes, in effect the 1900 were only the ones around when the attacker got access. Whether the specific three were targeted attacks or random messing around isn't clear though.
The attacker sought out 3 specific numbers. The 1900 number is the amount of registrations that occurred during the time the attacker had access to re-register their Signal account – but likely mostly didn't.
This is a weird thread. There's a product that does secure messaging with usernames and only requires user/pass. It's called Keybase. If this is the product you want, then go use it. I don't understand why everyone wants Signal to be something it's not. I quite like Signal as they are and this "incident" demonstrates exactly what happens if a carrier gets compromised: nothing. Nothing happens. Signal decides not to trust any phone verifications from the period of compromise and requires affected numbers to reregister. All the important crypto has nothing to do with phone numbers in Signal's domain. And this is exactly why I use Signal. It lets me send secure messages to people using a tried and true UX: text messaging, but with its own secure application layer. It's really difficult to build a useable security product, and Signal has done it successfully.
If you're looking for a Keybase replacement, check out Peergos (https://peergos.org). Peergos is a P2P E2EE global filesystem and application protocol that's:
* fully open source (including the server) and self hostable
* has a business model of charging for a hosted version
* designed so that you don't need to trust your server
* audited by Cure53
* fine-grained access control
* identity proofs with controllable visibility
* encrypted applications like calendar, chat, social media, text editor, video streamer, PDF viewer, kanban
* custom apps - you can write your own apps for it (HTML5), which run in a sandbox which you can grant various permissions
I am aware. For one it still works just as well is it ever has, the Zoom acquisition didn't change anything there. So if you care about features, there shouldn't be any problem. For sure it seems to be in maintenance mode, but nothing they were doing of late with Lumens was that exciting anyway (trying to become a crypto wallet like everyone and their mothers).
I would pay $/mo for a Keybase reboot with the goal of building a sustainable business like Signal did instead of taking VC money for a shot at the moon. Until someone does that, Keybase continues to work as a messaging app with usernames instead of phone numbers.
It's kept updated, we use it to interact with the Chia Blockchain team heavily and you just can't substitute for its identity feature to know who you're talking to.
> If this is the product you want, then go use it.
This is great advice if your goal is to send messages to yourself. In the real world, though, a messaging app that you're the only one using is about as useful as a bag of ice in a snowstorm. People don't need "like signal but with usernames," they need "signal with usernames (or email addresses or...)" so they can communicate with people who use signal.
This doesn't make any sense. My assertion is that Signal would not be Signal if it has usernames. The subtext that I did not state specifically is exactly the question of why more people don't use Keybase regularly. Maybe it's not the winning UX?
You don't get to look over at Signal and say "wow what a great user base I need to be a part of that" and then draw the conclusion that "Signal needs to support my idealogical aversion to using a phone number". You're missing the possibility that Signing is the way it is because it requires users to verify their phone number.
If you can't use a phone number but need to talk to people who do, securely, then you need to convince them to use a product that accommodates your niche. Why can't you use PGP and email, or Keybase, or <insert one of the 10s of other products that let you send encrypted messages>?
Sure, signal could add support for usernames. But how do you know there'd be anyone left after they did for you to talk to? Maybe it's not what Signal's users need.
Anyway, if Signal found a way to support usernames that didn't compromise on all the reasons I use signal and also didn't open the network up for tons of spam and low quality content, I don't think I'd complain. But that's a big IF.
It's interesting to me that you used Keybase as the example. My brain doing its guessing ahead thing assumed you were going to say Matrix. I've seen several popular instances of it, and run in to people actively using it at least monthly where I haven't seen anyone use Keybase in years (since the Zoom acquisition). Do you see a lot of people _actively_ using Keybase still?
I don't use Matrix a bunch so it might just be that I'm not as familiar and out of the loop. To me Keybase (despite all the drama) seems like the most isolated/pure example of a product that took the approach of username/password style accounts and applied it to application layer crypto to achieve secure messaging. Keybase later added all the network-y chat type features that make me think more of a product like Matrix. But if Matrix is good for 1:1 "chat up my contacts and groups thereof", then great. Matrix always seemed more like federated Discord or "crypto" IRC to me with the whole needing to join channels thing.
I personally use it for LOTS of stuff, both personal and commercial (as a Slack replacement). Other than a couple bugs (pinch to zoom on Android, media playback), it's fine - I don't feel like I need any more features, though I'd love it to be a bit snappier. KBFS has been excellent for stuff like secrets in CI pipelines.
Disclaimer: I'm one of the ex-Keybase, now Zoom people. I'm definitely in a bubble. The non-Keybase people I talk with are my consultancy's employees + a couple clients.
Keybase's security model is excellent in protecting you from attacks like the one described in the OP. If you can't sign your device with another one, you can only recover a username if:
Matrix is great. Just remember that the Matrix threat model isn't the Signal threat model: you're usually telling a Matrix instance --- or, really, anyone who can compromise or suborn that instance --- a lot more about your communication patterns than you are with Signal.
Matrix, right now, is a lot more amenable to the kinds of messaging that people on HN tend to want to do than Signal is. The problematic thing is that HN people tend to believe that their workflows are (1) the most important and (2) the ones with the most sophisticated threat models. Neither are true; (2) is very un-true.
For, like, talking to team members about a shared dev project, I'd always use Matrix in preference to Signal --- of course, for that kind of work, what I'd really do is just use Slack or Discord. Which gets at something about what HN wants from Signal.
If the phone number requirement is the only part you dislike, buy a tourist Sim card with cash. The greatest lie of online identity is that phone numbers are tied to individuals forever.
I got a 6.0.1 update for Keybase like yesterday. I agree with the sentiment, though, feels like it's in maintenance mode. But its core value prop and feature has never stopped working. Point was that it's there and it works and if it's the UX model you prefer then by all means, use it at least until someone comes and reboots the concept.
I don't think it's stated enough just how easy signal is as a drop in replacement for WhatsApp, the main communication method for a significant portion of the world. The ability to install a new app, use your phones contact database, and be able to use the app nearly exactly the same way you used WhatsApp is an incredible feature. With almost zero effort you can significantly reduce (capitalist or nationstate) surveillance against you. It's not perfect but it's a lot of value for little effort.
All of these feature requests require less knowledgeable users to do new things or weigh alternative options which involves time spent developing onboarding. Having "one way," an opinionated way, to do a particular type of thing is a very useful engineering value especially if you have limited engineering resources. Simplicity is an extremely underrated feature.
Being 80% perfect for 20% of the work is laudable.
What's even better about Signal is that Facebook's competitive data is the list of people you know. Facebook wins every time a person adds a friend without adding their contact info to their phone. That means Facebook is the source of truth for who you know and Facebook is the intermediary for communicating with someone else. That's why, in retrospect, whatsapp was an obvious competitor worth spending a lot of money acquiring. WhatsApp drove people to use their phones contact list as the source of truth for you who communicate with, not Facebook's friend list.
A drop-in replacement would mean that you can still communicate with people on WhatsApp. Matrix protocol allows you to bridge WhatsApp and many other SaaS comms platforms to a single client, truly making is a drop-in replacement for WhatsApp.
The e2e encryption protocol is the definition of "let someone who has just learned about Diffie-Hellman roll their one crypto". It's called MTProto, and version 2 mostly updates padding and uses SHA256 instead of SHA1. Yes, SHA1 was deprecated before Telegram even existed. No, version 2 is not better.
Cryptographers praise Signal because the protocol makes sense and because it's not run by someone as data-hungry as Meta or Alphabet (though I think it's hosted on AWS).
Threema is a good alternative if you want username/password, but has less users (probably since it's a paid app) and less neat security properties (not even forward secrecy).
I agree Signal is not perfect and has never played the Open Source game very well (even under Moxie reports from the community were largely ignored) and the MobileCoin move is weird. I also have not followed the direction the project has taken since Moxie left. However, the _entire_ code is open source (which iirc is not the case with Telegram) and the protocol makes sense (and has been extensively studied), and there is a lot of eyes on the development. I remember code changes that suggested a pivot to not using phone numbers as identifiers (i.e. maybe requiring them for registration but not showing it to everyone you talk to).
I wonder whether MLS will go anywhere and actual projects will adopt it. Last time I checked it did require consensus on message ordering, which seems to make it less well-suited for non-centralized protocols like Matrix, but we'll see.
Telegram only provides e2e encryption for one-to-one conversations and only if you specifically create a "secret chat" largely because of usability and discoverability reasons with regard to their major point of focus. Its probably better discussed in comparison to other services like IRC, Matrix, Discord, or Slack that concentrate on feature rich group chat implementation with easy discoverability, organization, and mobility for which encryption either does not exist or is an opt-in or bolt-on feature.
Services like Signal, Whatsapp, Keybase, or iMessage that provide e2e encryption for all chats, group or otherwise, (albeit with differing levels of implementation security) have chosen to do so at the expense of things like mobility of chat history across devices and the ability to easily discover and join new group chats and instead focus on a less organized, more ad-hoc form of messaging that's a rather different use case than Telegram's.
Tgm is more a database, rather than just a messenger. It's a centralised huge server-side searchable abyss. It is both its good and bad side. On the good side: if you're using it for public and non-sensitive things like running tech support, it's easily the best thing. Once you type a word and search - you'll get everything from the very distant past. It's very good for dev-ops activity.
On the other hand - forget privacy: a phone number ID, server-decrypted chats, e2e hidden behind two menus and not available on some platforms (like Linux).
> It's really difficult to build a useable security product, and Signal has done it successfully.
I'd argue it hasn't. Signal still has no way of backing up your chat history (with photos, etc). Lose your phone and it's all gone forever. The PIN that the app annoyingly tells you to set up does not serve as an encryption key for your backups. There are no backups.
Once again, if your phone dies (this happened to me recently), all your data in Signal is gone forever. And there is no way to prevent that.
In this day and age, I consider this unacceptable. That is not a "useable security product".
I'm not saying you are wrong, but I am saying different people have different ideas and requirements about how they want things like this to work.
For me, most of my Signal chats have disappearing messages enabled, to intentionally ensure there is no long term archive of conversations (assuming you trust the other people to not be screenshotting everything). It gets you into the habit of storing message that may be useful later (mostly for me stuff like event details or addresses), with the benefit of making everybody in the conversation a little more inclined to treat it all as ephemeral and be somewhat more candid then you might be in SMS or email. Not _quite_ as candid as face to face in private, but closer.
There's a widely used and agreed on signal for most of my group chats, where setting disappearing messages to 5 minutes is understood to mean "juicy gossip or legal grey area chat is about to follow" and setting it back to 8 hours or 1 week means "OK, we're done with that discussion, back to regular chat".
Signal on my Android phone makes an encrypted backup every day, this includes photos and I can copy the file off my phone if I desire (plus I point the backups to my microsd card which should still be good if the phone dies).
> I quite like Signal as they are and this "incident" demonstrates exactly what happens if a carrier gets compromised: nothing. Nothing happens. Signal decides not to trust any phone verifications from the period of compromise and requires affected numbers to reregister.
cool, but entire carriers being compromised has never been a concern. it's state agencies forcing carriers to compromise individuals.
>I don't understand why everyone wants Signal to be something it's not
we don't. we just warn people against using it. it's not a privacy tool, it's a larp toy like a commercial VPN.
Doesn't everyone get notified when your verification status changes? Don't you need to rescan people's security numbers or whatever they call it? If this is truly a gripe you have couldn't signal also add some sort of delay to the re-verification process so that device resets take weeks to be trusted and with lots of warning and opportunities for both parties to disengage before any hostile actor takes over?
As far as I'm away, Signal is used by plenty of people who may be targeted by state agencies. Has there been even one "High value target apprehended because Signal" headline?
"...Twilio, the company that provides Signal with phone number verification services..."
Perhaps this is why Twilio (and Twilio-issued) VoIP numbers work so well for Signal when I don't want to use the number issued by my cellular carrier? Kinda hard to SIM-swap me if you don't know my real phone number.
I spin up a number and verify it can receive SMS (which it forwards to me via email). Since I need receive-only, this is fine. No need to futz with APIs or apps.
>it was possible for them to attempt to register the phone numbers they accessed to another device using the SMS verification code
That's a thing?
If my number expires and gets reassigned to someone else, and they register for Signal, I'll get locked out of my account just like that?
And they'll start getting all the messages that were addressed to me?
It's not your account any more. The new owner gets "your" SMS and phone calls too. The identity is backed by the ownership of the number, not your person.
Importantly the safety number will change since it's a new device. If you care about stuff like this, verify the new device out of band and distrust any unexpected changes. Most people don't care and they still see a huge improvement over plain SMS.
Though note that your message history is still private, as you have to manually export and import the local message history whenever you get a new device.
Services like Signal and WhatsApp can user 3P services that allow them to be notified when a phone number is rotated (given to a new user). They should ideally be doing this, I cannot verify if they are or not.
Second, Signal and other services have implemented secondary registration requirements such as a PIN, which they will require during a new device install or at other times.
Third, you can build models or crude business logic to identify when a number no longer appears used for a period of time. Carriers do not reassign a number immediately. Assigning a number thus one user cancelled, to another user, is seldom done before a 90 day hibernation period.
I absolutely do not understand why I have to link my very sensitive Signal account to a very insecure and hard to change ID: my phone number (which can be traced to my identity in too many ways).
Why Signal does not allow fully anonymous IDs (like Threema does) is a mystery to me.
Signal is fine for most users, but it is inherently _unsafe_ for high-value sensitive communications where participants can expect targeted phishing attacks.
It is not about being anonymous (though this also could be nice in some situations), it is about identity theft and credentials theft. There are numerous ways to steal my phone number and then impersonate me on Signal. For me, it is not a big deal (though a dedicated hater can probably ruin my life with that). For many people in sensitive positions, this is literally a matter of life and death.
I cannot do it in my country without physically going to some office and showing my passport. Doesn’t feel “temporary” to me.
SIM cloning is a thing. S7 hacking is a thing. Phone numbers are _insecure_ as IDs, as simple as that. Signal’s insistence to use nothing but phone numbers is somewhat suspicious these days.
(Both major competitors in secure messaging, Wire and Theeema, allow pseudonymous temporary IDs in addition to phone numbers).
As far as I can tell, Signal uses Twilio only to send SMS for phone number verification. Verification happens when a user registers a new number or changes the number on their existing account.
The rate at which Signal is adding new users could be calculated by:
1900 * (proportion of new registrants among SMS recipients) / (length of Twilio incident)
You could probably make some common-sense assumptions about the first variable. But I can't find any publicly available info on when Twilio was first compromised. Their press release only mentions that they discovered the intrusion on August 4, which is presumably close to the end date of the incident. Does anyone know what the estimated start of the incident might be?
Doesn't verification also occur when you re-install the app? Between that and how hard Signal makes device <-> device upgrade transfers I wouldn't be surprised if most messages were for existing users.
> During the window when an attacker had access to Twilio’s customer support systems it was possible for them to attempt to register the phone numbers they accessed to another device using the SMS verification code. The attacker no longer has this access, and the attack has been shut down by Twilio.
Among the 1,900 phone numbers, the attacker explicitly searched for three numbers, and we’ve received a report from one of those three users that their account was re-registered.
I am assuming US or Germany.
I can't remember which thing it was exactly but there was a huge privacy scare in the US at some point which got people to switch in droves to Signal. Maybe the WhatsApp T&C change?
German's have always been more privacy-aware (hence they have a much bigger cash payment culture than almost anywhere else in North-West Europe) and it seems like a steady trickle is switching over.
But for example here in The Netherlands, I'd say 99% of people is on WhatsApp, 10% is on Telegram, and 0.1% is on privacy-focussed messaging services.
Netherlands here. 80 contacts on my phone. 20 on Signal, of which 10 are quite normal people. Almost all 80 are on Whatsapp. No idea about Telegram.
These attacks are sophisticated and are capable of bypassing TOTP or mobile-app-based MFA. If this is widespread, I’d be surprised if we didn’t see a massive influx of breaches soon. The vast majority of companies are not well defended against this.
To be honest, I wish people would stop parroting that these attacks were "sophisticated". In my opinion, I'd call something like Pegasus spyware "sophisticated". I don't think these attacks were that sophisticated at all - they were just standard issue, MITM attacks using targeted text messages - and they just took advantage of what is always the weakest link in security: people. I think of myself as a general middle-of-the-road software developer but I think I could have easily replicated this attack myself.
[1]: "Twilio became aware of unauthorized access to information related to a limited number of Twilio customer accounts through a sophisticated social engineering attack designed to steal employee credentials. This broad based attack against our employee base succeeded in fooling some employees into providing their credentials. The attackers then used the stolen credentials to gain access to some of our internal systems, where they were able to access certain customer data. " https://www.twilio.com/blog/august-2022-social-engineering-a...
At Cloudflare, or Google, or several other places that took this seriously, "fooling some employees into providing their credentials" doesn't get you anywhere. With WebAuthn your employees don't have a way to give bad guys credentials the bad guys can use - no matter how badly they were fooled.
TOTP is effective against credential stuffing, but it does nothing for phishing.
https://www.bleepingcomputer.com/news/security/hackers-breac...
Deleted Comment
I wonder if this was a curious attacker trying to see what they could do with their access, or a targeted attack.
You can btw use the password reset function on many sites to correlate it with notifications. Easy at public events.
"Holy shit are those Signal 2FA codes? That's wild"
In my head, this is something a more teenager (e.g. Lapsus) might think?
Yes. I mean why not, you've got the number(s).
They also apparently have logs of the attacker searching out three specific accounts within these 1900. That seems odd. What’s the chance that, out of all signal accounts, the three they are curious about just happen to be among the 1900 they got access to? (Perhaps signal/trillio don’t have logs from failed searches? That would be pretty poor logging though)
Any Signal accounts that did not start that process during that time would not be able to be intercepted or accessed since Twilio has no means to begin it. The three specific accounts mentioned would be the cases found that the verification message was accessed through Twilio to register the account on the attacker's device.
So yes, in effect the 1900 were only the ones around when the attacker got access. Whether the specific three were targeted attacks or random messing around isn't clear though.
Deleted Comment
Zoom acqui-hired the team in 2020: https://blog.zoom.us/zoom-acquires-keybase-and-announces-goa...
* fully open source (including the server) and self hostable
* has a business model of charging for a hosted version
* designed so that you don't need to trust your server
* audited by Cure53
* fine-grained access control
* identity proofs with controllable visibility
* encrypted applications like calendar, chat, social media, text editor, video streamer, PDF viewer, kanban
* custom apps - you can write your own apps for it (HTML5), which run in a sandbox which you can grant various permissions
* designed with quantum resistance in mind
You can read more in our tech book (https://book.peergos.org) or source (https://github.com/peergos/peergos)
Disclaimer: co-founder here
I would pay $/mo for a Keybase reboot with the goal of building a sustainable business like Signal did instead of taking VC money for a shot at the moon. Until someone does that, Keybase continues to work as a messaging app with usernames instead of phone numbers.
This is great advice if your goal is to send messages to yourself. In the real world, though, a messaging app that you're the only one using is about as useful as a bag of ice in a snowstorm. People don't need "like signal but with usernames," they need "signal with usernames (or email addresses or...)" so they can communicate with people who use signal.
You don't get to look over at Signal and say "wow what a great user base I need to be a part of that" and then draw the conclusion that "Signal needs to support my idealogical aversion to using a phone number". You're missing the possibility that Signing is the way it is because it requires users to verify their phone number.
If you can't use a phone number but need to talk to people who do, securely, then you need to convince them to use a product that accommodates your niche. Why can't you use PGP and email, or Keybase, or <insert one of the 10s of other products that let you send encrypted messages>?
Sure, signal could add support for usernames. But how do you know there'd be anyone left after they did for you to talk to? Maybe it's not what Signal's users need.
Anyway, if Signal found a way to support usernames that didn't compromise on all the reasons I use signal and also didn't open the network up for tons of spam and low quality content, I don't think I'd complain. But that's a big IF.
Disclaimer: I'm one of the ex-Keybase, now Zoom people. I'm definitely in a bubble. The non-Keybase people I talk with are my consultancy's employees + a couple clients.
Keybase's security model is excellent in protecting you from attacks like the one described in the OP. If you can't sign your device with another one, you can only recover a username if:
- it's not in [lockdown mode](https://book.keybase.io/docs/lockdown)
- it has a verified email / phone number
- you either click a reset link in the email / SMS _or_ know the password
- _and_ the user fails to cancel the reset over many days of warnings.
And if you manage to go through all that trouble, all your contacts will get blasted with warnings about your identity. Fun!
Matrix, right now, is a lot more amenable to the kinds of messaging that people on HN tend to want to do than Signal is. The problematic thing is that HN people tend to believe that their workflows are (1) the most important and (2) the ones with the most sophisticated threat models. Neither are true; (2) is very un-true.
For, like, talking to team members about a shared dev project, I'd always use Matrix in preference to Signal --- of course, for that kind of work, what I'd really do is just use Slack or Discord. Which gets at something about what HN wants from Signal.
https://github.com/keybase/client/graphs/code-frequency
https://github.com/keybase/client/graphs/contributors
All of these feature requests require less knowledgeable users to do new things or weigh alternative options which involves time spent developing onboarding. Having "one way," an opinionated way, to do a particular type of thing is a very useful engineering value especially if you have limited engineering resources. Simplicity is an extremely underrated feature.
Being 80% perfect for 20% of the work is laudable.
What's even better about Signal is that Facebook's competitive data is the list of people you know. Facebook wins every time a person adds a friend without adding their contact info to their phone. That means Facebook is the source of truth for who you know and Facebook is the intermediary for communicating with someone else. That's why, in retrospect, whatsapp was an obvious competitor worth spending a lot of money acquiring. WhatsApp drove people to use their phones contact list as the source of truth for you who communicate with, not Facebook's friend list.
Dead Comment
genuine question from someone oblivious to messaging advances in the last decade.
Cryptographers praise Signal because the protocol makes sense and because it's not run by someone as data-hungry as Meta or Alphabet (though I think it's hosted on AWS).
Threema is a good alternative if you want username/password, but has less users (probably since it's a paid app) and less neat security properties (not even forward secrecy).
I agree Signal is not perfect and has never played the Open Source game very well (even under Moxie reports from the community were largely ignored) and the MobileCoin move is weird. I also have not followed the direction the project has taken since Moxie left. However, the _entire_ code is open source (which iirc is not the case with Telegram) and the protocol makes sense (and has been extensively studied), and there is a lot of eyes on the development. I remember code changes that suggested a pivot to not using phone numbers as identifiers (i.e. maybe requiring them for registration but not showing it to everyone you talk to).
I wonder whether MLS will go anywhere and actual projects will adopt it. Last time I checked it did require consensus on message ordering, which seems to make it less well-suited for non-centralized protocols like Matrix, but we'll see.
Services like Signal, Whatsapp, Keybase, or iMessage that provide e2e encryption for all chats, group or otherwise, (albeit with differing levels of implementation security) have chosen to do so at the expense of things like mobility of chat history across devices and the ability to easily discover and join new group chats and instead focus on a less organized, more ad-hoc form of messaging that's a rather different use case than Telegram's.
I'd argue it hasn't. Signal still has no way of backing up your chat history (with photos, etc). Lose your phone and it's all gone forever. The PIN that the app annoyingly tells you to set up does not serve as an encryption key for your backups. There are no backups.
Once again, if your phone dies (this happened to me recently), all your data in Signal is gone forever. And there is no way to prevent that.
In this day and age, I consider this unacceptable. That is not a "useable security product".
On the other hand, I consider this a feature.
I'm not saying you are wrong, but I am saying different people have different ideas and requirements about how they want things like this to work.
For me, most of my Signal chats have disappearing messages enabled, to intentionally ensure there is no long term archive of conversations (assuming you trust the other people to not be screenshotting everything). It gets you into the habit of storing message that may be useful later (mostly for me stuff like event details or addresses), with the benefit of making everybody in the conversation a little more inclined to treat it all as ephemeral and be somewhat more candid then you might be in SMS or email. Not _quite_ as candid as face to face in private, but closer.
There's a widely used and agreed on signal for most of my group chats, where setting disappearing messages to 5 minutes is understood to mean "juicy gossip or legal grey area chat is about to follow" and setting it back to 8 hours or 1 week means "OK, we're done with that discussion, back to regular chat".
Not a proper solution but a hacky workaround possibility.
Save messages and media you want to keep outside of your encrypted chats...
Deleted Comment
cool, but entire carriers being compromised has never been a concern. it's state agencies forcing carriers to compromise individuals.
>I don't understand why everyone wants Signal to be something it's not
we don't. we just warn people against using it. it's not a privacy tool, it's a larp toy like a commercial VPN.
As far as I'm away, Signal is used by plenty of people who may be targeted by state agencies. Has there been even one "High value target apprehended because Signal" headline?
Deleted Comment
Perhaps this is why Twilio (and Twilio-issued) VoIP numbers work so well for Signal when I don't want to use the number issued by my cellular carrier? Kinda hard to SIM-swap me if you don't know my real phone number.
That's a thing? If my number expires and gets reassigned to someone else, and they register for Signal, I'll get locked out of my account just like that? And they'll start getting all the messages that were addressed to me?
Importantly the safety number will change since it's a new device. If you care about stuff like this, verify the new device out of band and distrust any unexpected changes. Most people don't care and they still see a huge improvement over plain SMS.
Second, Signal and other services have implemented secondary registration requirements such as a PIN, which they will require during a new device install or at other times.
Third, you can build models or crude business logic to identify when a number no longer appears used for a period of time. Carriers do not reassign a number immediately. Assigning a number thus one user cancelled, to another user, is seldom done before a 90 day hibernation period.
I used to work at a cell provider.
Why Signal does not allow fully anonymous IDs (like Threema does) is a mystery to me.
Signal is fine for most users, but it is inherently _unsafe_ for high-value sensitive communications where participants can expect targeted phishing attacks.
SIM cloning is a thing. S7 hacking is a thing. Phone numbers are _insecure_ as IDs, as simple as that. Signal’s insistence to use nothing but phone numbers is somewhat suspicious these days.
(Both major competitors in secure messaging, Wire and Theeema, allow pseudonymous temporary IDs in addition to phone numbers).
This is a very simple phishing attack and I am surprised it has proven to be effective.
Dead Comment