It has become fashionable to s*t on GnuPG. I just wish all the crypto experts doing that would point me to an alternative that is functionally equivalent.
Something that will encrypt using AES-256 with a passphrase, but also using asymmetric crypto. Oh, and I want my secret keys printable if needed. And I want to store them securely on YubiKeys once generated (https://github.com/drduh/YubiKey-Guide). I want to be able to encrypt my backups to multiple recipients. And I want the same keys (stored on Yubikeys, remember?) to be usable for SSH authentication, too.
And by the way, if your fancy tool is written using the latest language du jour with a runtime that changes every couple of years or so, or requires huge piles of dependencies that break if you even as much as sneeze (python, anyone?), it won't do.
BTW, in case someone says "age", I actually followed that advice and set it up just to be there on my systems (managed by ansible). Apart from the fact that it really slowed down my deployments, the thing broke within a year. And I didn't even use it. I just wanted to see how reliable it will be in the most minimal of ways: by having it auto-installed on my systems.
If your fancy tool has less than 5 years of proven maintenance record, it won't do. Encryption is for the long term. I want to be able to read my stuff in 15-30 years.
So before you go all criticizing GnuPG, please understand that there are reasons why people still use it, and are actually OK with the flaws described.
> I just wish all the crypto experts doing that would point me to an alternative that is functionally equivalent.
The entire point of every single valid criticism of PGP is that you cannot make a single functionally equivalent alternative to PGP. You must use individual tools that are good at specific things, because the "Swiss Army knife" approach to cryptographic tool design has yielded empirically poor outcomes.
If you have an example of how age broke for you, I think its maintainers would be very interested in hearing that -- I've been using it directly and indirectly for 5+ years and haven't had any compatibility or runtime issues with it, including when sharing encrypted files across different implementations of age.
Point of order: there are valid and important criticisms of PGP that have nothing to do with its jack-of-all-trades philosophy. There's no modern cryptosystem in the world you would design with PGP's packet scheme.
> Apart from the fact that it really slowed down my deployments
Is this a comparable complaint worth mentioning, and if it is are you sure you actually need cryptography? It slowed things down a bit, so you don't really want to move on from demonstrably too-complex to not have bugs GnuPG?
Asking for an equivalent to GPG is like asking for an equivalent of a Swiss knife with unshielded chainsaws and laser cutters.
Stop asking for it, for your own good, please. If you don't understand the entire spec you can't use it safely.
You want special purpose tools. Signal for communication, Age for safer file encryption, etc.
What exact problems did you have with age? You're not explaining how it broke anything. Are you compiling yourself?
Age has yubikey support and can do all you described.
> if your fancy tool has less than 5 years of proven maintenance record, it won't do. Encryption is for the long term. I want to be able to read my stuff in 15-30 years.
This applies to algorithms, it does not apply to cryptographic software in the same way. The state of art changes fast, and while algorithms tend to stand for a long time these days there are significant changes in protocol designs and attack methods.
There isn't an alternative that is functionally equivalent because what PGP does is dumb. It's a Swiss Army Knife. Nobody who wants to design an excellent saw sets out to design the Swiss Army Knife saw[†]. Nobody who needs shears professionally buys a Swiss Army Knife for the scissors.
The cryptographic requirements of different problems --- backup, package signing, god-help-us secure messaging --- are in tension with each other. No one design adequately covers all the use cases. Trying to cram them all into one tool is a sign that something other than security is the goal. If that's the case, you're live action roleplaying, not protecting people.
I'd be interested in whether you could find a cryptographer who disagrees with that. I've asked around!
> Apart from the fact that it really slowed down my deployments, the thing broke within a year. And I didn't even use it. I just wanted to see how reliable it will be in the most minimal of ways: by having it auto-installed on my systems.
I didn't even catch this the first read. `age` is a command line program written in Go. It's not a system service. Simply "having it installed" on your system can't do anything.
I believe all the criticism of GnuPG is due to the fact most people grew up with Microsoft or Apple, so they are use to hand-holding.
If you read the various how-tos out there it is not that hard to use, just people do not want to read anything more than 2 lines. That is the main issue.
My only complaint is Thunderbird now uses its own homegrown encryption, thus locking you into their email client. Seems almost all email clients have their own way of encryption, confusing the matters even more. I now use mutt because it can be easily likned to GnuPG and it does not lock me into a specific client.
> If you read the various how-tos out there it is not that hard to use, just people do not want to read anything more than 2 lines. That is the main issue.
The video linked above contains multiple examples of people using GnuPG's CLI in ways that it was seemingly intended to be used. Blaming users for holding it wrong seems facile.
Okay, since there’s so much stuff to digest here and apparently there are issues designated as wontfix by GnuPG maintainers, can someone more in the loop tell us whether using gpg signatures on git commits/tags is vulnerable? And is there any better alternative going forward? Like is signing with SSH keys considered more secure now? I certainly want to get rid of gpg from my life if I can, but I also need to make sure commits/tags bearing my name actually come from me.
One of those WONTFIX's is on an insane vulnerability: you can bitflip known plaintext in a PGP message to switch it into handling compression, allowing attackers to instruct GnuPG packet processing to look back to arbitrary positions in the message, all while suppressing the authentication failure message. GPG's position was: they print, in those circumstances, an error of some sort, and that's enough. It's an attack that reveals plaintext bytes!
Are you referring to "Encrypted message malleability checks are incorrectly enforced causing plaintext recovery attacks"?
Seems like a legitimate difference of opinion. The researcher wants a message with an invalid format to return an integrity failure message. Presumably the GnuPGP project thinks that would be better handled by some sort of bad format error.
The exploit here is a variation on the age old idea of tricking a PGP user into decrypting an encrypted message and then sending the result to the attacker. The novelty here is the idea of making the encrypted message look like a PGP key (identity) and then asking the victim to decrypt the fake key, sign it and then upload it to a keyserver.
Modifying a PGP message file will break the normal PGP authentication[1] (that was not acknowledged in the attack description). So here is the exploit:
* The victim receives a unauthenticated/anonymous (unsigned or with a broken signature) message from the attacker. The message looks like a public key.
* Somehow (perhaps in another anonymous message) the attacker claims they are someone the victim knows and asks them to decrypt, sign and upload the signed public key to a keyserver.
* They see nothing wrong with any of this and actually do what the attacker wants ignoring the error message about the bad message format.
So this attack is also quite unlikely. Possibly that affected the decision of the GnuPG project to not change behaviour in this case, particularly when such a change could possibly introduce other vulnerabilities.
Added: Wait. How would the victim import the bogus PGP key into GPG so they could sign it? There would normally be a preexisting key for that user so the bogus key would for sure fail to import. It would probably fail anyway. It will be interesting to see what the GnuPG project said about this in their response.
It's a fundamentally bad idea to have a single key that applications are supposed to look for in a particular place, and then use to sign things.
There is inherent complexity involved in making multi-context key use safe, and it's better to just avoid it architecturally.
Keys (even quantum safe) are small enough that having one per application is not a problem at all.
If an application needs multi-context, they can handle it themselves.
If they do it badly, the damage is contained to that application.
If someone really wants to make an application that just signs keys for other applications to say "this is John Smith's key for git" and "this is John Smith's key for email" then they could do that.
Such an application would not need to concern itself with permissions for other applications calling into it.
The user could just copy and paste public keys, or fingerprints when they want to attest to their identity in a specific application.
The keyring circus (which is how GPG most commonly intrudes into my life) is crazy too.
All these applications insist on connecting to some kind of GPG keyring instead of just writing the secrets to the filesystem in their own local storage.
The disk is fully encrypted, and applications should be isolated from one another.
Nothing is really being accomplished by requiring the complexity of yet another program to "extra encrypt" things before writing them to disk.
I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.
> The disk is fully encrypted, and applications should be isolated from one another.
For most apps on non-mobile devices, there isn't filesystem isolation between apps. Disk/device-level encryption solves for a totally different threat model; Apple/Microsoft/Google all ship encrypted storage for secrets (Keychain, Credential Manager, etc), because restricting key material access within the OS has merit.
> I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.
Basically everything in PGP/GPG predates the existence of "corporate security circles".
These are not vulnerabilities in the "remote exploit" sense. They should be taken seriously, you should be careful not to run local software on untrusted data, and GPG should probably do more to protect users from shooting themselves in the foot, but the worst thing you could do is panic and throw out a process your partners and colleagues trust. There is nothing here that will disturb your workflow signing commits or apt-get install-ing from your distribution.
If you use crypographic command line tools to verify data sent to you, be mindful on what you are doing and make sure to understand the attacks presented here. One of the slides is titled "should we even use command line tools" and yes, we should because the alternative is worse, but we must be diligent in treating all untrusted data as adversarial.
A huge part of GPG’s purported use case is getting a signed/encrypted/both blob from somebody and using GPG to confirm it’s authentic. This is true for packages you download and for commits with signatures.
I did the switch this year after getting yet another personal computer. I have 4 in total (work laptop, personal sofa laptop, Mac Mini, Linux Tower). I used Yubi keys with gpg and resident ssh keys. All is fine but the configuration needed to get it too work on all the machines. I also tend to forget the finer details and have to relearn the skills of fetching the public keys into the keychain etc. I got rid of this all by moving to 1Password ssh agent and git ssh signing. Removes a lot of headaches from my ssh setup. I still have the yubi key(s) though as a 2nd factor for certain web services. And the gpg agent is still running but only as a fallback. I will turn this off next year.
I’ve ended up the same place as you. I had previously set up my gpg key on a Yubikey and even used that gpg key to handle ssh authentication. Then at some point it just stopped working, maybe the hardware on my key broke. 2FA still works though.
In any case I figured storing an SSH key in 1Password and using the integrated SSH socket server with my ssh client and git was pretty nice and secure enough. The fact the private key never leaves the 1Password vault unencrypted and is synced between my devices is pretty neat. From a security standpoint it is indeed a step down from having my key on a physical key device, but the hassle of setting up a new Yubikey was not quite worth it.
I’m sure 1Password is not much better than having a passphrase-protected key on disk. But it’s a lot more convenient.
> I certainly want to get rid of gpg from my life if I can
I see this sentiment a lot, but you later hint at the problem. Any "replacement" needs to solve for secure key distribution. Signing isn't hard, you can use a lot of different things other than gpg to sign something with a key securely. If that part of gpg is broken, it's a bug, it can/should be fixed.
The real challenge is distributing the key so someone else can verify the signature, and almost every way to do that is fundamentally flawed, introduces a risk of operational errors or is annoying (web of trust, trust on first use, central authority, in-person, etc). I'm not convinced the right answer here is "invent a new one and the ecosystem around it".
It's not like GPG solves for secure key distribution. GPG keyservers are a mess, and you can't trust their contents anyways unless you have an out of band way to validate the public key. Basically nobody is using web-of-trust for this in the way that GPG envisioned.
This is why basically every modern usage of GPG either doesn't rely on key distribution (because you already know what key you want to trust via a pre-established channel) or devolves to the other party serving up their pubkey over HTTPS on their website.
A lot of people are using PGP for things that don’t require any kind of key distribution. If you’re just using it to encrypt files (even between pointwise parties), you can probably just switch to age.
(We’re also long past the point where key distribution has been a significant component of the PGP ecosystem. The PGP web of trust and original key servers have been dead and buried for years.)
For anyone relatedly wondering about the "schism", i.e. GnuPG abandoning the OpenPGP standard and doing their own self-governed thing, I found this email particularly insightful on the matter:
https://lists.gnupg.org/pipermail/gnupg-devel/2025-September...
> As others have pointed out, GnuPG is a C codebase with a long history (going on 28 years). On top of that, it's a codebase that is mostly uncovered by tests, and has no automated CI. If GnuPG were my project, I
would also be anxious about each change I make. I believe that because of this the LibrePGP draft errs on the side of making minimal changes, with the unspoken goal of limiting risks of breakage in a brittle codebase with practically no tests. (Maybe the new formats in RFC 9580 are indeed "too radical" of an evolutionary step to safely implement in GnuPG. But that's surely not a failing of RFC 9580.)
Nothing has improved and everything has gotten worse since I wrote that. Both factions are sleepwalking into an interoperability disaster. Supporting one faction or the other just means you are part of the problem. The users have to resist being made pawns in this pointless war.
>Maybe the new formats in RFC 9580 are indeed "too radical" of an evolutionary step to safely implement in GnuPG.
Traditionally the OpenPGP process has been based on minimalism and rejected everything without a strong justification. RFC-9580 is basically everything that was rejected by the LibrePGP faction (GnuPG) in the last attempt to come up with a new standard. It contains a lot of poorly justified stuff and some straight up pointless stuff. So just supporting RFC-9580 is not the answer here. It would require significant cleaning up. But again, just supporting LibrePGP is not the answer either. The process has failed yet again and we need to recognize that.
Here is the short version from someone who took part in this process: while serving as the editor of the draft, Werner did not let anything into the draft that wasn't his own idea. But for his own ideas, there were cases where a new feature was committed to spec master and released in gnupg within the week. He was impossible to work with over many years, to the point that everyone agreed that the only way forward was to leave gnupg behind. This is a bonkers decision for OpenPGP as an ecosystem, but it was not made in ignorance of the consequences. And as far as I'm aware, even with today's hindsight, noone involved in the process regrets making the decision.
Why does Fedora / RPM still rely on GPG keys for verifying packages?
This is a staggering ecosystem failure. If GPG has been a known-lost cause for decades, then why haven't alternatives ^W replacements been produced for decades?
This feels pretty unsatisfying: something that’s been “considered harmful” for three decades should be deprecated and then removed in a responsible ecosystem.
(PGP/GPG are of course hamstrung by their own decision to be a Swiss Army knife/only loosely coupled to the secure operation itself. So the even more responsible thing to do is to discard them for purposes that they can’t offer security properties for, which is the vast majority of things they get used for.)
Maybe the site is overloaded. But as for the "brb, were on it!!!!" - this page had the live stream of the talk when it was happening. Hopefully they'll replace it with the recording when media.ccc.de posts it, which should be within a couple hours.
Is anyone else worried that a lot of people coming from the Rust world contribute to free software and mindlessly slap on it MIT license because it's "the default license"? (Yes, I've had someone say this to me, no joke)
GnuPG for all its flaws has a copyleft license (GPL3) making it difficult to "embrace extend extinguish". If you replace it with a project that becomes more successful but has a less protective (for users) license, "we the people" might lose control of it.
> Is anyone else worried that a lot of people coming from the Rust world contribute to free software and mindlessly slap on it MIT license
Yeah; I actually used to do that to (use the "default license"), but eventually came to the same realisation and have been moving all my projects to full copyleft.
You are attributing a general trend to a particular language community. I also believe that you are unjustifiably unfairly interpreting “default license” just because you disagree with what they think the “default license” is. We all know what is means by this. It just sounds like you think it should be something GPL
No, you're guessing what I'm thinking. I'm telling you that a person I spoke to TOLD ME verbatim "I chose MIT because it's the default lincense". I'm not guessing that's what they did, that's what they TOLD ME. Do you understand the concept or literally telling someone something?
I find that this is something reflective of most modern language ecosystems, not just Rust. I actually first started noticing the pervasiveness of MIT on npm.
For me, I am of two minds. On one hand, the fact that billion-dollar empires are built on top of what is essentially unpaid volunteer work does rankle and makes me much more appreciative of copyleft.
On the other hand, most of my hobbyist programming work has continued to be released under some form of permissive license, and this is more of a reality of the fact that I work in ecosystems where use of the GPL isn't merely inconvenient, but legally impossible, and the pragmatism of permissive licenses win out.
I do wish that weak copyleft like the Mozilla Public License had caught on as a sort of middle ground, but it seems like those licenses are rare enough to where their use would invite as much scrutiny as the GPL, even if it was technically allowed. Perhaps the FSF could have advocated more strongly for weak copyleft in area where GPL was legally barred, but I suppose they were too busy not closing the network hole in the GPLv3 to bother.
I love the MPL and I use it wherever I get the opportunity. IMO it has all the advantages of the GPL and lacks the disadvantages (the viral part) that makes the GPL so difficult to use.
I used to develop free software exclusively under GPL or AGPL.
But at some point, for things like, a very small-but-useful library or utility, I had a change of heart. I felt that it's better for the project to use non-copyleft licenses.
I do this as a rule now for projects where the scope is small and the complexity of a total rewrite is not very large for several engineers at a large company.
For small stuff, the consideration is, I want people to use it, period.
When devs look at open source stuff and see MIT / Apache, they know they can use it no questions asked. When they see GPL etc. then they will be able to use it in some cases and not others depending on what they are working on. I don't want to have that friction if it's not that important.
For a lot of stuff I publish, it's really just some small thing that I tried to craft thoughtfully and now I want to give it away and hope that someone else benefits. Sometimes it gets a few million downloads and I get feedback, and I just like that experience. Often whatever the feedback is it helps me make the thing better which benefits my original use case, or I just learn things from the experience.
Often I'm not trying to build a community of developers around that project -- it's too small for that.
I still like the GPL and I have nothing against it. If I started working on something that I anticipated becoming really large somehow, I might try to make it GPL. And I feel great about contributing to large GPL projects.
I just feel like even though I'm friendly to the GPL, it's definitely no longer my default, because I tend to try to publish very small useful units. And somehow I've convinced myself that it's better for the community and for the projects themselves if those kind of things are MIT / Apache / WTFPL or similar.
I hope that makes sense.
I realized that I can be seen as one of those that treats the GPL as weird or not normal, because I don't really use it anymore. But I'm not trying to be an enemy of the GPL or enable embrace-extend-extinguish tactics. It's just that it a very nuanced thing for me I guess nowadays. Your comment caused me to reflect on this.
A company adopts some software with a free but not copyleft license. Adopts means they declare "this is good, we will use it".
Developers help develop the software (free of charge) and the company says thank you very much for the free labour.
Company puts that software into everything it does, and pushes it into the infrastructure of everything it does.
Some machines run that software because an individual developer put it there, other machines run that software because a company put it there, some times by exerting some sort of power for it to end up there (for example, economic incentives to vendors, like android).
A some point the company says "you know what, we like this software so much that we're going to fork it, but the fork isn't going to be free or open source. It's going to be just ours, and we're not going to share the improvements we made"
But now that software is already running in a lot of machines.
Then the company says "we're going to tweak the software a bit, so that it's no longer inter-operable with the free version. You have to install our proprietary version, or you're locked out" (out of whatever we're discussing hypothetically. Could be a network, a standard, a protocol, etc).
Developers go "shit, I guess we need to run the proprietary version now. we lost control of it."
This is what happened e.g. with chrome. There's chromium, anyone can build it. But that's not chrome. And chrome is what everybody uses because google has lock-in power. Then google says "oh I'm going to disallow you running the extensions you like, so we can show you more ads". Then they make tweaks to chrome so that websites only get rendered well if they use certain APIs, so now competitors to Chrome are forced to implement those APIs, but those aren't public.
And all of this was initially build by free labour, which google took, by people who thought they were contributing to some commons in a sense.
Copyleft licenses protect against this. Part of the license says: if you use these licenses, and you make changes to the software, you have to share the changes as well, you can't keep them for yourself".
> Is anyone else worried that [...] the Rust world [...] slap on it MIT license because it's [reason you don't like]?
No... I don't think that's how software works. Do you have an example of that happening? Has any foss project lost control of the "best" version of some software?
> Not everything in software is about features.
I mean, I would happily make the argument that the ability to use code however I want without needing to give you, (the people,) permission to use my work without following my rules a feature. But then, stopping someone from using something in a way you don't like, is just another feature of GPL software too, is it not?
I don't know the legals in detail, but I cant imagine that GPL would do something about how you use it in your home? How is that enforceable?
Again don't now the legals but I think in practical terms this affects companies trying to own a project.
> using something in a way you don't like
You're mischaracterizing what I'm saying. For one thing you're talking about "someone" when I'm taking about "someone with power". Copyleft isn't about two people, one gaining power over the other. It's about lots of people with no power protecting themselves again one entity with a lot of power to impose themselves.
> Do you have an example of that happening?
Are you new to HN? Every month there's news of projects trying to arrest power contributors using various shenanigans. Copyleft protects against a class of such attacks.
I'm not worried it might be the case. I'm certain that ubuntu and everyone else replacing gnu stuff with rust MIT stuff is done with the sole purpose of getting rid of copyleft components.
If the new components were GPL licensed there would be less opposition, but we just get called names and our opinions discarded. After all such companies have more effective marketing departments.
The vast majority of open-source software is written by people whose day job is building empires on top other open-source software, at zero cost and without releasing modifications, which is harder to do with the GPL.
A thru-line of some of the gnarliest vulnerabilities here is PGP's insane packet system, where a PGP message is a practically arbitrary stream of packets, some control and some data, with totally incoherent cryptographic bindings. It's like something in between XMLDSIG (which pulls cryptographic control data out of random places in XML messages according to attacker-controlled tags) and SSL2 (with no coherent authentication of the complete handshake).
The attack on detached signatures (attack #1) happens because GnuPG needs to run a complicated state machine that can put processing into multiple different modes, among them three different styles of message signature. In GPG, that whole state machine apparently collapses down to a binary check of "did we see any data so that we'd need to verify a signature?", and you can selectively flip that predicate back and forth by shoving different packets into message stream, even if you've already sent data that needs to be verified.
The malleability bug (attack #4) is particularly slick. Again, it's an incoherent state machine issue. GPG can "fail" to process a packet because it's cryptographically invalid. But it can also fail because the message framing itself is corrupted. Those latter non-cryptographic failures are handled by aborting the processing of the message, putting GPG into an unexpected state where it's handling an error and "forgetting" to check the message authenticator. You can CBC-bitflip known headers to force GPG into processing DEFLATE compression, and mangle the message such that handling the message prints the plaintext in its output.
The formfeed bug (#3) is downright weird. GnuPG has special handling for `\f`; if it occurs at the end of a line, you can inject arbitrary unsigned data, because of GnuPG's handling of line truncation. Why is this even a feature?
Some of these attacks look situational, but that's deceptive, because PGP is (especially in older jankier systems) used as an encryption backend for applications --- Mallory getting Alice to sign or encrypt something on her behalf is an extremely realistic threat model (it's the same threat model as most cryptographic attacks on secure cookies: the app automatically signs stuff for users).
There is no reason for a message encryption system to have this kind of complexity. It's a deep architectural flaw in PGP. You want extremely simple, orthogonal features in the format, ideally treating everything as clearly length-delimited opaque binary blobs. Instead you get a Weird Machine, and talks like this one.
Something that will encrypt using AES-256 with a passphrase, but also using asymmetric crypto. Oh, and I want my secret keys printable if needed. And I want to store them securely on YubiKeys once generated (https://github.com/drduh/YubiKey-Guide). I want to be able to encrypt my backups to multiple recipients. And I want the same keys (stored on Yubikeys, remember?) to be usable for SSH authentication, too.
And by the way, if your fancy tool is written using the latest language du jour with a runtime that changes every couple of years or so, or requires huge piles of dependencies that break if you even as much as sneeze (python, anyone?), it won't do.
BTW, in case someone says "age", I actually followed that advice and set it up just to be there on my systems (managed by ansible). Apart from the fact that it really slowed down my deployments, the thing broke within a year. And I didn't even use it. I just wanted to see how reliable it will be in the most minimal of ways: by having it auto-installed on my systems.
If your fancy tool has less than 5 years of proven maintenance record, it won't do. Encryption is for the long term. I want to be able to read my stuff in 15-30 years.
So before you go all criticizing GnuPG, please understand that there are reasons why people still use it, and are actually OK with the flaws described.
The entire point of every single valid criticism of PGP is that you cannot make a single functionally equivalent alternative to PGP. You must use individual tools that are good at specific things, because the "Swiss Army knife" approach to cryptographic tool design has yielded empirically poor outcomes.
If you have an example of how age broke for you, I think its maintainers would be very interested in hearing that -- I've been using it directly and indirectly for 5+ years and haven't had any compatibility or runtime issues with it, including when sharing encrypted files across different implementations of age.
Is this a comparable complaint worth mentioning, and if it is are you sure you actually need cryptography? It slowed things down a bit, so you don't really want to move on from demonstrably too-complex to not have bugs GnuPG?
Stop asking for it, for your own good, please. If you don't understand the entire spec you can't use it safely.
You want special purpose tools. Signal for communication, Age for safer file encryption, etc.
What exact problems did you have with age? You're not explaining how it broke anything. Are you compiling yourself? Age has yubikey support and can do all you described.
> if your fancy tool has less than 5 years of proven maintenance record, it won't do. Encryption is for the long term. I want to be able to read my stuff in 15-30 years.
This applies to algorithms, it does not apply to cryptographic software in the same way. The state of art changes fast, and while algorithms tend to stand for a long time these days there are significant changes in protocol designs and attack methods.
Downgrade protection, malleability protection, sidechannel protection, disambiguation, context binding, etc...
You want software to be implemented by experts using known best practices with good algorithms and audited by other experts.
The cryptographic requirements of different problems --- backup, package signing, god-help-us secure messaging --- are in tension with each other. No one design adequately covers all the use cases. Trying to cram them all into one tool is a sign that something other than security is the goal. If that's the case, you're live action roleplaying, not protecting people.
I'd be interested in whether you could find a cryptographer who disagrees with that. I've asked around!
[†] I am aware that SAK nerds love the saw.
I'm very curious about this. Tell me more.
If you read the various how-tos out there it is not that hard to use, just people do not want to read anything more than 2 lines. That is the main issue.
My only complaint is Thunderbird now uses its own homegrown encryption, thus locking you into their email client. Seems almost all email clients have their own way of encryption, confusing the matters even more. I now use mutt because it can be easily likned to GnuPG and it does not lock me into a specific client.
The video linked above contains multiple examples of people using GnuPG's CLI in ways that it was seemingly intended to be used. Blaming users for holding it wrong seems facile.
Seems like a legitimate difference of opinion. The researcher wants a message with an invalid format to return an integrity failure message. Presumably the GnuPGP project thinks that would be better handled by some sort of bad format error.
The exploit here is a variation on the age old idea of tricking a PGP user into decrypting an encrypted message and then sending the result to the attacker. The novelty here is the idea of making the encrypted message look like a PGP key (identity) and then asking the victim to decrypt the fake key, sign it and then upload it to a keyserver.
Modifying a PGP message file will break the normal PGP authentication[1] (that was not acknowledged in the attack description). So here is the exploit:
* The victim receives a unauthenticated/anonymous (unsigned or with a broken signature) message from the attacker. The message looks like a public key.
* Somehow (perhaps in another anonymous message) the attacker claims they are someone the victim knows and asks them to decrypt, sign and upload the signed public key to a keyserver.
* They see nothing wrong with any of this and actually do what the attacker wants ignoring the error message about the bad message format.
So this attack is also quite unlikely. Possibly that affected the decision of the GnuPG project to not change behaviour in this case, particularly when such a change could possibly introduce other vulnerabilities.
[1] https://articles.59.ca/doku.php?id=pgpfan:pgpauth
Added: Wait. How would the victim import the bogus PGP key into GPG so they could sign it? There would normally be a preexisting key for that user so the bogus key would for sure fail to import. It would probably fail anyway. It will be interesting to see what the GnuPG project said about this in their response.
Keys (even quantum safe) are small enough that having one per application is not a problem at all. If an application needs multi-context, they can handle it themselves. If they do it badly, the damage is contained to that application. If someone really wants to make an application that just signs keys for other applications to say "this is John Smith's key for git" and "this is John Smith's key for email" then they could do that. Such an application would not need to concern itself with permissions for other applications calling into it. The user could just copy and paste public keys, or fingerprints when they want to attest to their identity in a specific application.
The keyring circus (which is how GPG most commonly intrudes into my life) is crazy too. All these applications insist on connecting to some kind of GPG keyring instead of just writing the secrets to the filesystem in their own local storage. The disk is fully encrypted, and applications should be isolated from one another. Nothing is really being accomplished by requiring the complexity of yet another program to "extra encrypt" things before writing them to disk.
I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.
For most apps on non-mobile devices, there isn't filesystem isolation between apps. Disk/device-level encryption solves for a totally different threat model; Apple/Microsoft/Google all ship encrypted storage for secrets (Keychain, Credential Manager, etc), because restricting key material access within the OS has merit.
> I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.
Basically everything in PGP/GPG predates the existence of "corporate security circles".
If you use crypographic command line tools to verify data sent to you, be mindful on what you are doing and make sure to understand the attacks presented here. One of the slides is titled "should we even use command line tools" and yes, we should because the alternative is worse, but we must be diligent in treating all untrusted data as adversarial.
Handling untrusted input is core to that.
In any case I figured storing an SSH key in 1Password and using the integrated SSH socket server with my ssh client and git was pretty nice and secure enough. The fact the private key never leaves the 1Password vault unencrypted and is synced between my devices is pretty neat. From a security standpoint it is indeed a step down from having my key on a physical key device, but the hassle of setting up a new Yubikey was not quite worth it.
I’m sure 1Password is not much better than having a passphrase-protected key on disk. But it’s a lot more convenient.
I’m still working through how to use this but I have it basically setup and it’s great!
I see this sentiment a lot, but you later hint at the problem. Any "replacement" needs to solve for secure key distribution. Signing isn't hard, you can use a lot of different things other than gpg to sign something with a key securely. If that part of gpg is broken, it's a bug, it can/should be fixed.
The real challenge is distributing the key so someone else can verify the signature, and almost every way to do that is fundamentally flawed, introduces a risk of operational errors or is annoying (web of trust, trust on first use, central authority, in-person, etc). I'm not convinced the right answer here is "invent a new one and the ecosystem around it".
This is why basically every modern usage of GPG either doesn't rely on key distribution (because you already know what key you want to trust via a pre-established channel) or devolves to the other party serving up their pubkey over HTTPS on their website.
(We’re also long past the point where key distribution has been a significant component of the PGP ecosystem. The PGP web of trust and original key servers have been dead and buried for years.)
What do you mean? Web of Trust? Keyservers? A combination of both? Under what use case?
> As others have pointed out, GnuPG is a C codebase with a long history (going on 28 years). On top of that, it's a codebase that is mostly uncovered by tests, and has no automated CI. If GnuPG were my project, I would also be anxious about each change I make. I believe that because of this the LibrePGP draft errs on the side of making minimal changes, with the unspoken goal of limiting risks of breakage in a brittle codebase with practically no tests. (Maybe the new formats in RFC 9580 are indeed "too radical" of an evolutionary step to safely implement in GnuPG. But that's surely not a failing of RFC 9580.)
* https://articles.59.ca/doku.php?id=pgpfan:schism
Nothing has improved and everything has gotten worse since I wrote that. Both factions are sleepwalking into an interoperability disaster. Supporting one faction or the other just means you are part of the problem. The users have to resist being made pawns in this pointless war.
>Maybe the new formats in RFC 9580 are indeed "too radical" of an evolutionary step to safely implement in GnuPG.
Traditionally the OpenPGP process has been based on minimalism and rejected everything without a strong justification. RFC-9580 is basically everything that was rejected by the LibrePGP faction (GnuPG) in the last attempt to come up with a new standard. It contains a lot of poorly justified stuff and some straight up pointless stuff. So just supporting RFC-9580 is not the answer here. It would require significant cleaning up. But again, just supporting LibrePGP is not the answer either. The process has failed yet again and we need to recognize that.
But trust in Werner Koch is gone. Wontfix??
People who are serious about security use newer, better tools that replace GPG. But keep in mind, there’s no “one ring to rule them all”.
Why do high-profile projects, such as Linux and QEMU, still use GPG for signing pull requests / tags?
https://docs.kernel.org/process/maintainer-pgp-guide.html
https://www.qemu.org/docs/master/devel/submitting-a-pull-req...
Why does Fedora / RPM still rely on GPG keys for verifying packages?
This is a staggering ecosystem failure. If GPG has been a known-lost cause for decades, then why haven't alternatives ^W replacements been produced for decades?
Archive link: https://web.archive.org/web/20251227174414/https://www.gnupg...
(PGP/GPG are of course hamstrung by their own decision to be a Swiss Army knife/only loosely coupled to the secure operation itself. So the even more responsible thing to do is to discard them for purposes that they can’t offer security properties for, which is the vast majority of things they get used for.)
Deleted Comment
Most people have never heard of it and never used it.
But werner at this point has a history of irresponsible decisions like this, so it's sadly par for the course by now.
Another particularly egregious example: https://dev.gnupg.org/T4493
Dead Comment
Dead Comment
As they said, they were on it...
GnuPG for all its flaws has a copyleft license (GPL3) making it difficult to "embrace extend extinguish". If you replace it with a project that becomes more successful but has a less protective (for users) license, "we the people" might lose control of it.
Not everything in software is about features.
Yeah; I actually used to do that to (use the "default license"), but eventually came to the same realisation and have been moving all my projects to full copyleft.
For me, I am of two minds. On one hand, the fact that billion-dollar empires are built on top of what is essentially unpaid volunteer work does rankle and makes me much more appreciative of copyleft.
On the other hand, most of my hobbyist programming work has continued to be released under some form of permissive license, and this is more of a reality of the fact that I work in ecosystems where use of the GPL isn't merely inconvenient, but legally impossible, and the pragmatism of permissive licenses win out.
I do wish that weak copyleft like the Mozilla Public License had caught on as a sort of middle ground, but it seems like those licenses are rare enough to where their use would invite as much scrutiny as the GPL, even if it was technically allowed. Perhaps the FSF could have advocated more strongly for weak copyleft in area where GPL was legally barred, but I suppose they were too busy not closing the network hole in the GPLv3 to bother.
But at some point, for things like, a very small-but-useful library or utility, I had a change of heart. I felt that it's better for the project to use non-copyleft licenses.
I do this as a rule now for projects where the scope is small and the complexity of a total rewrite is not very large for several engineers at a large company.
For small stuff, the consideration is, I want people to use it, period.
When devs look at open source stuff and see MIT / Apache, they know they can use it no questions asked. When they see GPL etc. then they will be able to use it in some cases and not others depending on what they are working on. I don't want to have that friction if it's not that important.
For a lot of stuff I publish, it's really just some small thing that I tried to craft thoughtfully and now I want to give it away and hope that someone else benefits. Sometimes it gets a few million downloads and I get feedback, and I just like that experience. Often whatever the feedback is it helps me make the thing better which benefits my original use case, or I just learn things from the experience.
Often I'm not trying to build a community of developers around that project -- it's too small for that.
I still like the GPL and I have nothing against it. If I started working on something that I anticipated becoming really large somehow, I might try to make it GPL. And I feel great about contributing to large GPL projects.
I just feel like even though I'm friendly to the GPL, it's definitely no longer my default, because I tend to try to publish very small useful units. And somehow I've convinced myself that it's better for the community and for the projects themselves if those kind of things are MIT / Apache / WTFPL or similar.
I hope that makes sense.
I realized that I can be seen as one of those that treats the GPL as weird or not normal, because I don't really use it anymore. But I'm not trying to be an enemy of the GPL or enable embrace-extend-extinguish tactics. It's just that it a very nuanced thing for me I guess nowadays. Your comment caused me to reflect on this.
Maybe have it run CLI in compatibility mode when called as `gpg` but have completely new one when called normally
A company adopts some software with a free but not copyleft license. Adopts means they declare "this is good, we will use it".
Developers help develop the software (free of charge) and the company says thank you very much for the free labour.
Company puts that software into everything it does, and pushes it into the infrastructure of everything it does.
Some machines run that software because an individual developer put it there, other machines run that software because a company put it there, some times by exerting some sort of power for it to end up there (for example, economic incentives to vendors, like android).
A some point the company says "you know what, we like this software so much that we're going to fork it, but the fork isn't going to be free or open source. It's going to be just ours, and we're not going to share the improvements we made"
But now that software is already running in a lot of machines.
Then the company says "we're going to tweak the software a bit, so that it's no longer inter-operable with the free version. You have to install our proprietary version, or you're locked out" (out of whatever we're discussing hypothetically. Could be a network, a standard, a protocol, etc).
Developers go "shit, I guess we need to run the proprietary version now. we lost control of it."
This is what happened e.g. with chrome. There's chromium, anyone can build it. But that's not chrome. And chrome is what everybody uses because google has lock-in power. Then google says "oh I'm going to disallow you running the extensions you like, so we can show you more ads". Then they make tweaks to chrome so that websites only get rendered well if they use certain APIs, so now competitors to Chrome are forced to implement those APIs, but those aren't public.
And all of this was initially build by free labour, which google took, by people who thought they were contributing to some commons in a sense.
Copyleft licenses protect against this. Part of the license says: if you use these licenses, and you make changes to the software, you have to share the changes as well, you can't keep them for yourself".
Deleted Comment
No... I don't think that's how software works. Do you have an example of that happening? Has any foss project lost control of the "best" version of some software?
> Not everything in software is about features.
I mean, I would happily make the argument that the ability to use code however I want without needing to give you, (the people,) permission to use my work without following my rules a feature. But then, stopping someone from using something in a way you don't like, is just another feature of GPL software too, is it not?
Again don't now the legals but I think in practical terms this affects companies trying to own a project.
> using something in a way you don't like
You're mischaracterizing what I'm saying. For one thing you're talking about "someone" when I'm taking about "someone with power". Copyleft isn't about two people, one gaining power over the other. It's about lots of people with no power protecting themselves again one entity with a lot of power to impose themselves.
> Do you have an example of that happening?
Are you new to HN? Every month there's news of projects trying to arrest power contributors using various shenanigans. Copyleft protects against a class of such attacks.
Eg Oracle and open office, red hat and centos.
Edit this is literally on HN right. Is this your first day here or something? https://old.reddit.com/r/linux/comments/1puojsr/the_device_t...
If the new components were GPL licensed there would be less opposition, but we just get called names and our opinions discarded. After all such companies have more effective marketing departments.
The attack on detached signatures (attack #1) happens because GnuPG needs to run a complicated state machine that can put processing into multiple different modes, among them three different styles of message signature. In GPG, that whole state machine apparently collapses down to a binary check of "did we see any data so that we'd need to verify a signature?", and you can selectively flip that predicate back and forth by shoving different packets into message stream, even if you've already sent data that needs to be verified.
The malleability bug (attack #4) is particularly slick. Again, it's an incoherent state machine issue. GPG can "fail" to process a packet because it's cryptographically invalid. But it can also fail because the message framing itself is corrupted. Those latter non-cryptographic failures are handled by aborting the processing of the message, putting GPG into an unexpected state where it's handling an error and "forgetting" to check the message authenticator. You can CBC-bitflip known headers to force GPG into processing DEFLATE compression, and mangle the message such that handling the message prints the plaintext in its output.
The formfeed bug (#3) is downright weird. GnuPG has special handling for `\f`; if it occurs at the end of a line, you can inject arbitrary unsigned data, because of GnuPG's handling of line truncation. Why is this even a feature?
Some of these attacks look situational, but that's deceptive, because PGP is (especially in older jankier systems) used as an encryption backend for applications --- Mallory getting Alice to sign or encrypt something on her behalf is an extremely realistic threat model (it's the same threat model as most cryptographic attacks on secure cookies: the app automatically signs stuff for users).
There is no reason for a message encryption system to have this kind of complexity. It's a deep architectural flaw in PGP. You want extremely simple, orthogonal features in the format, ideally treating everything as clearly length-delimited opaque binary blobs. Instead you get a Weird Machine, and talks like this one.
Amazing work.