Readit News logoReadit News
tux3 · 6 months ago
The NIST report that explains the decision: https://nvlpubs.nist.gov/nistpubs/ir/2025/NIST.IR.8545.pdf#p...
buu700 · 6 months ago
Really glad to see it. It sounds like their reasoning is essentially the same as mine for implementing HQC alongside Kyber in Cyph three years ago: having a different theoretical basis makes it highly complementary to Kyber/ML-KEM, but its public key sizes are much more practical than those of the more well known code-based candidate Classic McEliece.
some_furry · 6 months ago
The title is subtly incorrect: HQC is a fifth post-quantum cryptography algorithm selected for standardization by NIST, but it's only the second KEM (encryption-like).

> NIST selects HQC as fifth algorithm for post-quantum encryption

The other 3 are digital signature algorithms, not encryption.

whimsicalism · 6 months ago
Give how quickly quantum is potentially coming, I wonder if we should/could find some way of using multiple quantum-resistant algorithms simultaneously as a default, in case a fault is found after the limited time we have to verify that there are no faults.

Also - should we not be switching over to these algorithms starting like... now? Am I wrong that anyone collecting https traffic now will be able to break it in the future?

jonmon6691 · 6 months ago
Correct, KEM's should be replaced ASAP since they are currently vulnerable to store-now-decrypt-later attacks. Digital signature algorithms are less urgent but considering how long it takes to roll out new cryptography standards, they should be preferred for any new designs. That said, the new PQC signatures are much larger than the current 32 byte ed25519 signatures that are most common, and that could end up being very difficult to integrate into embedded systems with low bandwidth or limited memory, ie. CAN bus secure diagnostics, meshtastic nodes etc.
daghamm · 6 months ago
Sorry, can you expand on the issues with KEM?
throw_pm23 · 6 months ago
Sorry if my question appears ignorant, but how quickly is quantum really coming? If your prior belief is "nothing practical is ever likely to come out of quantum computing", then so far there is nothing that would seriously suggest you to reconsider it.

I do not say this lightly, having followed the academic side of QC for more than a decade.

FateOfNations · 6 months ago
Given how seriously the spookie parts of the US government are taking it, I would treat it with a similar level of urgency. While we obviously aren't privy to everything they know, their public actions indicate that they don't think it's a hypothetical risk, and is something that we will need to be ready for within the next decade or so, given technology refresh cycles: they need to get these crypto algorithms in place now, for devices that will still be in production use well in to the 2030s.

There's also a good chance that the initial compromises of the classical algorithms won't be made public, at least initially. There are multiple nation-state actors working on the problem in secret, in addition to the academic and commercial entities working on it more publicly.

grayhatter · 6 months ago
It's a reasonable question. The need for quantum resistant crypto isn't because the practical attack is right around the corner. All though, I do really enjoy the analogy of predicting when we'll get QC based crypto attacks, is similar to predicting when humans will land on the moon by looking at the altitude for the highest manned flight. It has more to do with the level of effort it takes to replace infra as critical as cryptography.

Imagine if next year, via magic wand, all the current TLS systems were completely and totally broken such that the whole of the internet using TLS became effectively unencrypted in any way? How much damage would that do to the ecosystem? But we also just invented a new protocol that works, so how long would it take to deploy it to just 50%? or to 80%? And how long would it take to replace the long tail?

I'll also leave record now decrypt later for another commenter.

kerkeslager · 6 months ago
I agree with you based on my following QC that we're still pretty far away from QC attacks on current crypto.

The problem is, this sort of question suffers from a lot of unknown unknowns. How confident are you that we don't see crypto broken by QC in the next 10 years? The next 20? Whatever your confidence, the answer is probably "not confident enough" because the costs of that prediction being wrong are incalculable for a lot of applications.

I'd say I'm 99% confident we will not see QC break any crypto considered secure now in the next 20 years. But I'll also say that the remaining 1% is more than enough risk that I think governments and industry should be taking major steps to address that risk.

MJGrzymek · 6 months ago
Isn't that trivial in a sense? Encrypt with layer 1, then use that encrypted channel to send layer 2 (and so on). Not sure about the performance.

Signal has a post about using pre and post-quantum together: https://signal.org/blog/pqxdh/

> The essence of our protocol upgrade from X3DH to PQXDH is to compute a shared secret, data known only to the parties involved in a private communication session, using both the elliptic curve key agreement protocol X25519 and the post-quantum key encapsulation mechanism CRYSTALS-Kyber. We then combine these two shared secrets together so that any attacker must break both X25519 and CRYSTALS-Kyber to compute the same shared secret.

chowells · 6 months ago
You generally don't want to layer encryption like that. It apparently really does introduce new kinds of attacks, which has been observed in the real world.

The pattern typically used for this is that the key for the high-speed symmetric encryption is split into multiple parts, each of which is encrypted with a separate public key system. One classical, one (or two, now?) with a post-quantum algorithm. As long as each part of the key is big enough, this still means you'd need to crack all the separate public key algorithms, but doesn't introduce any of the layering weaknesses.

__MatrixMan__ · 6 months ago
It's not clear whether NIST's recommendations are influenced by their relationship with the NSA (who would have an incentive to pressure NIST to recommend an algorithm which they--and hopefully they alone--can break). So perhaps the reasonable move is to use only one NIST-recommended algorithm and have the other layer be one that they do not recommend.
jeffrallen · 6 months ago
Or just ask NIST to STFU about cryptographic systems entirely, because their integrity is completely destroyed by their association with NSA.
EGreg · 6 months ago
What is your favorite post-quantum encryption approach?

I think Lattice-based ones will eventually be broken by a quantum algorithm. I am fully on board with lamport signatures and SPHINCS+

buu700 · 6 months ago
In Cyph, I went with Kyber (lattice-based) combined with HQC (code-based) for encryption. NTRU Prime also may be a good option if Kyber is ever broken in a way that doesn't fundamentally break all lattice crypto.

For signing, I treat Dilithium (lattice-based) as "standard security" and SPHINCS+ (hash-based) as "high security". In particular, the former is used for end user public keys and certificates, while the latter is used for code signing where the larger public key and signature sizes are less of an issue.

In all cases, I wouldn't use PQC without combining it with classical crypto, in Cyph's case X25519/Ed25519. Otherwise you run the risk of a situation like SIDH/SIKE where future cryptanalysis of a particular PQC algorithm finds that it isn't even classically secure, and then you're hosed in the present.

coppsilgold · 6 months ago
> In all cases, I wouldn't use PQC without combining it with classical crypto

With hash-based signatures, hybridization isn't required. They are the most powerful signature scheme approach by far. The security assumption hash-based signatures rely on is also shared with every other signature scheme (hashes are what are signed). Other schemes come with additional assumptions.

It's unfortunate that hash-based public key exchange is not practical. Merkle Puzzles require 2^n data exchange for 2^2n security.

nabla9 · 6 months ago
bdcravens · 6 months ago
Completely off-topic, but in today's climate, I worry about any department like this having their efforts rendered moot if they become a DOGE target.
rsfern · 6 months ago
It’s not that off topic IMO, NIST (along with all the other federal agencies) is finalizing restructuring/reorganization plans this week in response to one of last month’s executive orders. It’s not clear when the extent of the cuts will be announced, or what the criteria will be for which programs are to be affected. Though in the Commerce secretary’s confirmation hearing there was mostly good will towards this kind of light touch non-regulatory standardization approach that NIST tries to take
throw0101c · 6 months ago
Meta: I can understand the math problems behind RSA and DH, and the general concepts of EC, but all stuff for post-quantum algorithms I have yet to have a intuitive understanding even after reading / watching a bunch of videos trying to explain things.
tptacek · 6 months ago
The intuition for ML-KEM-style lattice cryptography is actually kind of straightforward. Generate a secret vector x and a random matrix A; multiply Ax, then corrupt it with a random vector of errors e, so instead of Ax=b you have Ax+e=b.

Solving Ax=b is like week 2 of undergraduate linear algebra. Solving Ax+e=b, in a lattice setting, is Hard in the same sense factoring is: as you run the steps of elimination to attempt to solve it, the errors compound catastrophically.

j_chance · 6 months ago
Solving `Ax=b` is easy if `x` can be any value. But if you bound `x` such that all entries must be small (e.g. only zeroes and ones) then the problem is hard: https://en.wikipedia.org/wiki/Short_integer_solution_problem

What you described would be closer to learning with errors (https://en.m.wikipedia.org/wiki/Learning_with_errors) combined with SIS/SVP. Learning with errors is based on the parity learning problem (https://en.m.wikipedia.org/wiki/Parity_learning) in machine learning, which I take as a positive sign of security.

Interestingly you can get information theoretically secure constructions using lattices by tuning the parameters. For example, if you make the `A` matrix large enough it becomes statistically unlikely that more than 1 solution exists (e.g. generate a random binary vector `x` and it's unlikely that an `x'` exists that solves for `b`).

iovoid · 6 months ago
The hash-based one (SPHINCS+) is probably the easiest to understand as its security hinges on hashes being secure, and some statistics.

I found https://er4hn.info/blog/2023.12.16-sphincs_plus-step-by-step... to be a nice introduction.

FateOfNations · 6 months ago
Yeah, I'm going to have to spend some time trying to understand the math too. I was mostly waiting for things to calm down standardization wise. This newly selected algorithm is based on error correction codes (the same ones used in communications protocols), which look interesting.
WaitWaitWha · 6 months ago
I was in the same exact boat. I just could not digest it.

I found AI (combo Grok & ChatPPT 4o) to be the best resource for this. It was able to break it down to digestible chunks, then pull it together that made sense. It even made suggestions what math areas I need to brush up on.

bryan0 · 6 months ago
I just did the same thing. Claude (sonnet 3.7) walked me through a toy implementation of the algorithm.
bobsmooth · 6 months ago
Don't get me wrong, it's good to be prepared, but what are the chances we'll need these algorithms by 2050?
MidnightRider39 · 6 months ago
Basically zero.

Great talk by Peter Gutman why this whole quantum topic is bollocks: https://www.cs.auckland.ac.nz/~pgut001/pubs/bollocks.pdf

est31 · 6 months ago
hn thread about this (with some interesting comments): https://news.ycombinator.com/item?id=43046631
mmooss · 6 months ago
There are many secrets created 25 years ago that need to be secure now; think of national security secrets.
daghamm · 6 months ago
US is pushing for these to replace classic crypto in commercial applications by 2030.

The C in CNSA 2.0 is for "Commercial".

bawolff · 6 months ago
Predicting anything 25 years out is a fool's errand.

Cryptographically relavent Quantum computers are definitely not happening in the near term, but 25 years is a long enough time horizon that it is plausible.

Just consider what tech was like 25 years ago. Would anyone (without the benefit of hindsight) in 1999 really be able to predict modern AI, the ubiquity of smart phones, etc. Heck 25 years ago people still thought the internet thing was a fad. Anyone trying to predict 25 years out is full of crap.

https://xkcd.com/678/

tptacek · 6 months ago
Except for RC4, which people were side-eyeing on Usenet in the 1990s, and key size progress, I think the cryptography primitives we were using 25 years ago have held up reasonably well. A lot of things that have been major issues since then are things that we know theoretically back in 2000, but just didn't know viscerally, because people hadn't written exploits. That's not the same thing as, like, Peter Schwabe's take on whether a particular key exchange or KEM is likely to survive until 2050.

Dead Comment