We can expect a quantum computer with 20 million noisy qubits to break RSA 2048 [1]
I can't speak to coherence time or circuit depth concerns, but qubit counts are doubling roughly every year. Current chips have thousands of qubits, so the exponential scaling implies we'd have 20 million qubits by 2035-2040.
edit: And from the paper, the required quantum volume ("megaqubitdays") scales bewteen O(n^3) and O(n^4) with RSA key length. So a few years after breaking RSA 2048, you'd have a computer five times larger that could break RSA 3072.
Google's "quantum supremacy" paper was about a 53 qubit chip in 2019 [1]. This year, they reported having 70 qubits [2]. Looking at the papers (and supplements), the gate fidelities and qubit lifetimes have stayed roughly the same. This really doesn't look like anything like Moore's law to me.
Last time I took around 1 hour to do a review of literature, I discovered that the size of those computers were growing at a linear rate of around 4 qbits/year.
It got slightly faster on the last few years, I don't know if inherently so or just due to random fluctuation. Yet people keep repeating that claim that the growth is exponential, based on no evidence at all.
If we didn't invent photolithography, we'd still be using computers the size of buildings limited to universities and military installations.
Right now no one knows how to build a scalable quantum computer. But as soon as we find out the equivalent of lithography for building quantum chips, the progress will come, and it will come quickly and suddenly.
The record quantum computers can factor is 21 -- and that is by cheating by already knowing the factors are 3 and 7. There are other results that use special form composites which don't count.
So a QC can factor a 5 bit number with Shor's algorithm in 2023 (with some cheating). That record has not changed for 10+ years.
I publicly bet 8 years ago that nobody would factor the number 35 by 2030. I hope I'm proved wrong.
I'm not super concerned. Cynically: one big driver of research into PQ cryptography is that it's a full employment program for academic cryptographers. Not that there's anything wrong with that! I like a Richelot isogeny as much as the next guy, or I would if I understood what they were.
At least as of 2020, it looked like both qubit counts and two-qubit gate quality were improving exponentially, and our naive extrapolation said RSA 2048 wouldn't get cracked by 2039 at 95% confidence.
As far as I can tell, this website suggests that two-qubit gate infidelity has continued to improve exponentially, although it's hard to tell if these are reliable datapoints.
Because there's typically an engineering tension between qubit count and gate quality, what you want to track is something like quantum volume, which looks to be on trend
Worth noting that once you cross the fault tolerant threshold you will probably see a big shift in how engineering effort is distributed: many researchers think it will be harder to improve gate quality than just increase increase qubit counts to make up for it, so you may see gate quality stall and extrapolation become even less reliable.
If we cross the threshold. Until then, adding a qubit requires halving the noise floor, so the Moore's law equivalent to exponential scaling is basically adding a fixed number of qubits per year.
Unless you're worried about storing and/or transmitting a huge amount of keys (in the order of "at least 100/second") and/or using one key "at least 100 times/second", why not just go for 4096 by default?
Because 4096 bit RSA is a lot slower and bigger and those things matter. And there isn't any upside? If you're actually worried about 2048-bit RSA (and you should not be), you should switch to one of the elliptic curve schemes.
I'm not actually sure about this. the elliptic curve schemes are just as broken with quantum computers, and the larger key size of rsa seems like it might add a few years of overhead in terms of qbits needed. not an expert though
Because that will soon become "why not 128k". Or perhaps we should use 2 megabytes for each prime? We don't know, but it's better to be safe than sorry.
That's a significant amount of time if we're talking about long-term file storage.
I've had my dropbox account for over 10 years now. Being concerned about a timescale of 20 to 30 years seems reasonable for things like long term file storage IMO.
Backblaze, dropbox, google drive, onedrive, AWS are all over a decade old.
I think the point is as computers get faster there is less trade off in having longer bit lengths, rather than focusing on the potential to crack sad keys.
That is, if it costs very little to have larger keys, why not have larger keys?
It is essentially hedging your bets as even if quantum computing key factorisation works, key lengths will still have an impact on the difficulty of factorisation, and it may make a difference in terms of practicality.
> Quantum computing of the sort intended to break RSA involves a breakthrough in both computing and algorithms. Normally some sort of new computing technology is invented and then algorithms are designed to enable that technology to do something useful. The quantum computing threat to RSA is different. We now have the algorithm (Shor's) but the computer to run it on only exists in our imagination.
> If someone were to invent such a computer then RSA 2048 would be immediately and trivially breakable. RSA 3072 would also be trivially breakable. The same applies to RSA 4096 and 8192.
They can't know that however, they may say it but they can't know it. Hence why not choose key lengths by looking at encoding and decoding resource usage instead of crackability.
By costs nothing I mean as CPUs get faster there is less performance impact on lenghtening keys
> even if quantum computing key factorisation works, key lengths will still have an impact on the difficulty of factorisation, and it may make a difference in terms of practicality.
I mean, the whole thing with quantum computer factoring is it scales well. Getting to 2048 rsa seems really really difficult. But if we ever get there, getting to 4096 is probably just a tiny extra step.
The current state of quantum computing suggests that there is an exponential effort to increase the available qbits. Going from RSA 2048 to RSA 4096 would not just double the effort required on the quantum side.
`That is, if it costs very little to have larger keys, why not have larger keys?`
it is often very expensive/difficult to change length of a RSA key is part of existing platform/infrastructure, like key in TPM/HSM/CA infrastructure, regardless how fast computer CPU is
But RSA has been long time going out, and short-keyed RSA doubly so. I would estimate that since maybe 2015ish deploying stuff that is coupled to 2048bit RSA would have been mistake. That gives generous 15ish year transition period. Anyone who cares even the slightest should succeed transition in that sort of timeframe.
I’m not a cryptographer, but I can see many more pressing reasons for migrating off RSA before 2030. Is there any reason to pick RSA for greenfield today?
RSA, to my knowledge, is vulnerable to side channels and poor parameter choices. Implementation simplicity is an underrated security parameter. The fewer feet you have, the fewer of them you can shoot.
The NSA data centers don’t want to waste time on your RSA key anyway, much less your run-of-the-mill Russian black hat groups. What bites us in practice are 0-days of something stupid like heartbleed or rowhammer that can be automated, and takes a long time to patch.
This is my (applied) cryptographic understanding as well: RSA 2048 probably won't be broken by improvements to prime factorization by 2030, but will continue to be a pointlessly dangerous (and slow) cryptosystem when compared to ECC.
Old best: Quadratic Sieve: exp(c(log n)^1/2(log log n)^1/2)
New best: General Number field sieve: exp(c(log n)^1/3(log log n)^2/3)
I can't help but feel that's an exponent there that we've moved to 1/3 that could be moved further. Sure we don't know how and we've been stuck here on the current best for just over 25 years but i just feel that if you give me two methods and one moves a term like that there's a good chance there's a way to reduce that term further. It'd be weird for that complexity statement to stay as is. That's telling me "the universe doesn't allow factorization any faster than a term that's raised to a power of 1/3rd" and i'm asking "why is 1/3 so special?". So i'm not convinced that there's not more here. I don't have a clue how of course. But the history of RSA going "256bits is secure" to 512bits to 1024bits to 2048bits being needed has me worried about the safety of prime factorization.
McEliece encryption algorithm was published in 1978 (one year later than RSA) and seems to be considered safe to classical and quantum computers, the only downside is the huge public key size.
The Lindy effect applies to non-perishables. There are alternatives which are much more likely to conform to a reasonable equivalency of non-perishable for cryptography. Per this article, RSA strongly does not fit reasonable equivalent definitions.
If you accept the current assumption of double exponential growth for both computing performance and algorithmic efficiency then you would have to accept that 256 bit elliptic curve keys will become obsolete in the 2040s. So it might be just RSA and discrete logs today but a requirement for pointlessly long EC keys will be along soon.
Yeah but does it matter? Either that’s true or it isn’t. If it is true, we’ll (as usual) have ample time to migrate. With RSA though, it’s already today more complex, slower and about 8x larger key sizes. And the cryptanalysis track record is afaik (please correct me) much more successful than ECC, so it’s “higher risk” that the timeline gets pushed forward or that new patches are needed to avoid bad parameter choices.
> So it might be just RSA and discrete logs today but a requirement for pointlessly long EC keys will be along soon.
It wouldn’t be pointless if computers can crack those sizes. It’d only be pointless if cryptanalysis can exploit structure to reduce the effective entropy, no?
>Implementation simplicity is an underrated security parameter.
Sure, if we were implementing cryptographic algorithms from scratch that would be a proper strong consideration. However, 99% of programmers should just link to an established library/framework and use its cryptographic implementation. These established libraries already paid the price of implementation, and are very battle-tested. There's therefore very good reason to believe their RSA implementation is secure.
Choosing an algorithm should be done on other considerations then. A lower keysize would point to ECC. But maybe we don't want a single algorithm for all encryption - a mixed global ecosystem with RSA and ECC projects would be more robust.
> These established libraries already paid the price of implementation, and are very battle-tested. There's therefore very good reason to believe their RSA implementation is secure.
Many of these established libraries have fallen in battle, some several times. There's always a new up and coming library that works on platform X, in language Y, or has a better license Z, or is integrated into an init system, and while some of them learn from the experience of others, many learn by making the same mistakes others did.
Pushing towards simpler constructions gives hope that those new implementations make fewer mistakes.
> These established libraries already paid the price of implementation, and are very battle-tested.
Yes absolutely. I’m not saying users should pick the one that’s easier to implement.
Simplicity is good for implementers. It allows for more participants, eg std libs to provide their own. Also, even the security geeks are humans and make mistakes. Heartbleed is a perfect example of where even simple things can go catastrophically wrong.
As a second order effect, users benefit from simplicity in the long run, because less complex systems have fewer bugs, and thus fewer security bugs.
I always thought discussions about encryption forget about this topic. Encrypted data today is only hidden today, not in the future. So any darknet criminal should consider that files he/she sends are being intercepted and readable in 30 years.
I find it amusing that the table published a conservative cutoff year and an optimistic cutoff year. Based on the trends I've seen, most non-critical software would probably have made the switch in time for the conservative year, whereas anything security-critical like a bank would probably use the optimistic year.
The author of the paper had a problem. His elliptic curve method seemed like it might overtake the best known algorithm at the time for factoring. So the conservative estimate takes that into account. The elliptic curve method never managed supremacy so the optimistic estimate is actually more relevant. That means that the actual prediction is 2040 but it seems the various national cybersecurity entities might of missed that point.
I can't speak to coherence time or circuit depth concerns, but qubit counts are doubling roughly every year. Current chips have thousands of qubits, so the exponential scaling implies we'd have 20 million qubits by 2035-2040.
edit: And from the paper, the required quantum volume ("megaqubitdays") scales bewteen O(n^3) and O(n^4) with RSA key length. So a few years after breaking RSA 2048, you'd have a computer five times larger that could break RSA 3072.
[1] https://quantum-journal.org/papers/q-2021-04-15-433/
[1] https://doi.org/10.1038/s41586-019-1666-5
[2] https://doi.org/10.48550/arXiv.2304.11119
It got slightly faster on the last few years, I don't know if inherently so or just due to random fluctuation. Yet people keep repeating that claim that the growth is exponential, based on no evidence at all.
Right now no one knows how to build a scalable quantum computer. But as soon as we find out the equivalent of lithography for building quantum chips, the progress will come, and it will come quickly and suddenly.
So a QC can factor a 5 bit number with Shor's algorithm in 2023 (with some cheating). That record has not changed for 10+ years.
I publicly bet 8 years ago that nobody would factor the number 35 by 2030. I hope I'm proved wrong.
https://sam-jaques.appspot.com/quantum_landscape
I'm not super concerned. Cynically: one big driver of research into PQ cryptography is that it's a full employment program for academic cryptographers. Not that there's anything wrong with that! I like a Richelot isogeny as much as the next guy, or I would if I understood what they were.
https://arxiv.org/abs/2009.05045
As far as I can tell, this website suggests that two-qubit gate infidelity has continued to improve exponentially, although it's hard to tell if these are reliable datapoints.
https://metriq.info/Task/38
Because there's typically an engineering tension between qubit count and gate quality, what you want to track is something like quantum volume, which looks to be on trend
https://metriq.info/Task/34
but it's notable that Google achieved quantum supremacy without having amazing quantum volume numbers, so it's not a perfect metric
https://spectrum.ieee.org/quantum-computing-google-sycamore
Worth noting that once you cross the fault tolerant threshold you will probably see a big shift in how engineering effort is distributed: many researchers think it will be harder to improve gate quality than just increase increase qubit counts to make up for it, so you may see gate quality stall and extrapolation become even less reliable.
If we cross the threshold. Until then, adding a qubit requires halving the noise floor, so the Moore's law equivalent to exponential scaling is basically adding a fixed number of qubits per year.
really? I thought we have 70 max?
https://www.ibm.com/quantum/roadmap
Do NOT switch to ECC if your threat model includes a quantum computer arriving.
Either use larger RSA keys or more appropriately a hybrid signature scheme combining one of NIST's PQC signatures and a traditional algorithm.
https://csrc.nist.gov/Projects/post-quantum-cryptography/sel...
I've had my dropbox account for over 10 years now. Being concerned about a timescale of 20 to 30 years seems reasonable for things like long term file storage IMO.
Backblaze, dropbox, google drive, onedrive, AWS are all over a decade old.
https://web.archive.org/web/20230710195916/https://articles....
That is, if it costs very little to have larger keys, why not have larger keys?
It is essentially hedging your bets as even if quantum computing key factorisation works, key lengths will still have an impact on the difficulty of factorisation, and it may make a difference in terms of practicality.
> Quantum computing of the sort intended to break RSA involves a breakthrough in both computing and algorithms. Normally some sort of new computing technology is invented and then algorithms are designed to enable that technology to do something useful. The quantum computing threat to RSA is different. We now have the algorithm (Shor's) but the computer to run it on only exists in our imagination.
> If someone were to invent such a computer then RSA 2048 would be immediately and trivially breakable. RSA 3072 would also be trivially breakable. The same applies to RSA 4096 and 8192.
By costs nothing I mean as CPUs get faster there is less performance impact on lenghtening keys
I mean, the whole thing with quantum computer factoring is it scales well. Getting to 2048 rsa seems really really difficult. But if we ever get there, getting to 4096 is probably just a tiny extra step.
No one can know at the moment, hence the trade off, if it costs very little to do a longer key, why not do a longer key?
RSA, to my knowledge, is vulnerable to side channels and poor parameter choices. Implementation simplicity is an underrated security parameter. The fewer feet you have, the fewer of them you can shoot.
The NSA data centers don’t want to waste time on your RSA key anyway, much less your run-of-the-mill Russian black hat groups. What bites us in practice are 0-days of something stupid like heartbleed or rowhammer that can be automated, and takes a long time to patch.
https://www.ams.org/notices/199612/pomerance.pdf has a great writeup on the history of the work around this. Essentially when you see improvements in complexity of the form
Old best: Quadratic Sieve: exp(c(log n)^1/2(log log n)^1/2)
New best: General Number field sieve: exp(c(log n)^1/3(log log n)^2/3)
I can't help but feel that's an exponent there that we've moved to 1/3 that could be moved further. Sure we don't know how and we've been stuck here on the current best for just over 25 years but i just feel that if you give me two methods and one moves a term like that there's a good chance there's a way to reduce that term further. It'd be weird for that complexity statement to stay as is. That's telling me "the universe doesn't allow factorization any faster than a term that's raised to a power of 1/3rd" and i'm asking "why is 1/3 so special?". So i'm not convinced that there's not more here. I don't have a clue how of course. But the history of RSA going "256bits is secure" to 512bits to 1024bits to 2048bits being needed has me worried about the safety of prime factorization.
Lindy Effect has been the best predictor of what will still work in five years.
> So it might be just RSA and discrete logs today but a requirement for pointlessly long EC keys will be along soon.
It wouldn’t be pointless if computers can crack those sizes. It’d only be pointless if cryptanalysis can exploit structure to reduce the effective entropy, no?
Sure, if we were implementing cryptographic algorithms from scratch that would be a proper strong consideration. However, 99% of programmers should just link to an established library/framework and use its cryptographic implementation. These established libraries already paid the price of implementation, and are very battle-tested. There's therefore very good reason to believe their RSA implementation is secure.
Choosing an algorithm should be done on other considerations then. A lower keysize would point to ECC. But maybe we don't want a single algorithm for all encryption - a mixed global ecosystem with RSA and ECC projects would be more robust.
Many of these established libraries have fallen in battle, some several times. There's always a new up and coming library that works on platform X, in language Y, or has a better license Z, or is integrated into an init system, and while some of them learn from the experience of others, many learn by making the same mistakes others did.
Pushing towards simpler constructions gives hope that those new implementations make fewer mistakes.
Yes absolutely. I’m not saying users should pick the one that’s easier to implement.
Simplicity is good for implementers. It allows for more participants, eg std libs to provide their own. Also, even the security geeks are humans and make mistakes. Heartbleed is a perfect example of where even simple things can go catastrophically wrong.
As a second order effect, users benefit from simplicity in the long run, because less complex systems have fewer bugs, and thus fewer security bugs.
https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later
However the 3 linked examples, AES, ChaCha20 and Camellia all use a key size of at least 128 bits, with 192 or 256 bits also listed as options.
What does this current NIST key size recommendation (effective as of 2019) of 112 mean then? Does anyone use this size?