Readit News logoReadit News
sansseriff · 2 years ago
We can expect a quantum computer with 20 million noisy qubits to break RSA 2048 [1]

I can't speak to coherence time or circuit depth concerns, but qubit counts are doubling roughly every year. Current chips have thousands of qubits, so the exponential scaling implies we'd have 20 million qubits by 2035-2040.

edit: And from the paper, the required quantum volume ("megaqubitdays") scales bewteen O(n^3) and O(n^4) with RSA key length. So a few years after breaking RSA 2048, you'd have a computer five times larger that could break RSA 3072.

[1] https://quantum-journal.org/papers/q-2021-04-15-433/

fsh · 2 years ago
Google's "quantum supremacy" paper was about a 53 qubit chip in 2019 [1]. This year, they reported having 70 qubits [2]. Looking at the papers (and supplements), the gate fidelities and qubit lifetimes have stayed roughly the same. This really doesn't look like anything like Moore's law to me.

[1] https://doi.org/10.1038/s41586-019-1666-5

[2] https://doi.org/10.48550/arXiv.2304.11119

marcosdumay · 2 years ago
Last time I took around 1 hour to do a review of literature, I discovered that the size of those computers were growing at a linear rate of around 4 qbits/year.

It got slightly faster on the last few years, I don't know if inherently so or just due to random fluctuation. Yet people keep repeating that claim that the growth is exponential, based on no evidence at all.

orlp · 2 years ago
If we didn't invent photolithography, we'd still be using computers the size of buildings limited to universities and military installations.

Right now no one knows how to build a scalable quantum computer. But as soon as we find out the equivalent of lithography for building quantum chips, the progress will come, and it will come quickly and suddenly.

sweis · 2 years ago
The record quantum computers can factor is 21 -- and that is by cheating by already knowing the factors are 3 and 7. There are other results that use special form composites which don't count.

So a QC can factor a 5 bit number with Shor's algorithm in 2023 (with some cheating). That record has not changed for 10+ years.

I publicly bet 8 years ago that nobody would factor the number 35 by 2030. I hope I'm proved wrong.

tptacek · 2 years ago
More detail on progress towards quantum factoring of RSA (among other things):

https://sam-jaques.appspot.com/quantum_landscape

I'm not super concerned. Cynically: one big driver of research into PQ cryptography is that it's a full employment program for academic cryptographers. Not that there's anything wrong with that! I like a Richelot isogeny as much as the next guy, or I would if I understood what they were.

jessriedel · 2 years ago
At least as of 2020, it looked like both qubit counts and two-qubit gate quality were improving exponentially, and our naive extrapolation said RSA 2048 wouldn't get cracked by 2039 at 95% confidence.

https://arxiv.org/abs/2009.05045

As far as I can tell, this website suggests that two-qubit gate infidelity has continued to improve exponentially, although it's hard to tell if these are reliable datapoints.

https://metriq.info/Task/38

Because there's typically an engineering tension between qubit count and gate quality, what you want to track is something like quantum volume, which looks to be on trend

https://metriq.info/Task/34

but it's notable that Google achieved quantum supremacy without having amazing quantum volume numbers, so it's not a perfect metric

https://spectrum.ieee.org/quantum-computing-google-sycamore

Worth noting that once you cross the fault tolerant threshold you will probably see a big shift in how engineering effort is distributed: many researchers think it will be harder to improve gate quality than just increase increase qubit counts to make up for it, so you may see gate quality stall and extrapolation become even less reliable.

akvadrako · 2 years ago
> once you cross the fault tolerant threshold

If we cross the threshold. Until then, adding a qubit requires halving the noise floor, so the Moore's law equivalent to exponential scaling is basically adding a fixed number of qubits per year.

mirekrusin · 2 years ago
> Current chips have thousands of qubits,

really? I thought we have 70 max?

cwillu · 2 years ago
I suspect d-wave nonsense was involved in the confusion.
MattPalmer1086 · 2 years ago
IBM released a processor with 433 qubits last year, and they say they are on track to deliver a 1121 qubit processor this year.

https://www.ibm.com/quantum/roadmap

iraqmtpizza · 2 years ago
if qubit counts are doubling every year then why is 21 still the largest number ever factored with a quantum algorithm
capableweb · 2 years ago
Unless you're worried about storing and/or transmitting a huge amount of keys (in the order of "at least 100/second") and/or using one key "at least 100 times/second", why not just go for 4096 by default?
loeg · 2 years ago
Because 4096 bit RSA is a lot slower and bigger and those things matter. And there isn't any upside? If you're actually worried about 2048-bit RSA (and you should not be), you should switch to one of the elliptic curve schemes.
Grimburger · 2 years ago
All other things equal, 256-bit elliptical curve cryptography is going to break before RSA2048 does with quantum computing advances.

Do NOT switch to ECC if your threat model includes a quantum computer arriving.

Either use larger RSA keys or more appropriately a hybrid signature scheme combining one of NIST's PQC signatures and a traditional algorithm.

https://csrc.nist.gov/Projects/post-quantum-cryptography/sel...

adgjlsfhk1 · 2 years ago
I'm not actually sure about this. the elliptic curve schemes are just as broken with quantum computers, and the larger key size of rsa seems like it might add a few years of overhead in terms of qbits needed. not an expert though
H8crilA · 2 years ago
Because that will soon become "why not 128k". Or perhaps we should use 2 megabytes for each prime? We don't know, but it's better to be safe than sorry.
upofadown · 2 years ago
If you accept the current assumptions, then you would have to accept that 4096 bit RSA will be obsolete by 2060.
thewataccount · 2 years ago
That's a significant amount of time if we're talking about long-term file storage.

I've had my dropbox account for over 10 years now. Being concerned about a timescale of 20 to 30 years seems reasonable for things like long term file storage IMO.

Backblaze, dropbox, google drive, onedrive, AWS are all over a decade old.

yjftsjthsd-h · 2 years ago
So only 37 years? I think I can live with that.
devmor · 2 years ago
I would hope that someone invents a more robust cryptography protocol in the next 37 years.
slater · 2 years ago
Server appears to be hugged, web archive link:

https://web.archive.org/web/20230710195916/https://articles....

roomey · 2 years ago
I think the point is as computers get faster there is less trade off in having longer bit lengths, rather than focusing on the potential to crack sad keys.

That is, if it costs very little to have larger keys, why not have larger keys?

It is essentially hedging your bets as even if quantum computing key factorisation works, key lengths will still have an impact on the difficulty of factorisation, and it may make a difference in terms of practicality.

nailer · 2 years ago
They cover that in the article:

> Quantum computing of the sort intended to break RSA involves a breakthrough in both computing and algorithms. Normally some sort of new computing technology is invented and then algorithms are designed to enable that technology to do something useful. The quantum computing threat to RSA is different. We now have the algorithm (Shor's) but the computer to run it on only exists in our imagination.

> If someone were to invent such a computer then RSA 2048 would be immediately and trivially breakable. RSA 3072 would also be trivially breakable. The same applies to RSA 4096 and 8192.

roomey · 2 years ago
They can't know that however, they may say it but they can't know it. Hence why not choose key lengths by looking at encoding and decoding resource usage instead of crackability.

By costs nothing I mean as CPUs get faster there is less performance impact on lenghtening keys

bawolff · 2 years ago
> even if quantum computing key factorisation works, key lengths will still have an impact on the difficulty of factorisation, and it may make a difference in terms of practicality.

I mean, the whole thing with quantum computer factoring is it scales well. Getting to 2048 rsa seems really really difficult. But if we ever get there, getting to 4096 is probably just a tiny extra step.

alphager · 2 years ago
The current state of quantum computing suggests that there is an exponential effort to increase the available qbits. Going from RSA 2048 to RSA 4096 would not just double the effort required on the quantum side.
roomey · 2 years ago
Well, we can't know. Maybe it would be trivial, maybe it could keep your data safe for 5ore years. Maybe it could double the cost to crack.

No one can know at the moment, hence the trade off, if it costs very little to do a longer key, why not do a longer key?

hujun · 2 years ago
`That is, if it costs very little to have larger keys, why not have larger keys?` it is often very expensive/difficult to change length of a RSA key is part of existing platform/infrastructure, like key in TPM/HSM/CA infrastructure, regardless how fast computer CPU is
zokier · 2 years ago
But RSA has been long time going out, and short-keyed RSA doubly so. I would estimate that since maybe 2015ish deploying stuff that is coupled to 2048bit RSA would have been mistake. That gives generous 15ish year transition period. Anyone who cares even the slightest should succeed transition in that sort of timeframe.
klabb3 · 2 years ago
I’m not a cryptographer, but I can see many more pressing reasons for migrating off RSA before 2030. Is there any reason to pick RSA for greenfield today?

RSA, to my knowledge, is vulnerable to side channels and poor parameter choices. Implementation simplicity is an underrated security parameter. The fewer feet you have, the fewer of them you can shoot.

The NSA data centers don’t want to waste time on your RSA key anyway, much less your run-of-the-mill Russian black hat groups. What bites us in practice are 0-days of something stupid like heartbleed or rowhammer that can be automated, and takes a long time to patch.

woodruffw · 2 years ago
This is my (applied) cryptographic understanding as well: RSA 2048 probably won't be broken by improvements to prime factorization by 2030, but will continue to be a pointlessly dangerous (and slow) cryptosystem when compared to ECC.
AnotherGoodName · 2 years ago
I'm not convinced on the first part. That it won't be broken by improvements to prime factorization.

https://www.ams.org/notices/199612/pomerance.pdf has a great writeup on the history of the work around this. Essentially when you see improvements in complexity of the form

Old best: Quadratic Sieve: exp(c(log n)^1/2(log log n)^1/2)

New best: General Number field sieve: exp(c(log n)^1/3(log log n)^2/3)

I can't help but feel that's an exponent there that we've moved to 1/3 that could be moved further. Sure we don't know how and we've been stuck here on the current best for just over 25 years but i just feel that if you give me two methods and one moves a term like that there's a good chance there's a way to reduce that term further. It'd be weird for that complexity statement to stay as is. That's telling me "the universe doesn't allow factorization any faster than a term that's raised to a power of 1/3rd" and i'm asking "why is 1/3 so special?". So i'm not convinced that there's not more here. I don't have a clue how of course. But the history of RSA going "256bits is secure" to 512bits to 1024bits to 2048bits being needed has me worried about the safety of prime factorization.

forgotmypw17 · 2 years ago
Top reason to use RSA for me, as with most tech I use, is its longevity.

Lindy Effect has been the best predictor of what will still work in five years.

HappyPanacea · 2 years ago
McEliece encryption algorithm was published in 1978 (one year later than RSA) and seems to be considered safe to classical and quantum computers, the only downside is the huge public key size.
raggi · 2 years ago
The Lindy effect applies to non-perishables. There are alternatives which are much more likely to conform to a reasonable equivalency of non-perishable for cryptography. Per this article, RSA strongly does not fit reasonable equivalent definitions.
withinboredom · 2 years ago
The fact that the sun rose today, has no effect on whether the sun will rise tomorrow.
upofadown · 2 years ago
If you accept the current assumption of double exponential growth for both computing performance and algorithmic efficiency then you would have to accept that 256 bit elliptic curve keys will become obsolete in the 2040s. So it might be just RSA and discrete logs today but a requirement for pointlessly long EC keys will be along soon.
klabb3 · 2 years ago
Yeah but does it matter? Either that’s true or it isn’t. If it is true, we’ll (as usual) have ample time to migrate. With RSA though, it’s already today more complex, slower and about 8x larger key sizes. And the cryptanalysis track record is afaik (please correct me) much more successful than ECC, so it’s “higher risk” that the timeline gets pushed forward or that new patches are needed to avoid bad parameter choices.

> So it might be just RSA and discrete logs today but a requirement for pointlessly long EC keys will be along soon.

It wouldn’t be pointless if computers can crack those sizes. It’d only be pointless if cryptanalysis can exploit structure to reduce the effective entropy, no?

yyyk · 2 years ago
>Implementation simplicity is an underrated security parameter.

Sure, if we were implementing cryptographic algorithms from scratch that would be a proper strong consideration. However, 99% of programmers should just link to an established library/framework and use its cryptographic implementation. These established libraries already paid the price of implementation, and are very battle-tested. There's therefore very good reason to believe their RSA implementation is secure.

Choosing an algorithm should be done on other considerations then. A lower keysize would point to ECC. But maybe we don't want a single algorithm for all encryption - a mixed global ecosystem with RSA and ECC projects would be more robust.

toast0 · 2 years ago
> These established libraries already paid the price of implementation, and are very battle-tested. There's therefore very good reason to believe their RSA implementation is secure.

Many of these established libraries have fallen in battle, some several times. There's always a new up and coming library that works on platform X, in language Y, or has a better license Z, or is integrated into an init system, and while some of them learn from the experience of others, many learn by making the same mistakes others did.

Pushing towards simpler constructions gives hope that those new implementations make fewer mistakes.

klabb3 · 2 years ago
> These established libraries already paid the price of implementation, and are very battle-tested.

Yes absolutely. I’m not saying users should pick the one that’s easier to implement.

Simplicity is good for implementers. It allows for more participants, eg std libs to provide their own. Also, even the security geeks are humans and make mistakes. Heartbleed is a perfect example of where even simple things can go catastrophically wrong.

As a second order effect, users benefit from simplicity in the long run, because less complex systems have fewer bugs, and thus fewer security bugs.

Tempest1981 · 2 years ago
No concern about "store now, decrypt later"?

https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later

I_am_tiberius · 2 years ago
I always thought discussions about encryption forget about this topic. Encrypted data today is only hidden today, not in the future. So any darknet criminal should consider that files he/she sends are being intercepted and readable in 30 years.
jbarrs · 2 years ago
I find it amusing that the table published a conservative cutoff year and an optimistic cutoff year. Based on the trends I've seen, most non-critical software would probably have made the switch in time for the conservative year, whereas anything security-critical like a bank would probably use the optimistic year.
upofadown · 2 years ago
The author of the paper had a problem. His elliptic curve method seemed like it might overtake the best known algorithm at the time for factoring. So the conservative estimate takes that into account. The elliptic curve method never managed supremacy so the optimistic estimate is actually more relevant. That means that the actual prediction is 2040 but it seems the various national cybersecurity entities might of missed that point.
Aardwolf · 2 years ago
For Symmetric encryption, it says: 'Current key size: 112 bits'

However the 3 linked examples, AES, ChaCha20 and Camellia all use a key size of at least 128 bits, with 192 or 256 bits also listed as options.

What does this current NIST key size recommendation (effective as of 2019) of 112 mean then? Does anyone use this size?

upofadown · 2 years ago
That's pretty much for 3DES, which is disliked for other reasons. The point still stands though...
zokier · 2 years ago
112 bit effective key size almost certainly refers to 3des