The somewhat cynical but honest answer is that all articles need some kind of pretty image at the top because when you share a link on any social media platform, the platform looks for an image to use as a thumbnail. If it doesn't find one, it gets posted as just a plain link and almost no one clicks it.
This is why Medium requires an image on every post, why every programming blog out there puts random pictures of laptops and coffee cups at the top of their articles, why Unsplash is so popular, and now why AI-generated images at the top of posts are so common.
Imagine this discovery led to a larger breakthrough on prime numbers that allowed easy factorization of large integers and effectively rendered public key cryptography such as RSA ineffective overnight, by allowing anyone with a consumer-grade CPU to crack any production-size key.
Does the industry have DR plans for this scenario? Can the big players quickly switch to a different, unbroken encryption system? While it would probably be a heavenly day for jailbreakers, console modders and other "device freedom" types generally, the overall impact would be disastrous and incalculable.
Does the industry simply not consider "sudden number theory breakthrough" a possible event?
The us government used to restrict export of long rsa keys. At one point much of the world was using 128bit rsa keys but Dixon method had everyone scrambling to use 512bit keys. Then the special number field drive had us all scrambling to use 1024bit keys and the general number field seive again had us scrambling to get to 2048bit keys.l and that really wasn’t that long ago relatively speaking.
Check out rsa encryption hardware from the 80s. They are really proud of some of the hardware that can do 512bits! (Useless today)
The special and general number field seize complexity statements are a few constants in difference. Look at those constants. Do they seem to be some root limit to you? Is it really that unlikely that there’s not a way to reduce those further making even 2048bit keys useless?
You don’t need to ask “what would happen if RSA broke” because those of us who have been through this many times now can straight up tell you. You’ll be scrambling to once more bump up the key size and you’ll be auditing all the potential data leaked.
> At one point much of the world was using 128bit rsa keys
When?
I was writing crypto libraries in the early 90s to support RSA, DSA and ElGamal at my company (this predates the times when good open source crypto libraries were broadly available).
Even back then 128 bit RSA keys were not used. The smallest the library supported was 512 and the smallest we ever used in production was 768 bits.
That's how far back my own memory goes. But here's a paper from Arjen Lenstra from 2001 which has a table showing computationally equivalent key sizes back to 1982.
In 1982, security comparable (at the time!) to DES would have been 417 bit RSA keys.
So even in 1982, using 128 bit RSA keys made no sense!
> You’ll be scrambling to once more bump up the key size and you’ll be auditing all the potential data leaked.
If you've had to do this for RSA keys (more than once, even!) I respectfully suggest you need to be a lot more conservative picking key lengths. There has never been a sudden breakthrough in factorization that has rendered conservatively chosen RSA key lengths obsolete overnight.
If someone found a way to easily factorize large integers easily on consumer grade hardware then it would be very painful as RSA is one of the big public key algorithms.
Before you start worrying about it though consider that RSA has held up for 47 years of active cryptanalysis so far - during which time many alternative algorithms have been suggested as being superior, only to be broken a short time later.
Also the push to switch to Elliptic-Curve algorithms has been more down to them being easier for computers to use to encrypt/decrypt data.
Personally if I had to bet on which public key algorithm will still be around in 10 years time then I'd put my money on RSA.
RSA has effectively been broken many times. We literally had 128bit RSA encryption hardware at one point. There were even export controls on keys beyond a certain length (512bits) that today are trivial to break with the general number field seive. You look at the history of RSA and it’s not pretty. Dixons method had us all scrambling to use 512bit keys (pushing the export restrictions), special number field seive had us rushing to get to 1024bit. The general number field seive more recently pushed us to 2048bits. Who can tell what’s next here. In fact look at the complexity of the special vs general number field seives and you’ll see the statements are almost the same, just some constants reduced. That’s worrying because there’s no reason to think the current constants are a minimum here. We may well find out 2048bits is not enough.
Heck just read a paper in state of the art dedicated RSA encryption hardware from the 80s. All now completely broken. They are very impressed with some of the 512bit hardware!
Actually RSA has several "gotchas", so it is not that it has held up but people have managed to work around those gotchas into a working encryption system
> Before you start worrying about it though consider that RSA has held up for 47 years of active cryptanalysis so far
True, but is there much overlap between the set of people who actively do cryptanalyses on RSA and the set of people who actively research integer factoring?
As far as I know the skills needed to make progress on integer factoring are not really needed for anything in cryptography other than that specific problem and include a bunch of other skills that have nothing whatsoever to do with cryptography and take a long time to master. It's also been an open and important problem long enough that if it is solved it is likely to be by someone who is a world class expert in it, similar to how Fermat's Last Theorem was finally solved.
Similarly the skills needed for anything in cryptanalysis other than trying to break RSA by factoring are useless for working on integer factoring. The result is that very few, if any, people have the time and ability to become both cryptanalysts and world class integer factoring researchers.
Thus I'd expect nearly all of the 47 years of active RSA cryptanalysis has been on finding and fixing the numerous mistakes that can be made when implementing RSA that allow it to be broken without factoring.
I'd guess it is also similar with the P = NP problem. A solution to that has implications for cryptography but I doubt many cryptanalysts are working on the P = NP problem. Also maybe same for the Riemann hypothesis (I'm not sure if that one has implications for cryptography).
I think this is where the move to elliptic curve comes in, and that seems well on its way. Both in signatures and handshakes (Diffie-Hellman).
Perhaps it's not a one minute operation to DR, but I doubt everything would be open if RSA/DH was rendered insecure overnight. I know my SSH keys are a mixed bag right now.
Arguably, we know a lot more about elliptical curves than prime number theory too. There have definitely been a lot more folks working on it.
There are other algorithms, NIST has a post-quantum project. There are options, but it’s a lot more than having some algorithms, we need protocols and they are still a bit off. Prime factorization isn’t going to get easier or faster though. There might be another attack or approach to attacking, some other numerical break through that leads to faster rsa cracking might be found but it’s hard to imagine it being that practical.
This would be a lot worse than that. Crowdstrike was bad because everyone lets relatively untested code straight into the Windows kernel - i.e. known incompetence of approach. This would be bad despite massive care taken to have the right approach.
I guess one perspective is finding fast factoring is considered super rare. The story is so many smart people have looked at it, it's probably impossible...for now. But that story may be its own weakness.
Anyway, the risk, while just as real as something like the grid being downed by a massive solar storm with multiyear recovery period from stone age due to transformer manufacturing delays and no stockpiles, just seems too minuscule/theoretical to spend much time on - from that point of view.
Regarding any plan, I don't know if it's so easy to just switch to ECC, because actual asymmetric encryption with ECC depends on shared secret, which (if you're assuming an unsecured exchange channel due to RSA being broken), is more vulnerable to MITM than RSA. I don't think it's an easy swap out.
All that aside, another point of view is RSA is probably already broken, the break is just secret to the codebreaking agencies. It would be very desirable for them to keep their breakthrough secret. That might even involve trying to find ways to suppress any "sudden number theory breakthroughs" hahaha! :)
I think someone working in cryptography will worry about a sudden number theory breakthrough that allows for breaking of cryptography as much as a someone working in the energy sector will worry about a sudden physics breakthrough that allows for practically free energy cold fusion energy.
I remember telling my high school students that if they found an efficient way to factor large numbers, they would destroy capitalism and a large number of them got very excited about number theory after that.
Many people in the industry does not think that RSA is crackable due to the assumptions that the Riemann Hypothesis and also the distribution of prime numbers is such a hard problem with a long time of being unsolvable.
A possible mitigation for things like websites would be either ECC or even using the quantum resistant encryption systems (the industry would more likely avoid this due to the systems being very prototypical since we have just started researching this).
Since old bitcoin wallets can’t be moved off of RSA you can transfer the coins to your wallet and there is no mitigation.
I don't see how proving the Riemann Hypothesis would help cracking RSA? If it helps, couldn't you just assume it is true and start cracking RSA today? If you ever hit a point where it doesn't work then BOOOM: Riemann Hypothesis disproven!
RSA was once 128bits and today has to be 2048bits minimum to be secure because it was essentially broken multiple times. There used to be 128bit rsa encrypting hardware that now doesn’t work at all to protect data due to previous mathematical breakthroughs.
The congruence of squares equivalence to factorization demonstrated we need at least 500 bits and then the special number field seive that built on this push it to 1024. The general number field seive pushed it again to 2048.
Sure it’s not a log(n) break but it’s been broken. If you look at the complexity analysis of the special vs general number field seive the portion of the exponent going from 1/2 to 1/3 should give you thought. Can it be moved to 1/4? Could it be moved indefinitely to 1/x? The general number field seive is relatively recent. If someone comes up with a similar breakthrough again (and this has happened many times over with rsa) your 2048bit keys won’t be secure just as your 128bit rsa keys from the past are no longer secure.
From what I remember in my math class where we made those cryptography calculations by hand, the teacher many years ago said that the day we could guess prime numbers it will be a disaster because many cryptographic calculations are based on the premise that we can’t guess prime numbers. I don’t know if that changed ?
It's easy to find primes of a given bit length, and it's easy to multiply them together. It's hard to un-multiply a given number (public key) into its unique prime factors (private key).
IIRC, Elliptic curve cryptography doesn’t rely on factorization, so there’s already an interim PKC solution in place.
I also recall there were many problems with the ECC based algorithms or at least the implementations—something about discrete approximations weakening security?
ECC is very closely related though (hidden abelian subgroup problem is the category they both fall under).
It’s actually concerning because rsa was broken. The reason we’re not using 128bit rsa keys anymore and instead using 2048bit keys is because rsa was broken by the general number field sieve. We’re now all using ecc to avoid working with very large keys but there’s no mathematical proofs that ecc is anymore difficult. In fact it’s widely believed to be the same problem underneath.
That may surprise people. ECC, the thing we rely on, is not proven except by the fact that no one has broken it yet just like rsa was until someone broke it.
We could always switch to symmeric keys (but key exchange would be somewhat problematic). Also elliptic curves crypto doesn't use primes, so even public key/private key crypto would still be around and secure.
Isn’t this the same as zero-day vulnerabilities? Typically only a bunch of people out there know how to take advantage of such holes, and eventually they get fixed.
I guess if public key cryptography gets broken, only a bunch of people would know how to take advantage of it, and eventually it would get fixed.
If there is a such a breakthrough then the hackers or even spy agencies will not reveal it. They will instead silently make use of it. It will be essentially a backdoor for them.
Integer factorization is an NP problem but is not known to be NP-complete. Therefore, we do not know how to solve all NP problems in P time using a hypothetical P time factorization.
People always think the structure of primes is complex, but it's not really, it's just a recursive structure of the magnitude gaps not landed on by multiples of previous gaps.
It doesn't make it easier to "predict" without tracking all prior gaps, but it's not essentially a complex structure. Kind of funny that like such a simple structure is so elusive. Sorta like how the 3n + 1 sequence gives rise to such complexity. Or the logistic map with its parameter above the threshold.
A generator of "all primes" is pretty simple and deterministic. But you cannot simply generate next prime given only prime n without recomputing its non-trivial remainders. That means just a binary representation of a number n doesn't provide enough information to make quick answer what is the next prime. You have to pre-compute some 'pivots' first. Basically, more complexity, but it's still simple and trivial, not even in NP.
Ah yes nothing simpler than providing the foundational theory to one of the most rigorous and intellectually intimidating areas of mathematics - number theory /s
Nah, it's not that complex. Did you ever take introductory number theory? Anyway, I think you missed the point that the description is really simple, and overlooked. Hahaha! :)
Something inspiring about this: "In dedicated Friday afternoon thinking sessions, he returned to the problem again and again over the past decade, to no avail."
Friend of mine worked used to block off his friday afternoons for 'weekly review'. Which was part big thinking, part end of week nap, and mostly avoiding colleagues who had tricky tasks 'needed first thing monday' they had forgotten to bring up before.
"they pulled some unorthodox moves to finally break Ingham’s bound"
Why is taking methods from other fields an unorthodox move? I come from an engineering background an there it is the common case. The usage of harmonic analysis is a staple in many fields (audio, waves, electrical analysis, statistics) and of course the algorithms are pure math under the hood. If I want to find a reaccuring structure in an underlying system, wouldn't it be normal to try different plotting techniques and choose the one that suits my problem best?
"Save for Maynard, a 37-year-old virtuoso who specializes in analytic number theory, for which he won the 2022 Fields Medal—math’s most prestigious award. In dedicated Friday afternoon thinking sessions, he returned to the problem again and again over the past decade, to no avail. At an American Mathematical Society meeting in 2020, he enlisted the help of Guth, who specializes in a technique known as harmonic analysis, which draws from ideas in physics for separating sounds into their constituent notes. Guth also sat with the problem for a few years. Just before giving up, he and Maynard hit a break. Borrowing tactics from their respective mathematical dialects and exchanging ideas late into the night over an email chain, they pulled some unorthodox moves to finally break Ingham’s bound."
This quote doesn't suggest that the only thing unorthodox about their approach was using some ideas from harmonic analysis. There's nothing remotely new about using harmonic analysis in number theory.
1. I would say the key idea in a first course in analytic number theory (and the key idea in Riemann's famous 1859 paper) is "harmonic analysis" (and this is no coincidence because Riemann was a pioneer in this area). See: https://old.reddit.com/r/math/comments/16bh3mi/what_is_the_b....
3. One of the citations in the Guth Maynard paper is the following 1994 book: H. Montgomery, Ten Lectures On The Interface Between Analytic Number Theory And
Harmonic Analysis, No. 84. American Mathematical Soc., 1994. There was already enough interface in 1994 for ten lectures, and judging by the number of citations of that book (I've cited it myself in over half of my papers), much more interface than just that!
What's surprising isn't that they used harmonic analysis at all, but where in particular they applied harmonic analysis and how (which are genuinely impossible to communicate to a popular audience, so I don't fault the author at all).
To me your comment sounds a bit like saying "why is it surprising to make a connection." Well, breakthroughs are often the result of novel connections, and breakthroughs do happen every now and then, but that doesn't make the novel connections not surprising!
Unorthodox is maybe a bit strong, but what they're saying is that it's a novel application of an existing technique from another field. Fairly often you will see big breakthroughs like this in maths, where someone has the insight to see that there are parallels between two seemingly unconnected areas of maths, and you can use ideas from one area to give you insight in to the other area.
The tricky bit is that these connections between areas are not usually obvious, so to see the similarity can require a considerable step in understanding.
It’s kind of silly. Just a reporter reporting. You could say that every discovery in mathematics involves some “unorthodox” move, since the orthodoxy is all that is known so far.
> “At first sight, they look pretty random,” says James Maynard, a mathematician at the University of Oxford. “But actually, there’s believed to be this hidden structure within the prime numbers.”
What would the pattern of primes hypothetically look like? Is there expected to be some kind of closed form formula? If the Riemann hypothesis were proven, what would be the next step to understanding the distribution? Or is the proof itself expected to hold this answer?
Every time I hear about James Maynard it really solidifies my opinion that he's one of those once in a generation geniuses. He's already contributed so much to prime number theory, it really feels like there might be a proof of the Riemann Hypothesis within my lifetime.
https://www.quantamagazine.org/sensational-proof-delivers-ne...
Terence Tao also provides links to a presentation by James Maynard and Larry Guth: https://www.ias.edu/video/new-bounds-large-values-dirichlet-... and https://www.ias.edu/video/new-bounds-large-values-dirichlet-...
Edit: https://en.wikipedia.org/wiki/Ulam_spiral
This is why Medium requires an image on every post, why every programming blog out there puts random pictures of laptops and coffee cups at the top of their articles, why Unsplash is so popular, and now why AI-generated images at the top of posts are so common.
It's dumb.
Deleted Comment
Imagine this discovery led to a larger breakthrough on prime numbers that allowed easy factorization of large integers and effectively rendered public key cryptography such as RSA ineffective overnight, by allowing anyone with a consumer-grade CPU to crack any production-size key.
Does the industry have DR plans for this scenario? Can the big players quickly switch to a different, unbroken encryption system? While it would probably be a heavenly day for jailbreakers, console modders and other "device freedom" types generally, the overall impact would be disastrous and incalculable.
Does the industry simply not consider "sudden number theory breakthrough" a possible event?
The us government used to restrict export of long rsa keys. At one point much of the world was using 128bit rsa keys but Dixon method had everyone scrambling to use 512bit keys. Then the special number field drive had us all scrambling to use 1024bit keys and the general number field seive again had us scrambling to get to 2048bit keys.l and that really wasn’t that long ago relatively speaking.
Check out rsa encryption hardware from the 80s. They are really proud of some of the hardware that can do 512bits! (Useless today)
https://people.csail.mit.edu/rivest/pubs/pubs/Riv84.pdf
The special and general number field seize complexity statements are a few constants in difference. Look at those constants. Do they seem to be some root limit to you? Is it really that unlikely that there’s not a way to reduce those further making even 2048bit keys useless?
You don’t need to ask “what would happen if RSA broke” because those of us who have been through this many times now can straight up tell you. You’ll be scrambling to once more bump up the key size and you’ll be auditing all the potential data leaked.
When?
I was writing crypto libraries in the early 90s to support RSA, DSA and ElGamal at my company (this predates the times when good open source crypto libraries were broadly available).
Even back then 128 bit RSA keys were not used. The smallest the library supported was 512 and the smallest we ever used in production was 768 bits.
That's how far back my own memory goes. But here's a paper from Arjen Lenstra from 2001 which has a table showing computationally equivalent key sizes back to 1982.
https://infoscience.epfl.ch/server/api/core/bitstreams/c323a...
In 1982, security comparable (at the time!) to DES would have been 417 bit RSA keys.
So even in 1982, using 128 bit RSA keys made no sense!
> You’ll be scrambling to once more bump up the key size and you’ll be auditing all the potential data leaked.
If you've had to do this for RSA keys (more than once, even!) I respectfully suggest you need to be a lot more conservative picking key lengths. There has never been a sudden breakthrough in factorization that has rendered conservatively chosen RSA key lengths obsolete overnight.
Before you start worrying about it though consider that RSA has held up for 47 years of active cryptanalysis so far - during which time many alternative algorithms have been suggested as being superior, only to be broken a short time later.
Also the push to switch to Elliptic-Curve algorithms has been more down to them being easier for computers to use to encrypt/decrypt data.
Personally if I had to bet on which public key algorithm will still be around in 10 years time then I'd put my money on RSA.
Heck just read a paper in state of the art dedicated RSA encryption hardware from the 80s. All now completely broken. They are very impressed with some of the 512bit hardware!
https://people.csail.mit.edu/rivest/pubs/pubs/Riv84.pdf
(Basically your data is not encrypted with RSA, you encrypt a secondary key, send it with RSA but the main encryption is AES see https://en.wikipedia.org/wiki/Transport_Layer_Security#Key_e... )
True, but is there much overlap between the set of people who actively do cryptanalyses on RSA and the set of people who actively research integer factoring?
As far as I know the skills needed to make progress on integer factoring are not really needed for anything in cryptography other than that specific problem and include a bunch of other skills that have nothing whatsoever to do with cryptography and take a long time to master. It's also been an open and important problem long enough that if it is solved it is likely to be by someone who is a world class expert in it, similar to how Fermat's Last Theorem was finally solved.
Similarly the skills needed for anything in cryptanalysis other than trying to break RSA by factoring are useless for working on integer factoring. The result is that very few, if any, people have the time and ability to become both cryptanalysts and world class integer factoring researchers.
Thus I'd expect nearly all of the 47 years of active RSA cryptanalysis has been on finding and fixing the numerous mistakes that can be made when implementing RSA that allow it to be broken without factoring.
I'd guess it is also similar with the P = NP problem. A solution to that has implications for cryptography but I doubt many cryptanalysts are working on the P = NP problem. Also maybe same for the Riemann hypothesis (I'm not sure if that one has implications for cryptography).
Perhaps it's not a one minute operation to DR, but I doubt everything would be open if RSA/DH was rendered insecure overnight. I know my SSH keys are a mixed bag right now.
There are other algorithms, NIST has a post-quantum project. There are options, but it’s a lot more than having some algorithms, we need protocols and they are still a bit off. Prime factorization isn’t going to get easier or faster though. There might be another attack or approach to attacking, some other numerical break through that leads to faster rsa cracking might be found but it’s hard to imagine it being that practical.
The ability to prepare for catastrophic scenarios is overestimated. The ability to survive them is underestimated.
Anyway, the risk, while just as real as something like the grid being downed by a massive solar storm with multiyear recovery period from stone age due to transformer manufacturing delays and no stockpiles, just seems too minuscule/theoretical to spend much time on - from that point of view.
Regarding any plan, I don't know if it's so easy to just switch to ECC, because actual asymmetric encryption with ECC depends on shared secret, which (if you're assuming an unsecured exchange channel due to RSA being broken), is more vulnerable to MITM than RSA. I don't think it's an easy swap out.
All that aside, another point of view is RSA is probably already broken, the break is just secret to the codebreaking agencies. It would be very desirable for them to keep their breakthrough secret. That might even involve trying to find ways to suppress any "sudden number theory breakthroughs" hahaha! :)
It’s the IPv6 of the security world.
Dead Comment
A possible mitigation for things like websites would be either ECC or even using the quantum resistant encryption systems (the industry would more likely avoid this due to the systems being very prototypical since we have just started researching this).
Since old bitcoin wallets can’t be moved off of RSA you can transfer the coins to your wallet and there is no mitigation.
The congruence of squares equivalence to factorization demonstrated we need at least 500 bits and then the special number field seive that built on this push it to 1024. The general number field seive pushed it again to 2048.
Sure it’s not a log(n) break but it’s been broken. If you look at the complexity analysis of the special vs general number field seive the portion of the exponent going from 1/2 to 1/3 should give you thought. Can it be moved to 1/4? Could it be moved indefinitely to 1/x? The general number field seive is relatively recent. If someone comes up with a similar breakthrough again (and this has happened many times over with rsa) your 2048bit keys won’t be secure just as your 128bit rsa keys from the past are no longer secure.
I also recall there were many problems with the ECC based algorithms or at least the implementations—something about discrete approximations weakening security?
Far beyond my comprehension
It’s actually concerning because rsa was broken. The reason we’re not using 128bit rsa keys anymore and instead using 2048bit keys is because rsa was broken by the general number field sieve. We’re now all using ecc to avoid working with very large keys but there’s no mathematical proofs that ecc is anymore difficult. In fact it’s widely believed to be the same problem underneath.
That may surprise people. ECC, the thing we rely on, is not proven except by the fact that no one has broken it yet just like rsa was until someone broke it.
I guess if public key cryptography gets broken, only a bunch of people would know how to take advantage of it, and eventually it would get fixed.
Deleted Comment
Deleted Comment
P =? NP would remain open.
And even if problems can be solved in polynomial time, the constants involved can be prohibitively large.
It doesn't make it easier to "predict" without tracking all prior gaps, but it's not essentially a complex structure. Kind of funny that like such a simple structure is so elusive. Sorta like how the 3n + 1 sequence gives rise to such complexity. Or the logistic map with its parameter above the threshold.
This is incorrect. Integer factorization is NP-intermediate. Very much “in NP”.
https://en.m.wikipedia.org/wiki/NP-intermediate
Also, saying factorization lacks “complexity” because sieves exist misunderstands both concepts.
> In order to talk about complexity classes such as P, NP, and co-NP, the problem has to be stated as a decision problem.
> Decision problem (Integer factorization) — For every natural numbers n and k, does n have a factor smaller than k besides 1?
> It is known to be in both NP and co-NP
https://en.m.wikipedia.org/wiki/Integer_factorization#:~:tex....
Dead Comment
https://en.wikipedia.org/wiki/Information_theory
https://en.wikipedia.org/wiki/Computational_irreducibility
https://en.wikipedia.org/wiki/Aperiodic_tiling
To see what I am talking about as in trivial and non-trivial zeros see this wikipedia animation https://en.wikipedia.org/wiki/File:Riemann3d_Re_0.1_to_0.9_I...
Basically, it implies that there is another relationship between real and imaginary numbers we have not yet stumbled upon....
And,this has implications upon finding the gravity theory as Riemann math is involved in quantum mechanics....
Strange science that primes is or might be involved in gravity theory....
Why is taking methods from other fields an unorthodox move? I come from an engineering background an there it is the common case. The usage of harmonic analysis is a staple in many fields (audio, waves, electrical analysis, statistics) and of course the algorithms are pure math under the hood. If I want to find a reaccuring structure in an underlying system, wouldn't it be normal to try different plotting techniques and choose the one that suits my problem best?
This quote doesn't suggest that the only thing unorthodox about their approach was using some ideas from harmonic analysis. There's nothing remotely new about using harmonic analysis in number theory.
1. I would say the key idea in a first course in analytic number theory (and the key idea in Riemann's famous 1859 paper) is "harmonic analysis" (and this is no coincidence because Riemann was a pioneer in this area). See: https://old.reddit.com/r/math/comments/16bh3mi/what_is_the_b....
2. The hottest "big thing" in number theory right now is essentially "high dimensional" harmonic analysis on number fields https://en.wikipedia.org/wiki/Automorphic_form, https://en.wikipedia.org/wiki/Langlands_program. The 1-D case that the Langlands program is trying to generalize is https://en.wikipedia.org/wiki/Tate%27s_thesis, also called "Fourier analysis on number fields," one of the most important ideas in number theory in the 20th century.
3. One of the citations in the Guth Maynard paper is the following 1994 book: H. Montgomery, Ten Lectures On The Interface Between Analytic Number Theory And Harmonic Analysis, No. 84. American Mathematical Soc., 1994. There was already enough interface in 1994 for ten lectures, and judging by the number of citations of that book (I've cited it myself in over half of my papers), much more interface than just that!
What's surprising isn't that they used harmonic analysis at all, but where in particular they applied harmonic analysis and how (which are genuinely impossible to communicate to a popular audience, so I don't fault the author at all).
To me your comment sounds a bit like saying "why is it surprising to make a connection." Well, breakthroughs are often the result of novel connections, and breakthroughs do happen every now and then, but that doesn't make the novel connections not surprising!
The tricky bit is that these connections between areas are not usually obvious, so to see the similarity can require a considerable step in understanding.
What would the pattern of primes hypothetically look like? Is there expected to be some kind of closed form formula? If the Riemann hypothesis were proven, what would be the next step to understanding the distribution? Or is the proof itself expected to hold this answer?