I made this visualization of the zeta function in javascript, it is infinitely zoomable, and you can play around with parameters: https://amirhirsch.com/zeta/index.html
It might help you understand why the hypothesis is probably true. It renders the partial sums and traces the path of zeta.
In my rendering, I include all partial sums up to an automatically computed "N-critical" which is when the phase difference between two summands is less than pi (nyquist limit!), after which the behavior of the partial sums is monotonic. The clusters are like alias modes that go back and forth when the instantaneous frequency of the summands is between kpi and (k+1)pi, and the random walk section is where you only have one point per alias-mode. The green lines highlight a symmetry of the partial sums, where the clusters maintain symmetry with the random walk section, this symmetry is summarized pretty well in this paper: https://arxiv.org/pdf/1507.07631
I formed an intuitive signal processing interpretation of the Riemann Hypothesis many years ago, which I'll try to summarize briefly here. You can think of the Zeta function as a log-time sampler -- zeta(s) is the Laplace transform of sum(delta(t-ln n)) which samples at time t=(ln n) for integers n>0, a rapidly increasing sample rate. You can imagine this as an impulse response coming from a black box, and the impulse response can either be finite in energy or a power signal depending on the real parameter.
If you suppose that the energy sum(|1/s|^2) is finite (ie real(s) > 1/2), then the Riemann Hypothesis implies that the sum is non-zero. It is akin to saying that the logarithmic sampler cannot destroy information without being plugged-in.
For the longest time I thought the zeta curve was some kind of sophisticated equation, but it is astonishingly simple. The "magic" of the zeta zeros only happens because of the 1/2 term in the exponent of the equation below. Any change with this fraction, and the zeros do not converge.
You start with a line segment. You then draw another line segment that starts at the end of the previous line segment, and whose length is shorter than the previous segment. The length of any segment is (1/n)^(1/2) where n is the number of the segment. These segments approach a limit (think of Zeno's paradox).
Finally, you bend each segment by an angle alpha. Technically this angle is in imaginary space, but the visual in Cartesian space just looks like a spiral, where each bend adds an angle, like a bull whip, so that the whole curve spirals back around (after creating other, mesmerizing sub-spirals). Amazingly, this curve always intersects zero (per Riemann Hypothesis). As I mentioned in my other comment, it's very useful to see this curve in 3D space.
James Maynard appears regularly on Numberphile so if you'd like to hear some accessible mathematics from one of the authors of this paper I suggest you check it out:
Unfortunately, they award it on the 4n+2 years. As someone born on a 4n+0 year, I’ll have just 38 years, which is too severe a disadvantage for me to stomach, so I didn’t bother with it.
Yes, I don't think the Fields medal was intended as the "Nobel prize of mathematics", but since it was the biggest award that existed it got promoted as such despite its inequivalence. More recently there's the Abel prize, which tries to be a more direct Nobel prize analogue, but of course the Fields medal has a multi-decade head start in terms of promotion...
For anyone looking for an introduction to the Riemann Hypothesis that goes deeper than most videos but is still accessible to someone with a STEM degree I really enjoyed this video series [1] by zetamath.
I understood everything in Profesor Tao's OP up to the part about "controlling a key matrix of phases" so the videos must have taught me something!
Trying to imagine what it must feel like to have Terence Tao summarize your argument while mentioning that he'd tried something similar but failed.
"The arguments are largely Fourier analytic in nature. The first few steps are standard, and many analytic number theorists, including myself, who have attempted to break the Ingham bound, will recognize them; but they do a number of clever and unexpected maneuvers."
I haven't met him personally, but Tao's writing is very humble and very kind. He talks openly about trying things and not having them work out. And he writes in general a lot about tools and their limitations. I definitely recommend reading his blog.
I find that mathematicians tend to have the smallest egos -- eccentric as they may be. I think it's because the difficulty of mathematics reminds one of their fallibility.
In school, I typically found the math and physics teachers to be humbler than the others. Not always, but, I couldn't help but notice that trend.
Yes but it's Terence Tao. I mean, the set of living mathematicians is not well ordered on greatness but if it were, Terence Tao would be fairly close to the upper limit.
It must feel like meritocracy. Like when ranking, particularly in strict order, is not the norm - so Terrence Tao doesn't see himself "on top" of anything. Moreover it must imply some solid grounding and a good understanding of how someone's actions are not expected to be correlated with their reputation. This is especially the case where getting the results is a personal or strictly team effort, not a popularity contest.
It can be unexpeted for anyone that's operating in the regular business, corporate, VC and general academic landscape where politics rule while meritocracy is a feel good motivator while popularity is the real coin.
Also curious about the potential significance of a proof. The article is vague:
> (primes) are important for securing encrypted transmissions sent over the internet. And importantly, a multitude of mathematical papers take the Riemann hypothesis as a given. If this foundational assumption were proved correct, “many results that are believed to be true will be known to be true,” says mathematician Ken Ono of Emory University in Atlanta. “It’s a kind of mathematical oracle.”
Are there some obvious, known applications where a RH proof would have immediate practical effects? (beyond satisfaction and 'slightly better encryption').
Mathematics is sort of strange in this regard in that there's lots of work already done that assumes RH, so many of the consequences of the theorem itself are already worked out. And RH seems to be true on extensive numerical searches(no counterexamples found). So the theorem being true wouldn't be earth-shattering in and of itself.
It's more about the method used to prove the theorem, which might involve novel mathematics. And probably will in this case considering how long it's taking to prove RH. Since this method hasn't been found yet, it's hard to say what the consequences of it might be.
If we knew that the extended Riemann hypothesis was true, we could use the Miller test for deterministic primality testing in log(n)^4; the AKS test, which doesn't depend on RH, is lg(n)^6.
Do we care? Not for most applications -- doing a bunch of randomized Miller-Rabin tests is fine for most practical purposes. But it would be really nice to have a faster deterministic algorithm around. AKS isn't practical for anything; miller... Miiiiiggghtt be.
In the realm of applications, most engineers would say that our confidence in the RH (or more realistically, downstream consequences thereof) is high enough to treat it as true. The proof is, for applications, a formality (albeit a wide-ranging, consequential one!).
More likely, a proof of the Riemann Hypothesis would require new ideas, techniques, and math. It is probable that those devices would have broader reach.
The applications of expansions in math often work that way: as we forge through the jungle, the tools we develop to make our way through are more useful than the destination.
What are your opinions of all the theorems that rely on RH as an excluded middle?
Constructivists reject exmid, saying instead that a proof of "A or B" requires you to have in hand a proof of A or a proof of B. And nobody yet has a proof of RH nor a proof of ~RH. This is important in so-called incomplete logical systems, where some theorems are neither provable nor disprovable, and, therefore, exmid is an inadmissible axiom.
The whole point of provability is that it is purely syntactic process that could be verified in finite time. Ideally it would be the same as truth, but there are some caveats.
How do you know something is true if you don't have a proof? It all depends on you views on the philosophy of mathematics. Are there "true" statements that don't have proofs? Some say yes, there are platonic ideas that are true, even if they aren't provable. Others say, "what does it even mean to say something is true, if there is no proof. What you really have is a conjecture."
This comments section is very oddly filled with people who don't actually understand the subject matter but want to sound smart, and then accomplish the opposite. Let go of your insecurities people, it's ok to not understand some things and be open about it. Everyone doesn't understand more things than they do.
Apart from one flagged comment I find the comments to be quite profound and interesting, we even have a cool visualization demo of the Riemann zeta function:
The comment you link is two hours newer than mine. I also didn't say it's only that type of comment. But when I commented, there were several comments that were just completely off base or bordering on nonsensical due to not understanding the subject. This is a toxic phenomenon that we don't really address, and I thought was interesting to highlight and also share a positive message to those afflicted by the malady of insecurity. You can choose to read in whatever you want, but that was (pretty obviously) my intent.
This comments section is very oddly filled with people who don't actually understand the subject matter but want to sound smart, and then accomplish the opposite.
It might help you understand why the hypothesis is probably true. It renders the partial sums and traces the path of zeta.
In my rendering, I include all partial sums up to an automatically computed "N-critical" which is when the phase difference between two summands is less than pi (nyquist limit!), after which the behavior of the partial sums is monotonic. The clusters are like alias modes that go back and forth when the instantaneous frequency of the summands is between kpi and (k+1)pi, and the random walk section is where you only have one point per alias-mode. The green lines highlight a symmetry of the partial sums, where the clusters maintain symmetry with the random walk section, this symmetry is summarized pretty well in this paper: https://arxiv.org/pdf/1507.07631
If you suppose that the energy sum(|1/s|^2) is finite (ie real(s) > 1/2), then the Riemann Hypothesis implies that the sum is non-zero. It is akin to saying that the logarithmic sampler cannot destroy information without being plugged-in.
Mine is in Unity and shows the spiral in 3D, up the Y axis. I think it's helpful to see in three dimensions: https://github.com/atonalfreerider/riemann-zeta-visualizatio...
Still, it's funny to see how many people have attempted this! It's a fun programming exercise with a nice result.
You start with a line segment. You then draw another line segment that starts at the end of the previous line segment, and whose length is shorter than the previous segment. The length of any segment is (1/n)^(1/2) where n is the number of the segment. These segments approach a limit (think of Zeno's paradox).
Finally, you bend each segment by an angle alpha. Technically this angle is in imaginary space, but the visual in Cartesian space just looks like a spiral, where each bend adds an angle, like a bull whip, so that the whole curve spirals back around (after creating other, mesmerizing sub-spirals). Amazingly, this curve always intersects zero (per Riemann Hypothesis). As I mentioned in my other comment, it's very useful to see this curve in 3D space.https://www.youtube.com/playlist?list=PLt5AfwLFPxWJdwkdjaK1o...
Source: https://www.youtube.com/watch?v=eupAXdWPvX8&list=PLt5AfwLFPx...
Unfortunately, they award it on the 4n+2 years. As someone born on a 4n+0 year, I’ll have just 38 years, which is too severe a disadvantage for me to stomach, so I didn’t bother with it.
4n+2 people, you have no excuses.
I understood everything in Profesor Tao's OP up to the part about "controlling a key matrix of phases" so the videos must have taught me something!
[1] https://www.youtube.com/watch?v=oVaSA_b938U&list=PLbaA3qJlbE...
"The arguments are largely Fourier analytic in nature. The first few steps are standard, and many analytic number theorists, including myself, who have attempted to break the Ingham bound, will recognize them; but they do a number of clever and unexpected maneuvers."
In school, I typically found the math and physics teachers to be humbler than the others. Not always, but, I couldn't help but notice that trend.
[0]: https://en.wikipedia.org/wiki/Larry_Guth
[1]: https://en.wikipedia.org/wiki/James_Maynard_(mathematician)
It can be unexpeted for anyone that's operating in the regular business, corporate, VC and general academic landscape where politics rule while meritocracy is a feel good motivator while popularity is the real coin.
Deleted Comment
[1] https://www.sciencenews.org/article/why-we-care-riemann-hypo...
> (primes) are important for securing encrypted transmissions sent over the internet. And importantly, a multitude of mathematical papers take the Riemann hypothesis as a given. If this foundational assumption were proved correct, “many results that are believed to be true will be known to be true,” says mathematician Ken Ono of Emory University in Atlanta. “It’s a kind of mathematical oracle.”
Are there some obvious, known applications where a RH proof would have immediate practical effects? (beyond satisfaction and 'slightly better encryption').
It's more about the method used to prove the theorem, which might involve novel mathematics. And probably will in this case considering how long it's taking to prove RH. Since this method hasn't been found yet, it's hard to say what the consequences of it might be.
At least that's my layman's understand of it.
Do we care? Not for most applications -- doing a bunch of randomized Miller-Rabin tests is fine for most practical purposes. But it would be really nice to have a faster deterministic algorithm around. AKS isn't practical for anything; miller... Miiiiiggghtt be.
More likely, a proof of the Riemann Hypothesis would require new ideas, techniques, and math. It is probable that those devices would have broader reach.
The applications of expansions in math often work that way: as we forge through the jungle, the tools we develop to make our way through are more useful than the destination.
Constructivists reject exmid, saying instead that a proof of "A or B" requires you to have in hand a proof of A or a proof of B. And nobody yet has a proof of RH nor a proof of ~RH. This is important in so-called incomplete logical systems, where some theorems are neither provable nor disprovable, and, therefore, exmid is an inadmissible axiom.
If RH is unprovable one way or another, then certainly no counterexample can exist to the RH otherwise you could find it and prove the RH to be false.
Hence if RH is unprovable, it must be true. I suppose this uses logic outside the logical system that RH operates in.
https://news.ycombinator.com/item?id=40571995#40576767
Your comment is rather condescending and kind of feels like a form of projection rather than a meaningful contribution.
Just this one?