I am so tired of seeing massless neutrinos being described as a "prediction" of the Standard Model, and finite neutrino masses as beyond-Standard Model physics or even a "mystery". This is especially disappointing coming from a supposedly serious magazine like Symmetry.
Neutrinos were originally hypothesized in order to solve a problem which did not require them to have mass, and for a long time after they were actually observed, their measured masses remained within error bars straddling zero. It therefore made perfect sense to model them as massless.
But to actually include neutrino masses in the Standard Model is trivial, and was done long ago.
The most straightforward way to do it is to give them quark-like mass terms. This requires introducing a right-handed partner for each known (left-handed) neutrino, which some people don't like because right-handed particles don't partake in weak interactions, and weak interactions are the only (known) neutrino interactions (apart from gravity), so you end up with undetectable particles.
The main alternative is to use Majorana mass terms, making neutrinos their own anti-particles, which some people don't like because it deviates from the pattern of all other fermions in the Standard Model.
A third way is to say "it's both", typically involving the seesaw mechanism, which some people don't like because it requires unfashionable GUT-style beyond-Standard Model physics.
Point is, there is neither a failed "prediction" nor a great "mystery" here. There is uncertainty about which kind of mass term we should use for neutrinos, because the experimentally observable differences between the alternatives are really, really tiny.
> This requires introducing a right-handed partner for each known (left-handed) neutrino, which some people don't like because right-handed particles don't partake in weak interactions, and weak interactions are the only (known) neutrino interactions (apart from gravity), so you end up with undetectable particles.
For anyone else wondering: yes, this does make said right-handed (aka sterile) neutrinos a candidate for dark matter, assuming some of them are much heavier than their left-handed counter parts to account for the 'cold' properties of dark matter that we observe.
> This requires introducing a right-handed partner for each known (left-handed) neutrino, which some people don't like because right-handed particles don't partake in weak interactions, and weak interactions are the only (known) neutrino interactions (apart from gravity), so you end up with undetectable particles.
I'd be curious to hear more about this. What mechanism forces you to add right-handed partners? Is it some conservation property? Are calculations too difficult without them?
Maybe it's also not clear if there really is a difference between saying "right-handed neutrinos exist but are undetectable" and "right-handed neutrinos are fake particles added for ease of modeling".
Edit: Wikipedia also says
> The neutral-current Z^0 interaction can cause any two fermions in the standard model to deflect: Either particles or anti-particles, with any electric charge, and both left- and right-chirality, although the strength of the interaction differs.
So could right-handed neutrinos be detected this way?
So you can't have one without both. (BTW, the linked section says "there are no Dirac mass terms in the Standard Model's Lagrangian", but it should really say that the Dirac mass terms in the Standard Model Lagrangian arise as a consequence of the Higgs mechanism.)
> Wikipedia also says
I can't find that quote, but I guess it's about experimentally observed particles. It would not apply to a right-handed neutrino:
I think you're correct to say a lot of people simplify what the problem is with neutrino mass. In principle it seems like there is no problem, you just add a mass term for the neutrino just like any other particle. Just b/c at first we didn't expect that term to be there doesn't mean it's a problem to now, or that original expectation was all that meaningful. And again, as you point out, there are a couple of potential ways to add that mass term in, either the "normal" way with a right handed neutrino, or with some fancy see-saw majorana term, or some combination thereof.
The issue is though, right now the standard model is at least ambiguous in terms of the majorana mass term. If it ends up the neutrino gets its mass from only the "normal" mass term, then why doesn't it have a majorana mass term? There's no current symmetry that says there can't be a majorana mass? If the neutrino's majorana mass is zero, then you'd probably have to introduce a symmetry into the standard model that says majorana particles can't exist.
But if the neutrino does end up having a non-zero majorana mass term then that means the neutrino is a majorana particle, and can undergo lepton number violating processes (e.g. neutrinoless double beta decay). Again, that's new physics.
So no matter how you give the neutrino mass, you're gonna have to modify the standard model in some "significant" way to accommodate. Either by specifically saying majorana particles can't exist, or by allowing for lepton number violating processes.
Now you could say, well then it might the case that majorana particles don't exist b/c that would require lepton number violating processes, so I don't need to introduce a new symmetry, I can just take advantage of one that's already lying around. That might be a valid claim to make...I'm not sure. I think the issue with that comes down to the difference between lepton number a global vs accidental symmetry in the standard model.
Honest question from a physics novice: Would it be wrong to say the same is roughly true of all of the first 4?
I have heard it is plausible that Dark Matter is merely another particle that fits in the standard model. I've heard it plausible that Dark Energy is e.g. a WIMP or other new particle in the standard model. And, on more-matter-than-antimatter, I imagine _some_ explanations of baryon asymmetry could come from outside the standard model but others (boundary condition, mirror anti-universe) would be fully standard-model-compatible, right?
That would leave only #5 as a mystery: Why is gravity as we know it in general relativity so different (weaker) than the force that a standard-model graviton would predict?
> I have heard it is plausible that Dark Matter is merely another particle that fits in the standard model. I've heard it plausible that Dark Energy is e.g. a WIMP or other new particle in the standard model.
Whoever told you that was getting dark matter and dark energy mixed up.
Dark matter could be "another particle", possibly a sterile neutrino:
If I understand this correctly, the "right hand" means antiparticles? Do I have that right?
And if so, then I have two questions.
1. Antiparticles don't participate in the weak force? So if I had antimatter, and I made a nucleus of some kind, then it couldn't beta decay? If so, does this say anything about the matter/antimatter asymmetry in the universe?
2. At various times, I have seen references to anti-neutrinos. They seemed to say that what made them "anti" was simply that the spin was in the opposite direction relative to the direction of motion compared to a "regular" neutrino. Were they wrong? And if they were right, what about the direction of spin should make them unable to participate in the weak force?
b) that something beyond the standard model is happening, as in your second and third points.
There’s a failed “prediction” that all fermions have similar mass terms, and that failure suggests either something strange (undetectable particles), something strange (fermions without consistent mass mechanisms) or something strange (novel physics).
"either something strange (undetectable particles)" There is nothing strange here. Undetectable in this context means it would be only detectable by gravitation detectors - all masses/energies have gravitation changes.
One other mystery not mentioned is the problem of fine tuning. The standard model requires certain parameters (like alpha, the fine structure constant) to have their current values accurate to many orders of magnitude for the universe as we know it to exist. There are two philosophical schools of thought about that -- (a) we're in the universe we're in, so by definition it must exist and there's a selection bias there; or (b) there is an underlying detailed structure that gives the values of supposed 'fundamental' quantities their shape as an emergent property of something more beautiful – and thus they're not "free" at all. This is one of the things that SUSY was supposed to solve – but it's been experimentally found to not really exist by the LHC. A good introduction about this (in the context of the Higgs mass, where the need for fine tuning is really apparent) is here: https://www.physicsmatt.com/blog/2016/11/17/paper-explainer-...
“This is rather as if you imagine a puddle waking up one morning and thinking, 'This is an interesting world I find myself in — an interesting hole I find myself in — fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!' This is such a powerful idea that as the sun rises in the sky and the air heats up and as, gradually, the puddle gets smaller and smaller, frantically hanging on to the notion that everything's going to be alright, because this world was meant to have him in it, was built to have him in it; so the moment he disappears catches him rather by surprise. I think this may be something we need to be on the watch out for.”
The puddle being in a hole implies a world outside of the hole. If the puddle has no means to perceive the outside world except a careful inspection of its own bounds, isn't that an interesting mystery?
To me the fine tuning problem is a bit like saying "circles wouldn't exist without π being exactly the value it is" and wondering why π has that value specifically and not any other value.
I don't think A and B are mutually exclusive. The values seem perfectly tuned for our universe because we exist in our universe, and they are probably emergent from a more fundamental parameter, possibly something like the particular Calabi-Yau manifold topology that happens to correspond to our universe (in the case of superstring theory). If we lived in a different CY topology that was capable of supporting intelligent life then we'd wonder why that one's constants are so precisely tuned for us.
But then, I'm and idiot who just watches PBS Space Time and nods his head, not a physicist.
That’s not that great analogy for fine tuning. The quantized representative value of Pi is arbitrary based our numerical system, the relationship that derives it is fixed if the relationship would change circles indeed would not exist.
Let’s move the analogy to triangles a triangle has 180 degrees but that only holds true when it’s on a flat surface if you have a curvature it can have more or less than 180 degrees.
So this isn’t a fine tuning problem on its own, if you would lived in a universe where triangles have less or more than 180 degrees it would represent a universe with negative or positive curvature.
The issue with fine tuning is that the curvature of the universe is directly tied to the mass/energy density and any deviation from an extremely narrow range which our universe seems to sit in out of all possible values would not just produce a universe with triangles with fewer or more degrees than 180 but would either produce a universe that would collapse on itself within a blink of an eye or expand so fast that gravity would never be strong enough to cause even the most basic structures to form.
So the issue really is that to produce a universe which will form galaxies and stars and survive long enough to produce life you need a lot of parameters at a certain very specific value even the smallest of deviations would not produce a universe that would ever support life, yet alone an intelligent life. What’s even stranger iirc is that the values have to be really what they are now you can’t simply 2X all of them to maintain the proportions and get the same result.
And this is really what people are looking to solve, yes the androcentric is a solution if they were anything than what they are now we wouldn’t be here to discuss why, but the issue is that out of all other possible combinations you don’t seem to find another stable state and that is the true mystery.
According to Leonard Susskind[1], fine tuning is a compelling argument by itself[2], and the strongest case is the cosmological constant.[3] In a nutshell it is a sort of repelling force first proposed by Einstein to create a workable model for a static universe who later regretted it as one of the biggest mistakes in his career. However, the theory is now back with Nobel prize winning research showing that expansion is accelerating, which would require a positive number. It could explain a large portion of "dark matter."
When expressed in one way, it is 10^-122 "units of the square Planck length". I'm not smart enough to completely understand it, but it is (according to physicists) an incredibly precise ingredient in the various properties of physics that make our Universe possible. Any larger or smaller and the model falls apart. If it is an accident, that is one hell of a lottery ticket.
> and wondering why π has that value specifically and not any other value.
Maybe the question should be "why circles" then. Early models of the solar system suffered under the assumption that everything should be modeled using circles when ellipses modeled everything better.
It's an explanatory theory with a lot - a lot - of gaps.
It has been extended with some nice predictions like the Higgs. But basically it's a Franken-patchwork of math glued together quite awkwardly.
Because that is what it is. Literally. It was developed by thousands of grad students and their supervisors throwing math at the wall and keeping anything that matched observations. So there was a lot of random searching involved.
What's missing is a central guiding metaphor.
Relativity has one. In comparison, the Standard Model is very epicycle-ish tool for calculating Lagrangians with plenty of "Yes but" and "Except when".
The stop squark is one of the sfermions (the superpartner particles of their associated fermions). As such they are all sparticles. Some of the other sfermions would be the sup squark, the scharm squark, the sstrange squark, the selectron or the stau. [1]
In my opinion physicists are great at naming things :D
Those supersymmetric particles have the disadvantage of not having any evidence they exist. I am sad for all those physics grad students who went into supersymmetry and string theory.
> (a) we're in the universe we're in, so by definition it must exist and there's a selection bias there;
This has basically 3 possible explanations - insane coincidence, multiverse (with many versions of constants; certain forms of mathematicism also provide such "multiverse") or "reason" (simulation admin/God).
somewhat related, I think, is the enormous disparity between the strength of the strong/week/electromagnetic forces and gravity (30 or 40 orders of magnitude); possible indication that there's something missing
Why is this a problem? If the other three forces were perfectly equal, or in regular intervals, maybe this would have meant something, but the other three forces' strength varies by x, y, z between them. Other than human intuition, there is nothing inherently different between x = 2 and x = 10^30-40.
It's only framed as a problem when it is assumed that the values could be other than what they are. But we have no reason to suspect that they could be different. We have no idea how unlikely the current selection is given our observation is a single observable. For all we know, it's certain.
'What really interests me is whether God had any choice in creation' - Albert Einstein
The problem is that from an experimental / observational technologies perspective we have been for many decades now in some sort of "evidence desert" that pushes against fundamental technology boundaries and that is not conducive to solving big "mysteries".
Unifications, re-interpretations, new conceptualizations (new forces etc) are the mental tools through which we solve previous "mysteries" (and create new ones). Right now there are alive more physicists than ever and even a tiny piece of important news could lead to a revolution - in like a couple of years. But what you really want is a firehose of new data points, "a new window". This has not happened and it may not happen for generations (for the attentive reader: gravitational waves are at the very, very edge of the detectable).
As Feynman might say, the Universe doesn't owe us a continuous stream of gee-wow moments
But if I had to bet where the breakthrough might come from I'd say it would probably be cosmology rather than elementary particles...
I've recently heard a rather interesting and optimistic take on this. Since we have had so many brilliant minds looking in so many places for new physics and still have not seen evidence of it, that suggests whenever we do find new physics, it will have to be so bafflingly strange that all these brilliant people could never imagine it. It may very well be a bigger paradigm shift than the jump from classical to modern physics.
Yep, that makes sense. It doesn't give us a timescale for when such a "jump" might happen but suggests that it could be "big" in the context of our heretofore discoveries
My best guess at timescale (following up on the cosmology theme) has to do with our rate of utilizing the inner solar system as a clean and quiet laboratory for ultra sensitive observations and experiments (whether LISA) or extremely sensitive telescopes or any other probes.
It seems crazy to say that we have not seen evidence for new physics when the size estimates of dark matter and dark energy account for about 95% of known energy in the observable universe. How is that not evidence for new physics?
If our physical theories cover only about ~5% of what we observe… that seems like a bit of an issue, no matter how accurately they model that 5%.
IMO, the breakthru will happen once they finish building that super collider that will find glimpses of the tiny 4th dimension (that Kaluza-Klein hypothesis). With that will come a whole new standard model worth of 4d particles and phycists will be busy for another century.
2, 3 and 4 are not Standard Model problems, but cosmological problems. There are cosmological theories that explain them by using only General Relativity without any changes to particle physics. For example, you can check out Nick Gorkavyi's cosmological papers:
I would add, why do certain particles decay into other particles? For example, the tau particle contains nothing else, as far as we can tell, yet it decays into certain other particles (and not the same ones every time).
More generally, the standard model records a lot of particles and things that happen, but not why those instead of others. I suspect there's a simple model underneath, but I have no idea what it is.
As a physicist I'd say these things are well understood.
- Why do certain particles decay into other particles?
Quantum mechanics is totalitarian: what ever is not forbidden is mandatory. Forbidden: excluded by some symmetry principle (violation of a conserved quantity, like energy, angular momentum, ...)
- The tau decays into certain other particles (and not the same ones every time).
Tau carries electric charge, fermion number, angular momentum. The decay products' total quantum numbers match that of the tau. But the quantum-mechanical totalitarian principle says that every possible combination that satisfies that constraint happens with some amplitude.
- The standard model records a lot of particles and things that happen, but not why those instead of others. I suspect there's a simple model underneath, but I have no idea what it is.
If by 'those' and 'others' you mean all the varied observed phenomena, then yes, there is a simple model underneath and it IS the standard model. If by 'those' and 'others' you mean 'why is the SM the way it is', that's (likely) an out-of-bounds question for the SM in the first place. But a modern perspective on the SM is to think of it as a low-energy effective field theory anyway.
I’d add to this, that we know very little about the spatial distribution of nuclear matter at and below the scale of a nucleus. The standard model excels at understanding the salient characteristics of asymptotic states before and after an interaction, things like spin, lepton number, etc. But we still can’t tell you how gluons are distributed inside the proton.
The neutrino mass thing is much less a surprise than the rest. Neutrino mass always was quite possible as an optional add-on. Basically, the situation for the quarks, where all six have mass, can be copied to apply to the leptons as well.
They forgot at least one mystery: What determines the fermion mass spectrum and specific mass vlaues? Why do these particle masses take such seemingly random values?
Related: Why are these masses so much smaller than the Planck mass? For comparison, the electric charge is sqrt(alpha_em) or ~1/11th the value of the Planck charge.
Neutrinos were originally hypothesized in order to solve a problem which did not require them to have mass, and for a long time after they were actually observed, their measured masses remained within error bars straddling zero. It therefore made perfect sense to model them as massless.
But to actually include neutrino masses in the Standard Model is trivial, and was done long ago.
The most straightforward way to do it is to give them quark-like mass terms. This requires introducing a right-handed partner for each known (left-handed) neutrino, which some people don't like because right-handed particles don't partake in weak interactions, and weak interactions are the only (known) neutrino interactions (apart from gravity), so you end up with undetectable particles.
The main alternative is to use Majorana mass terms, making neutrinos their own anti-particles, which some people don't like because it deviates from the pattern of all other fermions in the Standard Model.
A third way is to say "it's both", typically involving the seesaw mechanism, which some people don't like because it requires unfashionable GUT-style beyond-Standard Model physics.
Point is, there is neither a failed "prediction" nor a great "mystery" here. There is uncertainty about which kind of mass term we should use for neutrinos, because the experimentally observable differences between the alternatives are really, really tiny.
For anyone else wondering: yes, this does make said right-handed (aka sterile) neutrinos a candidate for dark matter, assuming some of them are much heavier than their left-handed counter parts to account for the 'cold' properties of dark matter that we observe.
I'd be curious to hear more about this. What mechanism forces you to add right-handed partners? Is it some conservation property? Are calculations too difficult without them?
Maybe it's also not clear if there really is a difference between saying "right-handed neutrinos exist but are undetectable" and "right-handed neutrinos are fake particles added for ease of modeling".
Edit: Wikipedia also says
> The neutral-current Z^0 interaction can cause any two fermions in the standard model to deflect: Either particles or anti-particles, with any electric charge, and both left- and right-chirality, although the strength of the interaction differs.
So could right-handed neutrinos be detected this way?
A Dirac mass term (the kind used for all other Standard Model fermions) involves both left-handed and right-handed particles:
https://en.wikipedia.org/wiki/Sterile_neutrino#Mass
So you can't have one without both. (BTW, the linked section says "there are no Dirac mass terms in the Standard Model's Lagrangian", but it should really say that the Dirac mass terms in the Standard Model Lagrangian arise as a consequence of the Higgs mechanism.)
> Wikipedia also says
I can't find that quote, but I guess it's about experimentally observed particles. It would not apply to a right-handed neutrino:
https://en.wikipedia.org/wiki/Sterile_neutrino#Properties
The issue is though, right now the standard model is at least ambiguous in terms of the majorana mass term. If it ends up the neutrino gets its mass from only the "normal" mass term, then why doesn't it have a majorana mass term? There's no current symmetry that says there can't be a majorana mass? If the neutrino's majorana mass is zero, then you'd probably have to introduce a symmetry into the standard model that says majorana particles can't exist.
But if the neutrino does end up having a non-zero majorana mass term then that means the neutrino is a majorana particle, and can undergo lepton number violating processes (e.g. neutrinoless double beta decay). Again, that's new physics.
So no matter how you give the neutrino mass, you're gonna have to modify the standard model in some "significant" way to accommodate. Either by specifically saying majorana particles can't exist, or by allowing for lepton number violating processes.
Now you could say, well then it might the case that majorana particles don't exist b/c that would require lepton number violating processes, so I don't need to introduce a new symmetry, I can just take advantage of one that's already lying around. That might be a valid claim to make...I'm not sure. I think the issue with that comes down to the difference between lepton number a global vs accidental symmetry in the standard model.
I have heard it is plausible that Dark Matter is merely another particle that fits in the standard model. I've heard it plausible that Dark Energy is e.g. a WIMP or other new particle in the standard model. And, on more-matter-than-antimatter, I imagine _some_ explanations of baryon asymmetry could come from outside the standard model but others (boundary condition, mirror anti-universe) would be fully standard-model-compatible, right?
That would leave only #5 as a mystery: Why is gravity as we know it in general relativity so different (weaker) than the force that a standard-model graviton would predict?
Whoever told you that was getting dark matter and dark energy mixed up.
Dark matter could be "another particle", possibly a sterile neutrino:
https://en.wikipedia.org/wiki/Sterile_neutrino#Sterile_neutr...
Dark energy however is definitely something else. WIMPs in particular are hypothetical dark matter particles:
https://en.wikipedia.org/wiki/Weakly_interacting_massive_par...
And if so, then I have two questions.
1. Antiparticles don't participate in the weak force? So if I had antimatter, and I made a nucleus of some kind, then it couldn't beta decay? If so, does this say anything about the matter/antimatter asymmetry in the universe?
2. At various times, I have seen references to anti-neutrinos. They seemed to say that what made them "anti" was simply that the spin was in the opposite direction relative to the direction of motion compared to a "regular" neutrino. Were they wrong? And if they were right, what about the direction of spin should make them unable to participate in the weak force?
No, it's about chirality:
https://en.wikipedia.org/wiki/Chirality_(physics)
We know that either:
a) there’s mysterious undetectable particles, or
b) that something beyond the standard model is happening, as in your second and third points.
There’s a failed “prediction” that all fermions have similar mass terms, and that failure suggests either something strange (undetectable particles), something strange (fermions without consistent mass mechanisms) or something strange (novel physics).
I think that qualifies as a mystery.
--Douglas Adams
I don't think A and B are mutually exclusive. The values seem perfectly tuned for our universe because we exist in our universe, and they are probably emergent from a more fundamental parameter, possibly something like the particular Calabi-Yau manifold topology that happens to correspond to our universe (in the case of superstring theory). If we lived in a different CY topology that was capable of supporting intelligent life then we'd wonder why that one's constants are so precisely tuned for us.
But then, I'm and idiot who just watches PBS Space Time and nods his head, not a physicist.
Let’s move the analogy to triangles a triangle has 180 degrees but that only holds true when it’s on a flat surface if you have a curvature it can have more or less than 180 degrees.
So this isn’t a fine tuning problem on its own, if you would lived in a universe where triangles have less or more than 180 degrees it would represent a universe with negative or positive curvature.
The issue with fine tuning is that the curvature of the universe is directly tied to the mass/energy density and any deviation from an extremely narrow range which our universe seems to sit in out of all possible values would not just produce a universe with triangles with fewer or more degrees than 180 but would either produce a universe that would collapse on itself within a blink of an eye or expand so fast that gravity would never be strong enough to cause even the most basic structures to form.
So the issue really is that to produce a universe which will form galaxies and stars and survive long enough to produce life you need a lot of parameters at a certain very specific value even the smallest of deviations would not produce a universe that would ever support life, yet alone an intelligent life. What’s even stranger iirc is that the values have to be really what they are now you can’t simply 2X all of them to maintain the proportions and get the same result.
And this is really what people are looking to solve, yes the androcentric is a solution if they were anything than what they are now we wouldn’t be here to discuss why, but the issue is that out of all other possible combinations you don’t seem to find another stable state and that is the true mystery.
When expressed in one way, it is 10^-122 "units of the square Planck length". I'm not smart enough to completely understand it, but it is (according to physicists) an incredibly precise ingredient in the various properties of physics that make our Universe possible. Any larger or smaller and the model falls apart. If it is an accident, that is one hell of a lottery ticket.
[1] https://en.wikipedia.org/wiki/Leonard_Susskind
[2] https://www.closertotruth.com/interviews/3081
[3] https://en.wikipedia.org/wiki/Cosmological_constant
Maybe the question should be "why circles" then. Early models of the solar system suffered under the assumption that everything should be modeled using circles when ellipses modeled everything better.
It has been extended with some nice predictions like the Higgs. But basically it's a Franken-patchwork of math glued together quite awkwardly.
Because that is what it is. Literally. It was developed by thousands of grad students and their supervisors throwing math at the wall and keeping anything that matched observations. So there was a lot of random searching involved.
What's missing is a central guiding metaphor.
Relativity has one. In comparison, the Standard Model is very epicycle-ish tool for calculating Lagrangians with plenty of "Yes but" and "Except when".
"The partner of the Higgs, the higgsino."
"The partner of the top quark, the stop squark."
In my opinion physicists are great at naming things :D
1: https://en.wikipedia.org/wiki/Sfermion
This has basically 3 possible explanations - insane coincidence, multiverse (with many versions of constants; certain forms of mathematicism also provide such "multiverse") or "reason" (simulation admin/God).
'What really interests me is whether God had any choice in creation' - Albert Einstein
https://www.bretthall.org/fine-structure.html
Unifications, re-interpretations, new conceptualizations (new forces etc) are the mental tools through which we solve previous "mysteries" (and create new ones). Right now there are alive more physicists than ever and even a tiny piece of important news could lead to a revolution - in like a couple of years. But what you really want is a firehose of new data points, "a new window". This has not happened and it may not happen for generations (for the attentive reader: gravitational waves are at the very, very edge of the detectable).
As Feynman might say, the Universe doesn't owe us a continuous stream of gee-wow moments
But if I had to bet where the breakthrough might come from I'd say it would probably be cosmology rather than elementary particles...
My best guess at timescale (following up on the cosmology theme) has to do with our rate of utilizing the inner solar system as a clean and quiet laboratory for ultra sensitive observations and experiments (whether LISA) or extremely sensitive telescopes or any other probes.
So be patient for a few more decades :-)
If our physical theories cover only about ~5% of what we observe… that seems like a bit of an issue, no matter how accurately they model that 5%.
https://pos.sissa.it/335/039/
https://academic.oup.com/mnras/article/476/1/1384/4848298
https://academic.oup.com/mnras/article/461/3/2929/2608669
https://arxiv.org/abs/2110.10218
https://www.sao.ru/Doc-k8/Science/Public/Bulletin/Vol76/N3/A... (this one is available only in Russian for now)
More generally, the standard model records a lot of particles and things that happen, but not why those instead of others. I suspect there's a simple model underneath, but I have no idea what it is.
- Why do certain particles decay into other particles?
Quantum mechanics is totalitarian: what ever is not forbidden is mandatory. Forbidden: excluded by some symmetry principle (violation of a conserved quantity, like energy, angular momentum, ...)
- The tau decays into certain other particles (and not the same ones every time).
Tau carries electric charge, fermion number, angular momentum. The decay products' total quantum numbers match that of the tau. But the quantum-mechanical totalitarian principle says that every possible combination that satisfies that constraint happens with some amplitude.
- The standard model records a lot of particles and things that happen, but not why those instead of others. I suspect there's a simple model underneath, but I have no idea what it is.
If by 'those' and 'others' you mean all the varied observed phenomena, then yes, there is a simple model underneath and it IS the standard model. If by 'those' and 'others' you mean 'why is the SM the way it is', that's (likely) an out-of-bounds question for the SM in the first place. But a modern perspective on the SM is to think of it as a low-energy effective field theory anyway.
http://backreaction.blogspot.com/2021/11/why-can-elementary-...
Related: Why are these masses so much smaller than the Planck mass? For comparison, the electric charge is sqrt(alpha_em) or ~1/11th the value of the Planck charge.