“Supernova (SN) cosmology is based on the key assumption that the luminosity standardization process of Type Ia SNe remains invariant with progenitor age. However, direct and extensive age measurements of SN host galaxies reveal a significant (5.5σ) correlation between standardized SN magnitude and progenitor age, which is expected to introduce a serious systematic bias with redshift in SN cosmology. This systematic bias is largely uncorrected by the commonly used mass-step correction, as progenitor age and host galaxy mass evolve very differently with redshift. After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) model” [1].
I know the team that did this. In fact i was listening to their seminar just a few days ago. They are very careful and have been working on this a long time. One caveat that they readily admit is that the sample used to create the luminosity age relation has some biases such as galaxy type and relatively lower redshift. They will be updating their results with the Rubin LSST data in the next few years.
Exciting times in cosmology after decades of a standard LCDM model.
Could you help me understand this sentence: "After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) model”?
I did a deep dive into cosmology simulations ~a year ago. It was striking how much is extrapolated from the brightness of small numbers of galaxy-surface pixels. I was looking at this for galaxies and stars, and observed something similar. The cosmology models are doing their best with sparse info, but to me it seemed that the predictions about things like Dark Matter and Dark Energy are presented in a way that's too confident for the underlying data. Not enough effort is spent trying to come up with new models. (Not to mention trying to shut down alternatives to Lambda CDM, or a better understanding of the consequences of GR, and the assumptions behind applying Newtonian instant-effect gravity in simulations).
Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
I think the most established models are doing their best with the data they have, but there is so much room for new areas of exploration based on questioning assumptions about the feeble measurements we can make from this pale blue dot.
That fuzziness can be quantified. It's called error bars. Whenever physicists perform a measurement, they always derive a confidence interval from the instruments they use. They take great care of accounting for the limits of each individual instrument, perform error propagation and report the uncertainty of the final result.
Consider figure 5 of the following article for example:
The differently shaded ellipses represent different confidence levels. For the largest ellipsis, the probability of the true values being outside of it is less than 1%. We call that 3-sigma confidence.
> Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
Well, then do some error analysis and report your results. Give us sigmas, percentages, probabilities. Science isn't based on gut feelings, but cold hard numbers.
> type Ia supernovae, long regarded as the universe’s "standard candles", are in fact strongly affected by the age of their progenitor stars.
A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
I'm really interested in the counterargument to this.
It could be a big discovery and it also aligns with the findings from DESI BAO [1] and by another Korean group using galaxy clustering to infer the expansion history [2].
I'm dumb and barely understand things at a high level, but standard candles never sat right with me so it's interesting to hear that they might not be, but then again who knows.
This is mostly my physics ignorance talking, but if we measure distance in space-time and not just space, and speed or velocity is space-time/time (which somehow are both relative to each other) and the derivative of velocity is acceleration, cant acceleration mean either expanding "faster" in the sense of distance OR time speeding up or slowing down? All of it seems so self referential its hard to wrap around.
We measure distance in space, and time intervals in time, and so velocity is just plain old distance/time. Special relativity doesn't change that. What changes is that if you start traveling at a different velocity, your measurements of distances and time intervals deviate.
The expansion rate of the universe is not a velocity in the usual sense of distance/time. It's actually in units of velocity/distance, which reduces to 1/time. An expansion rate of r Hertz means that a given span of distance intrinsically doubles every 1/r seconds. The objects occupying the space don't "move" in any real sense due to expansion. They just wind up farther apart because space itself grew.
And, just like measurements of distance and time, measurements of the expansion rate change if you change your velocity. There is a special velocity in our universe which causes the expansion in all directions to be the same. From this special perspective, which is traveling at a kind of cosmic "rest" velocity, you can calculate the expansion rate. It turns out that the Sun is traveling at approximately 370 km/s with respect to that special "rest" velocity.
Yes, it is the same thing, but since the objects are in free fall and there is no traditional force to cause the acceleration the better view point is that this is accelerated expansion of the universe. In a flat spacetime a forward light-cone can be identified with an expanding (no acceleration) universe where objects just fly away from a single point with constant but different speeds, i.e. an explosion. But in this model space as slice with the same local time after explosion is not flat. Also data seems to indicate that space is flat while space-time is curved on a large scale, so this picture is too simple.
Seems like the problem should be pretty easy to figure out. Just need to wait ~5 gigayears and see which model is right. I'm personally hoping for deceleration so that we have more total visitable volume.
I'll set a reminder to check back at that time to see who was right.
With 5 gigayears to work with I'm going to move a few star systems over, break down all the matter orbiting the star into a Dyson sphere made of computronium, and simulate visiting any world I could possibly ever want to.
Anyone know how credible this is? If true, then that means the big bounce is back on the menu, and the universe could actually be an infinitely oscillating system.
At least The Guardian has a comment from an independent expert:
"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. “It’s definitely interesting. It’s very provocative. It may well be wrong,” he said. “It’s not something that you can dismiss. They’ve put out a paper with tantalising results with very profound conclusions.”"
As an academic, that is exactly what the kind of noncommittal, don’t burn your bridges with colleagues and funding bodies thing that I would say about even clearly flawed research if I were put on the spot by a popular-press publication. In fact, if you know you can rebut flawed research in time, you might want to assist in hyping it first so that your rebuttal will then make a bigger splash and benefit your personal brand.
> If true, then that means the big bounce is back on the menu
I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.
AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.
> AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption.
Why would you assume this? It's not correct.
Type 1a supernovae aren't even assumed to be "standard candles" as is often claimed: rather, they're standardizable, i.e. with cross-checks and statistical analysis, they can be used as an important part of a cosmological distance ladder.
A great deal of analysis has gone into the development of that distance ladder, with cross-checks being used wherever it's possible to use them.
They look at surface brightness fluctuations in the same galaxies, Tully-Fisher distances[1], tip of the red giant branch distances[2], and even baryon acoustic oscillations[3]
Is it possible that this one single paper has upended all that? Theoretically. Is it likely? No.
“We can’t observe the whole universe, so cosmology is not really about the universe. It’s about the observable patch and the assumptions we make about the rest.”
(paraphrasing George Ellis)
We’re in a bounding sphere, with a radius that’s roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but there’s no (known) way to know what’s beyond that sphere.
The more we learn, the less we end up knowing about how "everything" works - some things are mathematical in nature and demonstrate absolutes, but frameworks shift, and complexify, and exceptions to things we thought absolutes have occurred throughout history.
For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.
For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.
Standard candles (all these measurements of redshift according to distance, need us to actually get the distance of what we are measuring right) are the gift that keeps on giving.
This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.
That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.
That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.
Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:
"Remarkably, this agrees with what is independently predicted from BAO-only or BAO+CMB analyses, though this fact has received little attention so far.""
Well, this part you mention, for instance "though this fact has received little attention so far".
A change on the standard candles calibration would be a huge deal for cosmology and galactic astronomy (and other fields) and would not be taken lightly at all. There are all sorts of ramifications from this and if astronomers aren't all in an oof about it, it is because big proof is needed for big claims.
And a change in the standard candles calibration is indeed a very big claim.
I would not be surprised if the universe was somewhat elastic, expands and then contracts and then expands ad infinitam.
After all, existence in itself is irrefutable and cannot not exist by definition.
If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws.
Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc...
Everything is mostly interacting data.
We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.
Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?
You have a material view of existence perhaps.
How would the notion of nothingness even exist if there was no existence in the first place?
And if we even accepted that nothing was possible, which in itself doesn't make any sense, how would something even start to exist?
Well the contradiction is already in the fact that there is a preexisting concept of nothing in the first place.
Existence is impredicative too.
It defines itself. That's a fact.
It is not because it is impredicative that it needs to be hard to understand I think. It's almost a tautology rather.
Oh by the way, forgniz exist, you made it to design something. It doesn't have to refer to something material. It could be an idea. After all, inventions don't exist by being material in the first place.
But idea have at least material support (your brain signals) and the data acquired through your body. As far as we know.
You’re being downvoted, but your point is true — something can exist “by definition”, and yet not exist in our real world. The thing that exists “by definition” is just a version that we have imagined to exist by definition. But imagining something with property X doesn’t imply anything can actually be found with property X.
Side-note: the deontological argument is an argument for the existence of God, which uses the same principle as the grandparent. “Imagine God. Imagine God is good. A good God should exist, because otherwise that god is not good. Therefore, the good God we imagined has the property of existence. Therefore God exists”. The issue is exactly the same — we can imagine something with property X, but that doesn’t mean we can find something with property X
I think they mean existence in general, not the existence of any specific thing. Meaning that if there were no “existence” then we wouldn’t be here to consider its nonexistence.
Eventually we will find that the heat death of the universe and the big bang are the same thing, since the totality of the universe is always a oneness, then from the universal perspective the infinitely small and infinitely large are the same thing (one), then they by nature bleed into (and define) each other like yin and yang.
A funny coincidence is that the solar system was formed 4.6 billion years ago which is exactly when the universe's rate of expansion peaked according to figure 3.
If you want to believe in an intelligent creator—not that I do—it's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.
As a non-scientist I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime, and the compounding of error at each rung of the ladder. Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB can also have problems if our assumptions about it are wrong. A major goal of having two methods is that they should coalesce to the same result within margin of error - that they didn't told us we were missing something.
> I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime
Stars are just basic nuclear physics and gravity, that's why they're expected to be stable and predictable.
> Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB doesn't tell you anything on its own, you have to interpret the data in terms of a model. If you have a completely different model, say one without dark energy or without dark matter, CMB measurements would tell you something different than LCDM.
The supernovas type Ia luminosity depends on their composition and that takes into account both the age of the supernova and of the donator star. And that can be inferred by the luminosity curve of the supernova.
[1] https://academic.oup.com/mnras/article/544/1/975/8281988?log...
Exciting times in cosmology after decades of a standard LCDM model.
Could you help me understand this sentence: "After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) model”?
Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
I think the most established models are doing their best with the data they have, but there is so much room for new areas of exploration based on questioning assumptions about the feeble measurements we can make from this pale blue dot.
Consider figure 5 of the following article for example:
https://arxiv.org/abs/1105.3470
The differently shaded ellipses represent different confidence levels. For the largest ellipsis, the probability of the true values being outside of it is less than 1%. We call that 3-sigma confidence.
> Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
Well, then do some error analysis and report your results. Give us sigmas, percentages, probabilities. Science isn't based on gut feelings, but cold hard numbers.
A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
I'm really interested in the counterargument to this.
[1] https://arxiv.org/abs/2404.03002
[2] https://arxiv.org/abs/2305.00206
The expansion rate of the universe is not a velocity in the usual sense of distance/time. It's actually in units of velocity/distance, which reduces to 1/time. An expansion rate of r Hertz means that a given span of distance intrinsically doubles every 1/r seconds. The objects occupying the space don't "move" in any real sense due to expansion. They just wind up farther apart because space itself grew.
And, just like measurements of distance and time, measurements of the expansion rate change if you change your velocity. There is a special velocity in our universe which causes the expansion in all directions to be the same. From this special perspective, which is traveling at a kind of cosmic "rest" velocity, you can calculate the expansion rate. It turns out that the Sun is traveling at approximately 370 km/s with respect to that special "rest" velocity.
Indeed. It's so hard to definitively prove things that are, that the most significant breakthroughs prove things that aren't (so to speak), imho.
Significant breakthroughs do both. Prove things aren’t as we thought. And are as the new model suggests.
I'll set a reminder to check back at that time to see who was right.
With 5 gigayears to work with I'm going to move a few star systems over, break down all the matter orbiting the star into a Dyson sphere made of computronium, and simulate visiting any world I could possibly ever want to.
"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. “It’s definitely interesting. It’s very provocative. It may well be wrong,” he said. “It’s not something that you can dismiss. They’ve put out a paper with tantalising results with very profound conclusions.”"
https://www.theguardian.com/science/2025/nov/06/universe-exp...
I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.
AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.
Why would you assume this? It's not correct.
Type 1a supernovae aren't even assumed to be "standard candles" as is often claimed: rather, they're standardizable, i.e. with cross-checks and statistical analysis, they can be used as an important part of a cosmological distance ladder.
A great deal of analysis has gone into the development of that distance ladder, with cross-checks being used wherever it's possible to use them.
They look at surface brightness fluctuations in the same galaxies, Tully-Fisher distances[1], tip of the red giant branch distances[2], and even baryon acoustic oscillations[3]
Is it possible that this one single paper has upended all that? Theoretically. Is it likely? No.
[1] https://en.wikipedia.org/wiki/Tully%E2%80%93Fisher_relation
[2] https://en.wikipedia.org/wiki/Tip_of_the_red-giant_branch
[3] https://en.wikipedia.org/wiki/Baryon_acoustic_oscillations
(paraphrasing George Ellis)
We’re in a bounding sphere, with a radius that’s roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but there’s no (known) way to know what’s beyond that sphere.
For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.
For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.
https://arxiv.org/abs/1010.5513
This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.
That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.
That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.
Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:
A change on the standard candles calibration would be a huge deal for cosmology and galactic astronomy (and other fields) and would not be taken lightly at all. There are all sorts of ramifications from this and if astronomers aren't all in an oof about it, it is because big proof is needed for big claims.
And a change in the standard candles calibration is indeed a very big claim.
If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws. Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc... Everything is mostly interacting data.
We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.
Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?
I can say the same about forgnoz, which is something I've just invented that must exist by definition.
You'd need to try a bit harder to make existence actually inevitable.
It is not because it is impredicative that it needs to be hard to understand I think. It's almost a tautology rather.
Oh by the way, forgniz exist, you made it to design something. It doesn't have to refer to something material. It could be an idea. After all, inventions don't exist by being material in the first place. But idea have at least material support (your brain signals) and the data acquired through your body. As far as we know.
Side-note: the deontological argument is an argument for the existence of God, which uses the same principle as the grandparent. “Imagine God. Imagine God is good. A good God should exist, because otherwise that god is not good. Therefore, the good God we imagined has the property of existence. Therefore God exists”. The issue is exactly the same — we can imagine something with property X, but that doesn’t mean we can find something with property X
Universe gong.
If you want to believe in an intelligent creator—not that I do—it's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.
But wavering around a line above y = 0.
Dead Comment
https://en.wikipedia.org/wiki/Cosmic_distance_ladder
Stars are just basic nuclear physics and gravity, that's why they're expected to be stable and predictable.
> Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB doesn't tell you anything on its own, you have to interpret the data in terms of a model. If you have a completely different model, say one without dark energy or without dark matter, CMB measurements would tell you something different than LCDM.
https://en.wikipedia.org/wiki/Type_Ia_supernova