Readit News logoReadit News
culi · 2 years ago
> Since its creation, STEM has evolved to cover a broader set of key air pollutants, such as ozone and particulate matter,

I've always found air quality metrics focusing on particle size confusing. E.g. PM2.5 (which describes fine inhalable particles, with diameters that are generally 2.5 micrometers and smaller)

But surely there must be a massive qualitative difference between particulate matter of, e.g., lead vs particulate matter of, e.g., pollen or fungal spores, right? How useful is it really to lump all of these together?

jvanderbot · 2 years ago
Yes - there are other poisons as well hidden behind those particles, but it's very hard to measure the composition of those particles in real time. Particulate matter can be measured with a $20 sensor from Sparkfun, so it's an often-used proxy.

As for why the particles sizes are differentiated, I think it's because some sizes can cross barriers into the blood - and are therefore concerning regardless of the contents. pm 2.5 can reach the deepest parts of the lungs and just muck up that whole area regardless and cause all kinds of systemic inflammation. pm 10.0 generally doesn't reach that deep but still causes eye/ nose/ throat problems IIRC.

If there is near-zero or undetectable particulate matter, then that would preclude lead particles as well, obviously.

hatmatrix · 2 years ago
It's widely thought but not conclusively found that there are certain constituents that are more toxic than others. As you say, the problem lies in the ability to measure these constituents more widely.

However, there is a hypothesis [1] that maybe there is something about the collective mass as negative association with health effects have been found all over the world where the composition changes drastically. A more recent hypothesis [e.g., 2] is that many health problems from particles stem from persistent reactive oxygen species that cause oxidative stress, and these reactive species form from many different mechanisms so likely scales with the collective mass, which has been serving as a surrogate metric.

[1] https://doi.org/10.1016/S0048-9697(99)00513-6

[2] https://doi.org/10.1038/srep32916

culi · 2 years ago
This seems like a reasonable explanation, thanks
wnevets · 2 years ago
Focusing on PM 2.5 makes sense because anything that size and smaller tend to cause the most harm. [1]

[1] https://www.epa.gov/pm-pollution/particulate-matter-pm-basic...

> Particulate matter contains microscopic solids or liquid droplets that are so small that they can be inhaled and cause serious health problems. Some particles less than 10 micrometers in diameter can get deep into your lungs and some may even get into your bloodstream. Of these, particles less than 2.5 micrometers in diameter, also known as fine particles or PM2.5, pose the greatest risk to health.

BobaFloutist · 2 years ago
Presumably smaller particles are also more difficult to filter (and therefore more likely to make their way indoors without dedicated filtration), and also expected to remain airborne for a greater distance and duration (because of square cube things).
4death4 · 2 years ago
Yes, but isn’t lead getting into your bloodstream worse than pollen? I mean oxygen gets into your bloodstream via inhalation…
luma · 2 years ago
PM2.5 is of interest because it’s a size that your body has a harder time getting rid of. Larger size particles have an easier time being captured by the upper respiratory system. Smaller things might make it into the blood stream and hopefully be filtered by the kidneys. 2.5 microns is right at the point between where either thing is less likely to happen and instead the stuff might get stuck in your lungs.
nonamousnoose · 2 years ago
They are separated by size because that determines which airways they migrate to, and which can enter the bloodstream. Yes, what they're made of matters - the latter is also measured in specific studies, but the former is a lot easier/cheaper to systematically measure.
photochemsyn · 2 years ago
Getting the data on the chemical composition of the PM2.5 fraction of air pollution requires fairly advanced instrumentation and lab capabilities. It is the important data in estimating toxicity, for example:

https://meetingorganizer.copernicus.org/EGU24/EGU24-881.html

> "It was found that PM chemical composition was major determinant in toxicity assessment rather than its mass concentration."

BunsanSpace · 2 years ago
Canada is updating their model to be the worst of PM2.5 and their traditional models of common pollutants.

Part of the issue is the existing model does an average over 3 hours, where PM2.5 can change rapidly (fire, traffic surge, &c). So they have a second metric that is averaged over an hour.

hadlock · 2 years ago
Putting a frozen pizza in the oven can spike most apartments to 300 in under 10-12 min, especially if you leave it in too long at 450F
londons_explore · 2 years ago
Indeed - clearly it matters what the particles are made from. Some stuff won't even be toxic, like particles of water (fog).
taeric · 2 years ago
I mean, yes, toxins in the air are worse than non-toxins in the air. I don't think anyone will really debate that.

The banal answer to your question is you measure what is easy to measure. And it is easier to build a device that can measure size of particles than one that can enumerate all of the types. Especially since you will take the same actions regardless? Put up a filter and run it.

hollerith · 2 years ago
>Put up a filter and run it.

It's not that simple because any filtration media that will filter out, e.g., diesel exhaust or pollen will lyse the bacteria that are always floating in the air, thereby adding to the air that leaves the filtration device the lipopolysaccharide (LPS) toxin that without the action of the filtration device would remain relatively-safely inside the bacterium. Yes, even without any air purifiers, you will absorb some LPS toxin from the bacteria in the air you breath, but running an air purifier approximately doubles the dose. I claim to be able to tell the difference in that when I'm in a small room with an air purifier running at high fan speed, I am much more likely to feel a certain kind of non-severe, but not-good headache-like feeling.

I.e., you don't want to run an air purifier that removes very small particles from the air when you don't need to -- at least if you are as sensitive to LPS as I am. And you don't want the fan speed of the purifier's being higher than necessary.

hendler · 2 years ago
Air quality is not thought enough of in terms of localized data. Some modeling works at larger scales, but if you want to know the forecast for other pollutants (especially ones that disperse or transition) or the source of methane leaks, you need very localized data from many modalities (wind direction, temperature, topographic, seasonal, traffic, etc).

I worked at https://aclima.io on air quality for 6.5 years. My role was managing backend data pipelines, but I worked with scientists and data scientists who were pushing the boundaries of models' capabilities. Models are complex and expensive - any advancement here, like Graphcast, is very important. [1] One job our team was responsible for is to reduce the cost of high quality data, so we drove vehicles around to collect very localized data, which ended up being temporally sparse. Modeling can fill gaps to some measurable level of certainty.

It should also be said that policy is far behind the science, but the burden will remain on science and data to continue make conclusions irrefutable.

[1] - https://deepmind.google/discover/blog/graphcast-ai-model-for...

selimthegrim · 2 years ago
Is there a good email address/way to reach you? I have a couple questions about your past job as well as the product at your current gig.
hendler · 2 years ago
sure - twitter.com/Hendler
banish-m4 · 2 years ago
The costs of changing from say a model like the US EPA AQI to something else is it creates fragmentation and breaks comparability and infrastructure that came before. An agreed-upon international "AQ" metric should reflect real risk to sensitive groups and long-term health outcomes.
wiz21c · 2 years ago
It uses https://people.cs.vt.edu/~asandu/Software/Kpp/ which is free software !!!
perlpimp · 2 years ago
I like how that freon patent was running out and people coaxed up ozone hole scare alleging that using hair spray and freezers as it would irradiate everyone and kill life on earth. Science establishment gave generous window dressing to this, now firmly forgotten as adjacent patented and as "ozone depleting" as previous ones.
jgord · 2 years ago
One reason we might want the best possible model of sulphur particulates ..

.. is because we might need to use them to reduce warming :

https://e360.yale.edu/features/aerosols-warming-climate-chan...

Its one of the very few levers we have to pull.

[ keep in mind net-zero == max-CO2 == max-heat .. were arguably on track for +2.0C by 2040 .. the heat itself may not be 'survivable' for large populations ]

Dead Comment

Dead Comment