https://spectrum.ieee.org/in-2016-microsofts-racist-chatbot-...
Parallelism doesn't magically add non-determinism of this kind unless you intentionally build it to be non deterministic. Nothing prevents you from processing an array in order in parallel.
You would have to explicitly order the terms prior to reduction but you don't always have that level of control.
Couple nits: - frequency estimate should be more aggressive, should be looking for an upper bound on all numbers to quantify max potential impact. I would crank that up to once a day or more.
- a hole allowing twice as much sounds like a comolete guess. Could it be 100x, 10,000x? or perhaps less than 10%? Other comments make the ozone layer sound more important. The hole punching effects on that layer are unknown (to me and the article does not mention it). In effect, this could be a light show, or it could be a routine perforation of each layer of the atmosphere (more data needed)
- that extra radiation is not spread out across the area of the earth. Instead how many people and animals are present under that hole. Everyone underneath the hole presumably would get the full blast. Thus, it is simple, those people get twice the radiation, whatever that factor is, it is not averaged across the whole planet because it is a local effect. Thus, the area sizes only matter for determining how many people and animals receive extra exposure.
- starlink donated their services to Ukraine. That one act of donation does not make it a public good. Starlink is a for profit service, not a public good (some services can be quite good, but that is different from "a public good")
I think the right way to approach this is to consider the unshielded radiation flux over the hole and the time that the hole takes to close. This would give a good back of envelope upper bound of the increase in cancer risk. There’s probably other effects, but all I care about is harm to individuals.