This logic always bugs me because no one truly lives in a vacuum. People are flawed and generally need help from a community. A small community can't really fight back a well endowed company like gambling companies. The whole(stated) reason android is losing unsigned side loading is because grandmas in SEA are sideloading gambling apps.
It's obvious to me that gambling is generally a vulnerability in the human psyche. For many, it short circuits something in their brain and forms genuine addiction.
It's actually insane to me to use this vulnerability as a tax base to fund roads and schools, because regardless of the funds, your incentives will still be perverse and those incentives will dictate that more people need to be losing their money to out-of-state firms because a small portion of it might fund roads and schools.
The incentives basically state: "A percentage of our population must become sick and addicted to risk and reward in order for society to function". Is this not basically the concept of Omelas?
This adds to the burden of finding what to ban, which may be different depending on who you ask.
If there’s a massive burden with addicts, you can still impose that the gambling industry pays more to offset.
This takes me back to arguing with Gentoo users 20 years ago who insisted that compiling everything from source for their machine made everything faster.
The consensus at the time was basically "theoretically, it's possible, but in practice, gcc isn't really doing much with the extra instructions anyway".
Then there's stuff like glibc which has custom assembly versions of things like memcpy/etc, and selects from them at startup. I'm not really sure if that was common 20 years ago but it is now.
It's cool that after 20 years we can finally start using the newer instructions in binary packages, but it definitely seems to not matter all that much, still.
As in, are there any common libraries or parts of the system that typically slow things down, or was this more targeting a time when hardware was more limited so improving all would have made things feel faster in general.
https://store.ubisoft.com/on/demandware.static/-/Sites-maste...
https://www.gamesaktuell.de/screenshots/1280x/2002/10/adel.j...
Even AoE and Settlers (both preferably in the second edition) look soooo much better to me than most of the games I can find, they look just strange, both the remakes from "brand name" studios and a lot of the smaller games like you'd find on Apple Arcade.
It would be quite hard getting sane defaults for all sorts of configs, e.g. multi AP setup as in my case.
Most trials have long lists of excluded conditions. As you say, one reason is reducing variability among subjects so effects of the treatment can be determined.
This is especially true when effects of a new treatment are subtle, but still quite important. If subjects with serious comorbidities are included, treatment effects can be obscured by these conditions. For example, if a subject is hospitalized was that because of the treatment or another condition or some interaction of the condition and treatment?
Initial phase 3 studies necessarily have to strive for as "pure" a study population as possible. Later phase 3/4 studies could in principle cautiously add more severe cases and those with specific comorbidities. However there's a sharp limit to how many variations can be systematically studied due to intrinsic cost and complexity.
The reality is that the burden of sorting out use of treatments in real-world patients falls to clinicians. It's worth noting level of support for clinicians reporting their observations has if anything declined over decades. IOW valuable information is lost in the increasingly bureaucratic and compartmentalized healthcare systems that now dominate delivery of services.