I prefer Greg Egan's interpretation in Permutation City, where simulations can become self-bootstrapping and that simulations need no "simulator" at all. No one's loaded universe.exe in some higher-lever reality, it runs itself.
That still won't stop me from attempting rowhammer attacks.
I'm not sure why many people are still dealing with legacy manual certificate renewal. Maybe some regulatory requirements? I even have a wildcard cert that covers my entire local network which is generated and deployed automatically by a cron job I wrote about 5 years ago. It's working perfectly and it would probably take me longer to track down exactly what it's doing than to re-write it from scratch.
For 99.something% of use cases, this is a solved problem.
I'm never quite sure what this is meant to mean. Is it comparing to other simulations like computer games or physical simulations where you could change a seed or a data structure and have it manifest in reality? What is expected from a simulation to differ from reality? What does it even mean to make this distinction when we are observing inside the process we are trying to distinguish between real and simulated?
FTFY
Deleted Comment
But most importantly, apart from breaking away from "UNIX-philosophy tools", what do you lose in practical terms?