Even assuming your compute demands stay fixed, its possible that a future generation of accelerator will be sufficiently more power/cooling efficient for your workload that it is a positive return on investment to upgrade, more so when you take into account you can start depreciating them again.
If your compute demands aren't fixed you have to work around limited floor space/electricity/cooling capacity/network capacity/backup generators/etc and so moving to the next generation is required to meet demand without extremely expensive (and often slow) infrastructure projects.
This has been my experience every time I try Linux. If I had to guess, tracing down all these little things is just that last mile that is so hard and isn't the fun stuff to do in making an OS, which is why it is always ignored. If Linux ever did it, it would keep me.
Why? Kids can combine the power of their ideas with crayons, markers, and pencils.
From the website it seems like a great way to generate some black and white outlines that kids can still color in. If used like that it seems almost strictly more creative than a coloring book, no? There are plenty of other ways kids can express creativity with pre-made art too. Maybe they use them to illustrate a story they dreamed up? Maybe they decorate something they built with them?
Also, some children might want to have fun be creative in ways that don't involve visual arts. I was never particularly interested in coloring or drawing and still believe myself to be a pretty creative individual. I don't think my parents buying me some stickers robbed me of any critical experience.
Unpleasant, but comes with the territory (I don’t like it, when it’s done to me).
That said, I’m not sure that kind of scolding is particularly effective, either.
A software developer's primary job is to develop software for their users, not to comply with a third party distributor that repackages their software.
Really the whole raison d'etre of debian is move at this pace to prioritize stability/compatibility. If you don't like that philosophy there are other distros but a package maintainer's primary job is to repackage software for that distro (which presumably users have chosen for a reason), not comply with upstream.
Distributed systems with files as a communication medium are much more complex than programmers think with far more failure modes than they can imagine.
Like… this one, that took out a cloud for hours!
I think the communications piece depends on what other systems you have around you to build on, its unlikely this planner/executor is completely freestanding. Some companies have large distributed filesystems with well known/tested semantics, schedulers that launch jobs when files appear, they might have ~free access to a database with strict serializability where they can store a serialized version of the plan, etc.
...and including the erroneous entry is squarely the author's fault.
Papers should be carefully crafted, not churned out.
I guess that makes me sweetly naive
> Papers should be carefully crafted, not churned out.
I think you can say the same thing for code and yet, even with code review, bugs slip by. People aren't perfect and problems happen. Trying to prevent 100% of problems is usually a bad cost/benefit trade-off.