In contrast, this design lets you know precisely what is recomputed on each refresh, even if it comes at the cost of explicitness. Any and all timing errors will be present in the code as written.
In contrast, this design lets you know precisely what is recomputed on each refresh, even if it comes at the cost of explicitness. Any and all timing errors will be present in the code as written.
Human verification can never be ruled out entirely with these sorts of systems: you always have to check that the definitions used in the final statement mean what you think they mean, and that all of the base axioms are acceptable.
And of course, there's always the possibility of bugs in the kernel. I even recently found a bug [0] in a verifier for Metamath, which is designed to be so simple that its only built-in logic is typed string substitution. But such bugs should hopefully be unlikely in non-adversarial settings.
So it is currently a persistent time sync, and rewriting it so that it can sit inside the browser sandbox will probably add a significant amount of work in its own right. If that's work that nobody wants to do, then it's difficult to see what your solution actually is.
As for immediate work, some in this thread have proposed compiling libxslt to WASM and using that, which sounds perfectly viable to me, if inefficient. WASM toolchains have progressed far enough that very few changes are needed to a C/C++ codebase to get it to compile and run properly, so all that's left is to set up the entry points.
(And if there really were no one-for-one replacement short of a massive labor effort, then current XSLT users would be left with no simple alternative at all, which would make this decision all the worse.)
It's that why Chrome unilaterally releases 1000+ web APIs a year, many of them quite complex, and spanning a huge range of things to go wrong (including access to USB, serial devices etc.)? To reduce the attack surface?
Which is miles better than having to having to use calcs for CSS animation timing which requires a kludge of CSS variables/etc to keep track of when something begins/ends time-wise, if wanting to avoid requiring Javascript. And some years ago Firefox IIRC didn't even support time-based calcs.
When Chromium announced the intent to deprecate SMIL a decade back (before relenting) it was far too early to consider that given CSS at that time lacked much of what SMIL allowed for (including motion along a path and SVG attribute value animations, which saw CSS support later). It also set off a chain of articles and never-again updated notes warning about SMIL, which just added to confusion. I remember even an LLM mistakenly believing SMIL was still deprecated in Chromium.
And there's one of the issues: browser devs are perfectly happy if user JS can be used to replicate some piece of functionality, since then it's not their problem.
In the absence of anyone raring to do that, removal seems the more sensible option.
Does it, though? Browsers already have existing XSLT stacks, which have somehow gotten by practically unmodified for the last 20 years. The basic XSLT 1.0 functionality never changes, and the links between the XSLT code and the rest of the codebase rarely change, so I find it hard to believe that slapping it into a sandbox would suddenly turn it into a persistent time sink.
I think the idea of that is reasonable. If I used XSLT on my tiny, low-traffic blog, I think it's reasonable for browser devs to tell me to update my code. Even if 100 people like me said the same thing, that's still a vanishingly small portion of the web, a rounding error, protesting it.
I'd expect the protests to be disproportionate in number and loudness because the billion webmasters who couldn't care less aren't weighing in on it.
Now, I'm not saying this with a strong opinion on this specific proposal. It doesn't affect me either way. It's more about the general principle that a loud number of small webmasters opposing the move doesn't mean it's not a good idea. Like, people loudly argued about removing <marquee> back in the day, but that happened to be a great idea.
(And if you did want to tell the entire world to update their code, and have any chance of them following through with it, you'd better make sure there's an immediate replacement ready. Log4Shell would probably still be a huge issue today if it couldn't be fixed in place by swapping out jar files.)
What I'm trying to say is that it's a false dichotomy in most cases: implementations could almost eliminate the attack surface while maintaining the same functionality, and without devoting any more ongoing effort. Such as, for instance, JS polyfills, or WASM blobs, which could be subjected to the usual security boundaries no matter how bug-ridden and ill-maintained they are internally.
But removing the functionality is often seen as the more expedient option, and so that's what gets picked.
Excerpt from the user agreement:
People put their heads in the sand over reddit for some reason, but it's worse than FAANG.