Presuming this goes ahead, I believe this is the first time a standard, baseline-available feature will be removed.
There have been other removals, but few of them were of even specified features, and I don’t think any of them have been universally available. One of the closest might be showModalDialog <https://web.archive.org/web/20140401014356/http://dev.opera....>, but I gather mobile browsers never supported it anyway, and it was a really problematic feature from an implementation perspective too. You could argue Mutation Events from ~2011 qualifies¹; it was supplanted by Mutation Observers within two years, yet hung around for over a decade before being removed. As for things like Flash or FTP, those were never part of the web platform. Nor were they ever anything like universal anyway.
And so here they are now planning to remove a well-entrenched (if not especially commonly used) feature against the clearly-expressed will of the actual developers, in a one year time frame.
—⁂—
¹ I choose to disqualify Mutation Events because no one ever finished their implementation: WebKit heritage never did DOMAttrModified, Gecko/Trident heritage never did DOMNodeInsertedIntoDocument or DOMNodeRemovedFromDocument. Flimsy excuse, probably. If you want to count it, perhaps you’ll agree to consider XSLT the first time a major, standard, baseline-available feature will be removed?
Look, I wouldn't want to be responsible for maintaining anything to do with XML or XSLT either. All the technical arguments outlined for removing support make sense. But can users really call it an "update" if you could view an XML/XSLT document in Internet Explorer 6 or Chrome 1 but not the newest version?
I think this sets a concerning precedent for future deprecations, where parts of the web platform are rugpulled from developers because it's convenient for the browser vendors.
> I think this sets a concerning precedent for future deprecations, where parts of the web platform are rugpulled from developers because it's convenient for the browser vendors.
Meanwhile, we don't seem to be learning from the past. If alert is fair game for removal, then so is every API we add to the platform if the web's future stewards deem it harmful.
Given Chrome's near-monopoly control of the browser market, I'm genuinely concerned about what this all means for the future of the web. An ad company shouldn't have this much influence over something that belongs to all of us. I don't know how to fix the standards process so that it's more representative of the diversity of the web's stakeholders, but I'm increasingly convinced that we need to figure it out.
The security argument isn't that great. Google has been grumbling about xslt for more than a decade. If security was really their concern they could have replaced the compiled C library with an asm.js version ten years ago, much as they did for pdf rendering. They could use wasm now. They don't need to deprecate it.
> As for things like Flash or FTP, those were never part of the web platform. Nor were they ever anything like universal anyway.
I feel like there is a bit of a no true scotsman to this.
XSLT was always kind of on the side. If FTP or flash weren't part of the web platform than i dont know that xslt is either. Flash might not be "standard" but it certainly had more users in its heyday than xslt ever did.
Does removal of tls 1.1 count here? Its all kind of a matter of definitions.
Personally i always thought the <keygen> tag was really cool.
XSLT is an integrated part of the web platform: browsers can load XML documents that use an XSLT stylesheet, and even inside HTML documents XSLTProcessor is available.
FTP was never integrated: it just so happened that some platforms shipped a protocol handler for it, and some browsers included an FTP protocol handler themselves. But I don’t believe you could ever, say, fetch("ftp://…").
Flash, like applets, was even more clearly not part of the web platform. It was a popular third-party extension that you had to go out of your way to install… or wait for it to be installed by some shady installer Adobe paid off. Though I have a vague feeling Chrome shipped with Flash at some point? I don’t remember all the history any more, this is a long time ago.
Older versions of TLS is definitely a more interesting case. It’s a different kind of feature, but… yeah, I might consider it.
<keygen> was an interesting concept that in practice went nowhere.
Yeah… on the one hand I don’t care about XSLT, haven’t used it in more than 20 years, and never intend to use it again.
On the other… I’m still a bit uncomfortable with the proposed change because it reads as another example of Google unilaterally dictating the future of the web, which I’ve never liked or supported.
XSLT is not trendy technology but I doubt it's worse than WebBluetooth, WebUSB or WebGL from a complexity/maintenance/security perspective.
This change definitely feels like moving a (tiny) step into the direction of turning the Web platform into something akin to the Android dev experience.
The post indicates WHATWG has "broad agreement" about removing XSLT. I don't know how many seats Google has there, but on the surface it doesn't sound like a unilateral decision.
<marquee> still works fine. Better than it used to, honestly, as at least Firefox and Chromium removed the deliberate low frame rate at some point in the last decade.
<blink> was never universal, contrary to popular impression: <https://en.wikipedia.org/wiki/Blink_element#:~:text=The%20bl...>, it was only ever supported by Netscape/Gecko/Presto, never Trident/WebKit. Part of the joke of Blink is that it never supported <blink>.
> Netscape only agreed to remove the blink tag from their browser if Microsoft agreed to get rid of the marquee tag in theirs during an HTML ERB meeting in February 1996.
Fun times. Both essentially accusing the other of having a dumb tag.
One might think that as technology progresses more and more pieces of older technologies get revived and incorporated into the available tooling. Yet the very opposite thing happens: good and working parts are removed because the richest companies on Earth "cannot afford" to keep them.
In 19th century Russia there was a thinker, N. F. Fedorov, who wanted to revive all dead people. He saw it as the ultimate goal of humanity. (He worked in a library, a very telling occupation. He spent most of what he earned to support others.) We do not know how to revive dead people or if we can do that at all; but we certainly can revive old tech or just not let it die.
Of course, this job is not for everyone. We cannot count on the richest, apparently, they're too busy getting richer. This is a job for monks.
The browser vendors are arguing XSLT is neither good - it's adoption has always been lacking because of complexity and has now become a niche technology because better alternatives exist - nor working, see the mentioned security and maintenance issues. I think they have a good point there.
Well, one can argue that Metafont fonts are "niche". No font library supports them. But from what I know they could easily be technically superior to, say, Type 1 fonts. Of course a technology will be niche if it is treated as a poor relative. XSLT could be an alternative to CSS, for example. It is definitely more powerful than CSS, because it actually transforms the document, not just alters the appearance and sprinkles some automatic content here and there. And is actually used this way in XSL-FO, which, I think, powers a substantial share of technical publishing.
> One might think that as technology progresses more and more pieces of older technologies get revived and incorporated into the available tooling. Yet the very opposite thing happens: good and working parts are removed because the richest companies on Earth "cannot afford" to keep them.
I think it is because nobody, excepts a handful of people around the world, feels the need to use XSLT in lieu of CSS. Hence, CSS has evolved over time while XSLT has not.
This is how the world works: technology advances and old things become obsolete over time.
Pertinent to your point, he wanted to resurrect ancestors so that they, too, could participate in the general resurrection. The analogy being old technology resurrected to work alongside contemporary technology towards a shared goal.
XSLT is to my knowledge the only client side technology that lets you include chunks of HTML without using JavaScript and without server-side technology.
XSLT lets you build completely static websites without having to use copy paste or a static website generator to handle the common stuff like menus.
I did that. You can write .rst, then transform it into XML with 'rst2xml' and then generate both HTML and PDF (using XSL-FO). (I myself also did a little literate programming this way: I added a special reStructuredText directive to mark code snippets, then extracted and joined them together into files.)
XSLT is great, but its core problem is that the tooling is awful. And a lot of this has to do with the primary author of the XSLT specification, keeping a proprietary (and expensive) library as the main library that implements the ungodly terse spec. Simpler standards and open tooling won out, not just because it was simpler, but because there wasn't someone chiefly in charge of the spec essentially making the tooling an enterprise sales funnel. A shame.
Once upon a time HTML was a kind of XML, which is why the current version is very similar to XML and hence painful to write. This in turn is why we tend to use programmatic tools to handle the HTML, and you should if you work with XML too.
For what it's worth, this is the difference between private-sector and public-sector development. The public sector would have instead argued for some budget to hire developers to maintain libxslt and issue RFPs for grant money to rewrite it in Rust for memory safety guarantees. The private sector decides that it's just not a profitable use of resources and moves to cancel support.
The question isn't whether or not you use XSLT yourself, it's whether you use a different feature that could be deemed unprofitable and slammed on the chopping block. And therefore a question of whether it wouldn't be better for everyone for this work to be publicly funded instead.
I’m lost at “the public sector would have argued for some budget”. Xslt and libxslt are used across a no - trivial amount of deployments.
Why would the public sector feel bound to support it as opposed to pivot in the same direction the winds are blowing?
Outside the idiocy of this particular administration in the US, gov is pivoting toward more commercial norms (with compliance/etc for gov cloud and etc compliance).
> Why would the public sector feel bound to support it
The underlying axiom is the Pareto principle - that you get 80% of the benefit from the first 20% of the work, and getting the last 20% of the benefit takes up 80% of the work. The private sector will stop funding after the first 80% of benefit (it's not profitable to chase the last 20%) but the public sector is usually mandated to support everybody so it is indeed required to put in that extra effort.
I'm quite unconvinced by this - it seems very easy to come up with all sorts of counterexamples, particularly in terms of public infrastructure, but also all of public services are regularly cut if the organising body doesn't see that service as achieving its goals any more.
It is true that public bodies are less concerned with profitability, which changes how they make decisions around deprecations and removals, but being cost-effective is still important for them, especially when budgets are low and need is high. In situations like that, it's not uncommon for, say, a service to get cut so that funding can be reallocated elsewhere where it's more needed.
I don't think publicly funding this sort of work would necessarily significantly change the equation here. The costs of XSLT are relatively high because of its complexity and the natural security risks that arise from that complexity. Meanwhile, it is very rarely used, and where it is used, there are better alternatives (generally loading a sandboxed library rather than using the built-in tooling).
One extremely important use-case is for RSS/Atom feeds. Right now, clicking on a link to feed brings up a wall of XML (or worse, a download link). If the feed has an XSLT stylesheet, it can be presented in a way that a newcomer can understand and use.
There have been other removals, but few of them were of even specified features, and I don’t think any of them have been universally available. One of the closest might be showModalDialog <https://web.archive.org/web/20140401014356/http://dev.opera....>, but I gather mobile browsers never supported it anyway, and it was a really problematic feature from an implementation perspective too. You could argue Mutation Events from ~2011 qualifies¹; it was supplanted by Mutation Observers within two years, yet hung around for over a decade before being removed. As for things like Flash or FTP, those were never part of the web platform. Nor were they ever anything like universal anyway.
And so here they are now planning to remove a well-entrenched (if not especially commonly used) feature against the clearly-expressed will of the actual developers, in a one year time frame.
—⁂—
¹ I choose to disqualify Mutation Events because no one ever finished their implementation: WebKit heritage never did DOMAttrModified, Gecko/Trident heritage never did DOMNodeInsertedIntoDocument or DOMNodeRemovedFromDocument. Flimsy excuse, probably. If you want to count it, perhaps you’ll agree to consider XSLT the first time a major, standard, baseline-available feature will be removed?
I think this sets a concerning precedent for future deprecations, where parts of the web platform are rugpulled from developers because it's convenient for the browser vendors.
The precedent was already set when they tried to remove alert/prompt. See https://dev.to/richharris/stay-alert-d and https://css-tricks.com/choice-words-about-the-upcoming-depre...
Only a large public outcry stopped them, barely.
To quote from the first link:
--- start quote ---
Meanwhile, we don't seem to be learning from the past. If alert is fair game for removal, then so is every API we add to the platform if the web's future stewards deem it harmful.
Given Chrome's near-monopoly control of the browser market, I'm genuinely concerned about what this all means for the future of the web. An ad company shouldn't have this much influence over something that belongs to all of us. I don't know how to fix the standards process so that it's more representative of the diversity of the web's stakeholders, but I'm increasingly convinced that we need to figure it out.
--- end quote ---
These aren't horrible formats or standards. XSLT is actually somewhat elegant.
Yes. Just like we don't have Flash everywhere or ActiveX. Good riddance to them and to XSLT and, fingers crossed, XML in the future.
I feel like there is a bit of a no true scotsman to this.
XSLT was always kind of on the side. If FTP or flash weren't part of the web platform than i dont know that xslt is either. Flash might not be "standard" but it certainly had more users in its heyday than xslt ever did.
Does removal of tls 1.1 count here? Its all kind of a matter of definitions.
Personally i always thought the <keygen> tag was really cool.
FTP was never integrated: it just so happened that some platforms shipped a protocol handler for it, and some browsers included an FTP protocol handler themselves. But I don’t believe you could ever, say, fetch("ftp://…").
Flash, like applets, was even more clearly not part of the web platform. It was a popular third-party extension that you had to go out of your way to install… or wait for it to be installed by some shady installer Adobe paid off. Though I have a vague feeling Chrome shipped with Flash at some point? I don’t remember all the history any more, this is a long time ago.
Older versions of TLS is definitely a more interesting case. It’s a different kind of feature, but… yeah, I might consider it.
<keygen> was an interesting concept that in practice went nowhere.
Deleted Comment
On the other… I’m still a bit uncomfortable with the proposed change because it reads as another example of Google unilaterally dictating the future of the web, which I’ve never liked or supported.
Feeling quite conflicted.
This change definitely feels like moving a (tiny) step into the direction of turning the Web platform into something akin to the Android dev experience.
I’m not a Chrome dev but I think they have decent reasons for going this way.
<blink> was never universal, contrary to popular impression: <https://en.wikipedia.org/wiki/Blink_element#:~:text=The%20bl...>, it was only ever supported by Netscape/Gecko/Presto, never Trident/WebKit. Part of the joke of Blink is that it never supported <blink>.
> Netscape only agreed to remove the blink tag from their browser if Microsoft agreed to get rid of the marquee tag in theirs during an HTML ERB meeting in February 1996.
Fun times. Both essentially accusing the other of having a dumb tag.
[1] For example: https://www.nagpuruniversity.ac.in/
Flash was the web technology.
In 19th century Russia there was a thinker, N. F. Fedorov, who wanted to revive all dead people. He saw it as the ultimate goal of humanity. (He worked in a library, a very telling occupation. He spent most of what he earned to support others.) We do not know how to revive dead people or if we can do that at all; but we certainly can revive old tech or just not let it die.
Of course, this job is not for everyone. We cannot count on the richest, apparently, they're too busy getting richer. This is a job for monks.
The browser vendors are arguing XSLT is neither good - it's adoption has always been lacking because of complexity and has now become a niche technology because better alternatives exist - nor working, see the mentioned security and maintenance issues. I think they have a good point there.
I think it is because nobody, excepts a handful of people around the world, feels the need to use XSLT in lieu of CSS. Hence, CSS has evolved over time while XSLT has not.
This is how the world works: technology advances and old things become obsolete over time.
XSLT isn't about styling documents, but is more like ETL (Extract, Transform, and Load)
Deleted Comment
XSLT lets you build completely static websites without having to use copy paste or a static website generator to handle the common stuff like menus.
How many people ever do this?
REPO: https://github.com/gregabbott/skip
DEMO: https://gregabbott.pages.dev/skip
(^ View Source: 2 lines of XML around a .md file)
https://web.archive.org/web/20140101011304/http://www.skeche...
They don't anymore. It was a pretty strange design.
http://www.blogabond.com/xsl/vistacular.xml
The upside is that the entire html page is content. I defy google to not figure out what to index here:
view-source:http://www.blogabond.com/xsl/vistacular.xml
The downside is everything else about the experience. Hence my 15 years of not bothering to implement it in a usable way.
Easy: ignore due to no content-type header.
I’m confused by your comment. My XSLT stylesheets are like this:
``` <?xml version="1.0"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> ```
The question isn't whether or not you use XSLT yourself, it's whether you use a different feature that could be deemed unprofitable and slammed on the chopping block. And therefore a question of whether it wouldn't be better for everyone for this work to be publicly funded instead.
Why would the public sector feel bound to support it as opposed to pivot in the same direction the winds are blowing?
Outside the idiocy of this particular administration in the US, gov is pivoting toward more commercial norms (with compliance/etc for gov cloud and etc compliance).
The underlying axiom is the Pareto principle - that you get 80% of the benefit from the first 20% of the work, and getting the last 20% of the benefit takes up 80% of the work. The private sector will stop funding after the first 80% of benefit (it's not profitable to chase the last 20%) but the public sector is usually mandated to support everybody so it is indeed required to put in that extra effort.
It is true that public bodies are less concerned with profitability, which changes how they make decisions around deprecations and removals, but being cost-effective is still important for them, especially when budgets are low and need is high. In situations like that, it's not uncommon for, say, a service to get cut so that funding can be reallocated elsewhere where it's more needed.
I don't think publicly funding this sort of work would necessarily significantly change the equation here. The costs of XSLT are relatively high because of its complexity and the natural security risks that arise from that complexity. Meanwhile, it is very rarely used, and where it is used, there are better alternatives (generally loading a sandboxed library rather than using the built-in tooling).
But someone who hasn't seen/used an RSS reader will see a wall of plain-text gibberish (or a prompt to download the wall of gibberish).
XSLT is currently the only way to make feeds into something that can still be viewed.
I think RSS/Atom are key technologies for the open web, and discovery is extremely important. Cancelling XSLT is going in the wrong direction (IMHO).
I've done a bunch of things to try to get people to use XSLT in their feeds: https://www.rss.style/
You can see it in action on an RSS feed here (served as real XML, not HTML): https://www.fileformat.info/news/rss.xml