The UK has a strong tradition of safeguarding privacy while ensuring that appropriate action can be taken against criminals, such as child sexual abusers and terrorists. I firmly believe that privacy and security are not mutually exclusive—we can and must have both. The Investigatory Powers Act governs how and when data can be requested by law enforcement and other relevant agencies. It includes robust safeguards and independent oversight to protect privacy, ensuring that data is accessed only in exceptional cases and only when necessary and proportionate. The suggestion that cybersecurity and access to data by law enforcement are at odds is false. It is possible for online platforms to have strong cybersecurity measures whilst also ensuring that criminal activities can be detected.
First, there is modeling ambiguity, too many ways to represent the same data structure. Which means you can’t parse into native structs but instead into a heavy DOM object and it sucks to interact with it.
Then, schemas sound great, until you run into DTD, XSD, and RelaxNG. Relax only exists because XSD is pretty much incomprehensible.
Then let’s talk about entity escaping and CDATA. And how you break entire parsers because CDATA is a separate incantation on the DOM.
And in practice, XML is always over engineered. It’s the AbstractFactoryProxyBuilder of data formats. SOAP and WSDL are great examples of this, vs looking at a JSON response and simply understanding what it is.
I worked with XML and all the tooling around it for a long time. Zero interest in going back. It’s not the angle brackets or the serialization efficiency. It’s all of the above brain damage.
Then if there were any problems in my XML, trying to decipher horrible errors determining what I did wrong.
The docs sucked and where "enterprise grade", the examples sucked (either too complicated or too simple), and the tooling sucked.
I suspect it would be fine now days with LLMs to help, but back when it existed, XML was a huge hassle.
I once worked on a robotics project where a full 50% of the CPU was used for XML serialization and parsing. Made it hard to actually have the robot do anything. XML is violently wordy and parsing strings is expensive.