I don't even think something like a RasPi 4 physically can pull that much.
Is this chip shortage related and they just couldn't find some regulator or something and used linear?
I opened a bug with chromium when I first encountered that behavior ~10 years ago since it was an obvious security and privacy concern to me. Needless to say, the chromium devs didn't think it's an issue.
You would think browsers would ask permission for sites to do things like modify your clipboard, see when you copy/paste, track your mouse movements and text selections, etc. but google obviously isn't going to care about protecting the user from such things.
I know that circa 2018 the ad industry suffered a shock as major players started measuring more precisely what value ads brought, and CPC went down, I think things just got worse at this point because websites crammed more ads in response.
If that can be broken, I don't think 10% is infeasible. Though I think Microsoft would have to really screw up to lose their status as the "default" OS.
And that's exactly the problem - macOS and Windows don't have "packages" in the sense that they do in Linux distributions. You can install software on both, of course, but it's relatively speaking a mess and Homebrew / Chocolately don't really suffice to make the experience anywhere nearly as clean and consistent as it is on literally any Linux distribution.
Other than that, I would agree, though for computer experts I would argue that Linux gives you the ability to fully understand how your system functions and control it at every level, and that this can be valuable. It's also a lot easier to use primarily open-source software on Linux. On the other hand, Linux can't run a lot of proprietary programs that are readily available on other systems.
You can add some JS here and there for the few really interactive elements of the document but my browser already has all the features to render documents and links perfectly fine. People have been able to "click around" since 1991 and we never needed to download, parse and execute 2MB of JS for this.
Your book is probably big, and I'm probably not reading it in one go, so if it includes images and videos, downloading it all is probably unnecessary and the book is probably best split in several HTML pages. If you want to allow me to consult it offline, that's very kind and noble. Just put a zip file somewhere I can download.
Sorry for the rant, but I'm a bit fed up by having to download run megabytes of Javascript I can't control (and even read, because yay, bundles!!) to browse the web, just because.
To play devil's advocate: a majority of web traffic is on phones and tablets now, especially for long-form content where you will frequently see people request a page on a desktop, then request it two minutes later from a phone or tablet where they can read it more comfortably. 99% of mobile users will be happier when a text-heavy site is a PWA that caches itself, rather than a static HTML site that asks them to download a zip file, install an app to work with zip files on their device, unzip it to a folder of hopefully-relevantly-named HTML files, and then browse those, in the process breaking link sharing, link navigation (depending on OS), cross-device reading and referencing of highlights/notes, site search, and so on. Not to mention the limitations imposed on file:/// URIs, like browser extensions not working on them by default, which is a real problem for users relying on them for accessibility (e.g. dyslexia compensation, screen reader integration, stylesheet overrides). A lot of times that won't even be possible on a dedicated reading devices; my ereader will cache PWAs but will not download arbitrary files, if you make your site a PWA I can read it during my commute, if you make it static HTML with a zip file I can't. These are features most users appreciate a lot more than not having to load a 60k JS bundle (current size of React gzipped).
https://terminalbytes.com/reviving-kindle-paperwhite-7th-gen...