Still enjoying my sub $30 bills and bloatware free device
Deleted Comment
Suing Github = signing up for a ~decade long incredibly expensive and time-consuming legal battle against one of the richest companies in the world
There may be a slight difference in effort between these two options.
[0] https://techcrunch.com/2012/03/22/microsoft-and-tivo-drop-th...
Sure, I get it - a whole bunch of stuff has to come down before Document.onload() can be called and thats time from a JS perspective the user is looking at nothing. What I like to do is setup the default screen to be a fixed position blocking layer with my company logo on it and the last thing Document.onload() does is remove it. The initial load they see it for 1.4 seconds. Subsequent reloads from browser cache it's up barely long enough to notice.
So the problem of initial load is easily managed and not that big of a problem in the first place honestly. Simply setup caching on the web server to persist the files as long as possible ( I think a year is the max in most places now ) and problem solved right?
Now occasionally I run across some browser on some platform thats greedy about caching and refuses to pay proper attention to cache headers and replace them with newer versions when they come down. Its a little more of a pain but still well worth it to simply add a fingerprint to each remote resource fetched:
<link rel="stylesheet" href="/foo/bar.css>
becomes:
<link rel="stylesheet" href="/foo/bar.css?id=FINGERPRINT">
Now at build or deploy time I incorporate a little script that generates the current time in seconds and uses sed to replace FINGERPRINT with the current time in seconds (123888238823482834) such that browser sees:
<link rel="stylesheet" href="/foo/bar.css?id=123888238823482834">
This is a unique URL and forces the browser to pull down the new asset. This is easy to do with resources pulled down in the <head> section and more problematic with images however you just change the image name from "image" to "image_v2" on edits or changes and the problem goes away. Its easily enough done to iterate the image directory changing "*.jpg" files to a new version number and making the same replacements in html and js files if you really want to get tricky about it.
Now, new files are fetched one and only once per device and the page weight of a particular JS file becomes practically irrelevant.
> Subsequent reloads from browser cache it's up barely long enough to notice.
Are you measuring the time on your personal machine, or a machine that represents what your typical visitor is using? If you're using a recent Macbook, that's going to have very different performance characteristics than, say, an old Android phone. Something that's instantaneous on a Macbook could take ages on an old Android.
JS devs that did not used Desktop or Mobile GUI tolkits have no idea what is missing in the browser.
react-data-grid is 13.8KB gzipped[1]. React itself is ~45KB gzipped, so that's ~60KB total, nearly half the size of W2UI.
edit: And if you want a full-fledged component library, you could also throw in react-bootstrap, which is <40KB gzipped.
[1] https://bundlephobia.com/package/react-data-grid@7.0.0-beta....
I know this shouldn't ever happen, but you can well imagine plenty of legacy/badly configured setups where it would. More pertinently, where it would no matter how loudly you warn about it in release notes etc.
When you're as mature and used a platform as node, you just can't risk things like this, unfortunately, no matter how more convenient it would be for the vast majority of users.
No way to win. At least I could wrap all the ad locations so they didn’t shift the page when they finally popped in.