I feel like the 1MB limit is excessively generous, especially for text-only pages. But maybe that's what makes it so damning when pages fail to adhere to it. I know at least one website I maintain fails it spectacularly (though in my defense it's entirely because of that website being chock-full of photos, and full-res ones at that; pages without those are well under that 1MB mark), while other sites I've built consist entirely of pages within a fraction of that limit.
It'd be interesting to impose a stricter limitation to the 1MB Club: one where all pages on a given site are within that limit. This would disqualify Craigslist, for example (the listing search pages blow that limit out of the water, and the listings themselves sometimes do, too).
I also wonder how many sites 1mb.club would have to show on one page before it, too, ends up disqualifying itself. Might be worthwhile to start thinking about site categories sooner rather than later if everyone and their mothers starts spamming that GitHub issues page with sites (like I'm doing right now).
Gives you an emulator, an 8-bit AVR Assembler, and an IDE for just over 500k transferred. Almost all of it JavaScript.
Using math.js is by far the heaviest part but at least your Asm already knows things like solarMass and planckConstant :-). CodeMirror comes in second heaviest but for user-experience-per-byte you're always going to be streets ahead of a gif.
At the risk of over-complicating things, perhaps there could be limits per resource type. 10Mb of images might be reasonable (e.g. for a photojournal), but only 128KB of JS, and 128KB for everything else. Something along those lines.
It's over one hundred thousand lines of JavaScript code minified and Gziped into a 300KB bundle which should fully load in about 300ms on a decent computer.
Even the page where I used images came in at juuuust about the limit -- to the point where you'd have to make a ruling of whether the rest of the page gzipped counted because over the wire was technically <1MB but it was just above after decompress. The site does specifically say "downloaded", haha
Personally, I stuck to the 40K best practice recommendation for the basic web page as a target, which was in place when I started. Modified to up to 140K including webfonts. Notably, this is without images.
(This is a flexible target, depending on the complexity of a page. E.g. for a "bloated" page, a fully styled video display for competition winners showing 200+ entries and 280+ individual videos in categorized views is about 250K, including a few images, two and a half font families and the Vimeo Player SDK, but excluding the load of any external video streams. However, with compression we still manage the 140K mark.)
Then reality hits: Client insists on a full-width photographic hero image as it's still 2014. Usual controversies about a full-size intro video (autoplay, of course), we must have this highly intrusive chat asset installed, etc, etc… – And we easily blow the 1MB limit.
Something about all pages being under limit the limit instead of every page being under the limit changes the exact meaning to something that I cannot agree with. Which was the meaning I replied below and then wrote this after realizing you might of meant every when you wrote all.
Lets say you write a daily blog. A single A4 of text contains on average 3000 characters, your posts average slightly above that by being 4000 characters.
How long until the text content alone is above the 1MB limit.
I doubt you can find a single text blog on a specific topic that wouldn’t be improved by limiting it’s total text to 1MB.
Being more verbose is generally just poor writing. Now, using a separate website per topic seems like a silly limitation, but the more topics being discussed the less relevant the discussion.
I think an onload limit is much more useful than file size.
A 700kB JavaScript page can take up to 10 sec. to render on older mobile devices.
And a 500kB image can contain megapixels which will slow down non-PGU browsers.
Personally I always go for a max 2 sec. limit on all devices.
I think it should be relative to content. As in versus actual text on the screen. This metric can also be applied "per-page" and "per-site", with less ambiguity for SPAs; every new load brings in more bytes, but also more text, thereby contributing to the ratio.
Even better - fixed site-wide assets (i.e css, js) to features; hence loading an entire framework only to use a small % of it's features is penalised.
From a "Why, who cares" perspective website (and app) speed are highly correlated with conversion and engagement.
Google's headline research on the subject says "Improving your load time by 0.1s can boost conversion rates by 8%."
Some add'l data, sourced by Neilsen Group below:
- Google found that increasing the load time of its SERPs by half a second resulted in a 20% higher bounce rate.
- Google found 53% of mobile visits ended if a page took longer than 3 seconds to load.
- Akamai aggregated data from 17 retailers (7 billion pageviews) and found that conversion rates were highest for pages that loaded in less than 2 seconds; longer load times correlated with 50% drops in conversion rates and increased bounce rates, especially for mobile visitors.
- BBC found that for every extra second of page load time, 10% of users will leave.
Want to sell / fix this?
Here's the best three simple resource illustrating objective third party results from increasing site speed:
I used to work for a major e-commerce company and will back this up. The importance of performance, of course, depends on the use case.
In e-commerce, the biggest factors (that come to mind) to compete on are—SEO and brand aside—price, offerings, and convenience. Convenience has two dimensions: UX and performance.
If a user has a very clear idea of what they want from your site, they'll probably be patient. If a user is channel surfing (and the vast majority are when it comes to shopping, comparing options, etc.), then every millisecond matters. Every millisecond spent loading is a millisecond not spent selling/convincing.
I totally believe that the vast majority of page views are from channel surfing (or bots), but have a hard time believing that has any correlation to actual purchases.
If I'm spending an hour's pay on something I really want, of course I'll wait 10 seconds for the page to load, or if I can't get it to load at all, I'll make a note to try again later from different browser or internet connection. I'll manually enter my address details if they didn't auto-fill properly. I'll respond to emails on what I want to declare for customs, and various other efforts.
I, and people I know, feel like we buy very little on a whim. Is that unique? Are there whales who buy everything you can put in front of their face, or a different demographic who searches for something to buy and then changes their mind in precisely 850 milliseconds?
I would accept that like candy in the checkout lane, the profit is small but worth more than the extra effort it takes to put the offering there, but the revenue is small compared to the actual stuff people need to buy, but the analytics that suggest that hundreds of people click a link to your page but most close it faster than you can read the headline just seem unbelievable.
Those studies all come from companies trying to sell you something.
They seemed to come from when mobile phones were much slower when there was a valid use for AMP pages. A/B test data is very easy to cherry pick from and you can see a 10% increase in conversation based on random chance. I can see my marketing team misinterpret data like that and use it for promotion. I am skeptical.
I improved the initial loading speed of a website so it was 30% faster on an old mobile phone and it made no difference in conversation based on the analytics.
Most people used faster phones with faster internet that it really didn't matter. After the first page load, most of the bloated assets were cached so even those with slow connections were relatively fast navigating further.
There could be some types of websites where speed matters more like if you promoted click bait articles with high bounce rates. But if you have high quality content, it isn't going to be that important unless you have really terrible bloat.
To take control of other people’s content and, by extension, revenue streams? Performance was a selling point to the web dev - me, as a user of the web, was never even asked my opinion, and I can’t (easily) disable amp.
Alphabet cared. It cared about what corporations can care about, making money. It invested so much because there was a good return on investment and there was a lot of cash on hand wanting for investment and Amp was big enough to make use of enough cash in hand to be worth pursuing. With billions of cash on hand fixed overhead makes small investment not worth the bother. For Alphabet, a hundred million dollars is less than rounding error.
To compete with Facebook in centrally controlling the internet and the revenue that flows through it. It's their Plan B after their Plan A, Google+, failed.
But when you start this way, you quickly endup with the bloat that is modern software. Because each extra bit of waste seems justified at every steps along the way.
I don't mind highlighting and curating small sites for fun, that's neat.
But calling larger sites a "cancerous growth on the web" just feels immature to me. Everything is just cost/benefit. There's no need to bring a black-and-white fanatical exaggerated mindset to it.
You can celebrate something you like without having to make the alternative a disease, sheesh.
With all due respect. Modern news websites are a disease, and I find it pretty hard to disagree with the nomer cancerous growth since the behavior is spreading, and normalized by these websites.
There is zero benefit to me in loading 20mb of garbage just so I can read 10kb of text.
I think the reason why blog posts are written in this hyperbolic format is because it catches people’s attention, or “click bait”, if you will. If it were toned down in the way you suggested (which I agree with in principle), it might lose more readers before they get to the substance. Just hypothesizing here btw.
That’s just it — no benefit to you, but a lot of benefit to them. They need analytics to sustain their business. They need ads to sustain their business. They (maybe) need fancy interactions to differentiate and thereby sustain their business. If you expect everyone else to only do things that benefit you instead of mediating between benefiting you and benefiting themselves, you’re gonna be pretty disappointed.
1) Many websites are not news sites. In fact, I'd guess the vast majority. So this point doesn't feel relevant.
2) Sometimes things might be useful to people other than you. The world doesn't exist to cater to your needs, and referring to things that aren't exactly what you want as cancerous is childish.
And of those 10kb one line is the actual content and the rest is ELI5. In their defense can say that news websites are predominantly compatible with noscript.
As much as it "feels" immature, it is still an understandable analogy, regardless if you're a fanatic. As other users pointed out, it has become a norm for software/webdev houses to convert what is mostly static content into weird, flashy, bloated SPAs that load a ridiculous about of resources for little content. I see it in my peers too, and I saw it even when I was attending college. Frankly, I think it's immature to pretend this is not a problem as some seem to suggest.
Most webpages that load megabytes of bullshit today would be much better for everyone, developers, operators, hosting, and end users, if they just didn’t do that.
Sure there is a place for nuance, but there is also a place for bold writing without all the pussyfooting. Exclusively using either style makes it boring for the reader. We need the sweet with the sour ;)
>You can celebrate something you like without having to make the alternative a disease, sheesh.
You can, but would you be equally successful at it?
How accurate is this list? I see it mentions that visiting https://danluu.com/ downloads only 9.1 kB but when I actually visit this website with Google Chrome, in the Developer Tools' Network tab, I see that a total of 21.7 kB is transferred.
Name Status Type Size
------------ ------ -------- --------
danluu.com 200 document 2.5 kB
analytics.js 200 script 18.9 kB
collect 200 xhr 67 B
favicon.ico 404 text/html 266 B
Shameless plug: I maintain a lightweight home page myself at https://susam.in/ and I try to keep it as lean as possible without compromising on some custom formatting I care about. So far I have got the transfer size down to less than 6 kB.
Nice page. I only miss Firefox reader mode on these simple pages (for my preffered font size, and dark mode). I wonder if it's possible to hint to the browser that it should be available, even if the mark-up is simple?
Ah, the late 90s and early 00s when we still had user.css (in a meaningful sense).
There are vestiges of user.css like the Style sheet option in Safari. [1] I do wish they were better supported, and that user styles automatically had priority over webpage styles.
Just for your information, from their legal page: "...linking to this website without written permission is prohibited." Not sure what is meant by this, but I found it funny.
That same sentence starts (paraphrasing) "[Copying or giving out of any stuff gotten on this domain ... is banned]" so merely quoting the legal terms is also "illegal".
What I think they mean by this is that you shouldn't link to resources on their website to make it seem like they endorse your (product, website, whatever).
Want to tell them you like it? Better buy a stamp.
"If you have any comments about our WEB page, you can write us at the address shown above. However, due to the limited number of personnel in our corporate office, we are unable to provide a direct response."
I see they have a link to "Berkshire Activewear". Now that's a much much more heavyweight page.
I was once asked to debug an extremely slowly loading page. In the first 5 minutes it was evident that the client was doing things it wasn't supposed to do. It was downloading 300MB worth of resources to show the webpage. Incorrect implementations and inefficient use of libraries is the reason why we're seeing a bloated websites all over the web
> It was downloading 300MB worth of resources to show the webpage. Incorrect implementations and inefficient use of libraries is the reason why we're seeing a bloated websites all over the web
In situations like that, it's right and proper to ask who built the site. Then shake your head with absolute contempt.
I've had to fix web apps like that, and no, that's not always right and proper. One example I can give is a tool which loaded 50MB of deeply nested JSON, because when it was written 5 years ago, the payload per item was 80% smaller and the company had 0.1% the number of items.
The correct response is to work out who was responsible for maintaining the site for the past five years.
I feel like the 1MB limit is excessively generous, especially for text-only pages. But maybe that's what makes it so damning when pages fail to adhere to it. I know at least one website I maintain fails it spectacularly (though in my defense it's entirely because of that website being chock-full of photos, and full-res ones at that; pages without those are well under that 1MB mark), while other sites I've built consist entirely of pages within a fraction of that limit.
It'd be interesting to impose a stricter limitation to the 1MB Club: one where all pages on a given site are within that limit. This would disqualify Craigslist, for example (the listing search pages blow that limit out of the water, and the listings themselves sometimes do, too).
I also wonder how many sites 1mb.club would have to show on one page before it, too, ends up disqualifying itself. Might be worthwhile to start thinking about site categories sooner rather than later if everyone and their mothers starts spamming that GitHub issues page with sites (like I'm doing right now).
My toy project https://k8.fingswotidun.com/static/ide/?gist=ad96329670965dc...
Gives you an emulator, an 8-bit AVR Assembler, and an IDE for just over 500k transferred. Almost all of it JavaScript.
Using math.js is by far the heaviest part but at least your Asm already knows things like solarMass and planckConstant :-). CodeMirror comes in second heaviest but for user-experience-per-byte you're always going to be streets ahead of a gif.
It's over one hundred thousand lines of JavaScript code minified and Gziped into a 300KB bundle which should fully load in about 300ms on a decent computer.
No kidding. I just checked and the average text-only page on my blog well under 100kb. Even the image-heavy front page is under 1MB...
Even the page where I used images came in at juuuust about the limit -- to the point where you'd have to make a ruling of whether the rest of the page gzipped counted because over the wire was technically <1MB but it was just above after decompress. The site does specifically say "downloaded", haha
(This is a flexible target, depending on the complexity of a page. E.g. for a "bloated" page, a fully styled video display for competition winners showing 200+ entries and 280+ individual videos in categorized views is about 250K, including a few images, two and a half font families and the Vimeo Player SDK, but excluding the load of any external video streams. However, with compression we still manage the 140K mark.)
Then reality hits: Client insists on a full-width photographic hero image as it's still 2014. Usual controversies about a full-size intro video (autoplay, of course), we must have this highly intrusive chat asset installed, etc, etc… – And we easily blow the 1MB limit.
Lets say you write a daily blog. A single A4 of text contains on average 3000 characters, your posts average slightly above that by being 4000 characters. How long until the text content alone is above the 1MB limit.
https://dictionary.cambridge.org/grammar/british-grammar/all...
Being more verbose is generally just poor writing. Now, using a separate website per topic seems like a silly limitation, but the more topics being discussed the less relevant the discussion.
A 700kB JavaScript page can take up to 10 sec. to render on older mobile devices. And a 500kB image can contain megapixels which will slow down non-PGU browsers.
Personally I always go for a max 2 sec. limit on all devices.
For context, 1MB is the same order of magnitude as the original Doom which was about 2.4MB in size. [1]
[1]: https://www.wired.com/2016/04/average-webpage-now-size-origi...
Even better - fixed site-wide assets (i.e css, js) to features; hence loading an entire framework only to use a small % of it's features is penalised.
Google's headline research on the subject says "Improving your load time by 0.1s can boost conversion rates by 8%."
Some add'l data, sourced by Neilsen Group below:
- Google found that increasing the load time of its SERPs by half a second resulted in a 20% higher bounce rate.
- Google found 53% of mobile visits ended if a page took longer than 3 seconds to load.
- Akamai aggregated data from 17 retailers (7 billion pageviews) and found that conversion rates were highest for pages that loaded in less than 2 seconds; longer load times correlated with 50% drops in conversion rates and increased bounce rates, especially for mobile visitors.
- BBC found that for every extra second of page load time, 10% of users will leave.
Want to sell / fix this?
Here's the best three simple resource illustrating objective third party results from increasing site speed:
- https://blog.hubspot.com/marketing/page-load-time-conversion...
- https://wpostats.com/
- https://www.cloudflare.com/learning/performance/more/website...
Here is a more compelling deeper look from a leader in the UX space:
- https://www.nngroup.com/articles/the-need-for-speed/
Here's a really well written article about how to PROVE to the powers that be that site speed is worth testing:
https://web.dev/site-speed-and-business-metrics/
In e-commerce, the biggest factors (that come to mind) to compete on are—SEO and brand aside—price, offerings, and convenience. Convenience has two dimensions: UX and performance.
If a user has a very clear idea of what they want from your site, they'll probably be patient. If a user is channel surfing (and the vast majority are when it comes to shopping, comparing options, etc.), then every millisecond matters. Every millisecond spent loading is a millisecond not spent selling/convincing.
If I'm spending an hour's pay on something I really want, of course I'll wait 10 seconds for the page to load, or if I can't get it to load at all, I'll make a note to try again later from different browser or internet connection. I'll manually enter my address details if they didn't auto-fill properly. I'll respond to emails on what I want to declare for customs, and various other efforts.
I, and people I know, feel like we buy very little on a whim. Is that unique? Are there whales who buy everything you can put in front of their face, or a different demographic who searches for something to buy and then changes their mind in precisely 850 milliseconds?
I would accept that like candy in the checkout lane, the profit is small but worth more than the extra effort it takes to put the offering there, but the revenue is small compared to the actual stuff people need to buy, but the analytics that suggest that hundreds of people click a link to your page but most close it faster than you can read the headline just seem unbelievable.
They seemed to come from when mobile phones were much slower when there was a valid use for AMP pages. A/B test data is very easy to cherry pick from and you can see a 10% increase in conversation based on random chance. I can see my marketing team misinterpret data like that and use it for promotion. I am skeptical.
I improved the initial loading speed of a website so it was 30% faster on an old mobile phone and it made no difference in conversation based on the analytics.
Most people used faster phones with faster internet that it really didn't matter. After the first page load, most of the bloated assets were cached so even those with slow connections were relatively fast navigating further.
There could be some types of websites where speed matters more like if you promoted click bait articles with high bounce rates. But if you have high quality content, it isn't going to be that important unless you have really terrible bloat.
Dead Comment
[0] https://1mb.club/favicon.ico
https://output.jsbin.com/yudonidujo
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 48 48"><path fill="#662113" d="M27.3 5.7c4.9-3.1 11-3.9 16.8-4.1A162 162 0 0031 10l-3.7-4.5zm4.1 5.6c4.8-3.6 10.1-6.5 15.3-9.6a18 18 0 01-4.3 12.7l-11-3z"/><path fill="#c1694f" d="M44 1.6l3.3-.3-.6.4c-5.2 3.1-10.5 6-15.3 9.6 3.7 1.2 7.3 2.2 11 3l-6 6.2c-3.6-.8-7.4-2.8-11-2.4a70.5 70.5 0 00-17 18.3c2.7.2 5.5.4 8.2.4h.5c-3.3.2-6.6.4-9.9.9 3.6.6 7.2.5 10.8 1-3.7 2.2-8.2 1.7-12.4 1.7-1.6 2.4-2.7 5.3-5 7.1.7-2.8 2.6-5.2 3.8-7.8.1-2-.4-4.1-.5-6.2L6.7 36l.3-.3c5.2-7 10.7-13.7 17-19.7l-1.6-7.7L27 5.4l.3.2 3.6 4.5c4.3-3 8.7-5.9 13.2-8.5z"/><path fill="#d99e82" d="M8 21.2C11.6 16 17 12 22.3 8.3L24 16A178 178 0 007 35.7c-1-4.8-1.3-10 1-14.5zm.5 15.2c4.6-6.8 10-13.4 16.8-18.3 3.7-.4 7.5 1.6 11.1 2.4L33 24.1c-2.5-.2-4.9-.4-7.4-.4l5.4 2.4-3.2 3.5c-3.7-.3-7.4-1-11.2-1l9.7 3a10.7 10.7 0 01-9.5 5.2c-2.7 0-5.4-.2-8.2-.4z"/></svg>
I went with:
Renders a black square, everywhere except Safari. The black square actually goes with the brand.Performance is important, but if your site is clocking in at 17kb then you're probably doing okay.
But calling larger sites a "cancerous growth on the web" just feels immature to me. Everything is just cost/benefit. There's no need to bring a black-and-white fanatical exaggerated mindset to it.
You can celebrate something you like without having to make the alternative a disease, sheesh.
There is zero benefit to me in loading 20mb of garbage just so I can read 10kb of text.
2) Sometimes things might be useful to people other than you. The world doesn't exist to cater to your needs, and referring to things that aren't exactly what you want as cancerous is childish.
>You can celebrate something you like without having to make the alternative a disease, sheesh.
You can, but would you be equally successful at it?
Ah, the late 90s and early 00s when we still had user.css (in a meaningful sense).
[1]: https://support.apple.com/guide/safari/advanced-ibrw1075/mac
Does an analogous window exist for QUIC and HTTP/3?
[1]: https://hpbn.co/building-blocks-of-tcp/#cwnd-growth
Yes, my goal (very hard to do sometimes) is to fit all of HTML + Push CSS within 14kb
Please use uBlock Origin.
What I think they mean by this is that you shouldn't link to resources on their website to make it seem like they endorse your (product, website, whatever).
From the legal disclaimer at the bottom:
> linking to this website without written permission is prohibited
"If you have any comments about our WEB page, you can write us at the address shown above. However, due to the limited number of personnel in our corporate office, we are unable to provide a direct response."
I see they have a link to "Berkshire Activewear". Now that's a much much more heavyweight page.
Can it be assumed that this same website has been in place since 1978? Obviously not exactly like it is now, but probably not far off.
I wonder what the logic for the 1978 date is. It's hard for me to believe they had any reasonably connected predecessor of this in 1978.
But, the website looks rather similar to how it did back in 2001 (with the recognizable two-column list of bullet points):
https://web.archive.org/web/20011129002047/https://www.berks...
Deleted Comment
Deleted Comment
Deleted Comment
How about you average their subsidiary web pages? Start with DQ.com (Dairy Queen)
In situations like that, it's right and proper to ask who built the site. Then shake your head with absolute contempt.
The correct response is to work out who was responsible for maintaining the site for the past five years.
https://www.checkbot.io/
Includes analytics, chat client, big product screenshot, theming and payment integration, so you can still do a lot well within 1MB.