Browsers are now so fast and capable that you can emulate whole OS with another browser in them, which you can then use to browse.
So why does half of the websites in 2021 feel so sluggish?
GMail, which was famous for loading fast, sometimes takes forever to load.
I need to wait minutes when reading news websites while elements randomly jump around the page (seriously, why does pages do that? Why do elements randomly jump around for the first 2 minutes? what is the problem there?)
I don't want to just blindly blame JS. Look at this page! JS can be amazing! And I love stuff like Figma (as it's so fast and optimized). But why does the average experience suck so much?
It's basically Parkinson's law. If someone is given NASA's resources to move a piece of paper across the room, they will find a way to use a space shuttle to do it and call you oldfashioned for doing it by hand.
Basically, more resources mean that people will use more of these resources. And therefore any optimizations that go resource effectiveness (in this case browser efficiency) will be negated.
The short answer: Tracking and advertising.
Longer answer: All of the above + bloated frameworks (frontend and backend) + unoptimized/excessive static assets.
I guess we all know this, so I apologize if you meant it more as a rhetorical question.
As for Gmail, I've never had a problem, it's always pretty fast, considering all the features it has.
Why does tracking and advertising take so much time to load and display?
I would not really mind ads and being tracked if it didn't make internet sometimes literally unusable?
"bloated frameworks"... I remember when React was touted as "faster than native DOM"! Which didn't make much sense even back then, because you need to do the things in actual DOM too... but... there was always the push of speed
So where is the bloat from, really.
Javascript can be really fast. It's just, nobody really cares? I guess?
I can remember in the late 90s it would take about 20 minutes to download a 3mb MP3 file. My employer’s homepage is now about 8mb, according to the browser dev tools.
> I don't want to just blindly blame JS.
JS is a tool, a language. Don’t blame the tools as they have been refined to an astounding degree over the last 25 years. As a long time front end developer blame the business, specifically:
* Developer incompetence. Do you really need the largest frameworks humanity has written and a million dependencies to put a couple lines of text on the page? Yes. Well, no, but most developers will claim otherwise and most businesses will refuse to hire those who are so capable.
* Stalking. Analytics code is a silent performance killer and generally responsible for a lot of JS on many commercial websites. This wonderful stuff allows for session tracking via advertisements across various websites and at times serves as a point of malicious intent by both valid business interests and criminal organizations.
Javascript is one of the major reasons, but also trackers, huge graphics and loads of videos. Old pages still load fast so it's not the browser's fault. Say what you want about https://www.lingscars.com/, but it loads instantly, is full of all animations and doesn't lag down the browser.
The problem is modern design, basically. We've invented server-rendered pages, then started rendering everything client side with frameworks like React, then we've started pre-rendering React on the server and completing content in the client, and now the pendulum seems to be swinging the other way around again to technologies like Flutter that just render an application in a canvas shudder.
People want interactivity, and developers want that interactivity to be consistent across their website. To do so, they need to recreate and emulate everything the browser does. In the case of React and other JS frameworks, that even includes constructing fake DOMs. Back buttons get overridden, links get turned into buttons that do custom routing, you name it and there's a layer of Javascript you can download to avoid having to do the hard work.
This is partially because of how demanding people have become. They expect any proper web page to be like Facebook. People, especially customers, want features, and they want them fast. Big companies that have their own developers are pushing their users towards their apps, sometimes intentionally sabotaging their website (looking at you, Reddit) to force people to download their invasive, native code.
Websites have become applications, and applications are inherently taxing on most systems. Google tried to combat this in their own, misguided way with AMP. If it wasn't for their stupid caching architecture, I'd be a fan of AMP because of the speed it can provide compared to "modern" web pages.
I don't think any of this will change unless we convince web developers to stop relying on all of these "modern" technologies and just write proper web pages. From the reactions here on HN, there are two groups of people in this debate: the people like us, who lament how slow the web has become, and the people who will never give up their fancy frameworks because of the productivity it allows them, and will never give in to the web Luddites who want to take away their fancy cross-platform tools that run just fine on their $2000 M1 Macs.
There are also a lot of web developers out there that are just shit, but we can't help those.
> I don't think any of this will change unless we convince web developers to stop relying on all of these "modern" technologies and just write proper web pages.
It’s not the modern tech that is at fault. When used correctly it can work very well as it was intended.
Problem is it’s just about easy enough to be used wrongly by low skill developers who don’t engage with anything they’re making deep enough to even become critical of how it’s functioning.
And worse when you start pointing it out they always just come up with lame excuses as if it’s normal that a page displaying 30 items takes 15+ seconds to load and have to spend your time lecturing them that yes links should open in a new window when ctrl clicked or that urls should actually function when you go to them directly.
Dan Luu reports that Google penalizes his pages in search rankings because they don't include enough CSS and JS. It's not really half of the websites; it's the half you're being guided to.
There's a dead comment asking if you have a source, and the commenter appears to have attempted finding it themselves.
As an aside, I vouched for the comment (which still shows as [dead]), because a quick glance at posting history shows just the singular comment as dead, with no indications of twattery to justify it. It's like a commenter with not-interesting post history just got a single comment randomly killed.
Lack of care and attention. By developers who are pressured to keep sprint velocity up. By product owners who follow the ideal path. By designers who want sites to work like their mock-ups. By testers who don’t have enough time to test under specified features. By clients who are focused on reporting upwards and tracking business goals.
And by prioritising new learning experiences over boring reliable solutions.
The current compromise for monetization on the web is advertisiting. The web apps and sites are not optimized for snappiness, responsiveness, user experience. They are optimized for tracking and profiling, the market leading front-end frameworks enable mostly these.
> Browsers are now so fast and capable that you can emulate whole OS with another browser in them, which you can then use to browse
Yes they can emulate a whole OS. It's browsing at what they suck. Seconds wasted for TLS handshake, seconds wasted on downloading MB of JS, then realizing content is missing, another TLS handshake, another download, then some ads, some tracking scripts and finaly the page is displayed. Of course if you don't use Chrome or Google DNS you will be penalized. And don't forget to accept all cookies.
> But why does the average experience suck so much?
to the current crop of web designers/developers/etc, they literally may not remember a time when the web was a lot faster.
they don't remember the days of websites loading just fine (ok, most of the time..) over a 56k modem. they've grown up with the web the way it is, javascript and all.
Most people will put up with slow-loading pages up to some extreme threshold.
So, if you reduce your page's loading time from, say, 1 second to 400 millis, you won't gain any significant number of users.
On the other hand, if you use those extra 600 millis to hold an auction for ad space on the page, you make more money immediately.
Therefore we can expect any for-profit site to load only fast enough to keep users from giving up (even if they feel frustrated or whatever), and no faster than that.
Sorry for the offtopicness, but I need you to see this and don't have another way to contact you.
You have posted a great many comments that broke the HN guidelines egregiously. (I'm not talking about this thread.) Comments like this are bannable offenses on HN:
I'm not going to ban you right now, partly because those threads are all at least a week old and partly because your account has years of history on HN. However, we need you to review https://news.ycombinator.com/newsguidelines.html and stick to the rules from now on. If you keep breaking them like this, we're going to have to ban you. In particular, it's absolutely not acceptable to attack other users the way you've been doing, no matter how strongly you disagree with them. That goes against everything this site is supposed to be for.
A 5 y/o girl, watching me troubleshooting an error on a TV at family gathering, suggested I restart the TV. I told her mother the child had demonstrated sufficient knowledge to work in modern IT support.
You can also say this of software in general including operating systems.
Sure, the hardware and network are a miracle.
My 2 cents is that there is so much software to be written that most programmers are mediocre at what they do and/or their working environment makes it even more difficult to create efficient software systems.
Hardware and networks get the easy stuff, it's just moving lots of data around. Software is where all the messy human stuff happens. So naturally it's a reflection of human beings. Ie messy.
I'd say all the software on earth is done cheap. Quality doesn't really matter for most projects. Where it does matter, if there's money it happens.
I'm not sure what sort of workflow and mail client you desire, but Notmuch supports arbitrary tagging. Most coverage I've seen of it integrates with emacs or mutt (or similar text-based mail clients). That said, in searching for the project homepage, I also saw this project[1], which might be interesting if you're looking for something a bit less terminal. I'm sure there are more alternatives, but hopefully this is useful as a starting point for you.
I broadly agree, but you lost me at Figma being fast. Granted I am using an older MacBook Pro (2015), but using Figma in a browser usually requires that I close other apps to let Figma do its thing. Figma and Miro are the only sites I have encountered that incur this level of performance hit.
Incidentally, mightyapp.com was founded to make browsing less resource-intensive by doing the rendering in the cloud. They highlight Figma and Miro (among others) prominently, so it seems others have similar experiences with Figma.
I suppose most devs don't use tree-shaking when importing modules.
The frameworks themselves don't add much bloat usually.
I do believe however single page apps are a fad and we are going to return to SSG/SSR which is already apparent in new JS frameworks like NextJS.
The web has grown from something to display documents to a platform for applications. Focus has been placed heavily on optimization of JS jit compilation, but currently browser DOM and CSS layout engines have not cought up leaving a disparity in performance on major browsers.
Yeah I agree with all the parents other points, but Gmail has been an island of sanity, very rarely annoying me. Now, I’m usually using it on a fast computer but typical sites can’t even live up to that standard on the same machine.
Browsers are fast, too. Think about all of the hard work which has gone into advanced hardware acceleration, JIT engines, etc. That work deserves respect even if the median front-end developer squanders it with bad decisions.
You did not pay for any page to load fast or to serve you what you are looking for. Every web-request, every click you make with your browser, can deliver you a response which is unknown to you or your browser before-hand. There is no way for you to see what you are buying before buying it if it was possible to buy page-loads.
Say thanks because you are not in the EU and get the cookie and GDPR spam thrown at you as well. Worse than the 1999 popups.
Arguably, Safari is the IE of our times.
Safari somehow always have rendering issues. I use Firefox as that's closest to standards, I hate bending standard code to fit a garbage browser's special needs.
I wouldn't go as far as calling it an "IE", because those of us who lived through that know it was a completely different experience.
But yes, Safari is slow in implementing new features a bit and it does get annoying from time to time. But it's nowhere near the IE hell from ~2000 - 2008.
On iOS you have to use Safari if you want to use a content/ad blocker, which means I’m kind of stuck with it on my iPhone/iPad.
Then you bring macOS into play, and the way Safari on mobile syncs with Safari on desktop is really nice. Password sharing in particular is great, so you end up getting sucked in to Safari!
Jeez, the error message in safari was below the scroll line. I got annoyed with the spinning circle loading the page for navigator 3 and thought, “oh I get it.. haha it’s being slow like back in the day”.
After thinking, “no, it wasn’t _this_ slow back in the day,” I scroll down to see..
Sorry, OldWeb.today can not run in this emulator in your current browser.
Maybe hide the spinny circle thingy if you’re smart enough to know you’re not going to load anyway.
Oh wow. I wasn't expecting it to actually load and emulate an entire OS in my browser. It's an actual full OS too, with finder/explorer fully working. I thought it would be something like a proxy + web archive.
I chose geocities as of 1996. It booted mac os 7. Then IE threw an error at me. Then I entered google.com to the URL bar out of curiosity, thinking it would load actual modern google. Only to be greeted with "Google Search Engine Prototype". Lol.
Wow, very well done. I am one of those who grow up late 90s / early 2000s browsing internet, forums, blogs, fansites, presonal websites, and from time to time I go back to web.archive to see again the webpages that made me happy during my childhood / teen ages. This is a great complement for that.
Thank you very much whoever did this - my inner self is really thankful.
Did anybody here ever use the ISDN data ports in payphones in Japan? I remember seeing them here and there in 1998 (IIRC). At the time I wondered about popping over to one and getting on the web just to see how it would work. ISDN was serious biz to my mind at that time, compared to rural USA dialup.
In 1997 or so, I was an intern in the remote connectivity group at Sun. My job was primarily to help maintain connectivity for the people at home rocking their 128k dual-channel ISDN lines. I helped maintain local modem banks and remotely manage end user hardware and software. I remember drooling at the prospect of such a fat internet pipe.
Browsers are now so fast and capable that you can emulate whole OS with another browser in them, which you can then use to browse.
So why does half of the websites in 2021 feel so sluggish?
GMail, which was famous for loading fast, sometimes takes forever to load.
I need to wait minutes when reading news websites while elements randomly jump around the page (seriously, why does pages do that? Why do elements randomly jump around for the first 2 minutes? what is the problem there?)
I don't want to just blindly blame JS. Look at this page! JS can be amazing! And I love stuff like Figma (as it's so fast and optimized). But why does the average experience suck so much?
https://store.steampowered.com/app/504230/Celeste/
https://apps.apple.com/us/app/castlevania-grimoire-of-souls/...
When the hardware was restricted, such games used to be squeezed onto a 48MB SNES cartridge
Basically, more resources mean that people will use more of these resources. And therefore any optimizations that go resource effectiveness (in this case browser efficiency) will be negated.
I guess we all know this, so I apologize if you meant it more as a rhetorical question.
As for Gmail, I've never had a problem, it's always pretty fast, considering all the features it has.
I would not really mind ads and being tracked if it didn't make internet sometimes literally unusable?
"bloated frameworks"... I remember when React was touted as "faster than native DOM"! Which didn't make much sense even back then, because you need to do the things in actual DOM too... but... there was always the push of speed
So where is the bloat from, really.
Javascript can be really fast. It's just, nobody really cares? I guess?
> I don't want to just blindly blame JS.
JS is a tool, a language. Don’t blame the tools as they have been refined to an astounding degree over the last 25 years. As a long time front end developer blame the business, specifically:
* Developer incompetence. Do you really need the largest frameworks humanity has written and a million dependencies to put a couple lines of text on the page? Yes. Well, no, but most developers will claim otherwise and most businesses will refuse to hire those who are so capable.
* Stalking. Analytics code is a silent performance killer and generally responsible for a lot of JS on many commercial websites. This wonderful stuff allows for session tracking via advertisements across various websites and at times serves as a point of malicious intent by both valid business interests and criminal organizations.
The problem is modern design, basically. We've invented server-rendered pages, then started rendering everything client side with frameworks like React, then we've started pre-rendering React on the server and completing content in the client, and now the pendulum seems to be swinging the other way around again to technologies like Flutter that just render an application in a canvas shudder.
People want interactivity, and developers want that interactivity to be consistent across their website. To do so, they need to recreate and emulate everything the browser does. In the case of React and other JS frameworks, that even includes constructing fake DOMs. Back buttons get overridden, links get turned into buttons that do custom routing, you name it and there's a layer of Javascript you can download to avoid having to do the hard work.
This is partially because of how demanding people have become. They expect any proper web page to be like Facebook. People, especially customers, want features, and they want them fast. Big companies that have their own developers are pushing their users towards their apps, sometimes intentionally sabotaging their website (looking at you, Reddit) to force people to download their invasive, native code.
Websites have become applications, and applications are inherently taxing on most systems. Google tried to combat this in their own, misguided way with AMP. If it wasn't for their stupid caching architecture, I'd be a fan of AMP because of the speed it can provide compared to "modern" web pages.
I don't think any of this will change unless we convince web developers to stop relying on all of these "modern" technologies and just write proper web pages. From the reactions here on HN, there are two groups of people in this debate: the people like us, who lament how slow the web has become, and the people who will never give up their fancy frameworks because of the productivity it allows them, and will never give in to the web Luddites who want to take away their fancy cross-platform tools that run just fine on their $2000 M1 Macs.
There are also a lot of web developers out there that are just shit, but we can't help those.
It’s not the modern tech that is at fault. When used correctly it can work very well as it was intended.
Problem is it’s just about easy enough to be used wrongly by low skill developers who don’t engage with anything they’re making deep enough to even become critical of how it’s functioning.
And worse when you start pointing it out they always just come up with lame excuses as if it’s normal that a page displaying 30 items takes 15+ seconds to load and have to spend your time lecturing them that yes links should open in a new window when ctrl clicked or that urls should actually function when you go to them directly.
The technical term is webshits.
As an aside, I vouched for the comment (which still shows as [dead]), because a quick glance at posting history shows just the singular comment as dead, with no indications of twattery to justify it. It's like a commenter with not-interesting post history just got a single comment randomly killed.
I found this post [1] but it seems a but outdated and references AMP, which Core Web Vital metrics are supposed to replace (right?).
https://danluu.com/web-bloat/
And by prioritising new learning experiences over boring reliable solutions.
Yes they can emulate a whole OS. It's browsing at what they suck. Seconds wasted for TLS handshake, seconds wasted on downloading MB of JS, then realizing content is missing, another TLS handshake, another download, then some ads, some tracking scripts and finaly the page is displayed. Of course if you don't use Chrome or Google DNS you will be penalized. And don't forget to accept all cookies.
The few hundred milliseconds is worth it to stop someone sniffing my browsing details on an open network.
I can remember browsing the web using 56k. That was slow.
Well, these at least are fixable. The web is unusable without uBlock Origin and NoScript.
boss: "browsers today are so fast and capable, we don't need to waste resources on something that doesn't give us tangible ROI."
to the current crop of web designers/developers/etc, they literally may not remember a time when the web was a lot faster.
they don't remember the days of websites loading just fine (ok, most of the time..) over a 56k modem. they've grown up with the web the way it is, javascript and all.
that's what I gather, anyway.
So, if you reduce your page's loading time from, say, 1 second to 400 millis, you won't gain any significant number of users.
On the other hand, if you use those extra 600 millis to hold an auction for ad space on the page, you make more money immediately.
Therefore we can expect any for-profit site to load only fast enough to keep users from giving up (even if they feel frustrated or whatever), and no faster than that.
You have posted a great many comments that broke the HN guidelines egregiously. (I'm not talking about this thread.) Comments like this are bannable offenses on HN:
https://news.ycombinator.com/item?id=29670648
https://news.ycombinator.com/item?id=29657097
https://news.ycombinator.com/item?id=29657047
https://news.ycombinator.com/item?id=29657028
https://news.ycombinator.com/item?id=29657006
https://news.ycombinator.com/item?id=29346327
https://news.ycombinator.com/item?id=29352983
I'm not going to ban you right now, partly because those threads are all at least a week old and partly because your account has years of history on HN. However, we need you to review https://news.ycombinator.com/newsguidelines.html and stick to the rules from now on. If you keep breaking them like this, we're going to have to ban you. In particular, it's absolutely not acceptable to attack other users the way you've been doing, no matter how strongly you disagree with them. That goes against everything this site is supposed to be for.
Sure, the hardware and network are a miracle.
My 2 cents is that there is so much software to be written that most programmers are mediocre at what they do and/or their working environment makes it even more difficult to create efficient software systems.
Similar correlations to Government.
I'd say all the software on earth is done cheap. Quality doesn't really matter for most projects. Where it does matter, if there's money it happens.
It's been that way since the ad industry took over the web.
(Sadly, my workflow is heavily dependent on multiple labels per email, and no IMAP client seems to work acceptably in that situation.)
[0] https://notmuchmail.org/
[1] https://add0n.com/notmuch-email-client.html
Incidentally, mightyapp.com was founded to make browsing less resource-intensive by doing the rendering in the cloud. They highlight Figma and Miro (among others) prominently, so it seems others have similar experiences with Figma.
The web has grown from something to display documents to a platform for applications. Focus has been placed heavily on optimization of JS jit compilation, but currently browser DOM and CSS layout engines have not cought up leaving a disparity in performance on major browsers.
I agree with your point, but I have to point out that Figma is mostly built on C++/WASM.
Also, fronted parts that operate the DOM are necessarily JS.
Devs not bothering to put element dimensions in the markup, or otherwise not making the dimensions available/calculable at page load.
This actually means that you love C++. But HN crowd will hate that lol.
https://www.figma.com/blog/building-a-professional-design-to...
Say thanks because you are not in the EU and get the cookie and GDPR spam thrown at you as well. Worse than the 1999 popups.
Wow, it really does feel like using an old browser.
(Safari Version 15.2 (17612.3.6.1.6))
But yes, Safari is slow in implementing new features a bit and it does get annoying from time to time. But it's nowhere near the IE hell from ~2000 - 2008.
Then you bring macOS into play, and the way Safari on mobile syncs with Safari on desktop is really nice. Password sharing in particular is great, so you end up getting sucked in to Safari!
Most of the time I think it is just dev laziness.
Can anyone tell us which web feature is blocking the compatibility with Safari?
After thinking, “no, it wasn’t _this_ slow back in the day,” I scroll down to see..
Sorry, OldWeb.today can not run in this emulator in your current browser.
Maybe hide the spinny circle thingy if you’re smart enough to know you’re not going to load anyway.
I chose geocities as of 1996. It booted mac os 7. Then IE threw an error at me. Then I entered google.com to the URL bar out of curiosity, thinking it would load actual modern google. Only to be greeted with "Google Search Engine Prototype". Lol.
Thank you very much whoever did this - my inner self is really thankful.
I think at that time we just used it to browse wais.
I remember reading about T1 lines and being blown away at how fast that was
What amazed me was how quickly that went from state of the art to pretty mediocre.