A big point of contention in the comments here lies around the concept of what a website should be.
A school of thought in web development believes the web to be the next frontier in application development. For them it makes sense that websites like this feel and act like apps, both as an end-user (animations, transitions without full-page reloading, rich dynamic content, etc) and as a developer (client-side state management, frontend/backend separation, with an API in between, modular application structure, etc).
Apps don't load in 10ms, but they also can support some offline functionality given their independence from the server. Overriding browser behaviour and managing your own loading behaviour makes sense, because the default browser behaviour is not the experience you're striving for; it's not the app experience. These people are usually those who have worked on large web projects too— the developer experience that web developers have built for themselves in "userland" (JavaScript) is pretty good, and has evolved a lot to have features that makes developing the exact behaviour you want easier, and correctly iterating on a codebase, quicker.
A separate school of thought wish websites stayed true to their origins as enriched documents, and think trying to evolve them in the direction of applications is counter-productive to their true purpose as interactive surfaces to get and send information. If I am just going to scroll and see a few pictures, why do I need to download anything other than the structure of the page and the pictures themselves? If all the building blocks are already there, in the browser, why do people need to re-invent them and ship them as yet more JavaScript?
What should a website be though? The fact there isn't consensus about this is indication that there really doesn't seem to be a clear answer.
Per the document-like school of thought, facebook.com just keeps straying further and further away from the ideal, but as far as as the app-like school of thought goes, the new facebook.com is a pretty remarkable achievement.
Well I certainly want Google Maps to act like an app (which it does): panning, zooming and so on. And I'm happy for Hacker News to act like a traditional website. At that point it's a false dichotomy: different solutions for different problems, yes?
As an analogy: some pop-up books are amazing works of art. But reading would be frustrating if every book was a pop-up book.
On one hand I appreciate how elaborately the parent comment made their point. On another, I appreciate how succinctly you made your point. A conundrum within a conundrum, indeed.
Things that act like apps also can’t be adequately prevented from compromising the general data privacy of the population. What this means is that it doesn’t matter very much whether your or I have mere preferences for something to have rich app-like functionality or not. Our wanting of it does not play a role unless it can be made verifiably secure and not abusive of user privacy rights. So far, general web applications cannot be made verifiably non-abusive of users’ data privacy, so we must only focus on websites as purely structured documents with no app-like capability.
I sure wish companies wouldn’t abuse data privacy so we could instead care about user preferences for app-like functionality, but we don’t yet live in a world like that.
I don't think you articulate this quite right: it's possible to both think that websites and web applications are worthy uses of the web, but also that many web applications would have been best engineered using 'classical' techniques from early websites. There's a strong argument to be made in Facebook's case, since the core value proposition of Facebook hasn't changed much since it's inception, and it began its life as a server-side rendered 'website.'
In any case, this argument is operating at the wrong level of abstraction, the issue here isn't the distinction between these two things conceptually, but if there would be less incidental complexity overall if what are typically called web applications took a different approach to implementing their features, while still maintaining the same user experience.
It's hard to look at all of the crap you need to do to get a functioning web app working to not think there must be a better approach.
> it's possible to both think that websites and web applications are worthy uses of the web, but also that many web applications would have been best engineered using 'classical' techniques from early websites.
I agree with this.
> There's a strong argument to be made in Facebook's case, since the core value proposition of Facebook hasn't changed much since it's inception, and it began its life as a server-side rendered 'website.'
Yes, but I assume that's from the perspective of the value the site brings to you, not in general and not to everyone. If someone solely gets value from facebook.com as a site to send and receive information to/from friends and the world, then yeah, it hasn't changed much.
Facebook today offers a richer experience and that might be part of its value for other people. On facebook.com you can IM a friend, while watching a video in a PIP window and engaging in a real-time conversation on the the comment thread of an event post. You can then switch back and forth between an online marketplace and a streaming service without losing the state of your chat window. The ability to do these things are part of the value proposition for many users that facebook.com now offers today, and delivering that value can be harder with a solely SSR'd website.
> there would be less incidental complexity overall if what are typically called web applications took a different approach to implementing their features, while still maintaining the same user experience.
If you've figured a way that's better do share! I'm sure there's instances in the wild, but I don't think an experienced engineer would ship their own client-side networking in JavaScript if there was a better way to achieve what they want without shipping any more JS.
> It's hard to look at all of the crap you need to do to get a functioning web app working to not think there must be a better approach.
To be clear, you can get a functional "hello world" web app with a single line a code (specifically, thanks to the fact html is very permissive with improperly formatted documents). Everything afterwards depends on the decisions you make, for the experience you want. Is getting rid of that 200ms flicker between page full page loads worth the 500ms it might take takes to load and initialize your client-side routing library? Is making your site available offline worth the effort of setting up all the extra PWA business? Some will think so, some will not.
In any case, this argument is operating at the wrong level of abstraction, the issue here isn't the distinction between these two things conceptually, but if there would be less incidental complexity overall if what are typically called web applications took a different approach to implementing their features, while still maintaining the same user experience.
It's hard to look at all of the crap you need to do to get a functioning web app working to not think there must be a better approach.
> ... as far as the app-like school of thought goes, the new facebook.com is a pretty remarkable achievement.
I agree, they’ve done a fantastic job. Not only that, but as far as corporate engineering blogs go, this article is one of the best I’ve ever read.
Usually I either know the subject too well to learn anything, or I don’t know the subject well enough to understand what they’re saying in the amount of time it takes to read an article.
In this case, they found the perfect depth, they had great judgment on when and how to use visuals, and what they’re conveying is so clearly valuable.
If you usually skip the article and just go straight to comments, consider actually reading this one!
Thank you. This is one of the first things I tell people in my web-dev workshop. They're probably not going to be building websites, they most likely will be building web applications.
There is something between 10ms and 10s, especially when it's after the initial load. The point of contention is not the website vs webapp debate, it's more that a webapp doesn't have to be that bloated.
The whole point of SPA initially was better use experience, notably through faster response time and loading.
If you are not building for the first time experience I do not understand why you would have a problem with initial load of a web app taking seconds. It's like an install just way faster. No one uses facebook only once and so shouldn't care. Android apps are so shit the "slow" webapps are still miles ahead. I do not understand why android google drive app takes 5 seconds to load
Under the document-like school of thought, less "engineers" would need to be employed. Also, less computing resources would be required on both the client side. Money, not to mention time, could be saved.
Perhaps application-like school of thought probably allows for more user manipulation. metrics and tracking.
Not sure about the resources part. One key point in having app like interactions Is that you forego the complete page request/render cycle a document oriented implementation requires.
Perhaps we should split the web into two worlds. Create dedicated browsers for just enriched document etc. Add some sort of constraints on what the browser can do, and how much control is removed from the user.
They tried that in the 90s, its called Java. Didn't really work out for that purpose.
In practice things like Twitter and Facebook, interactive programs, should really be just that - programs you run. If the interface is nigh static and the purpose is content interaction rather than primarily consumption you should be opening the Facebook program that gives you this interface and uses its client / server communication to feed messages to and from the interface, not provide the whole thing over the wire spread across document addresses.
And they are that on mobile. Who uses Facebooks mobile website? Everyone uses the app. The contention only exists on "desktop" OSes because Windows and OSX don't provide a UX workflow to push an app at user (at least they didn't when it mattered) the way a mobile site can. And that the app environments on both were way worse than the Android or iOS SDKs for making a dumb GUI for something like Facebook.
And then all those non techy people will say why this website doesn't open... And they need to download another browser to have web apps. I hope you can see how bad this can go
I'm pretty sure it's because the Documents style isn't viable for Facebook as a business, their money comes from all the interactive JS crap they insist on shoving into their product
Tech has this global problem of technology products never being what they say they are on the tin. It's like those front-end/back-end iceberg memes -- what the user wants and what the business wants are just barely in alignment. This needs to end.
I'm currently reading "The Dream Machine", and it makes me instantly go back in time and imagine reading the same comment on discussions around "batch mode" vs "time-sharing" and the development of CTSS [1].
I mean almost by definition a single app page is a mini browser inside a browser. So you can make it as fast and clever as you want, you still added another layer to render boxes, text and probably the most important: ads.
I'm actually really surprised by the number of comments in this thread about how the new redesign is slower. I've had it since yesterday and it genuinely feels much faster and more responsive than the old Facebook UI - though, to be fair, that's not a huge accomplishment give that the old UI would take forever to finish painting or respond to input. I'd consider it a success, especially when compared to the disaster that was and continues to be Reddit's redesign.
Across mobile messenger, messenger.com and the messenger desktop app, the website was the only place left where you could have multiple chats open at once. Now that moves to one at a time as well. That is a huge usability regression (unless its been fixed since last time I tried the prerelease. Edit: Just switched back, it does look like they maintained chat windows at the bottom instead of just chatheads on desktop, although I can still only have two chats open at once.) Friend lists also got left with the legacy interface, which cant bode well for them in the long run. They are still facebooks most squandered opportunity.
The neverending quest to reduce information density is a usability disaster, despite the misheld belief that cleanliness = usable. Zooming out on the new fb interface, to restore some information density, leaves a comical amount of whitespace. Wells Fargo has turned its desktop interface into a giant stretched mobile app. Nothing is hyperlinks that support right click or new tab anymore.
Considering Facebook engineering has gone into detail about how the new site is much faster and transmits much less JS and CSS, I would be a little surprised if the opposite is true. I tend to not implicitly trust HN comments about things being extremely slow, because for whatever reason there are so many of these complaints I’ve never experienced myself. I still haven’t even had performance problems with Electron apps, and those seem to be widely panned on HN as having abysmal performance. They work fine for me.
Yep, works fast for me too. Random profile opens in 2-3 seconds max for me. But for some people websites are slow for some reason. I've heard complains about Gmail loading 30 seconds, while it takes 2-3 seconds cold start for me.
I don't understand how you think a 2-3 second load time is "fast" for such an enormous platform and the actual content the user sees. I sure get that Facebook is way more than that, but to think a page loading in 2-3 seconds is fast is something you could've gotten away with in 2000, but in 2020..? I genuinely don't understand why you find it quick.
It's by design. Reddit doesn't want you using the website. They want you to download the app. As numerous annoying popups and notifications will tell you when visiting the website (especially on mobile).
> Surely they are hiring world-class devs, so what’s holding them back?
Facebook Engineering has a notorious "not invented by me" culture, it's not unique there but a lot of our "world-class" engineers are just acting economically rationally and hole-digging on some new bespoke framework or tool to cement their position in the company. You end up with a massive amount of churn and over-engineered vanity projects, and it's manifesting downstream in basically every product we've turned out for the last five years. That's why the applications bloated and terrible.
The joke inside the company is that it used to be "move fast and break things" but now it's "breakfast, vest and move things around". It's really an engineering culture of decadence and waste these days.
There are some brain-dead behaviours on Reddit's site. They do a JS render-then-fetch, which is the worst way to load data. They also seem to stick the data fetch inside a requestAnimationFrame, which means it only runs for foregrounded tabs. This is basic stuff. I don't see how this could be an accident.
Reddit seems like a place where the kind of experienced and talented people needed to turn it around could make a lot more money (via stock grants in addition to salary) and frankly have a lot more impact, at any of FAANG.
I've not seen anything to indicate that Reddit is hiring, or trying to hire, "world-class devs".
I guess it's optimized for some computers/networks but not for others. Right now I'm in an area without optical fiber so I have to use a 4g modem (12 MBps), I have an old 2013 MacBook Pro and even though my setup is far from being fast, I have no problems with most web pages, some few load kinda slow, but Facebook is in a different category, it's totally unusable, some stuff never even get to load. If I want to check Facebook I have to use Safari (the new theme is not supported in Safari yet).
New Reddit and all these JS-based websites are painfully slow for me on Safari on a modern MacBook despite being on an enterprise connection with 1Gbps and a few milliseconds ping to most of Europe. Bandwidth is not the only issue, processing all those megabytes of JS and (poorly) reimplementing browser behaviours in it is the main problem.
I'm using a 1st gen Surface Book 1. Dual core, 6th gen i7. Not an awful computer but not a monster either. Seeing the other responses in this thread, I think it may have to do more with network bandwidth.
Quite sincerely, it's a total failure. I got the chance to try the new interface, and it's so slow that it's barely usable. It's even slower than the old website, that was already painfully slow.
Loading a random profile takes 8 seconds. Opening a messenger discussion takes 6 seconds. It reminds me of the new Reddit website. Facebook was more enjoyable to use 12 years ago.
It's really sad that in 2020, 10k+ engineers can't make a photo, video, post and message sharing website that is not a pain to use. We collectively failed as a profession. If one needs 2MB of CSS for such a website, there is clearly a problem.
I agree with you; however, the key disconnect is that Facebook is not a photo, video, post and message sharing website. It's a marketing platform intended to extract the most value out of you, the viewer, and transfer that value to Facebook and its advertisers.
If you think of it this way, you can see how you may need 2MB of CSS: to battle the bots trying to scrape your information and replicate your network, to sidestep the evil developers of adblocker software that threaten to destroy the sweet value transfer, the JS required to track every single movement you make both online and off, the A/B testing framework that allows you to determine how to most efficiently extract that extra 0.001% of valuable eyeball time, and so forth...
Connecting the world? Well, I guess that could be a nice side-effect...
A "free" ad-driven social networking site that brings in gigantic revenue, but that has to pay thousands of high-priced engineers to implement all of the cruft you just described.
versus ...
A subscription-based, non-ad-driven social networking site (perhaps operating as a member-owned cooperative?) that brings in much more modest revenue but that also can operate with many fewer engineers because it can be largely cruft-free.
I know there have been a gazillion attempts at the latter and none has succeeded in any way comparable to the "free" sites. It's too bad, because if any of them were to ever achieve Facebook scale, the subscription price would probably be quite modest.
That's probably something that can be measured: if the profile/wall fills out over several seconds, when do the ads appear? First, before everything else? In that case it would be cynical, but I agree that they might monetize the delays by ensuring ads appear before anything else.
I guess they probably could build a nice and fast website if it was up to them. But there's probably a lot more requirements than just that.
Things like https://twitter.com/wolfiechristl/status/1071473931784212480... are probably not decided on and implemented by the engineering team but are coming down as a requirement from the top. This is probably the case for a lot of other decisions that slow down the page ("We need this A/B test framework", "this needs to be hidden to increase retention",...)
I've always suspected that the Div-itis plaguing fb's website is a result of React's dependence on the 𝚘̶𝚟̶𝚎̶𝚛̶𝚞̶𝚜̶𝚎̶ misuse of higher order components.
Heh just yesterday I was trying to click on a zoom meeting link for a Facebook event, and it wasn't working for whatever reason, so I tried to pull the url out of the page with the inspector, and I saw the same thing -- hopelessly deep nesting of the element, with the URL hard to find and its text version obfuscated ... and that's not even an ad!
The problem there is that '10k+' "engineers" are trying to make the same 'photo, video, post and message-sharing website'.
It's a structural problem and little more: the website (and app) is their main money-maker, so they're going to give it a disproportionate amount of resources.
Imagine you hire ten thousand people to lay one railroad track. [note; see end of post] If any single one of them doesn't contribute directly in some way, you'll fire them. This seems kind of strange, doesn't it? Sure, it probably requires more than a single person to lay a track. But ten thousand people to lay one? How is that supposed to work, mechanically? This would be enough to warrant shareholder revolt.
Now, the railroad track gets broken a few hundred times, maybe they hammer it enough to make it twice as long, whatever. It now no longer resembles a railroad track. Certainly no train could go across it. Send a few hundred people to go ask the managers of this project for a replacement track. Okay, we're now at...maybe a tenth of people having contributed? Repeat this process until everyone's contributed. Maybe the manager gives different groups different materials for the track to fuck with them, whatever. But somehow, every single person manages to not get fired.
What's the outcome look like? You have a single railroad track, probably not even well-fit for the job (sparks fly whenever trains run on it; maybe it causes them to tilt, so on), but it's laid! And ten thousand people are employed!
It's the same thing with a website. You can't put a terabyte onto a user's device every single time they load your website; you just can't. So you have a window of performance you have to hit. Between ten thousand people trying to have things thrown onto user devices? Good luck making anything resembling 'decent'.
It's the same problem that Dave Zarzycki noted in his talk about launchd[1], but worse. Instead of 512 megabytes shared between some random abstract parties you can basically ignore, it's <10MB shared between ten thousand coders, translators, graphic designers, users, managers, etc. Does something seem strange about this?
[note]: This is the appropriate comparison here; at the scale of 'Over ten thousand people working on one program', it's grunt work, not art, science, or even programming. There's a word for implementation-grunts that's fallen out of favor in the past few decades: coders. This was seen as distinct until recently.
I don't know how many people FB actually allocates for their main app, but this reminds me of a chapter in The Mythical Man-Month. It is said over 1000 people and 5000 man-years went into OS/360. I don't see it anywhere today.
Instead, the book proposes The Surgical Team, i.e. about 10 people taking specialized roles, with the system being the product of the mind of a few key people. I wonder how well this aged.
For me the performance seems better. It also seems strange that if one of their two main publicly stated goals was to increase performance (the other goal being ease of maintenance), that it would slow down. Maybe you have extensions interfering?
Also, the set and scale of features in the Facebook app makes it literally one of the most complex webapps out there. It's far more than just multimedia posts+messaging -- it's a marketplace, dating, games, apps, groups, pages, and more. Nobody's "failing". And the 2MB of CSS was the "before" uncompressed. The "before" compressed was 400 KB, and this update appears to reduce it to under 80KB compressed. That's 96% less than the 2MB you're complaining about, more than an entire order of magnitude.
So Facebook seems to be improving here, no? I fail to see what is a "total failure" or "clearly a problem".
Really? I really like it. I almost solely use Facebook to organise blood bowl matches, so it’s a lot of group chats and events, and it’s so much better than the old design.
I haven’t noticed it being slower either, it’s certainly not fast, but it’s not really something I notice either.
I would be really interested to find one, only ONE, website where React/Angular was really bringing a better experience and better final product than a standard pure JS with simple Ajax system.
I'm not a Facebook (the company) apologist by any means, and only use it because there's a few groups on it relevant to a company I own.
That being said, I find the new FB to be insanely fast. I don't even block ads on it.
I do agree Facebook was way better 12 years ago (I saw real updates and photos about friends, rather than companies and ads). But speed right now hasn't been the problem.
I can't decide whether I love or hate the UX of this new stack. It certainly feels more like an app now than a website. I like the new basic layout, "app shell" or tier 1 rendering. It feels like the First Contentful Paint is improved and some random layout shifts have eliminated. It might take a couple of seconds more to load something but it appears where you expect it to appear.
On the other hand, navigation and clicking around is still sooo slow. My 60-year-old aunt called me and asked if she needs a new pc because facebook makes her laptop fans spin like crazy. I couldn't explain to her about all this react-redux-graphql thing and frankly, she doesn't care. All she cares is that facebook is slow and all she does is post photos and talk with friends like she did 10 years ago.
Agreed. Using this on the i9 8-core 16" MBP and the thing just isn't fluid. Has anyone at Facebook even bothered testing this on computers people actually use? Like some 2015 Macbook Pro? Or a 2014 Macbook Air or whatever.
I wouldn't even want to know how it runs on those.
I've got a 2015 MBP. FB is pretty sluggish, and I wonder why. When you send someone a list of notifications, it's probably worth getting the data ready in case they click it. And the resource usage is pretty big as well.
The mobile app seems to be just fine though, perhaps they want to push people to use that.
You’re missing their main goal: having an easier-to-change codebase.
Apparently this is way more economically rewarding than performance for Facebook.
With that in mind, who cares if the site is slow (btw this is the only complaint of your rant). If the software requires a few devs to change and a few eyes to maintain, they can literally scale as much as they want. And actually now they’re probably in a way better position than they would if they had developed a super performant but unmaintainable site.
The quote, premature optimization is the root of all evil, is still very much valid imho.
This was my experience at Facebook. Attempting things with a small team (or heaven forbid by yourself) was heavily frowned upon because it didn't justify manager and director salaries. As a result you ended up with poorly performing over-engineered code bases that prefered complex, expensive systems that would take multiple teams to build, but for whatever reason complexity that would improve performance was frowned upon. I'm sure this is common at many big tech companies. I didn't work on the mainline FB app but it seemed like part of the culture.
It really is appalling. I'm on a top of the line laptop with Gigabit internet and I can't do anything on Facebook without waiting several seconds for loading. Usually I only open it to check notifications. I just refreshed and counted and it took 9 seconds for the page to load and to show my notifications.
At a company I used to work for, we worked so hard to make sure our web app would load extremely fast… just to end up losing the battle with the data and analytics team for analytics scripts. They used a tag manager (tealium) which by itself can be used for good but it ultimately gave the other team the ability to overload our site with 3rd party scripts.
Yeah, I think that Twitter did a pretty good job. After the initial load, it even works offline, so the actual API calls are the only thing that it's fetching over the network.
It is not a failure of the profession. There are engineering teams out there that excel at software performance. Granted they may not have billions of users. It is a matter of mindset and core values and those are hard to change.
We'll see what the data show. I have been reading comments about Facebook's supposed decline for as long as I've been aware of Facebook and yet their published numbers continually show greater engagement. https://jakeseliger.com/2018/11/14/is-there-an-actual-facebo...
The unquestioning supplication at the alter of 'engagement' (a sterile marketing term if there ever was one) is what lead to where we are now in the first place. This is an affliction that pervades the entire consumer internet sector, but the folks at facebook seem to have refined it to its fullest potential.
The other day I got a facebook notification on my phone, which said something along the lines of "You have 4 new messages". Of course, thinking it was from my friends I opened the app to look at them. 3 of my 4 "messages" were notifications for friend requests from people I had never met. The last one was a photo someone had posted of cake she'd baked (not to me specifically, just in her feed). To someone sitting at her desk at facebook, looking at an engagement metrics chart, the notification would seem to have served its purpose - another data point, another person enticed to open the app in response, engagement maximized. But of course, this was deception. I found this experience distasteful enough to disable notifications entirely - probably another data point for their metrics team - and annoyed enough to complain about in an HN comment.
There’s a lot of fake profiles though and I think a lot more than they’re prepared to admit. Even brazen binary options trading scam profiles don’t get removed - it appears that they’re happy as long as the number are going up.
>and yet their published numbers continually show greater engagement.
Do you think this might have anything to do with the fact that, as an advertising company, it's crucial that they are able to tell companies that engagement is increasing?
I would be very surprised if engagement wasn't down among Americans under 50 – it may well be counteracted by growth in other markets and in other apps (especially Instagram), but there's _much_ less activity on Facebook.com from my peers than there were five years ago.
Looking at the internet today, I think we need to lower our expectations and be realistic. At least in the US.
We are still driving 60mph on freeways and what trains we have do not travel at 300kph.
Perhaps many of us flipped out when we only had 9600 baud modems, but you could get up, brew some tea, walk the dog, or read a book while waiting for a page to load. We all had so much more patience back then.
Why do we need instant gratification with FB and other social media? Maybe, or maybe not /s.
Let's not even get started with thir mobile apps, horribly large in size, poorly engineered (performance) and privacy loopholes are everywhere (perhaps by design).
Comments like these remind of an old post on Slashdot, "What makes a good website?" Ask "geeks" what they prefer, it's usually minimalism, no images, consistent text styling. In the end, the ideal format becomes a text file without markup. I think we need to accept the opinions of techies are increasingly irrelevant in tech. It's like being fine artist getting paid to design flyers or a chef making burgers.
Everyone hates slow webpages. Not just geeks. We can all argue over whether minimalism or eye-candy is preferred. But if your site feels like running in mud, it's frustrating regardless of the design.
And all these SPA, client-side rendered, sites seems guilty of this. You navigate to a page, and it loads up "instantly", except you see nothing but gray placeholder images. Then content starts loading in, but haphazardly. You see a link you want to click, and you go to click it, when BAM! it jumps down 37 pixels because some stupid widget just loaded above it on the page.
I really hate the modern web. Not the look of it, or the styling. The mechanics and slowness.
But, perhaps ironically, Slashdot was killed because its design updates made it more "designer-y" but much less usable. I remember one update in particular that added a ton of whitespace, gave it a "cleaner" look I guess, but it meant there were like 1/4 of the posts on the screen so it just took longer to peruse the comments. They also f'd up how they showed voting so it became much harder to just scan for popular comments and valuable discussion. I remember going to Slashdot pretty much daily and after that update just said screw this, will have to find something else.
Design updates can be useful, but just like for engineers, "beware lots of highly paid people looking for something to do".
Wrong. The average user does not give a shit if the page is rendered server-side, or of it's a SPA.
Geeks prefer speed, like everyone. There are plenty of papers that show that a reduction in latency improves the conversion rate. And it's does not have to be ugly to be fast.
"Geeks", as a class, will tend to focus on technical issues before aesthetic ones. Looking at that fact and immediately equating it to an absurd extreme is a fun game, if you don't care about describing reality. I know some accomplished engineers who are also good designers, and vice-versa.
Second, if my professional opinions are not being taken seriously, that usually means one of two things: I'm too far out over my skis, or am in the wrong place with the wrong people. Especially so if you feel like a chef making burgers.
Of course, if "in total control" and "irrelevant" are the only two states of being one sees, I suppose I see how you get there.
It's Facebook's own standard: "We knew we wanted Facebook.com to start up fast, respond fast, and provide a highly interactive experience.". The person you're responding to is saying they haven't met their own metrics for success based on their experience. You can't really attribute this "preference" solely to them.
On the mobile web or desktop, either one. (We're running it off of one server, it might get the HN effect, we'll see.)
We have been building our own, open source social networking platform and we have tried to make a lot of things more efficient while doing so. The site I linked to didn't minify any files or optimize images. However, it loads things on demand as needed, and even lazy-loads entire components.
Is it faster than Facebook? We have our own component system, not React.
I've never been on Facebook so I can't compare but your site responds pretty quickly for me. Also you left out the h in GitHub in your link so it goes to a domain for sale site.
The new design is so laughably bad. I was trying to send a message to someone on the website, it took 10+ seconds to open the chat window then when typing it couldn't keep up with my typing (I'm not an especially fast typer either). It was like having a 5 second ping over SSH. This is on top of the (pinned) tab regularly needing to be closed as it slowly takes up system resources.
This is all on my 8core/32gb workstation. I can't even imagine how much utterly useless crap they are running in JS to make that kind of experience.
On the bright side it does mean I am weaning myself off as keeping a pinned tab open is a non starter so I can't just have a quick refresh. And I'll be fucked if I'm installing their apps on my phone.
So I guess thanks needs to goto the FB engineers for making their new website so utterly garbage that the tiny dopamine hits driven by the FB algorithms are worth less than the pain caused when using the site.
I wonder if they fixed the bug where if you visit the site with Safari on an iPad, when you try to type a comment, every space becomes two spaces. Also, I wonder if paste (command-v) is also randomly blocked at times.
I use mbasic.facebook.com as much as possible. Occasionally I'll use m.facebook.com. I've had the mobile apps uninstalled for ages.
I still don't understand how the biggest websites get away with being so unbelievably bloated. My guess is that most people have medium to old phones and PCs that are bogged down with nonsense running in the background and facebook, instagram, twitter etc. run extra slow, but I guess people just put up with it.
I don't use Facebook but how do you know it's not BE calls that are creating the slow experience? It sounds like they rewrote the FE not the entire system.
Does the compensation of those "engineers" reflect that they have "failed"? Perhaps there is another way to evaluate the work, not from the perspective of the user waiting in front of a screen. Do not forget that the money to pay the salaries of those who do this work does not come from users.
I don't know if I have the new stuff or not, but I agree that some parts are currently frustratingly slow to use on desktop. I figured it was because so many people are spending much more time on it (including, tbh, me). But I was trying to message with an old friend and simple gave up a few days ago.
Yes, in particular, for large discussions, when I get a notification about a comment I made on it, and I click to jump to that comment, it takes a while to load -- and in a way oddly proportional to the size of the discussion and frequency of the posts on that group.
I'll always see the very top post of the group, and its whole page load, and then slowly my discussion will come up and then it will scroll down to that comment; and if I do anything, that breaks the whole process.
It's like no one ever considered the concept of just loading a piece of the discussion, like reddit does.
What's more, there are all kinds of UX nightmares, like how, if I open the messenger in one Facebook tab, it opens in every tab, blotting out content I want to read.
Or how a FB livestream event will just randomly stop playing, giving me no indication that I'm lagging behind the current video -- I've done trivia nights that way and I only find out I'm behind after my team members suggest answers to questions I haven't heard yet.
I'm curious what your response to the Cambridge Analytica stuff was? Open APIs to build your own experience are like 100 times worse than that, at least the APIs that CA used were limited (such that they didn't provide enough info to actually recreate FB) and required CA to sign a developer agreement with FB to restrict how they could be used.
But then how would they force us to look at ads? How would they keep all their valuable data locked in their walled garden? Sadly I don’t think we’re ever going back to the glory days of open APIs.
Facebook isn’t trying to maximize your enjoyment, they’re trying to maximize their profits. By that measure they are doing vastly better than 12 years ago.
This is 100% incidental complexity. It's painful to consider that this level of sophisticated engineering is needed to render a website quickly in 2020. What went wrong?
I'm personally excited about things like turbolinks and phoenix liveview, which may provide a path out of this mess.
Facebook rendered just fine a decade ago. What changed between now and then, in terms of actual improvement to end user experience, to make it so slow this kind of crap is needed?
My guess is:
- Desire to offload more processing to end user machines to save compute
- More and more ads and user analytics in order to pick which ads to show
- More engineers that irrationally hate the simplicity of PHP
Duct tape on top of abstractions on top of duct tape in order to make a document platform behave like an application platform. Isn't it time to just replace web browsers with something that doesn't suck?
Duct tape on top of abstractions on top of duct tape in order to make a document platform behave like an application platform. Isn't it time to just replace web browsers with something that doesn't suck?
It's not the browsers that suck, it's what companies like Facebook do with browsers that sucks. And then all the other non-thinking middle managers in other companies who want to copy these terrible things because they're incapable of leadership.
> Facebook rendered just fine a decade ago. What changed between now and then
It does way more things. In particular, there are a lot more interactive experiences. A decade ago, it just loaded a web page and nothing changed until you refreshed. Now live videos and other content types have streams of comments and reactions pushed to the client in real time.
> Duct tape on top of abstractions on top of duct tape in order to make a document platform behave like an application platform. Isn't it time to just replace web browsers with something that doesn't suck?
Be careful what you wish for. You just described native mobile apps.
I don't think it's any of those. It's about interactivity in the webpage. Imagine you have the template for the HTML of comments implemented in PHP. Now if the user posts a comment and you want to show the comment on their screen before it gets to the server and back, then you need the template itself on the client. The template being on the server in PHP isn't any help. You could imagine the server filling the template in with dummy values, and then giving that to the client to fill in, but that only works for the simplest of templates where a few values need to be substituted in. You have to create another template language both the server and client understand if you have more complicated logic like many conditional or repeated parts. When you have one part of the UI the client needs to render on its own, it's easy to make a one-off solution for it, but in Facebook's case, I imagine they have a ton of UI that they want to be server-rendered but also renderable on the client. React is a template system that can run on both the server and client, and can live-update already rendered elements on the client as the user interacts with the page.
The frontier of web design is apparently digital Rube Goldberg machines, existing probably more for job security rather than consideration for the end-user.
On the contrary, it does seem like jumping through these kinds of hoops has become strictly necessary if you are building web apps you would like to be responsive.
You have a ten digit user base that you need to engage, so you have a four (five?) digit engineering team that somehow needs to coordinate their patches in a way that doesn’t cause the complexity of the site to explode.
I never said it wasn't impressive. I was saying that it's sad that this level of incidental complexity is necessary to run, what is for all intents and purposes, a website.
The article talked a lot about their new dark mode feature, how they wouldn't have been able to implement it in their old tech stack, and how they were able to reduce their CSS size while adding a dark mode.
But is dark mode all that important? Even as a developer I don't care at all about FB having a dark mode, did they really need to rewrite their entire site to implement features no one cares about? Also, for a photo and video sharing site, is CSS size really important? I just loaded the page and it loaded 13.2mb worth of data while making 249 requests. Thanks for cutting down your 400kb CSS file though I guess.
"I don't care about this feature" !== "nobody cares about this feature." I feel like this is a fallacy a lot of programmers (myself included) tend to fall into, but it's a dangerous trap.
What I find weird about the article is that they spend so much time talking about the performance of the new design, yet fail to include any hard numbers comparing the old vs new implementations. They do throw out a few numbers like how the new page only downloads 20% of the previous 400kb CSS for the homepage, etc, but I'm surprised to see no actual browser benchmarks for what they claim. How else would they be measuring this internally?
Great write up. Glad to see they're making some changes as the current interface still feels like 2005 with the main feed at 500px wide. And as always, dark mode is welcome.
I really hope this helps performance on the site. In the past year or so I've been noticing that when the page sits in an unfocused tab for a while, clicking back usually takes 20+ seconds to actually load and I'm stuck at a white screen. It actually locks up the tab pretty well too, so navigating to other addresses and such takes a pretty long time.
About every 3 minutes I went a little mad with the new interface. With the old one, photos open, and you can click almost anywhere to close them.
With the new (desktop) interface, you have to find the X to close the photo. No, no, not the X you find on the top right of every other interface ever. This time it's on the top left!
I've been using the new Facebook and the interface is even slower than before. It also feels (may not actually be) less information-dense, forcing me to interact with the page more to see the same information, compounding the problem.
I don't get this infatuation with dark mode, beyond it looking cool. People claim it helps with eye strain, but a brighter background constricts your pupils, improving focus.
In my experience light mode will work fine on things like e-paper displays, but when you have an emissive or transmissive display the bright light can be straining.
I also feel like dark mode highlights text more, making it easier to identify the text. If I use light mode in text editors I get completely lost very quickly.
I think people might have different sensitivities to this issue. Before I changed everything to dark mode, I had trouble concentrating on a bright screen especially in a dimly lit room. Dark mode feels like a total game changer for my productivity.
It never occurred to me that this could be the reason for dark mode blurriness, but it makes sense intuitively. (Think you mean "constricts" rather than "dilates" though.)
Facebook used to be reference for speed and usability
On web, for a while. Remember when Facebook's iPhone app came out, it was a disaster. It was so slow, I could open the app, then take the elevator down to the basement of my building, drop off some outgoing mail, and return to my apartment before it finished updating the news feed. It was legendary in its time for its slowness.
At first, nobody complained because there weren't a lot of "apps" available. But then all the other apps came out, and everyone complained for years that all the other app-building companies could build responsive apps, but Facebook couldn't.
Then one day Facebook updated its app and it was a little better. Then another update came and it was good enough, and everyone stopped complaining and forgot.
Last day I entered again for curiosity. It's so sad. Strange interface, slow, unresponsive. It's sad.
I only use Facebook once a week, to update the page for a web site I manage. It is terribly slow on web. On both of the computers I use it on, loading the first page takes upwards of 15 seconds. Clicking on the text field to enter a new post takes eight to ten seconds for the editor to load.
I don't know how people who are addicted to Facebook manage to use it so much without going mad.
A school of thought in web development believes the web to be the next frontier in application development. For them it makes sense that websites like this feel and act like apps, both as an end-user (animations, transitions without full-page reloading, rich dynamic content, etc) and as a developer (client-side state management, frontend/backend separation, with an API in between, modular application structure, etc).
Apps don't load in 10ms, but they also can support some offline functionality given their independence from the server. Overriding browser behaviour and managing your own loading behaviour makes sense, because the default browser behaviour is not the experience you're striving for; it's not the app experience. These people are usually those who have worked on large web projects too— the developer experience that web developers have built for themselves in "userland" (JavaScript) is pretty good, and has evolved a lot to have features that makes developing the exact behaviour you want easier, and correctly iterating on a codebase, quicker.
A separate school of thought wish websites stayed true to their origins as enriched documents, and think trying to evolve them in the direction of applications is counter-productive to their true purpose as interactive surfaces to get and send information. If I am just going to scroll and see a few pictures, why do I need to download anything other than the structure of the page and the pictures themselves? If all the building blocks are already there, in the browser, why do people need to re-invent them and ship them as yet more JavaScript?
What should a website be though? The fact there isn't consensus about this is indication that there really doesn't seem to be a clear answer.
Per the document-like school of thought, facebook.com just keeps straying further and further away from the ideal, but as far as as the app-like school of thought goes, the new facebook.com is a pretty remarkable achievement.
As an analogy: some pop-up books are amazing works of art. But reading would be frustrating if every book was a pop-up book.
I sure wish companies wouldn’t abuse data privacy so we could instead care about user preferences for app-like functionality, but we don’t yet live in a world like that.
In any case, this argument is operating at the wrong level of abstraction, the issue here isn't the distinction between these two things conceptually, but if there would be less incidental complexity overall if what are typically called web applications took a different approach to implementing their features, while still maintaining the same user experience.
It's hard to look at all of the crap you need to do to get a functioning web app working to not think there must be a better approach.
I agree with this.
> There's a strong argument to be made in Facebook's case, since the core value proposition of Facebook hasn't changed much since it's inception, and it began its life as a server-side rendered 'website.'
Yes, but I assume that's from the perspective of the value the site brings to you, not in general and not to everyone. If someone solely gets value from facebook.com as a site to send and receive information to/from friends and the world, then yeah, it hasn't changed much.
Facebook today offers a richer experience and that might be part of its value for other people. On facebook.com you can IM a friend, while watching a video in a PIP window and engaging in a real-time conversation on the the comment thread of an event post. You can then switch back and forth between an online marketplace and a streaming service without losing the state of your chat window. The ability to do these things are part of the value proposition for many users that facebook.com now offers today, and delivering that value can be harder with a solely SSR'd website.
> there would be less incidental complexity overall if what are typically called web applications took a different approach to implementing their features, while still maintaining the same user experience.
If you've figured a way that's better do share! I'm sure there's instances in the wild, but I don't think an experienced engineer would ship their own client-side networking in JavaScript if there was a better way to achieve what they want without shipping any more JS.
> It's hard to look at all of the crap you need to do to get a functioning web app working to not think there must be a better approach.
To be clear, you can get a functional "hello world" web app with a single line a code (specifically, thanks to the fact html is very permissive with improperly formatted documents). Everything afterwards depends on the decisions you make, for the experience you want. Is getting rid of that 200ms flicker between page full page loads worth the 500ms it might take takes to load and initialize your client-side routing library? Is making your site available offline worth the effort of setting up all the extra PWA business? Some will think so, some will not.
In any case, this argument is operating at the wrong level of abstraction, the issue here isn't the distinction between these two things conceptually, but if there would be less incidental complexity overall if what are typically called web applications took a different approach to implementing their features, while still maintaining the same user experience.
It's hard to look at all of the crap you need to do to get a functioning web app working to not think there must be a better approach.
I agree, they’ve done a fantastic job. Not only that, but as far as corporate engineering blogs go, this article is one of the best I’ve ever read.
Usually I either know the subject too well to learn anything, or I don’t know the subject well enough to understand what they’re saying in the amount of time it takes to read an article.
In this case, they found the perfect depth, they had great judgment on when and how to use visuals, and what they’re conveying is so clearly valuable.
If you usually skip the article and just go straight to comments, consider actually reading this one!
The whole point of SPA initially was better use experience, notably through faster response time and loading.
Perhaps application-like school of thought probably allows for more user manipulation. metrics and tracking.
In practice things like Twitter and Facebook, interactive programs, should really be just that - programs you run. If the interface is nigh static and the purpose is content interaction rather than primarily consumption you should be opening the Facebook program that gives you this interface and uses its client / server communication to feed messages to and from the interface, not provide the whole thing over the wire spread across document addresses.
And they are that on mobile. Who uses Facebooks mobile website? Everyone uses the app. The contention only exists on "desktop" OSes because Windows and OSX don't provide a UX workflow to push an app at user (at least they didn't when it mattered) the way a mobile site can. And that the app environments on both were way worse than the Android or iOS SDKs for making a dumb GUI for something like Facebook.
But seriously, where do you draw the line between enriched documents and apps?
CSS? JS used for styling? d3 for visualizations? WebGL?
Websites have gotten so bloated that the only sane future I see is serving people unix + x in wasm.
Tech has this global problem of technology products never being what they say they are on the tin. It's like those front-end/back-end iceberg memes -- what the user wants and what the business wants are just barely in alignment. This needs to end.
[1]: https://amturing.acm.org/award_winners/corbato_1009471.cfm
It's like a bad copy of "the new Twitter" and even Twitter isn't really good.
The only end-user software from FB is Messenger Lite. It's quick and does what I expect it to do.
Even the voice chat is good and I didn't expect a lite version of a messenger to have it.
Deleted Comment
The neverending quest to reduce information density is a usability disaster, despite the misheld belief that cleanliness = usable. Zooming out on the new fb interface, to restore some information density, leaves a comical amount of whitespace. Wells Fargo has turned its desktop interface into a giant stretched mobile app. Nothing is hyperlinks that support right click or new tab anymore.
https://medium.com/signal-v-noise/why-i-love-ugly-messy-inte...
The cluster f of the old old facebook interface was beautiful.
Deleted Comment
They generally work fine. They just annihilate battery life and processing capacity while doing so.
Surely they are hiring world-class devs, so what’s holding them back?
Facebook Engineering has a notorious "not invented by me" culture, it's not unique there but a lot of our "world-class" engineers are just acting economically rationally and hole-digging on some new bespoke framework or tool to cement their position in the company. You end up with a massive amount of churn and over-engineered vanity projects, and it's manifesting downstream in basically every product we've turned out for the last five years. That's why the applications bloated and terrible.
The joke inside the company is that it used to be "move fast and break things" but now it's "breakfast, vest and move things around". It's really an engineering culture of decadence and waste these days.
What makes you think that?
Reddit seems like a place where the kind of experienced and talented people needed to turn it around could make a lot more money (via stock grants in addition to salary) and frankly have a lot more impact, at any of FAANG.
I've not seen anything to indicate that Reddit is hiring, or trying to hire, "world-class devs".
Dead Comment
Loading a random profile takes 8 seconds. Opening a messenger discussion takes 6 seconds. It reminds me of the new Reddit website. Facebook was more enjoyable to use 12 years ago.
It's really sad that in 2020, 10k+ engineers can't make a photo, video, post and message sharing website that is not a pain to use. We collectively failed as a profession. If one needs 2MB of CSS for such a website, there is clearly a problem.
If you think of it this way, you can see how you may need 2MB of CSS: to battle the bots trying to scrape your information and replicate your network, to sidestep the evil developers of adblocker software that threaten to destroy the sweet value transfer, the JS required to track every single movement you make both online and off, the A/B testing framework that allows you to determine how to most efficiently extract that extra 0.001% of valuable eyeball time, and so forth...
Connecting the world? Well, I guess that could be a nice side-effect...
A "free" ad-driven social networking site that brings in gigantic revenue, but that has to pay thousands of high-priced engineers to implement all of the cruft you just described.
versus ...
A subscription-based, non-ad-driven social networking site (perhaps operating as a member-owned cooperative?) that brings in much more modest revenue but that also can operate with many fewer engineers because it can be largely cruft-free.
I know there have been a gazillion attempts at the latter and none has succeeded in any way comparable to the "free" sites. It's too bad, because if any of them were to ever achieve Facebook scale, the subscription price would probably be quite modest.
Things like https://twitter.com/wolfiechristl/status/1071473931784212480... are probably not decided on and implemented by the engineering team but are coming down as a requirement from the top. This is probably the case for a lot of other decisions that slow down the page ("We need this A/B test framework", "this needs to be hidden to increase retention",...)
I've always suspected that the Div-itis plaguing fb's website is a result of React's dependence on the 𝚘̶𝚟̶𝚎̶𝚛̶𝚞̶𝚜̶𝚎̶ misuse of higher order components.
Yeah, welcome to the real world. We _all_ have to handle requirements like that, except maybe when we build our portfolio site
Deleted Comment
It's a structural problem and little more: the website (and app) is their main money-maker, so they're going to give it a disproportionate amount of resources.
Imagine you hire ten thousand people to lay one railroad track. [note; see end of post] If any single one of them doesn't contribute directly in some way, you'll fire them. This seems kind of strange, doesn't it? Sure, it probably requires more than a single person to lay a track. But ten thousand people to lay one? How is that supposed to work, mechanically? This would be enough to warrant shareholder revolt.
Now, the railroad track gets broken a few hundred times, maybe they hammer it enough to make it twice as long, whatever. It now no longer resembles a railroad track. Certainly no train could go across it. Send a few hundred people to go ask the managers of this project for a replacement track. Okay, we're now at...maybe a tenth of people having contributed? Repeat this process until everyone's contributed. Maybe the manager gives different groups different materials for the track to fuck with them, whatever. But somehow, every single person manages to not get fired.
What's the outcome look like? You have a single railroad track, probably not even well-fit for the job (sparks fly whenever trains run on it; maybe it causes them to tilt, so on), but it's laid! And ten thousand people are employed!
It's the same thing with a website. You can't put a terabyte onto a user's device every single time they load your website; you just can't. So you have a window of performance you have to hit. Between ten thousand people trying to have things thrown onto user devices? Good luck making anything resembling 'decent'.
It's the same problem that Dave Zarzycki noted in his talk about launchd[1], but worse. Instead of 512 megabytes shared between some random abstract parties you can basically ignore, it's <10MB shared between ten thousand coders, translators, graphic designers, users, managers, etc. Does something seem strange about this?
[note]: This is the appropriate comparison here; at the scale of 'Over ten thousand people working on one program', it's grunt work, not art, science, or even programming. There's a word for implementation-grunts that's fallen out of favor in the past few decades: coders. This was seen as distinct until recently.
[1] https://youtu.be/SjrtySM9Dns?t=255
Instead, the book proposes The Surgical Team, i.e. about 10 people taking specialized roles, with the system being the product of the mind of a few key people. I wonder how well this aged.
Also, the set and scale of features in the Facebook app makes it literally one of the most complex webapps out there. It's far more than just multimedia posts+messaging -- it's a marketplace, dating, games, apps, groups, pages, and more. Nobody's "failing". And the 2MB of CSS was the "before" uncompressed. The "before" compressed was 400 KB, and this update appears to reduce it to under 80KB compressed. That's 96% less than the 2MB you're complaining about, more than an entire order of magnitude.
So Facebook seems to be improving here, no? I fail to see what is a "total failure" or "clearly a problem".
This is an anecdotal datapoint that is insanely useless in the real world, but the fact that it is the top comment is typical of this site.
I haven’t noticed it being slower either, it’s certainly not fast, but it’s not really something I notice either.
Still haven’t found a use case for React/Angular or SASS/whatever.
If I’m guilty of something is not recognizing the validity of those tools, as I’m sure there are.
But 2MB CSS is simply inconceivable to me.
I would be really interested to find one, only ONE, website where React/Angular was really bringing a better experience and better final product than a standard pure JS with simple Ajax system.
Dead Comment
SASS, SCSS, LESS, etc are kind of great though. It sucks you have to compile them to css, but you can do this:
Saves a lot of time and effort.That being said, I find the new FB to be insanely fast. I don't even block ads on it.
I do agree Facebook was way better 12 years ago (I saw real updates and photos about friends, rather than companies and ads). But speed right now hasn't been the problem.
On the other hand, navigation and clicking around is still sooo slow. My 60-year-old aunt called me and asked if she needs a new pc because facebook makes her laptop fans spin like crazy. I couldn't explain to her about all this react-redux-graphql thing and frankly, she doesn't care. All she cares is that facebook is slow and all she does is post photos and talk with friends like she did 10 years ago.
The 2MB was for the old site. The new site loads 20% or 400KB.
I wouldn't even want to know how it runs on those.
The mobile app seems to be just fine though, perhaps they want to push people to use that.
Apparently this is way more economically rewarding than performance for Facebook.
With that in mind, who cares if the site is slow (btw this is the only complaint of your rant). If the software requires a few devs to change and a few eyes to maintain, they can literally scale as much as they want. And actually now they’re probably in a way better position than they would if they had developed a super performant but unmaintainable site.
The quote, premature optimization is the root of all evil, is still very much valid imho.
Too many cooks spoil the stew.
I might even go so far to say that 10 engineers would have a larger chance of success than 10k+ engineers.
Facebook is still pretty slow even on a Ryzen 3900x with 32GB of 3600mhz RAM. It's a lot better than it used to be though.
We'll see what the data show. I have been reading comments about Facebook's supposed decline for as long as I've been aware of Facebook and yet their published numbers continually show greater engagement. https://jakeseliger.com/2018/11/14/is-there-an-actual-facebo...
The other day I got a facebook notification on my phone, which said something along the lines of "You have 4 new messages". Of course, thinking it was from my friends I opened the app to look at them. 3 of my 4 "messages" were notifications for friend requests from people I had never met. The last one was a photo someone had posted of cake she'd baked (not to me specifically, just in her feed). To someone sitting at her desk at facebook, looking at an engagement metrics chart, the notification would seem to have served its purpose - another data point, another person enticed to open the app in response, engagement maximized. But of course, this was deception. I found this experience distasteful enough to disable notifications entirely - probably another data point for their metrics team - and annoyed enough to complain about in an HN comment.
For example - Flame wars increase engagement, even if people feel drained and frustrated afterward.
I understand why it's a useful metric - It's particularly valuable if your business model depends on time-on-site to sell ads.
But I wouldn't recommend them as a proxy for enjoyment by any means.
Do you think this might have anything to do with the fact that, as an advertising company, it's crucial that they are able to tell companies that engagement is increasing?
Deleted Comment
We are still driving 60mph on freeways and what trains we have do not travel at 300kph.
Perhaps many of us flipped out when we only had 9600 baud modems, but you could get up, brew some tea, walk the dog, or read a book while waiting for a page to load. We all had so much more patience back then.
Why do we need instant gratification with FB and other social media? Maybe, or maybe not /s.
And all these SPA, client-side rendered, sites seems guilty of this. You navigate to a page, and it loads up "instantly", except you see nothing but gray placeholder images. Then content starts loading in, but haphazardly. You see a link you want to click, and you go to click it, when BAM! it jumps down 37 pixels because some stupid widget just loaded above it on the page.
I really hate the modern web. Not the look of it, or the styling. The mechanics and slowness.
Design updates can be useful, but just like for engineers, "beware lots of highly paid people looking for something to do".
Geeks prefer speed, like everyone. There are plenty of papers that show that a reduction in latency improves the conversion rate. And it's does not have to be ugly to be fast.
"Geeks", as a class, will tend to focus on technical issues before aesthetic ones. Looking at that fact and immediately equating it to an absurd extreme is a fun game, if you don't care about describing reality. I know some accomplished engineers who are also good designers, and vice-versa.
Second, if my professional opinions are not being taken seriously, that usually means one of two things: I'm too far out over my skis, or am in the wrong place with the wrong people. Especially so if you feel like a chef making burgers.
Of course, if "in total control" and "irrelevant" are the only two states of being one sees, I suppose I see how you get there.
Consumers (apparently) prefer form over function (or at least, they are more easily fooled into thinking the more form, the more function)
On the mobile web or desktop, either one. (We're running it off of one server, it might get the HN effect, we'll see.)
We have been building our own, open source social networking platform and we have tried to make a lot of things more efficient while doing so. The site I linked to didn't minify any files or optimize images. However, it loads things on demand as needed, and even lazy-loads entire components.
Is it faster than Facebook? We have our own component system, not React.
Here is a site that did minify and combine all files: https://intercoin.org
And here is the platform we used: https://gitub.com/Qbix/Platform (warning: not all of it is documented, but enough, at https://qbix.com/platform/guide).
This is all on my 8core/32gb workstation. I can't even imagine how much utterly useless crap they are running in JS to make that kind of experience.
On the bright side it does mean I am weaning myself off as keeping a pinned tab open is a non starter so I can't just have a quick refresh. And I'll be fucked if I'm installing their apps on my phone.
So I guess thanks needs to goto the FB engineers for making their new website so utterly garbage that the tiny dopamine hits driven by the FB algorithms are worth less than the pain caused when using the site.
I use mbasic.facebook.com as much as possible. Occasionally I'll use m.facebook.com. I've had the mobile apps uninstalled for ages.
Were you using a machine with a gigabit connection, 32GB RAM, and 10th gen intel cpu like the devs?
I'll always see the very top post of the group, and its whole page load, and then slowly my discussion will come up and then it will scroll down to that comment; and if I do anything, that breaks the whole process.
It's like no one ever considered the concept of just loading a piece of the discussion, like reddit does.
What's more, there are all kinds of UX nightmares, like how, if I open the messenger in one Facebook tab, it opens in every tab, blotting out content I want to read.
Or how a FB livestream event will just randomly stop playing, giving me no indication that I'm lagging behind the current video -- I've done trivia nights that way and I only find out I'm behind after my team members suggest answers to questions I haven't heard yet.
* team for web component A => CSS, JS
* Team for web component B => CSS, JS
And so on with 1000+ components,
Ends up to be a big pile of mud and everything maybe duplicated, just got a different name and failed to be optimised away and removed.
I just opened FB with the cache disabled and it downloaded 5.85MB (19.76MB uncompressed).
Most of it happens after the page has rendered, which is great, but that's a lot of stuff. There are 13.74MB of uncompressed JavaScript.
its not stealing when the users agreed to it
Deleted Comment
I'm personally excited about things like turbolinks and phoenix liveview, which may provide a path out of this mess.
My guess is:
- Desire to offload more processing to end user machines to save compute
- More and more ads and user analytics in order to pick which ads to show
- More engineers that irrationally hate the simplicity of PHP
Duct tape on top of abstractions on top of duct tape in order to make a document platform behave like an application platform. Isn't it time to just replace web browsers with something that doesn't suck?
It's not the browsers that suck, it's what companies like Facebook do with browsers that sucks. And then all the other non-thinking middle managers in other companies who want to copy these terrible things because they're incapable of leadership.
It does way more things. In particular, there are a lot more interactive experiences. A decade ago, it just loaded a web page and nothing changed until you refreshed. Now live videos and other content types have streams of comments and reactions pushed to the client in real time.
Facebook of 12 years ago didn't have groups, marketplace, dating, live videos, stories, ...
Be careful what you wish for. You just described native mobile apps.
It’s huge and pretty impressive.
The article talked a lot about their new dark mode feature, how they wouldn't have been able to implement it in their old tech stack, and how they were able to reduce their CSS size while adding a dark mode.
But is dark mode all that important? Even as a developer I don't care at all about FB having a dark mode, did they really need to rewrite their entire site to implement features no one cares about? Also, for a photo and video sharing site, is CSS size really important? I just loaded the page and it loaded 13.2mb worth of data while making 249 requests. Thanks for cutting down your 400kb CSS file though I guess.
From yesterday:
https://news.ycombinator.com/item?id=23101483 Hello, World – Zerodha, India's largest stock broker
Facebook most likely has a different attitude towards software development.
I really hope this helps performance on the site. In the past year or so I've been noticing that when the page sits in an unfocused tab for a while, clicking back usually takes 20+ seconds to actually load and I'm stuck at a white screen. It actually locks up the tab pretty well too, so navigating to other addresses and such takes a pretty long time.
With the new (desktop) interface, you have to find the X to close the photo. No, no, not the X you find on the top right of every other interface ever. This time it's on the top left!
That doesn’t sound right.
Mac OS and at least some Linux desktop environments have buttons at top left.
Still, I agree with your general point: why change away from a tradition for little to negative benefit?
But yeah, that's annoying. We've had lightboxes for like 15 years now and people still can't get it right.
This is on a i7 laptop.
Product Lead to Zuck: user engagement up 5% after this update!
Zuck: great, here's your bonus check
I think we need to shift back to 4:3 displays. Widescreen is great for consuming content, but we lose so much vertical space.
I don't get this infatuation with dark mode, beyond it looking cool. People claim it helps with eye strain, but a brighter background constricts your pupils, improving focus.
I also feel like dark mode highlights text more, making it easier to identify the text. If I use light mode in text editors I get completely lost very quickly.
I left it ~4 years ago.
Last day I entered again for curiosity. It's so sad. Strange interface, slow, unresponsive. It's sad.
On web, for a while. Remember when Facebook's iPhone app came out, it was a disaster. It was so slow, I could open the app, then take the elevator down to the basement of my building, drop off some outgoing mail, and return to my apartment before it finished updating the news feed. It was legendary in its time for its slowness.
At first, nobody complained because there weren't a lot of "apps" available. But then all the other apps came out, and everyone complained for years that all the other app-building companies could build responsive apps, but Facebook couldn't.
Then one day Facebook updated its app and it was a little better. Then another update came and it was good enough, and everyone stopped complaining and forgot.
Last day I entered again for curiosity. It's so sad. Strange interface, slow, unresponsive. It's sad.
I only use Facebook once a week, to update the page for a web site I manage. It is terribly slow on web. On both of the computers I use it on, loading the first page takes upwards of 15 seconds. Clicking on the text field to enter a new post takes eight to ten seconds for the editor to load.
I don't know how people who are addicted to Facebook manage to use it so much without going mad.
It's a fun hack, and I enjoy spectacle, but it might be a sign that you are complicating your app more than you need.
Zuckerberg's focus on mobile first was a major focus to save the company and ultimately a success story (this happened right after the IPO).
Their big mistake was non-native applications.
That's the thing with the addiction. Once you get used to it, it doesn't matter how bad it is.