I have a bunch of privacy-enhancing addons installed, which have now all been disabled. If I hadn't read HN this morning, I wouldn't even have known why. Until now, I had no idea that it was even possible to remotely disable my addons.
And now Mozilla are saying that the "fix" is to allow them to install & run "studies" on my machine? What are they smoking? I'm having a hard time trusting a company that randomly & remotely disabled all my addons, regardless of the cause.
This is not entirely accurate. Nothing was done remotely to disable the add-ons. It happened locally. A certificate that's on your machine as part of the Firefox install expired. When that happened, add-ons that were signed via a cert chain that included the expired one started appearing to be invalidly signed. And that's why it requires an update to completely fix. That part is remote, because they need to push a new valid certificate to you to replace the old one.
I do think that the UX should ideally be a bit more graceful; one of my add-ons is Multi-account Containers and its being disabled suddenly caused the window I was actively browsing in to just close, among other side effects.
But that kind of UX polish for what should be an exceptional case is obviously not going to be super-high priority, unfortunately.
Clearly, downstream distributors need to create a patch which causes their distributes Firefox builds to only check certificates on add-on installation (and to check revocations too, sure): it should never be possible for a browser to fail into an unsafe configuration.
I enjoy a nice cup of outrage in the morning just like the next guy, but this one is really weak and lacks that fresh taste of evil conspiracy that I really crave.
You use a browser that has remote update capability, which allows them to install and run new software on your machine all the time. There is a whole separate section of the Preferences that says "Privacy" in large print that has a section that clearly identifies the Studies feature and lets you turn it off. And you use a browser that lets you install privacy-enhancing add-ons in the first place, and in fact which invented the whole concept of add-ons. When the browser discovered that it couldn't verify the add-on integrity with a valid cert, it did what it's supposed to do, it disabled them to protect you from someone backdooring these add-ons.
Someone at Mozilla fucked up, and they're trying in good faith to fix it. I don't know what else people are expecting them to do, putting on sackcloth and ashes won't resolve the problem.
Here's the thing, though: yes, we most certainly are giving them a lot of trust by allowing them to install software on our machines. Which means outrage when they screw up is totally justified, because they broke that trust.
Here's a metaphor: Let's say you let someone seemingly trustworthy watch your kid. (In this metaphor you have a kid). And they let your kid get a broken arm through gross negligence (let's say they passed out drinking beer), and then someone said "well, obviously, you should have never trusted that person, after all, they can do anything with your kid while you're gone, so why are you outraged?" You probably would still be pretty outraged right? You would certainly question your decision to trust them, but at the end of the day you have to trust someone, you'd be a complete shut-in if you could never hire a baby-sitter.
Your addons have not been remotely disabled. They were marked as trustworthy by a certificate that expired and thus are no longer considered trustworthy. The effect is similar, the mechanism is different. You could also enable loading of unsigned extensions, that would “fix” the issue, too.
>You could also enable loading of unsigned extensions, that would “fix” the issue, too.
Which is impossible unless you're either running Linux or running Nightly or Developer Edition. That setting is willfully ignored in normal Mac/Windows/Android builds most people are on.
They were effectively remotely disabled, there was a hidden dead-mans handle that's been triggered in order to effect the result; but it's logically equivalent from an end user perspective -- an external agency caused my add-ons to be disabled without my authorisation.
"A certificate chain has expired, do you want to disable all add-ons?"
No one remotely disabled anything. There's a certificate deployed with Firefox. The certificate Firefox used to check addons was only valid till yesterday. So, when the browser started next time it couldn't validate the addons and disabled them. That all happened locally.
You could say it was remotely disabled by design. What other piece of software randomly just breaks because of the calendar date? I can boot up almost any 20 year old piece of Windows software and it'll work fine, it might not make sense in the current world but it won't go "2019? Fuck off!"
> And now Mozilla are saying that the "fix" is to allow them to install & run "studies" on my machine? What are they smoking?
Can you elaborate what's your concern with "studies"? By installing Firefox that updates automatically, the user is already giving control of the software and letting Mozilla decide what's the best. How is modifying software logic using studies different than modifying logic by updating the binary?
They have not remotely disabled addons. The certificate expired and the addons did the correct thing when connection couldn’t be established. Nobody triggered a switch to disable addons.
If you can disable all my addons by having a certificate expire, you can effectively remotely disable all my addons. And that's exactly what happened. The fact that this was (presumably?) not intentional is irrelevant. The switch may not be an actual switch, but it's there nevertheless. And it shouldn't be.
Due to them easily being able to push code without much hastle using Studies, I think this is an elegant-ish solution to a problem that shouldn't even have happened (expired certs are something that's entirely avoidable), but errors happen.
The problem is that Firefox does not have sufficient built in privacy settings by default. Users shouldn't have to crawl the internet for lists of recommended addons, then have to trust such a variety of authors, to have basic privacy. Like I said elsewhere, I'm using Brave because of this.
> Even better would be to set things up to only do a verify on install instead on every startup.
That would defeat the purpose of verification: "Add-on signing in Firefox helps protect against browser hijackers and other malware by making it harder for them to be installed." [1]
And it's not just malware that was doing that. Microsoft force-installed the ".NET Framework Assistant" into Firefox on Windows, and you had to edit the registry to remove it. [2] If I recall correctly, AVG and Logitech were also among the list of offenders.
Those still would have their certificates checked on installation.
And honestly, I think it is security theater to attempt to defend against attackers on the same or higher privilege level. If microsoft wants to force something down your throat on windows then there's not much you can do.
The problem is that mozilla turns the failures of others into their own problem and then they try to fix it themselves. That scope and responsibility creep leads us to the fallout we're seeing now.
Could you explain why verifying on every startup, instead of just on install, is necessary? The page you linked doesn't mention it.
Edit: Let me amend my question - why is it necessary for the certificates to expire? If a plugin is signed by Mozilla, why wouldn't it be trusted once it gets old?
> And it's not just malware that was doing that. Microsoft force-installed the ".NET Framework Assistant" into Firefox on Windows, and you had to edit the registry to remove it.
If it's not a malicious extension, verifying the signature doesen't prevent a forced install.
Verify-on-install doesn't protect you against addons that steal someone's signing certificate to push out an update, because by the time the stolen certificate is discovered, it will have been installed on a bunch of users' systems.
IMHO it seems problematic, that they can remotely push code changes, including replacement of trusted certificate, and bypass package managers.
I don't expect software to (significantly?) change during runtime, outside of what was packaged, signed, distributed and installed as part of apt/yum/pacman/etc.
I understand (not that I like or agree with) that some apps are just embedded web browsers, and load everything externally, and that Firefox extensions are in the end just some JS/CSS/HTML loaded outside of system's package manager. However, extensions have limited API they can interact with, and you need to allow permissions for each extension. Having Mozilla owned extension, that can modify core functionality, seems a bit scary.
If you didn't have browsers auto updating no-one would update them manually, meaning bad news for web developers wanting to take advantage of newer features.
> meaning bad news for web developers wanting to take advantage of newer features.
I think you spelled “mass compromise of unsuspecting users due to unpatched security holes” wrong.
Like it or not, browsers as the primary networked application that people use are the prime target to exploit users. They connect to unknown endpoints of questionable trustworthiness (unlike most other networked apps) and execute code loaded from there. They also handle people’s secrets such as credentials to Homebanking. We maybe shouldn’t be at that point, but here we are and browser vendors need to handle that responsibility. Quick auto updates are crucial for that. Expert users might dislike them, but let’s face it, we’re not the majority.
I understand the appeal of that for developers but it comes at the cost of users agency and control of their own system, I've been very annoyed with even simple UI changes in firefox updates as I simply didn't ask or want any such change. Reading other comments here it's clear I'm a dying breed of old and stubborn users that prefers full control and agency over my own system. Making it easier for web developers to implement new features is absolutely not a tradeoff I'd make willingly at the cost of my systems consistency and reliability. Also the reason I use firefox is because of all the major browsers vendors they seem the most aligned with those values although this seems to be changing more and more every year.
Good news for users is sometimes bad news for developers. Anyway, too often these "newer features" are just new ways to exploit people or shiny add-ons without much societal value.
I wonder if there is someone out there in the middle of the ocean with a browser extension based communication and navagation system which is dead in the water?
It sounds to me that the real headline here is that every copy of firefox out there was timebombed and we only noticed because someone forgot to elongate the fuse.
Could other code signing systems like macOS gatekeeper also be vulnerable to problems like this?
IMO this seems like just plain bad design. The Firefox addon certificate should never have had an expiry date. If they ever needed to revoke it, they could distribute an updated version of the browser with the previous intermediate explicitly marked as revoked.
That is my biggest complaint. Only the Firefox Linux team of included a about:config option to turn it off. Android, Windows and mac have no way to do so. It's still broken on my phone. Wtf were they thinking?
The browser itself continued working fine. Are you aware of any life-depending extension? Leaving this particular issue aside, your hypothetical "browser extension for people in the middle of the ocean" was doomed from its inception if it was designed to run as a browser extension (though it opens the door for an interesting discussion about similar scenarios that are happenning, like pilots relying on ipads)
> your hypothetical "browser extension for people in the middle of the ocean" was doomed from its inception if it was designed to run as a browser extension
Why? You haven't backed up that statement at all. Especially before they killed XUL it was easy to make a non-doomed app that runs as a browser extension, and it's still plenty possible.
No (non-demo) program should brick itself if it can't connect home.
There are _many_ applications that exist as browser extensions, including critical communications applications.
I don't personally know of any obviously life critical application done this way, mostly because I try to stay as far away from that sort of insanity.
If you don't think it's at least a plausible thing that could eventually happen you haven't been paying attention.
I personally got stuck stranded because of signals stupid built in timebombing when I was relying on a device with no untrusted third party ability to shove silent software updates for communication.
I hate to say all these things because I use Firefox all the time, but...the communication around the add-ons issue has been poorly handled by Mozilla. I only learned of the problem by visiting HN. But what of the thousands of other users who don't visit HN?
If you visit the Mozilla homepage, there is nothing to acknowledge the problem (at least at the time of writing this message). Let's try the Support page. Where is it? Scroll down to the bottom of the lengthy Mozilla homepage to the page footer to find the link. (How many visitors will make it to the bottom?)
When you click through to the Support page, an easy-to-miss banner in tiny text appears at the top of the page that mentions the problem - screenshot here: https://imgur.com/a/TAHZSWa
Additionally, when the add-ons are disabled, Firefox misleading says: "These extensions do not meet current Firefox standards so they have been deactivated". This is probably a generic message but it's also an example when a generic message is misleading.
Finally, poorly-named settings like "Normandy" and "studies" that give no hint of their meaning only adds to the confusion.
The way I see it, people might have gotten used to software break from time to time. Once software breaks it is reasonable to expect it to get fix in a couple days when it is updated. At least this was probably the experience for the majority of users, those that noticed the issue.
The sad reality is Mozilla has been losing mindshare to Chrome for a long time and this will rapidly accelerate it. People don't expect things to break. They expect things to work, and when things break they get angry.
I love Firefox. It's my daily driver. It will continue to be. But this is a huge fuck-up and they're probably going to pay big in usership because of it.
That's sadly not the first time Mozilla fails to communicate appropriately about issues/changes that are pushed down to the end users. They should reshuffle some of their Marketing Resources to work on proper non-promotional communication instead, so that current users at least know what to deal with.
If I understand the bug report comments correctly, they didn't close the trees to other code changes to prioritize fixing this, they did it because the cert expiry broke some important tests at the same time as it broke every end user's browser.
This is one of the things about this whole episode that I find baffling. Stuff like adjusting bug priorities and arranging for someone to tweet an announcement is the work of a good engineering manager. This is the right person to run interference and handle comms and deal with things outside of the critical path, like bugzilla updates.
Because people were staying up until the wee hours of the morning working on fixing it instead of toggling priorities in Bugzilla. This was treated as a five-alarm fire.
The normal practice would be to have someone operating as an incident communication manager who would be taking care of status/things like this.
Saying "we were too busy fixing to communicate" is actually a really bad sign, because it's not just about what you are communicating to the outside world, but also, for example, about making sure people that need to be brought in are getting consistent information.
I agree with you, it was more important to do the work than to signal.
However, I bet it’s likely they have procedures and policies for work that first involve signaling like for example the priority level.
I’d be willing to bet lots of things surrounding this issue weren’t handled in a by the book manner. So if you are always going to wing it, why have a book (or a public priority level system) at all?
I don't think it bothers me personally but it's funny you said that. Presumably you mean a "'no-alarm fire' because who has time to set off an alarm when there's a fire to fight"?!
I'm also interested in the postmortem to explain the processes that failed to allow the certificate to expire, but let's not overdramatize the situation by nitpicking about filling in form fields on bugzilla. The fact that the tree was closed is equivalent to DEFCON-1, which is all the priority anyone needs to understand the severity of this bug.
I'm also interested in why existing adds-ons are failing to run due to this problem. (There was a similar question in another thread about the issue here at HN.)
I understand why an add-on update or new installation would be prevented from succeeding by a certificate expiration. But why would a certificate expiration prevent an already-installed from running? Any already-installed add-ons were previously validated at installation time and should (IMO) run as-is. It seems unnecessary to continuously check the status of an add-on's certificate if it has not been changed. Am I missing something?
When a certificate is no longer valid, the authority it represents expires too. Grandfathering trust in various places would make cert management even more difficult to get right, because there'd be no hard deadline when a certificate is no longer in force.
Looking at the changeset [1], I'm curious why the explicit check for expiry (line 644/646) didn't work. Unfortunately the mentioned bug is rather light on details; presumably they were collaborating on IRC or something instead.
I know Firefox isn't being malicious, but ugh, this seems like the worst possible PR move for this, optics wise. "Hey so uh, we accidentally broke your browser, so you need to opt-in to becoming a guinney pig. But don't worry! You probably were already opted in anyway and just didn't realize it! Also it might take six hours to work."
So that's pretty unfair.
1) They state they are working on a fix for normal, release channel users who don't want to run studies
2) they tell you to temporarily run studies to get the fix within up to 6 six hours (could be faster; set expectation)
3) You can explicitly install nightly or 66.4 before it's pushed if you want a fix now
Yes, it's unfortunate, I'd expect them to meet it head on, push a tested fix in a timely matter, admit a mistake was made, explain publicly how/why and apply learning moving forward. Beyond that, what's your expectation?
Not saying that their current actions are wrong, just that the optics of it are terrible for them.
There was a chain of bad decisions that led them here though: 1) thinking it's ok to disable software after its installed (using cert expiration -- I'm ok if the cert was revoked but that's a totally different discussion), 2) Taking more control of people's local software than many people are comfortable with, especially considering that their main market is tech savvy people that tend to be more sensitive to this than most 3) Making some of these things opt-out rather than opt-in, giving the perception that they may value data collection and control more than their users privacy.
I'm not sure I care how unfair the characterization is. I heavily use container tabs — ahem, 'usecontainers — and all of my open container tabs disappeared at once, with no indication of why or what to do about it, when this happened. I lost an absurd amount of work and state because of that. I only knew what caused it by inference, because I'd just previously read The Fine Article (which, btw, gave no indication that losing state like that was something I should expect, merely, "No active steps need to be taken to make add-ons work again"...)
I still prefer Firefox over all the other browsers, and will continue to use it, but the project has lost a lot of trust and goodwill over this.
The optics are indeed awful, and this was fully preventable. Firefox fucked up, full stop.
They only point out the opt-in instructions for the few people that voluntarily opt out of Shield studies and wish to get the fix sooner.
Most Firefox users have that checkbox enabled by default, and so most Firefox users received the fix within 0-6 hours of the blog post's publication.
HN readers often take special care to prevent Mozilla from updating Firefox, but that in no way represents the wider population of either all addons users or all Firefox users.
I am sure an update will be pushed quickly. I'm waiting too, as a testing user.
You can fix this temporarily via setting "xpinstall.signatures.required" to false. Toggle it back to true once update is released and you install it.
Meanwhile I'm hijacking this comment that is to the upper parts of the tree to state this: the way the community treats Mozilla and Firefox is horribly, inexplicably, unacceptably unfair.
This is nothing compared to innumerable other fuckups in software history, and even recent ones like goto fail, heartbleed, or Chrome logging you into Sync w/o notice.
This is a mistake, an easily recoverable one, and is not intentional or malicious. Firefox is developed in the out and open, all the processes are public. And people, with an absurd entitlement and malice, go as far as to call things backdoors or malware. Meanwhile the alternative actually is a backdoor ridden malware.
If Mozilla Studies are implemented the way other "push" update systems are, then probably your browser has an ID that it hashes to get a bucket ID that it builds into a URL to check for updates, plus a cron time offset for running those checks. Then, the experiments are rolled out by walking up the bucket ID list and gradually adding the addon to said buckets.
Usually, this mechanism is explained as being helpful to ensure a rollout of an experimental update can be rolled back if it's failing. That's not so much a concern in this case, I think. But this mechanism has another effect: it works as a solution to the thundering-herd problem. Every browser updating at once is bad, not just for Mozilla's servers, but for every piece of Internet infrastructure that those browsers (and their arbitrary set of addons) talk to when they update/restart. Within the time budget you have for running a rolling update, you ideally want as few machines updating concurrently as possible, just because you don't want to generate mysterious correlated traffic bursts that make NOCs paranoid.
Poor communication with zero accountability. A few hours could be 2, it could be 12.
“I (name the individual accountable) will give you an update at 12:00 PT (name a time) as an update to this post (name the communication channel) with the current status and latest information on this issue (don’t promise time to resolution, just time to info).”
Simple, clear, concrete, and unambiguous. I had hoped that Firefox had better communication procedures in the event of global-impact P1 issues.
And now Mozilla are saying that the "fix" is to allow them to install & run "studies" on my machine? What are they smoking? I'm having a hard time trusting a company that randomly & remotely disabled all my addons, regardless of the cause.
I do think that the UX should ideally be a bit more graceful; one of my add-ons is Multi-account Containers and its being disabled suddenly caused the window I was actively browsing in to just close, among other side effects.
But that kind of UX polish for what should be an exceptional case is obviously not going to be super-high priority, unfortunately.
Dead Comment
You use a browser that has remote update capability, which allows them to install and run new software on your machine all the time. There is a whole separate section of the Preferences that says "Privacy" in large print that has a section that clearly identifies the Studies feature and lets you turn it off. And you use a browser that lets you install privacy-enhancing add-ons in the first place, and in fact which invented the whole concept of add-ons. When the browser discovered that it couldn't verify the add-on integrity with a valid cert, it did what it's supposed to do, it disabled them to protect you from someone backdooring these add-ons.
Someone at Mozilla fucked up, and they're trying in good faith to fix it. I don't know what else people are expecting them to do, putting on sackcloth and ashes won't resolve the problem.
Here's a metaphor: Let's say you let someone seemingly trustworthy watch your kid. (In this metaphor you have a kid). And they let your kid get a broken arm through gross negligence (let's say they passed out drinking beer), and then someone said "well, obviously, you should have never trusted that person, after all, they can do anything with your kid while you're gone, so why are you outraged?" You probably would still be pretty outraged right? You would certainly question your decision to trust them, but at the end of the day you have to trust someone, you'd be a complete shut-in if you could never hire a baby-sitter.
Dead Comment
Which is impossible unless you're either running Linux or running Nightly or Developer Edition. That setting is willfully ignored in normal Mac/Windows/Android builds most people are on.
"A certificate chain has expired, do you want to disable all add-ons?"
How hard is that?
Can you elaborate what's your concern with "studies"? By installing Firefox that updates automatically, the user is already giving control of the software and letting Mozilla decide what's the best. How is modifying software logic using studies different than modifying logic by updating the binary?
Eorum est humanum.
There was a similar issue[0] a few years ago that was only caught a month in advance.
Even better would be to set things up to only do a verify on install instead on every startup.
[0] https://bugzilla.mozilla.org/show_bug.cgi?id=1267318
That would defeat the purpose of verification: "Add-on signing in Firefox helps protect against browser hijackers and other malware by making it harder for them to be installed." [1]
And it's not just malware that was doing that. Microsoft force-installed the ".NET Framework Assistant" into Firefox on Windows, and you had to edit the registry to remove it. [2] If I recall correctly, AVG and Logitech were also among the list of offenders.
[1] https://support.mozilla.org/en-US/kb/add-on-signing-in-firef... [2] https://support.microsoft.com/en-us/help/963707/how-to-remov...
And honestly, I think it is security theater to attempt to defend against attackers on the same or higher privilege level. If microsoft wants to force something down your throat on windows then there's not much you can do.
The problem is that mozilla turns the failures of others into their own problem and then they try to fix it themselves. That scope and responsibility creep leads us to the fallout we're seeing now.
Edit: Let me amend my question - why is it necessary for the certificates to expire? If a plugin is signed by Mozilla, why wouldn't it be trusted once it gets old?
If it's not a malicious extension, verifying the signature doesen't prevent a forced install.
I don't expect software to (significantly?) change during runtime, outside of what was packaged, signed, distributed and installed as part of apt/yum/pacman/etc.
I understand (not that I like or agree with) that some apps are just embedded web browsers, and load everything externally, and that Firefox extensions are in the end just some JS/CSS/HTML loaded outside of system's package manager. However, extensions have limited API they can interact with, and you need to allow permissions for each extension. Having Mozilla owned extension, that can modify core functionality, seems a bit scary.
I think you spelled “mass compromise of unsuspecting users due to unpatched security holes” wrong.
Like it or not, browsers as the primary networked application that people use are the prime target to exploit users. They connect to unknown endpoints of questionable trustworthiness (unlike most other networked apps) and execute code loaded from there. They also handle people’s secrets such as credentials to Homebanking. We maybe shouldn’t be at that point, but here we are and browser vendors need to handle that responsibility. Quick auto updates are crucial for that. Expert users might dislike them, but let’s face it, we’re not the majority.
It sounds to me that the real headline here is that every copy of firefox out there was timebombed and we only noticed because someone forgot to elongate the fuse.
IMO this seems like just plain bad design. The Firefox addon certificate should never have had an expiry date. If they ever needed to revoke it, they could distribute an updated version of the browser with the previous intermediate explicitly marked as revoked.
xpinstall.signatures.required
Works on Fennec version of Firefox.
Also IceCat version of Firefox wasn't affected AFAIK.
Why? You haven't backed up that statement at all. Especially before they killed XUL it was easy to make a non-doomed app that runs as a browser extension, and it's still plenty possible.
No (non-demo) program should brick itself if it can't connect home.
I don't personally know of any obviously life critical application done this way, mostly because I try to stay as far away from that sort of insanity.
If you don't think it's at least a plausible thing that could eventually happen you haven't been paying attention.
I personally got stuck stranded because of signals stupid built in timebombing when I was relying on a device with no untrusted third party ability to shove silent software updates for communication.
If you visit the Mozilla homepage, there is nothing to acknowledge the problem (at least at the time of writing this message). Let's try the Support page. Where is it? Scroll down to the bottom of the lengthy Mozilla homepage to the page footer to find the link. (How many visitors will make it to the bottom?)
When you click through to the Support page, an easy-to-miss banner in tiny text appears at the top of the page that mentions the problem - screenshot here: https://imgur.com/a/TAHZSWa
Additionally, when the add-ons are disabled, Firefox misleading says: "These extensions do not meet current Firefox standards so they have been deactivated". This is probably a generic message but it's also an example when a generic message is misleading.
Finally, poorly-named settings like "Normandy" and "studies" that give no hint of their meaning only adds to the confusion.
I love Firefox. It's my daily driver. It will continue to be. But this is a huge fuck-up and they're probably going to pay big in usership because of it.
Also why it took 6 hrs to assign P1 to the bug
I would assume the delay in assigning P1 is really just a result of assigning P1 not being as high priority as fixing the damn problem.
Because people were staying up until the wee hours of the morning working on fixing it instead of toggling priorities in Bugzilla. This was treated as a five-alarm fire.
Saying "we were too busy fixing to communicate" is actually a really bad sign, because it's not just about what you are communicating to the outside world, but also, for example, about making sure people that need to be brought in are getting consistent information.
However, I bet it’s likely they have procedures and policies for work that first involve signaling like for example the priority level.
I’d be willing to bet lots of things surrounding this issue weren’t handled in a by the book manner. So if you are always going to wing it, why have a book (or a public priority level system) at all?
I don't think it bothers me personally but it's funny you said that. Presumably you mean a "'no-alarm fire' because who has time to set off an alarm when there's a fire to fight"?!
Random user: What the fuck is a tree and why is the priority of this not higher yet?
I understand why an add-on update or new installation would be prevented from succeeding by a certificate expiration. But why would a certificate expiration prevent an already-installed from running? Any already-installed add-ons were previously validated at installation time and should (IMO) run as-is. It seems unnecessary to continuously check the status of an add-on's certificate if it has not been changed. Am I missing something?
[1] GitHub mirror to not stress their infra: https://github.com/mozilla/gecko-dev/commit/1d1260c7615f1d9a...
Deleted Comment
Deleted Comment
Yes, it's unfortunate, I'd expect them to meet it head on, push a tested fix in a timely matter, admit a mistake was made, explain publicly how/why and apply learning moving forward. Beyond that, what's your expectation?
There was a chain of bad decisions that led them here though: 1) thinking it's ok to disable software after its installed (using cert expiration -- I'm ok if the cert was revoked but that's a totally different discussion), 2) Taking more control of people's local software than many people are comfortable with, especially considering that their main market is tech savvy people that tend to be more sensitive to this than most 3) Making some of these things opt-out rather than opt-in, giving the perception that they may value data collection and control more than their users privacy.
I still prefer Firefox over all the other browsers, and will continue to use it, but the project has lost a lot of trust and goodwill over this.
The optics are indeed awful, and this was fully preventable. Firefox fucked up, full stop.
Given that it has happened, I expect them to provision a new certificate and push a fixed version within an hour or two to all release channels.
What I would emphatically ‘not’ expect, is a hack that might take up to 6 hours to be applied.
Most Firefox users have that checkbox enabled by default, and so most Firefox users received the fix within 0-6 hours of the blog post's publication.
HN readers often take special care to prevent Mozilla from updating Firefox, but that in no way represents the wider population of either all addons users or all Firefox users.
You can fix this temporarily via setting "xpinstall.signatures.required" to false. Toggle it back to true once update is released and you install it.
Meanwhile I'm hijacking this comment that is to the upper parts of the tree to state this: the way the community treats Mozilla and Firefox is horribly, inexplicably, unacceptably unfair.
This is nothing compared to innumerable other fuckups in software history, and even recent ones like goto fail, heartbleed, or Chrome logging you into Sync w/o notice.
This is a mistake, an easily recoverable one, and is not intentional or malicious. Firefox is developed in the out and open, all the processes are public. And people, with an absurd entitlement and malice, go as far as to call things backdoors or malware. Meanwhile the alternative actually is a backdoor ridden malware.
Please don't be this ungrateful.
Usually, this mechanism is explained as being helpful to ensure a rollout of an experimental update can be rolled back if it's failing. That's not so much a concern in this case, I think. But this mechanism has another effect: it works as a solution to the thundering-herd problem. Every browser updating at once is bad, not just for Mozilla's servers, but for every piece of Internet infrastructure that those browsers (and their arbitrary set of addons) talk to when they update/restart. Within the time budget you have for running a rolling update, you ideally want as few machines updating concurrently as possible, just because you don't want to generate mysterious correlated traffic bursts that make NOCs paranoid.
You go a bit in, wait to see if the canary, then go further.
HN has this problem too.
“I (name the individual accountable) will give you an update at 12:00 PT (name a time) as an update to this post (name the communication channel) with the current status and latest information on this issue (don’t promise time to resolution, just time to info).”
Simple, clear, concrete, and unambiguous. I had hoped that Firefox had better communication procedures in the event of global-impact P1 issues.