I think the fundamental disconnect here is that Google's view of "user" is a "Chrome/Android User Who Shops from SERP Pages" -- google makes money vs the more nebulous "user" of "the (open) web" which is probably only understood by a few people who were alive in the pre-web world (people 35 and older who were also online).
Google does not care about the later and only wishes to make more money from the former. Google has a clear and blatant monopoly position over ad-based web monetization so most of the web will follow Google's will. We all need paychecks. The group of old farts who saw the world change are growing older and irrelevant.
I am extremely pessimistic about the future of "the (open) web" as the vehicle of our modern low-friction economy as these corporate gatekeepers (Google and Microsoft) are making such big wins recently.
Good luck out there. The World Wide Web (old school) and Old Fashioned HTTP+HTML are under grave threat from carpetbaggers.
Is there any chance of a hard fork? What about, let's say, a web 1.1 where we intentionally remove all the fancy new web APIs and mostly revert back to what we had in the late 90s? Sure, things like video support can remain but all the crazy stuff for building web apps would go away. Let the current web rot away under its corporate overlords and then, maybe, we can have the fork go back into being a fun way of publishing and sharing information.
Have you tried the dark web? For the sake of anonymity, everyone has Javascript disabled when browsing Tor hidden sites — so such sites must be designed to conform to web 1.1 principles.
It's actually a very interesting frontend platform to design for, because you don't get any Javascript support, but you get full modern CSS support.
We'll build our own internet! With blackjack and hookers!
More seriously, I see echoes of the gentrification cycle. At the end of the cycle nobody wants to live in the soulless corporate hellscape they've helped create, so they follow the cool kids to the next up-and-coming neighbourhood. It works for social media sites, so why not for an entire protocol?
If you can figure out a protocol where ads don't work, I'm in.
The only hope is anti-trust breakup of Google. Chrome has to be pried forcefully from their hands.
We should launch massive campaigns not just in the US, but also Europe and other critical markets.
We shouldn't back down even if they abandon WEI. They'll just keep trying as they have with AMP, Manifest v2, WHATWG [1], etc.
Google can never be allowed to build browser tech so long as they control search.
The web must remain open.
[1] WHATWG took unilateral control over the HTML spec. They abandoned the Semantic Web, which had led to RSS, Atom, etc., and would allow documents to expose information you could scrape and index without Google Search. Google wanted documents to remain forgiving and easy to author (but messy, without standard semantics, and hard to scrape info from)
There are multiple platforms trying to provide this (neocities most prominently, mmm.page most recently, various others that occasionally get posted to HN). Of course, we don't need a platform; we need a culture, and infrastructure, and protocols, and some balance of organization and search. And have it all not sitting on Amazon's servers. And a way to pay for the parts people can't or won't provide for free.
> Is there any chance of a hard fork? What about, let's say, a web 1.1 where we intentionally remove all the fancy new web APIs and mostly revert back to what we had in the late 90s?
Sure. It's really just a matter of mass appeal. We could fork the existing browser base and eliminate the new attestation API. Some projects are already doing this from what I understand.
What will keep attestation from being used is websites will lose business if their customers can't access the site. We went through this with user-agent string checking in the 90's/00's when IE and Netscape/Mozilla were at war and every site had a very strong opinion on which browser they would support. Even today you occasionally see sites that will hit you with "unsupported browser" errors if you aren't running a specific version of something.
The solution to this was everyone realized they were throwing money away by excluding a large portion of their customer base. At the time no single browser really dominated the market share so it was easy to see that an IE-only site was losing 33% of internet traffic. These days everything is basically chrome-based so this hasn't been as much of an issue.
So in the future we'll see this same thing. Non-attestable browsers will be locked out of attested sites and it will be a numbers game to see if sites want to risk losing these customers/viewers.
At the end of the day, you have to remember that everything on the web is just a TCP socket and some HTTP which is a flexible text protocol. We can build pretty much anything we want but it takes inertia to keep it going.
I would like to think so but as someone who's tried to hack on the chromium codebase I'd say it's easier to make a new browser from scratch than to figure out how to make meaningful changes to chromium.
>My web browser (currently Mozilla Firefox running on Debian GNU/Linux, thank you very much) will never cooperate with this bizarre and misguided proposal.
Mozilla used to be about user freedoms. Lately Mozilla has been a front-runner on turning off and disabling non-TLS just HTTP support. They will likely be one of the first browsers to remove support for it and eventually HTTP/1.1 as a whole. ref: https://blog.mozilla.org/security/2015/04/30/deprecating-non...
Given that HTTP/3 as implemented by Mozilla cannot connect to self-signed TLS cert websites this means the future of Firefox is as a browser that can only visit websites that third party TLS CA corporations periodically approve (even if those corporations are currently benign, like LetsEncrypt). Does this remind you of anything? That's not to say other browsers are better in this respect. Mozilla's Firefox and it's forks are the least worst... it's just everything is getting much worse all together.
Having personally experienced what happens to my webpages when Comcast realizes that it can do whatever it wants to bare HTTP requests all the way up to and including inserting invasive advertisements loaded with arbitrary javascript, I think that "least worst" is exactly the right word for requiring HTTPS everywhere. I do agree that it would have been nice if there was a standard that required encryption without also requiring authentication, but this is the world we live in now.
I also had Comcast do that. It made my Steam (games service) browser un-usable one time. So in commercial contexts HTTPS makes sense.
But there is more to the web than just the commercial, institutional, and the like. Websites that are run by human people without profit motive and without any need to be constrained by the realities of CA TLS exist. The major browsers are all about money these days so they'll prioritize the safety of monetary transactions above user freedoms. But just because this is the right decision for an profit driven company or institution doesn't mean it's the right thing for everyone and should be applied to everyone. In fact doing so will ruin the web.
> a browser that can only visit websites that third party TLS CA corporations periodically approve
Er... no. It means that Firefox will only connect to websites that the domain administrator of the system approves of. You, as the administrator of a computer, can install whatever X.509 roots of trust you want. Including a root of trust you own, which can issue certificates for whatever websites you approve of.
Today, where there are residential users who can't get the attention of big companies, you'd probably then run a local forward-proxy that re-wraps connections to sites you trust, with certificates rooted in your root-of-trust.
But this is just a sociological evolution of the original design intent of X.509: where each corporate/institutional/etc domain would directly manage its own trust, acting as its own CA and making its own trust declarations about each site on the internet, granting each site it trusts a cert for that site to use when computers from that domain connect to it. Just like how client certs work — in reverse.
(How would that work? You'd configure your web server with a mapping from IP range to cert+privkey files. Made sense back when there was a 1:1 relationship between one class-A or class-B IP range, one Autonomous System, and one company/institution large enough to think of itself as its own ISP with its own "Internet safety" department.)
> You, as the administrator of a computer, can install whatever X.509 roots of trust you want. Including a root of trust you own, which can issue certificates for whatever websites you approve of.
That is a completely unreasonable assumption. The barriers of entry have been greatly increased.
How many users have devices that they are really administrators of? Fewer and fewer.
What is the technical challenge of setting up your own HTTP server that can be browsed with an off the shelf browser on your local computer?
My guess is that if only encryption were the goal, then browsers should trust self-signed certs or at least upon first visit, present the cert and ask whether to trust it in the future. *
Instead the current system depends on some set of built in trusted root certificate that's run by opaque monopolies (at least pre Let's Encrypt) plus a lot of hassle to add self signed certs if it's even supported at all. (IIRC some browsers like Chrome will ignore system trusted CAs in an attempt to "help the user be more secure" ref: https://serverfault.com/questions/946756/ssl-certificate-in-...)
* There is precedent for this, for things like Remote Desktop or SSH where only encryption is the goal, their default behavior is exactly this: confirm upon first access, and remember the approved cert for the future. You do not need to get your server blessed by a CA to connect over ssh :)
Mandating encryption scares me because it limits being able to be seen online to only those with control over DNS.
And there's too many countries that have power to mess with their citizen's DNS resolving, and too many ways for domain names to be taken down.
This is creating a system filled with more absolutes than there should be. And the people doing the encrypting aren't willing to put in any time or effort for other basic affordances. If we could do opportunistic encryption, which isn't really trustworthy (that it's not being mitm) but has many upsides like letting you know you're still talking to the same people for ex - I think if we had an ever more robustening and not ever narrowing stance for what we could do to encrypt the picture of security happening would be much less scary. But we are letting more and more layers of systems have to be involved, with more chokepoints for governments, in a way that seems ossifying & fragile.
There are ways to do encrypted web that don't require a corporation to approve your website every 90 days. If Firefox would change it's release binary's build so the http/3 lib linked accepted self-signed certs the problem would be almost entirely mitigated while still retaining the 100% encrypted web (assuming trust on first use is acceptable to you). But these things aren't going to happen when everyone keeps ignoring and downplaying the implications of HTTP/3 support as shipped now and how it creates a handful of content approval gateways like WEI.
When HTTP/1.1 is a thing of the past and Firefox won't load any endpoint without CA TLS on HTTP/3 then the fact that there are only a handful of corporate entities you can get a TLS cert from means they'll be an even more tempting target for those that wish to apply pressure and restrict access to whatever topics they don't like. It wouldn't be the first time a CA has been pressured to drop a site and it certainly won't be the last if things go this way.
Additionally, it significantly increases the complexity of setting up visitable personal website. There are packages for acme2 and some CAs that can hide this complexity but it is there and does break. It acts as a roadblock to what I see as one of viable contributors to keeping the web open: self hosting.
But again, I brought it up because the original linked article suggests Mozilla would never accept something as bad as WEI. With the way FF HTTP/3 is implemented they already have done something similar in outcome. So I do think we need to make noise about WEI (and HTTP/3).
What happens when CA's say "you can't get a certificate if you supported <insert ideology here>"? Or "you can't get a certificate if you are racist"? Or "you can't get a certificate if your credit score is too low"? Or "you can't get a certificate if your website contains <pornography/warez/p2p encryption/firearms/anything we don't like>"?
Not to mention the CA could potentially, intentionally or not, leak keys and allow governments, hackers or other interested entities to decrypt traffic.
Centralising trust will always be a bad idea, regardless of context.
This sounds like a great way to get lots of people to run old software. I'm sure most people wouldn't even bat an eyelid when they go on to install an out of date browser to make sure a website they want to visit works.
Security people can complain as much as they want, but it's these kinds of anti-user practices that makes users hate updating.
Security people can complain as much as they want, but it's these kinds of anti-user practices that makes users hate updating.
Indeed, I've always thought the classic saying about those who give up freedom for security is very relevant in the current times. I'm quite certain that it's possible to respect the user and improve security (for the user), but instead they've been using security as an excuse to do worse to the users.
I'd guess that ship has sailed for many people. I never update my "consumer" software because every update makes it worse. I can't be the only one. Nobody is getting any kind of positive reinforcement on updating, best case scenario it does nothing, mostly it makes stuff worse or takes away freedoms.
That would do the opposite. The main idea of this feature is that DRM and banking sites could block access when they can't verify the browser is untampered with. So your old browser would just be shown an error page telling you to install the latest chrome to continue.
This would likely be a great way to get lots of people to run old software for a while, until criminals take advantage of all those juicy unpatched vulnerabilities and all their devices start showing them ads for penis pills on every webpage and their credit card number gets stolen every other week.
That would be pretty dumb then because there is plenty of older IoT stuff that you won't be able to access anymore with FF. Sick and tired of all these companies, foundations and other silos telling people what they can and can not do with their own hardware.
If I want to visit scary non encrypted websites I should be able to do so.
Yes, absolutely. Nobody else will trust it, but you can always set up your own CA for use by computers you use.
Which is fundamentally still better than insecure HTTP, because it's at least possible to take steps to trust it and make sure it's the same server you expect to talk to.
My typical website's visitor is someone on the other side of the earth I don't know and will never know. Getting my root cert in their trust store just isn't a feasible option in this extremely common use case for a public website.
On the topic of user freedom, Firefox also doesn't allow installing extensions not signed by Mozilla unless you use a fork, Nightly, or Developer Edition (which is just a badly named beta)[0]. The hilarious thing is that Safari, the web browser from the company infamous for walled gardens and not letting you control your device, does let you install unsigned extensions on desktop[1].
This is actually no different from Firefox: "The Allow Unsigned Extensions setting resets when you quit Safari; set it again the next time you launch Safari."
You can run an unsigned add-on in regular Firefox by opening about:debugging#/runtime/this-firefox and clicking Load Temporary Add-on...
In both cases, it only lasts until you quit the app.
- "Who should your computer take its orders from? Most people think their computers should obey them, not obey someone else. With a plan they call “trusted computing,” large media corporations (including the movie companies and record companies), together with computer companies such as Microsoft and Intel, are planning to make your computer obey them instead of you. (Microsoft's version of this scheme is called Palladium.) Proprietary programs have included malicious features before, but this plan would make it universal."
Is there a link to an article that actually goes into WEI on a technical level that isn't the proposal itself?
So many things posted to HN about it have been the grand overview, which is a perspective worth diving into but also has drowned out every other perspective to the point where it's very difficult to figure out what's really happening with the proposal here.
1. Authors don't understand what tech they use. (I will leave it to your belief - but that would be even worse for Internet)
2. Authors don't understand that something unrealistic is unrealistic.
Idea that you can give companies or corporations tools to check "did user modify his environment" and they would not use it to exclude users is stupid or disingenuous because advocates for this did exactly comment in such way: We want this proposal to do precisely that.
Again Google tries to defend it by saying "we will return invalid 'false' for some of the users/times of Chrome users" [to make sure that website will not do that] which for me is not only bad because it then creates "when google revokes this policy we are in even worse situation" but then leaves the issue how google decides "who" to give back this 'false':
I will reject times immediately not only because this can be easily circumvented by website [check n-times] to detriment of user but it would also contradict official documentation of WEI (same token for same input from user).
And this leads us to another point - if Google wants to return false negatives it would need to either keep information that is supposed to return 'false' - EU will not be very happy with that (also it does contradict this "chrome users"); or more likely it will be implemented in chrome.
Now when we established that implementation in chrome is most probable - we can also establish that:
A) Implement this on profile basis - companies will ask you to reset profile if you are this false negative.
B) Implement on connection basis - companies will ask you to refresh.
C) Implement on device age / os version / type - Google can even make the manufacturers happy with this one.
… as you can see at most this will be nuisance and if by some weird way:
Z) Implement on Super-complicated basis - this will be still possible because…
3. Google plays disingenuous word game with us here by saying - We won't destroy open web
Other Chromium browsers may ignore that Google X% false negative (Google may loose few % of users before it scrapes this policy). And there is 0 need for Google to actually do something when companies will misuse this API.
In simple words the part that should worry you is not that "Google will destroy web by using this API on Chrome and it services", what must worry you is that other companies will do that for Google and Google will wash their hands from this by saying "We wanted good but didn't work". You can see that tone from the Google - We don't want that so we created these "holdouts".
Don't let Google move Overton window, they are proposing thing that any sensible person see as clear cut attack (or stupid idea that can only work this way) on Privacy and Your Right to use Your Device (and for some people Your OS and/or Your Browser) as You want to use. They are at fault here.
As far as I can tell, there are two reasons for this feature, the legitimate one is that users largely have browser extensions which are malware. They may have even been legitimate when they installed them, but then auto updated to be malware later. This poses a problem for banking sites because desktop browsers can no longer be trusted to be secure so they push you to use the mobile app or at least confirm transactions with the app which is trusted.
The illegitimate reason is they can stop ad blockers and content downloaders / DRM bypassers.
I'm not trying to understand it's "technical merits" but what exactly it is. Even experts on The Registry are quoted saying it's "nebulous".
So what exactly are we even talking about here? The idea of attestation or just this proposal or is it a Google thing? Does this compare to cloudflare private tokens or safetynet or are they completely different? If proposal goes through what does that functionally mean for browsers both ones based on chromium and ones not?
I don't know why it's so difficult to find these details and I'm instead being told to just accept the idea that the premise is unacceptable.
I stopped reading after the explainer’s intro section. The first example is making it easier for websites to sell adds (lmao) and the other 3 are extremely questionable whether if the proposed remedy even helps. And it’s presented as a benevolent alternative to browser fingerprinting, as if we must choose between these two awful choices. It’s an absolute joke of a proposal.
May I suggest something like "Enterprise Environment Integrity". How does the public know that the enterprise (i.e. google) it's dealing with is healthy?
The public should have an entity that will receive detailed attestation data to assess that. Failing the attestation will revoke business permit along with an announcement.
> How does the public know that the enterprise (i.e. google) it's dealing with is healthy?
Because they will pinky promise.
I find it funny that for some reason companies get the benefit of the doubt when it comes to dealing with data in a responsible matter. Yes, it's possible that they do. But it is also possible that they don't and no matter what they say in public that's just words, it doesn't prove anything about what is really going on and that's before we get to honest mistakes.
There is simply no way to be sure, all you know is that once you transmit data to any other host on the internet that it is quite literally out of your hands whether or not that data will one day show up elsewhere.
It goes a bit deeper than that. Many companies these days "choose" to get certified under a variety of standards (the most common one is ISO 27001), everyone who hasn't been completely ignorant is looking for or already got cybersecurity insurance and on top of that comes the entire GDPR saga. Basically, you got three levels of auditors that at least make sure the basics are covered, and on top of that come industry specific requirements such as TISAX [1], US SOX Act compliance or whatever AWS had to go through for GovCloud.
It doesn't matter to the public. Each site chooses what attestors it trusts and the site can keep track of how useful that signal is. If the signal turns out to be useless the site doesn't have to use it for anything or can stop collecting it.
> In the normal world, you show up at the store with a five dollar bill, pick up a newspaper, and the store sells you the newspaper (and maybe some change) in exchange for the bill. In Google’s proposed world, five dollar bills aren’t fungible anymore: the store can ask you about the provenance of that bill, and if they don’t like the answer, they don’t sell you the newspaper. No, they’re not worried about the bill being fake or counterfeit or anything like that. It’s a real five dollar bill, they agree, but you can’t prove that you got it from the right bank. Please feel free to come back with the right sort of five dollar bill.
Side note: This at least would occasionally happen if you tried to spend Scotland or NI £5 notes in England.
IDK why people try so hard to cram metaphors in to things, especially when the metaphore is more confusing that the thing they are trying to explain. It's not at all like currency and fungibility.
It's like Android SafetyNet where apps can work out if the device is rooted and running custom software underneath the browser/app.
No it's not, Scottish bank notes aren't of a different currency - they're still pound sterling. The reason they're typically not accepted in English shops (at least, those not on the Scottish border) is most often because they're rather uncommon so it's more difficult for cashiers to detect fakes. My understanding is also that some banks, when depositing, require the English and Scottish notes to be separated and may charge a fee to convert them to English notes, so it's more effort to accept and handle them.
I can't convey how disgusted I am at the thought of WEI becoming a reality.
It will lead to three webs: the remainder of the open web, the new closed web, and the pirate web.
Personally I'll do my bit to preserve openness, even if that means working socially and technically to support the new world of piracy. It will always be a losing battle without institutions fighting for openness, though.
This is a moment when Sun's old line - "the network is the computer" - starts to look hideous and dystopian. Prophetic, but maybe not how we thought.
It's not immediately obvious to me that the closed web will have anything good on it. People that want other people to see their stuff won't lock down who can visit, it seems like it's mainly for ad supported crap? Optimistically, the web will break apart into some AOL Disneyland Cable shit experience and an actual good internet whose participants are not just pretending to have engaging content so they can get ad views. I know that sounds too optimistic, what's the flaw in it? Google will use it's monopoly on a few things to push it, I'm happy to move away from gmail and I don't use Google search anyway. What other practical changes will there be?
Your online banking will stop working on your unapproved software, just like your baking app stopped working on your rooted/old Android phone some 3-5 years ago.
Google does not care about the later and only wishes to make more money from the former. Google has a clear and blatant monopoly position over ad-based web monetization so most of the web will follow Google's will. We all need paychecks. The group of old farts who saw the world change are growing older and irrelevant.
I am extremely pessimistic about the future of "the (open) web" as the vehicle of our modern low-friction economy as these corporate gatekeepers (Google and Microsoft) are making such big wins recently.
Good luck out there. The World Wide Web (old school) and Old Fashioned HTTP+HTML are under grave threat from carpetbaggers.
It's actually a very interesting frontend platform to design for, because you don't get any Javascript support, but you get full modern CSS support.
We'll build our own internet! With blackjack and hookers!
More seriously, I see echoes of the gentrification cycle. At the end of the cycle nobody wants to live in the soulless corporate hellscape they've helped create, so they follow the cool kids to the next up-and-coming neighbourhood. It works for social media sites, so why not for an entire protocol?
If you can figure out a protocol where ads don't work, I'm in.
The only hope is anti-trust breakup of Google. Chrome has to be pried forcefully from their hands.
We should launch massive campaigns not just in the US, but also Europe and other critical markets.
We shouldn't back down even if they abandon WEI. They'll just keep trying as they have with AMP, Manifest v2, WHATWG [1], etc.
Google can never be allowed to build browser tech so long as they control search.
The web must remain open.
[1] WHATWG took unilateral control over the HTML spec. They abandoned the Semantic Web, which had led to RSS, Atom, etc., and would allow documents to expose information you could scrape and index without Google Search. Google wanted documents to remain forgiving and easy to author (but messy, without standard semantics, and hard to scrape info from)
I want to see it; I don't know the path there.
Sure. It's really just a matter of mass appeal. We could fork the existing browser base and eliminate the new attestation API. Some projects are already doing this from what I understand.
What will keep attestation from being used is websites will lose business if their customers can't access the site. We went through this with user-agent string checking in the 90's/00's when IE and Netscape/Mozilla were at war and every site had a very strong opinion on which browser they would support. Even today you occasionally see sites that will hit you with "unsupported browser" errors if you aren't running a specific version of something.
The solution to this was everyone realized they were throwing money away by excluding a large portion of their customer base. At the time no single browser really dominated the market share so it was easy to see that an IE-only site was losing 33% of internet traffic. These days everything is basically chrome-based so this hasn't been as much of an issue.
So in the future we'll see this same thing. Non-attestable browsers will be locked out of attested sites and it will be a numbers game to see if sites want to risk losing these customers/viewers.
At the end of the day, you have to remember that everything on the web is just a TCP socket and some HTTP which is a flexible text protocol. We can build pretty much anything we want but it takes inertia to keep it going.
I would like to think so but as someone who's tried to hack on the chromium codebase I'd say it's easier to make a new browser from scratch than to figure out how to make meaningful changes to chromium.
>My web browser (currently Mozilla Firefox running on Debian GNU/Linux, thank you very much) will never cooperate with this bizarre and misguided proposal.
Mozilla used to be about user freedoms. Lately Mozilla has been a front-runner on turning off and disabling non-TLS just HTTP support. They will likely be one of the first browsers to remove support for it and eventually HTTP/1.1 as a whole. ref: https://blog.mozilla.org/security/2015/04/30/deprecating-non...
Given that HTTP/3 as implemented by Mozilla cannot connect to self-signed TLS cert websites this means the future of Firefox is as a browser that can only visit websites that third party TLS CA corporations periodically approve (even if those corporations are currently benign, like LetsEncrypt). Does this remind you of anything? That's not to say other browsers are better in this respect. Mozilla's Firefox and it's forks are the least worst... it's just everything is getting much worse all together.
But there is more to the web than just the commercial, institutional, and the like. Websites that are run by human people without profit motive and without any need to be constrained by the realities of CA TLS exist. The major browsers are all about money these days so they'll prioritize the safety of monetary transactions above user freedoms. But just because this is the right decision for an profit driven company or institution doesn't mean it's the right thing for everyone and should be applied to everyone. In fact doing so will ruin the web.
Er... no. It means that Firefox will only connect to websites that the domain administrator of the system approves of. You, as the administrator of a computer, can install whatever X.509 roots of trust you want. Including a root of trust you own, which can issue certificates for whatever websites you approve of.
Today, where there are residential users who can't get the attention of big companies, you'd probably then run a local forward-proxy that re-wraps connections to sites you trust, with certificates rooted in your root-of-trust.
But this is just a sociological evolution of the original design intent of X.509: where each corporate/institutional/etc domain would directly manage its own trust, acting as its own CA and making its own trust declarations about each site on the internet, granting each site it trusts a cert for that site to use when computers from that domain connect to it. Just like how client certs work — in reverse.
(How would that work? You'd configure your web server with a mapping from IP range to cert+privkey files. Made sense back when there was a 1:1 relationship between one class-A or class-B IP range, one Autonomous System, and one company/institution large enough to think of itself as its own ISP with its own "Internet safety" department.)
That is a completely unreasonable assumption. The barriers of entry have been greatly increased.
How many users have devices that they are really administrators of? Fewer and fewer.
What is the technical challenge of setting up your own HTTP server that can be browsed with an off the shelf browser on your local computer?
I still don’t understand your distain for the idea of a 100% encrypted web.
Rather than saying “does this remind you of anything”, can you tell us what it reminds you of?
I guess the issue is, eventually, CA’s can decide not to issue certificates to certain people classified as malicious/nefarious/etc?
Can you clearly articulate your position on this point?
Instead the current system depends on some set of built in trusted root certificate that's run by opaque monopolies (at least pre Let's Encrypt) plus a lot of hassle to add self signed certs if it's even supported at all. (IIRC some browsers like Chrome will ignore system trusted CAs in an attempt to "help the user be more secure" ref: https://serverfault.com/questions/946756/ssl-certificate-in-...)
* There is precedent for this, for things like Remote Desktop or SSH where only encryption is the goal, their default behavior is exactly this: confirm upon first access, and remember the approved cert for the future. You do not need to get your server blessed by a CA to connect over ssh :)
And there's too many countries that have power to mess with their citizen's DNS resolving, and too many ways for domain names to be taken down.
This is creating a system filled with more absolutes than there should be. And the people doing the encrypting aren't willing to put in any time or effort for other basic affordances. If we could do opportunistic encryption, which isn't really trustworthy (that it's not being mitm) but has many upsides like letting you know you're still talking to the same people for ex - I think if we had an ever more robustening and not ever narrowing stance for what we could do to encrypt the picture of security happening would be much less scary. But we are letting more and more layers of systems have to be involved, with more chokepoints for governments, in a way that seems ossifying & fragile.
When HTTP/1.1 is a thing of the past and Firefox won't load any endpoint without CA TLS on HTTP/3 then the fact that there are only a handful of corporate entities you can get a TLS cert from means they'll be an even more tempting target for those that wish to apply pressure and restrict access to whatever topics they don't like. It wouldn't be the first time a CA has been pressured to drop a site and it certainly won't be the last if things go this way.
Additionally, it significantly increases the complexity of setting up visitable personal website. There are packages for acme2 and some CAs that can hide this complexity but it is there and does break. It acts as a roadblock to what I see as one of viable contributors to keeping the web open: self hosting.
But again, I brought it up because the original linked article suggests Mozilla would never accept something as bad as WEI. With the way FF HTTP/3 is implemented they already have done something similar in outcome. So I do think we need to make noise about WEI (and HTTP/3).
Centralising trust will always be a bad idea, regardless of context.
Security people can complain as much as they want, but it's these kinds of anti-user practices that makes users hate updating.
Indeed, I've always thought the classic saying about those who give up freedom for security is very relevant in the current times. I'm quite certain that it's possible to respect the user and improve security (for the user), but instead they've been using security as an excuse to do worse to the users.
A quick nod to Tor Browser, the Firefox fork which will always support HTTP in order to support the vast majority of Tor hidden services.
If I want to visit scary non encrypted websites I should be able to do so.
But I would prefer my grandma be blocked from all non-encrypted sites, sorry!
Which is fundamentally still better than insecure HTTP, because it's at least possible to take steps to trust it and make sure it's the same server you expect to talk to.
[0]: https://wiki.mozilla.org/Add-ons/Extension_Signing
[1]: https://developer.apple.com/documentation/safariservices/saf...
You can run an unsigned add-on in regular Firefox by opening about:debugging#/runtime/this-firefox and clicking Load Temporary Add-on...
In both cases, it only lasts until you quit the app.
This fundamentally comes down to "do you really control your computer, or does someone else?":
https://youtu.be/Ag1AKIl_2GM?t=57
- "Who should your computer take its orders from? Most people think their computers should obey them, not obey someone else. With a plan they call “trusted computing,” large media corporations (including the movie companies and record companies), together with computer companies such as Microsoft and Intel, are planning to make your computer obey them instead of you. (Microsoft's version of this scheme is called Palladium.) Proprietary programs have included malicious features before, but this plan would make it universal."
https://www.gnu.org/philosophy/can-you-trust.en.html
So many things posted to HN about it have been the grand overview, which is a perspective worth diving into but also has drowned out every other perspective to the point where it's very difficult to figure out what's really happening with the proposal here.
1. Authors don't understand what tech they use. (I will leave it to your belief - but that would be even worse for Internet)
2. Authors don't understand that something unrealistic is unrealistic.
Idea that you can give companies or corporations tools to check "did user modify his environment" and they would not use it to exclude users is stupid or disingenuous because advocates for this did exactly comment in such way: We want this proposal to do precisely that.
Again Google tries to defend it by saying "we will return invalid 'false' for some of the users/times of Chrome users" [to make sure that website will not do that] which for me is not only bad because it then creates "when google revokes this policy we are in even worse situation" but then leaves the issue how google decides "who" to give back this 'false':
I will reject times immediately not only because this can be easily circumvented by website [check n-times] to detriment of user but it would also contradict official documentation of WEI (same token for same input from user).
And this leads us to another point - if Google wants to return false negatives it would need to either keep information that is supposed to return 'false' - EU will not be very happy with that (also it does contradict this "chrome users"); or more likely it will be implemented in chrome.
Now when we established that implementation in chrome is most probable - we can also establish that:
A) Implement this on profile basis - companies will ask you to reset profile if you are this false negative.
B) Implement on connection basis - companies will ask you to refresh.
C) Implement on device age / os version / type - Google can even make the manufacturers happy with this one.
… as you can see at most this will be nuisance and if by some weird way:
Z) Implement on Super-complicated basis - this will be still possible because…
3. Google plays disingenuous word game with us here by saying - We won't destroy open web
Other Chromium browsers may ignore that Google X% false negative (Google may loose few % of users before it scrapes this policy). And there is 0 need for Google to actually do something when companies will misuse this API.
In simple words the part that should worry you is not that "Google will destroy web by using this API on Chrome and it services", what must worry you is that other companies will do that for Google and Google will wash their hands from this by saying "We wanted good but didn't work". You can see that tone from the Google - We don't want that so we created these "holdouts".
Don't let Google move Overton window, they are proposing thing that any sensible person see as clear cut attack (or stupid idea that can only work this way) on Privacy and Your Right to use Your Device (and for some people Your OS and/or Your Browser) as You want to use. They are at fault here.
The illegitimate reason is they can stop ad blockers and content downloaders / DRM bypassers.
The premise is unacceptable and discussion on the technical merits will only give it the fuel to make it more material.
[0] - https://en.wikipedia.org/wiki/Overton_window
We're in it now, and fully understanding the issue and the problems it purports to solve is incredibly important.
Dialogue is all we have, and to even build a solid argument against WEI, understanding the details matters.
So what exactly are we even talking about here? The idea of attestation or just this proposal or is it a Google thing? Does this compare to cloudflare private tokens or safetynet or are they completely different? If proposal goes through what does that functionally mean for browsers both ones based on chromium and ones not?
I don't know why it's so difficult to find these details and I'm instead being told to just accept the idea that the premise is unacceptable.
https://github.com/mozilla/standards-positions/issues/852https://github.com/RupertBenWiser/Web-Environment-Integrity/...
I stopped reading after the explainer’s intro section. The first example is making it easier for websites to sell adds (lmao) and the other 3 are extremely questionable whether if the proposed remedy even helps. And it’s presented as a benevolent alternative to browser fingerprinting, as if we must choose between these two awful choices. It’s an absolute joke of a proposal.
The public should have an entity that will receive detailed attestation data to assess that. Failing the attestation will revoke business permit along with an announcement.
Because they will pinky promise.
I find it funny that for some reason companies get the benefit of the doubt when it comes to dealing with data in a responsible matter. Yes, it's possible that they do. But it is also possible that they don't and no matter what they say in public that's just words, it doesn't prove anything about what is really going on and that's before we get to honest mistakes.
There is simply no way to be sure, all you know is that once you transmit data to any other host on the internet that it is quite literally out of your hands whether or not that data will one day show up elsewhere.
It goes a bit deeper than that. Many companies these days "choose" to get certified under a variety of standards (the most common one is ISO 27001), everyone who hasn't been completely ignorant is looking for or already got cybersecurity insurance and on top of that comes the entire GDPR saga. Basically, you got three levels of auditors that at least make sure the basics are covered, and on top of that come industry specific requirements such as TISAX [1], US SOX Act compliance or whatever AWS had to go through for GovCloud.
[1] https://en.wikipedia.org/wiki/Trusted_Information_Security_A...
Side note: This at least would occasionally happen if you tried to spend Scotland or NI £5 notes in England.
It's like Android SafetyNet where apps can work out if the device is rooted and running custom software underneath the browser/app.
Pretty much all non-technical users -- and a great many technical users -- have never heard of SafetyNet and don't know what it does.
Metaphors are imperfect; that's inherent to their very nature. That doesn't make them useless.
It will lead to three webs: the remainder of the open web, the new closed web, and the pirate web.
Personally I'll do my bit to preserve openness, even if that means working socially and technically to support the new world of piracy. It will always be a losing battle without institutions fighting for openness, though.
This is a moment when Sun's old line - "the network is the computer" - starts to look hideous and dystopian. Prophetic, but maybe not how we thought.
Your online banking will stop working on your unapproved software, just like your baking app stopped working on your rooted/old Android phone some 3-5 years ago.