I naively assumed from the headline that the author would complain about users blocking cookies. I was very pleasantly surprised to see a post written by someone who appreciates that some users will want to do this and is actively working to support delivering them a useful content experience!
As someone working on a ticketing purchase flow, this is critical... can't exactly just turn people away! I was also surprised about localstorage throwing exceptions.
Given that many cookie banners still (IMHO illegally, to be verified by courts) give the choices "accept tracking or fuck off" I guess you could. But it behoves you that you would assume it's a stupid idea. :)
I often think that instead of completely blocking cookies, it would be better to accept them and then throw them away. Same with localStorage. Just store it temporarily.
I’ve been using Cookie AutoDelete for that purpose for the last few years. It works flawlessly for me and brings me comfort in knowing that I am only being tracked online by my browser fingerprint and IP.
Same, I absolutely love this extension. You can whitelist the websites you use frequently, and for everything else it's like a groundhog day every day.
The cookie banners can be super annoying sometimes, but they are easily removed with uBlock Origin. I also frequently have to solve captchas, but it's not so bad. For example, every time I visit amazon.com to order toilet paper or whatever, it thinks I'm a bot, but at least amazon's captchas are less annoying than some of the others.
One thing worth noting: at least when I installed it, the auto clean functionality (the bit that actually removes the cookie/etc data) is disabled by default. This means it needs configuration to actually do anything aside from manual cleaning.
Came here to mention this. This is is probably the most useful extension for preserving privacy while also not breaking things. Sites can store all the cookies they want now, but a few minutes after the tab goes away it's like we never met. Very nice.
I do this with Firefox's Temporary Containers. Every manually opened tab is a new browsing session, with no cookies etc. Closed tabs' data get deleted after 15 mins. Fantastic addon, and the usage is as seamless as it gets.
I do this too! I pair Temporary Containers with the Containerise add-on which lets me create persistent containers for a few specific sites that I want stay logged in to.
This setup works well with one glaring exception... Cloudflare and their stupid boats. Using temporary containers for everything has really shone a light on just how much of the web Cloudflare is gobbling up. Cloudflare throws a captcha at me every time I visit any website they gatekeep for. I'm talking mostly about random sites that turn up in web searches. Its annoying enough that when I encounter a Cloudflare captcha, I just close the tab and try the next site.
Now I'm wondering if there's a way to eliminate results from web searches that use Cloudflare with something like the uBlackList add-on.
Well, I've reacted to the article with an "of course, the Google's browser breaks everything if you try to block tracking".
There is absolutely no reason for letting the javascript know that you've blocked some functionality. It just adds new tracking.
Anyway, the sensible thing to do is to store the values for the lifetime of the page. Simply throwing them away can be an option, but it's a bad default. Non ad based browsers do get it.
This is the answer. Blocking the APIs is just asking for a broken internet, and I have very little sympathy.
Furthermore, blocking the API is a detectable characteristic and increases the surface area of your fingerprint. It has exactly the opposite of the intended effect on privacy.
Just wondering, is there a fully automatic way in e.g. FF to do this? Like right-click 'open in new private tab which automatically accepts all cookie dialogs' ?
Maybe. I run a word game (https://squareword.org) that uses localstorage to store stats. This allows me to give users statistics without requiring any sort of account or signup. Even so, I often hear from people that have their stats cleared, for example by iOS evicting localstorage after 7 days of not visiting a site.
I pick between the two. Of late that responsibility has been pretty well taken care of by the Forget Me Not extension on Firefox, although I think it's endangered (like a lot of things that have to do with Firefox and its extensions.)
You can set rules with three clicks, four clicks if you want that rule to be temporary and thrown away on browser restart. You may choose between never deleting, deleting on browser close, deleting on tab close, or just throwing them away. The initial setup for default policy has a few UI issues, but the author put a lot of work into it.
I agree. Private browsing takes care of this for me. I close the tab, cookies are deleted, and I will randomly confirm this occurs occasionally just to be certain. There's no need to get all OCD or self-righteous about cookies when Javascript is the scourge. I can not respect any that block cookies but do not surf with Javascript disabled. Though html5 has nearly but not quite made Javascript irrelevant, the scourge seems to now be built in to html.
Agreed - Firefox have very legit, useable workflows for avoiding tracking while still having the web function for you. Chrome has IMO a purposefully unusable approach. It’s theatre, they give you the option, but that option breaks the web so badly that you’re not going to want to use it. Which makes sense, they’re an ad sales company, tracking is crucial to their business.
This is how I use Firefox, things only stored for the session, for the given container. Containers are better than first-party isolation, because many sites expect to share data with third-parties.
It is also better to fake API responses than to block access to them. In Firefox the privacy.resistFingerprinting option takes care of this. It was originally developed for the Tor Browser.
What if browsers made it so when you turned off cookies, instead of not allowing anything to be written, they instead gave each page you visited its own fresh cookie jar that was cleared when you navigated away?
This is loosely what Firefox's temporary containers [0] extension does. Each tab (with options to control whether a tab spawned from a parent tab should inherit the cookie-jar context of the parent) gets its own temporary context. I don't recall whether it clears the jar on navigating away, but you can have that jar cleared when the tab is closed, and you can configure new jars when opening a new tab to a new site origin (i.e. domain).
I use and love this extension. The main complication that would prevent it from being a mainstream solution to cookie clearing is automating the decision of when to create a new container vs continue to use the existing one when links are clicked. Going by domain name (using Public Suffix List) breaks a lot of SSO implementations, and the occasional payment processor/verification flow, and other situations that redirect to another site, but pass information (or save state to have on return) via cookies.
That's effectively what Incognito does, except not at that granularity to make it simpler for users to understand.
It probably can't be every site being separate since that breaks a number of things sites do (like opening 3p windows to complete transactions), but it could probably be done in some kind of logical group manner. Maybe by using different window colors to signify the partitions.
Similar to suggestions that Android offer the option to provide fake location data to apps that require it without good reason: It's a fantastic idea and seems easy to implement, but might make it less painful for users to opt out of all the tracking that makes the internet so friendly to advertisers and other groups who would like to surveil your activity.
Firefox Focus on Android basically does that by default.
If you set it as default browser then all links you click will be openend there. Hit the back button when you're done and everything is deleted.
Another use case is if you quickly want to open some website to look something up: open the website, maybe click a few links, and when you're done everything is wiped.
You can keep a regular Firefox (or Chrome or whatever) for surfing to those websites where you want to keep some state.
Yes. And instead of requiring sites to ask "accept cookies", let it be a browser option when the site attempts to store cookies, like "OK for 10 minutes".
Exactly. The great thing about cookies is that they are a tool, completely in the hand of the user. The site gives you a piece of text and says "show this to me next time if you want me to remember you". And then the browser can choose to continue to use them or not.
Such a weird choice to put the onus on the websites to ask whether to give the cookies, rather than the browser to ask whether to save them. I'm a big supporter of privacy legislation like the GDPR, but this is asinine, as it needs me to trust every website I visit to actually honor my choice.
Really, the original sin of cookies is that were designed to be transparent to user. That was a mistake that promptly needs to be rectified.
A combination of Firefox Enhanced Tracking Protection and Cookie Autodelete works quite well here. Along with I Don't Care About Cookies to hide the inevitable slew of consent banners.
I do use Multi-Account Containers and Temporary Containers too, but typically when I want multiple simultaneous sessions, rather than wanting my current session to be cleaned up.
This is the exact use case for sessionStorage or a cookie with expires=0. But correct usage depends on the knowledge & goodwill of website authors.
For privacy purposes, Incognito mode achieves the same effect without any of the hassle. Maybe turning off cookies should not even be an option anymore?
What if legislators had targeted the ~3 browsers instead of the countless websites to enforce their policy? Things would actually work on a technical level, and we wouldn't be bombarded with dozens of useless cookie warnings. Would have been nice.
Assuming the website wants to do something on the first user's visit, it would start doing it on every page load. Letting the website know that the user has disabled cookies can help avoid it and improve user experience.
> All I am using is some innocent localStorage and IndexedDB to persist user settings like the values of the sliders or the chosen color scheme.
When you turn off cookies you're telling the browser not to let sites persist information. Otherwise, whatever goals you had in disabling cookies would just be worked around through these other technologies.
I totally understand your point and I think I agree, except—well, the setting says “disable cookies”. It should do what it says. If the goal is to disable all persistent information, the setting should be called “disable persistent storage”.
Of course, I also know why it’s not called that: a lot of people know what cookies are at this point, at least relative to the number who'd understand “persistent storage”. A toggle named “disable cookies” is better for usability.
On the other hand, trying to guess what the user actually wants based on a different preference is virtually guaranteed to cause confusion of its own. Should the setting also disable Canvas, since that’s commonly used for fingerprinting? And will Google make the same decision in Chrome v104 as they do in Chrome v110?
I can’t decide whether the primary issue is:
• The name of the setting.
• The undeserved cultural prominence we’ve given “cookies” specifically.
They could call it "disable cookies and other persistent storage (more information)" with more information providing their reasoning why they are bundled in plain English. There is no reason the setting has to have a two word name with no description.
“Cookies” is shorthand for “persistent storage” because nobody outside of web developers knows other methods exist. When people, laws, banners, etc. refer to cookies, they mean “any technology that stores information on the client side systems”.
Whatever mechanism is used is irrelevant to the meaning/concept.
Allow websites to store data on your computer
[ ] short-term
[ ] long-term or permanently
Some websites need to store data for some features, or to work it all. Storing data can also enable them to track you.
And I'd be inclined to blame the state of the web in general.
Especially with single-page applications, I would love for there to be a way for a page to have either access to persistent store or network connections, but not both. A site could load all resources, then announce to the browser that it would like to have access to whatever it stored the previous time. The browser would grant access to the local information, and simultaneously take away access from ever initiating a network connection again. A newly loaded copy of the page would start in the same state, able to pull in new resources, but unable to read or write local information until it again gives up the right to exfiltrate.
It would be a one-way street that way. The page can take any network information with it into the private cave, but nothing from the cave may ever come out, nor may it even know if the cave is empty before taking that irreversible step.
I disagree.
For example, a simple todo-list web-app doesn't need any cookies but stores everything in localStorage.
Cookies are made for a server, localStorage for a client.
What happens when the web app decides to send the server the entire contents of the localStorage every time it loads? Now we are back to emulating cookies with localStorage.
For regular users "Cookies" is a catch-all term for any persistent identifiers and tracking. The exact API used to persist cookie-equivalent data shouldn't matter. Excluding some tracking methods based on a technicality is a gotcha that erodes users' trust.
I think the real issue here is that Google chose to throw errors instead of turning those APIs into no-ops.
I don’t think so, else we should change the name. Cookies are sent to the server on every request so it has tracking implications that locally caching something like dark mode preferences does not.
One issue is that there’s a hysteria over cookies which muddies the water.
It’s trivial to emulate cookies with other Web APIs (storage + service worker, for one). You’re focusing on the label of the toggle and not user intent. If a website can send information about my visit 2 days ago to an upstream server, clearly my expectation of “Disable cookies” is broken.
I would actually love to see a demo of this used for comedic effect. "Unlogin, use your browser fingerprint as your password. We already, know who you are, why put up with the hassle of typing a password."
I'd argue that being able to write to and read from storage for the lifetime of the session (i.e. until you close the tab) is not "persistence" in the sense that any privacy-conscious user cares about.
If anything, making these features break loudly enables sites to detect that they can't be used for persistence and allows them to find ways to circumvent that. Contrast this with cookies which are silently discarded if the server sends them anyway.
It's not at all surprising that Google's browser would chose a way to loudly break these features in a way that a) allows sites to detect that they're unavailable and b) discourages users from using this setting.
This reminds me of when third-party Android releases would add a way to fake sensor data (e.g. GPS) when denying permissions to apps so the apps wouldn't be able to lock out users unwilling to agree to those permissions. A feature that of course Google would never add to stock Android as it is important for their business model that apps can trust their tracking data to be genuine.
I always use a wrapper around local/session storage[1] to avoid this problem. Then you have your app sync settings with storage, never read from it except during startup.
It becomes impossible to implement basic UI features like remembering open panes, etc when storage is disabled though. With the current policies around cookies - no cross-domain reads, Safari's ITP - there is no real need to turn them off for privacy reasons, for the average user at least.
Basic UI features shouldn't need storage. In-memory or in the URL is enough. If you put it in storage then it is actually a (cookie) session, with some sort of configuration - that's not "basic UI".
Sure, in-memory works until the page is refreshed. Storing data in the URL is an option, but also messy and cumbersome to manage especially with bookmarks. localStorage / sessionStorage is clean and dead simple, and it actually allows an app to be truly stateful, so it’s quite unfortunate that the trend is to avoid the “evils” of storing any kind of data on the client. What, should we go back to the days of session IDs and server-side storage for even the most trivial data?
If you define "basic" as not including "this remembers how you had it set last time" then, sure.
"In the URL" works for that, sort of, though not if you want it to still work for users that are just re-finding you through Google or typing in your address.
"(On a tangent, MDN is completely broken with cookies blocked, too. I was about to report this problem (because I care and love MDN), when I discovered a PR is already under way that fixes the Issue. Thanks, @bershanskiy!)"
This would imply that "MDN" is under a state of rapid flux, potentially "breaking" and then being "fixed" (or not) over short periods of time. However it appears from the edit history that most of it is actually static and has not changed since 2019 or 2020.^1
Perhaps the "completely broken" catchphrase invoked by the author refers to an issue with "cosmetics" (window dressing) not content. I use a text-only browser and have not found MDN to be either partially or completely "broken". I send an HTTP request for a file and I receive the contents of the file. For me, it works. No cookies or Javascript required.
Why does the browser pretend to have localstorage but then throw an exception when it's used?
Surely it would be better to simply pretend to not support localstorage and then all sites built with feature detection would work correctly without needing to special case this?
I can see it both ways. I think there's an opportunity for the developer to identify that localStorage is unavailable at runtime, and turn off certain features in the UI as a result, or write their own wrapper layer that does the 'throw-away' behavior.
Dead Comment
Given that many cookie banners still (IMHO illegally, to be verified by courts) give the choices "accept tracking or fuck off" I guess you could. But it behoves you that you would assume it's a stupid idea. :)
https://addons.mozilla.org/en-US/firefox/addon/cookie-autode...
The cookie banners can be super annoying sometimes, but they are easily removed with uBlock Origin. I also frequently have to solve captchas, but it's not so bad. For example, every time I visit amazon.com to order toilet paper or whatever, it thinks I'm a bot, but at least amazon's captchas are less annoying than some of the others.
Deleted Comment
https://addons.mozilla.org/en-US/firefox/addon/temporary-con...
This setup works well with one glaring exception... Cloudflare and their stupid boats. Using temporary containers for everything has really shone a light on just how much of the web Cloudflare is gobbling up. Cloudflare throws a captcha at me every time I visit any website they gatekeep for. I'm talking mostly about random sites that turn up in web searches. Its annoying enough that when I encounter a Cloudflare captcha, I just close the tab and try the next site.
Now I'm wondering if there's a way to eliminate results from web searches that use Cloudflare with something like the uBlackList add-on.
There is absolutely no reason for letting the javascript know that you've blocked some functionality. It just adds new tracking.
Anyway, the sensible thing to do is to store the values for the lifetime of the page. Simply throwing them away can be an option, but it's a bad default. Non ad based browsers do get it.
An editor warning you that your work hasn’t been saved?
Furthermore, blocking the API is a detectable characteristic and increases the surface area of your fingerprint. It has exactly the opposite of the intended effect on privacy.
You can set rules with three clicks, four clicks if you want that rule to be temporary and thrown away on browser restart. You may choose between never deleting, deleting on browser close, deleting on tab close, or just throwing them away. The initial setup for default policy has a few UI issues, but the author put a lot of work into it.
https://addons.mozilla.org/en-US/firefox/addon/forget_me_not...
It is also better to fake API responses than to block access to them. In Firefox the privacy.resistFingerprinting option takes care of this. It was originally developed for the Tor Browser.
Deleted Comment
[0] https://addons.mozilla.org/en-US/firefox/addon/temporary-con...
It probably can't be every site being separate since that breaks a number of things sites do (like opening 3p windows to complete transactions), but it could probably be done in some kind of logical group manner. Maybe by using different window colors to signify the partitions.
If you set it as default browser then all links you click will be openend there. Hit the back button when you're done and everything is deleted.
Another use case is if you quickly want to open some website to look something up: open the website, maybe click a few links, and when you're done everything is wiped.
You can keep a regular Firefox (or Chrome or whatever) for surfing to those websites where you want to keep some state.
Such a weird choice to put the onus on the websites to ask whether to give the cookies, rather than the browser to ask whether to save them. I'm a big supporter of privacy legislation like the GDPR, but this is asinine, as it needs me to trust every website I visit to actually honor my choice.
Really, the original sin of cookies is that were designed to be transparent to user. That was a mistake that promptly needs to be rectified.
I do use Multi-Account Containers and Temporary Containers too, but typically when I want multiple simultaneous sessions, rather than wanting my current session to be cleaned up.
For privacy purposes, Incognito mode achieves the same effect without any of the hassle. Maybe turning off cookies should not even be an option anymore?
Deleted Comment
When you turn off cookies you're telling the browser not to let sites persist information. Otherwise, whatever goals you had in disabling cookies would just be worked around through these other technologies.
Of course, I also know why it’s not called that: a lot of people know what cookies are at this point, at least relative to the number who'd understand “persistent storage”. A toggle named “disable cookies” is better for usability.
On the other hand, trying to guess what the user actually wants based on a different preference is virtually guaranteed to cause confusion of its own. Should the setting also disable Canvas, since that’s commonly used for fingerprinting? And will Google make the same decision in Chrome v104 as they do in Chrome v110?
I can’t decide whether the primary issue is:
• The name of the setting.
• The undeserved cultural prominence we’ve given “cookies” specifically.
• The modern web in general.
Whatever mechanism is used is irrelevant to the meaning/concept.
Block all cookies (not recommended)
- Sites can't use cookies to improve your browsing experience, for example, to keep you signed in or to remember items in your shopping cart
- Sites can't use your cookies to see your browsing activity across different sites, for example, to personalize ads
- Features on many sites may not work
That seems long enough that they could put in text about how this is storage in general and not just cookies.
Firefox just has "Cookies: All cookies (will cause websites to break)" so there's not really a place for that sort of text.
It would be a one-way street that way. The page can take any network information with it into the private cave, but nothing from the cave may ever come out, nor may it even know if the cave is empty before taking that irreversible step.
Deleted Comment
I think the real issue here is that Google chose to throw errors instead of turning those APIs into no-ops.
One issue is that there’s a hysteria over cookies which muddies the water.
Client side fingerprinting plus server side data storage and you get the same functionality in a roundabout way.
Note that you can't use the cache to work around browsers blocking third-party cookies because all the major browsers fragment the cache by site.
If anything, making these features break loudly enables sites to detect that they can't be used for persistence and allows them to find ways to circumvent that. Contrast this with cookies which are silently discarded if the server sends them anyway.
It's not at all surprising that Google's browser would chose a way to loudly break these features in a way that a) allows sites to detect that they're unavailable and b) discourages users from using this setting.
This reminds me of when third-party Android releases would add a way to fake sensor data (e.g. GPS) when denying permissions to apps so the apps wouldn't be able to lock out users unwilling to agree to those permissions. A feature that of course Google would never add to stock Android as it is important for their business model that apps can trust their tracking data to be genuine.
This is how all browsers have handled it, for as long as localStorage has existed. See, for example, this Firefox discussion from 2006: https://bugzilla.mozilla.org/show_bug.cgi?id=341524
The web API doesn’t cater to that. If you only need storage to persist for the session you can just use memory.
Deleted Comment
It becomes impossible to implement basic UI features like remembering open panes, etc when storage is disabled though. With the current policies around cookies - no cross-domain reads, Safari's ITP - there is no real need to turn them off for privacy reasons, for the average user at least.
[1] https://www.npmjs.com/package/localstory
"In the URL" works for that, sort of, though not if you want it to still work for users that are just re-finding you through Google or typing in your address.
This would imply that "MDN" is under a state of rapid flux, potentially "breaking" and then being "fixed" (or not) over short periods of time. However it appears from the edit history that most of it is actually static and has not changed since 2019 or 2020.^1
Perhaps the "completely broken" catchphrase invoked by the author refers to an issue with "cosmetics" (window dressing) not content. I use a text-only browser and have not found MDN to be either partially or completely "broken". I send an HTTP request for a file and I receive the contents of the file. For me, it works. No cookies or Javascript required.
1. https://raw.githubusercontent.com/mdn/content/main/files/en-...
If I want to check browser compatibility, which can change from time to time, I can use Github or the MDN website.
For example,
https://raw.githubusercontent.com/mdn/browser-compat-data/ma...
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cl...
Surely it would be better to simply pretend to not support localstorage and then all sites built with feature detection would work correctly without needing to special case this?