The article lists numerous alternatives to replace functionality that 3rd party cookies currently provide. But is there anything that prevents third-party analytics and similar services from creating libraries or services that are deployed under their clients' domain, and then transmit the collected data from client site to third-party service's own database?
Is that possibility going away too, or is it still a loophole? I ask b/c if it remains, it may be easier to implement than some of the alternatives, and it may become more difficult for browser plugins like uMatrix, NoScript, etc. to identify and block these services, especially if they allow the client to customize or obfuscate the embed link.
The point of killing third party cookies is to prevent a tracking identifier cookie that uniquely identifies your browser from being reused across different sites.
So you can of course host your own scripts and run them on your own origin, lets call it site1.com. But when your site1.com includes a third party iframe to e.g. googleanalytics.com, and that frame sets a cookie on itself, the cookie is now silently dropped. Then when site2.com later includes the googleanalytics.com frame, the frame cannot immediately link the two browsers. There are other ways to “link” browsers across origins, like browser fingerprinting or in many cases just IP, but they are not usually guaranteed to be 100% reliable.
Blocking third party cookies is standard obvious privacy functionality, but google has held out because it affects their bottom line. So IIUIC they had to wait until they implemented something that protects their bottom line (the chrome-only “privacy sandbox”).
If I understand correctly this requires a shift from avoiding certain domains to avoiding certain URLs.
At present myhometownnewspaper.com, i.e., newspapers, generally have a robots.txt which lists a "sitemap.xml". The sitemap should list all the articles but will not list /allurdatabelongtous.
Ads and analytics URLs do not appear in "sitemap.xml". Maybe some of the ads domains are in ads.txt if it exists. In any event, it's easy to avoid the garabage URLs. IMHO, a web browser that auto-loads resources is not the best software for retrieving the URLs in a sitemap.
> But is there anything that prevents third-party analytics and similar services from creating libraries or services that are deployed under their clients' domain, and then transmit the collected data from client site to third-party service's own database?
No, because to do this unambiguously is simply impossible. It is fundamentally always technically possible for your own site to track user behavior of people on your site (what's to say what's "tracking" versus simply serving site page data), and there is no technical way to prevent a site owner from simply sharing their own tracking data with data aggregators. There are of course some heuristics that tracking blockers could use (e.g. "Even though this script is served from somesite.com, it looks like the Facebook tracking code"), but that just becomes a cat-and-mouse game to obfuscate the script.
That's a big reason I feel this "third party cookie blocking" was just oversold. Adding third party cookies was just technically easy because it just involved adding a simple JS snippet to a page, but serving that snippet from one's own domain is still pretty trivial, and indeed what most of the big ad brokers are moving toward.
> serving that snippet from one's own domain is still pretty trivial, and indeed what most of the big ad brokers are moving toward
But in this case it's not as trivial to link usage across two domains. Sure, each domain can serve the snippet from its own domain, but the analytics snippet can't cross reference data from other sites which use the same script (unless of course the domain builds some data sharing mechanism - not trivial for non tech companies).
In which case why even serve the snippet form your domain when you can use the container mechanism.
Yeah all adtech players are going this way but it's only useful when the publisher has primarily logged-in users. Otherwise there's still no way to link you across sites.
We'll probably either be forced to agree to be tracked by Google's systems or forced to sign in to every page.
There's no chance that anything positive will ever happen with the web again, it has gotten worse at every step for more than a decade. Especially when Google has any involvement with it.
All the things that led to getting rid of flash player have been pretty great. CSS is pretty incredible compared to what it used to be. JavaScript is still a train wreck but web assembly holds some promise there. WebGL is also pretty great.
I’m confused at where we jump to the conclusion that we’ll be forced to sign in to Google’s systems or sign in to every page. How specifically does that come about?
No. Cookie banners cover any use of client size storage. That includes things like shopping carts and personalization (purely first party), analytics ("first party"), and ads (usually "third party").
Part of what's confusing here is that "first party" and "third party" are being used in a technical sense to mean which domain the cookies are set on. If on an example.com page JS from example.net causes a cookie to be set on example.com that's "first party", while if the cookie is set on any other domain that's "third party".
Most cookies in those popups are third-party cookies, at least for websites with giant lists of third party cookies that require Consent-O-Matic to properly disable. Website relying on first-party cookies are usually rather unobtrusive,
No. The cookie banners have nothing to do with cookies per se, they are a consent / information popup required if companies want to store and use personal data beyond what's required to serve you with the website content (GDPR) or they want to store non-essential identifiers on your device (ePrivacy directive).
In other words, you are seeing these because marketing departments need BS metrics that measure nothing and are based on some personal data. The internet can happily exist without them as proven by Github[1].
Some federated identity protocols, including SAML, depend on third party cookies. I've been watching the Federated Identity W3C community group work through some of the issues for a couple of years.
So is google singlehandedly choosing which technologies get implemented and deprecated? What if my website depends on third party cookies to function? Aren't they an IETF standard and so I can expect them to be supported?
I get that third party cookies are just used for tracking nowadays, but shouldn't there be a less monopolistic way of doing so?
In this instance, Google is actually the last browser to deprecate third-party cookies. Apple and Mozilla disabled them in their browsers over two years ago.
I think it’s pretty well established that third-party cookies are a bad idea. They’ve been turned off by default on Firefox and probably most other browsers for a while now. I can’t see when I would recommend turning them on for any user.
Just because something is standard doesn’t mean it never gets deprecated.
“We Are Preparing To Think About Contemplating Preliminary Work On Plans To Develop A Schedule For Replacing third party cookies with chocolate chips cookies”
For example, instead of an embed link like:
https://allurdataarebelongtous.xyz/collectitall
collecting visitor data on:
https://myhometownnewspaper.com/
It's replaced by:
https://myhometownnewspaper.com/allurdataarebelongtous/colle...
Is that possibility going away too, or is it still a loophole? I ask b/c if it remains, it may be easier to implement than some of the alternatives, and it may become more difficult for browser plugins like uMatrix, NoScript, etc. to identify and block these services, especially if they allow the client to customize or obfuscate the embed link.
So you can of course host your own scripts and run them on your own origin, lets call it site1.com. But when your site1.com includes a third party iframe to e.g. googleanalytics.com, and that frame sets a cookie on itself, the cookie is now silently dropped. Then when site2.com later includes the googleanalytics.com frame, the frame cannot immediately link the two browsers. There are other ways to “link” browsers across origins, like browser fingerprinting or in many cases just IP, but they are not usually guaranteed to be 100% reliable.
Blocking third party cookies is standard obvious privacy functionality, but google has held out because it affects their bottom line. So IIUIC they had to wait until they implemented something that protects their bottom line (the chrome-only “privacy sandbox”).
https://amiunique.org/fingerprint
At present myhometownnewspaper.com, i.e., newspapers, generally have a robots.txt which lists a "sitemap.xml". The sitemap should list all the articles but will not list /allurdatabelongtous.
Ads and analytics URLs do not appear in "sitemap.xml". Maybe some of the ads domains are in ads.txt if it exists. In any event, it's easy to avoid the garabage URLs. IMHO, a web browser that auto-loads resources is not the best software for retrieving the URLs in a sitemap.
No, because to do this unambiguously is simply impossible. It is fundamentally always technically possible for your own site to track user behavior of people on your site (what's to say what's "tracking" versus simply serving site page data), and there is no technical way to prevent a site owner from simply sharing their own tracking data with data aggregators. There are of course some heuristics that tracking blockers could use (e.g. "Even though this script is served from somesite.com, it looks like the Facebook tracking code"), but that just becomes a cat-and-mouse game to obfuscate the script.
That's a big reason I feel this "third party cookie blocking" was just oversold. Adding third party cookies was just technically easy because it just involved adding a simple JS snippet to a page, but serving that snippet from one's own domain is still pretty trivial, and indeed what most of the big ad brokers are moving toward.
But in this case it's not as trivial to link usage across two domains. Sure, each domain can serve the snippet from its own domain, but the analytics snippet can't cross reference data from other sites which use the same script (unless of course the domain builds some data sharing mechanism - not trivial for non tech companies).
In which case why even serve the snippet form your domain when you can use the container mechanism.
I've had third party cookies turned off in Firefox for over a decade now. So have many Firefox users.
The conflict of interest is obvious for Google though.
Deleted Comment
There's no chance that anything positive will ever happen with the web again, it has gotten worse at every step for more than a decade. Especially when Google has any involvement with it.
Part of what's confusing here is that "first party" and "third party" are being used in a technical sense to mean which domain the cookies are set on. If on an example.com page JS from example.net causes a cookie to be set on example.com that's "first party", while if the cookie is set on any other domain that's "third party".
In other words, you are seeing these because marketing departments need BS metrics that measure nothing and are based on some personal data. The internet can happily exist without them as proven by Github[1].
[1] https://github.blog/2020-12-17-no-cookie-for-you/
You all might be interested in checking out their GH repo: https://github.com/fedidcg/FedCM
I get that third party cookies are just used for tracking nowadays, but shouldn't there be a less monopolistic way of doing so?
With Chrome's global market share at 60% (not even including other Chomium-based browsers), basically yes.
Just because something is standard doesn’t mean it never gets deprecated.
You have alternatives. Login, oauth tokens, other site-specific API keys, ...
> Aren't they an IETF standard
I don't think they are required to work the way tracking worked (across sites).
yawn
Dead Comment