I have an idea that another way of preventing being tracked is just massively spamming trash in the data layer object, pushing thousands of dollars worth of purchase events and such, pushing randomly generated user details and other such events. Perhaps by doing this your real data will be hard to filter out. A side effect is also that data becomes unreliable overall, helping less privacy aware people in the process.
Since installing it on firefox on this computer (18 months ago or so) Ad Nauseam has clicked ~$38,000 worth of ads, that i never saw.
Between this and "track me not" i've been fighting back against ads and connecting my "profile" with any habits since 2016 or so. I should also note i have pihole and my own DNS server upstream, so that's thiry-eight grand in ad clicks that got through blacklists.
AdNaueam works against ads, but does it also work against Google Tag Manager?
I've already got most ads blocked by simply Piholing them, but GTM tracking my every move using first-party content is a different kind of interaction to attack.
I’d imagine that by this point in time, they are able to filter this specific type of noise out of the dataset. They have been tracking everyone for so long that I doubt there’s anyone they don’t know about whether directly of shadow profiles. These randomly generated users would just not match up to anything and would be fine to just drop
I have a quite common name in my country and snatched firstname.lastname@gmail.com for that name many years ago. Many use it by accident somehow when registering for things. Possibly (hopefully!) half of all leaks containing my email address are for other people. Never thought of what it might do for ad profiling, but hopefully it is adding at least some noise to it.
Maybe I could manually improve a bit on that by deliberately register myself for various random services and just clicking around a bit to pretend I am interested in things I have no interest in. On the other hand with 20 years of tracking I think Google has all my interests and habits nailed down anyway.
Am I dumb or does this article fail to explain what does the tag manager actually do? And not just with a loaded word, such as surveillance or spying, but actually technically explain what they are selling for and why it is bad.
Google Tag Manager is a single place for you to drop in and manage all the tracking snippets you might want to add to your site. When I've worked on B2C sites that run a lot of paid advertising campaigns, the marketing team would frequently ask me to add this tracking pixel or another, usually when we were testing a new ad channel. Want to start running ads on Snapchat? Gotta ad the Snapchat tracker to your site to know when users convert. Now doing TikTok? That's another snippet. Sometimes there would be additional business logic for which pages to fire or not fire, and this would change more often. Sometimes it was so they could use a different analytics tool.
While these were almost always very easy tickets to do, they were just one more interruption for us and a blocker for the stakeholders, who liked to have an extremely rapid iteration cycle themselves.
GTM was a way to make this self-service, instead of the eng team having to keep this updated, and also it was clear to everyone what all the different trackers were.
The self-service thing is such a nightmare. There are two things that you almost certainly cannot trust your marketing team with:
1. Understanding the security implications of code they add via tag manager. How good are they at auditing the third parties that they introduce to make sure they have rock-solid security? Even worse, do they understand that they need to be very careful not to add JavaScript code that someone emailed to them with a message that says "Important! The CEO says add this code right now!".
2. Understand the performance overhead of new code. Did they just drop in a tag that loads a full 1MB of JavaScript code before the page becomes responsive? Can they figure that out themselves? Are they positioned to make good decisions on trade-offs with respect to analytics compared to site performance?
Google Tag Manager lets you add tracking stuff on your website without needing to touch the code every time. So if you want to track things like link clicks, PDF downloads, or people adding stuff to their cart.
It doesn't track things by itself. It just links your data to other tools like Google Analytics or Facebook Pixel to do the tracking.
This kind of data lets businesses do stuff like send coupon emails to people who left something in their cart.
There are lots of other uses. Basically, any time you want to add code or track behavior without dealing with a developer.
I was tasked with auditing third party scripts at a client a couple of years ago, the marketing people where unable to explain wtf tag manager does concretely without resorting to ‚it tracks campaign engagement´ mumbo jumbo, but were adamant they they can’t live without it.
The chief reason is that websites pay for advertising and want to know if the advertising is working and Google tag manager is the way to do that, for Google Ads.
This is not unreasonable! People spend a lot of money on ads and would like to find out if and when they work. But people act like its an unspeakable nebulous crime but this is probably the most common case by miles.
Why should an advertiser have a right to know if their ads work, regardless of privacy considerations. EU brought out a freaking legal framework around this. I can't take seriously how you've over simplified it.
Tracking website ads has become so normalised, it doesn't seem to even cross the minds of web-only marketing people to think: how has this always worked for advertising via TV, radio, billboards, newspapers/magazines, etc?
Website-based advertising is a special case - the only one that makes this tracking possible. Advertisers need to understand the huge advantage they've been given, rather than taking it as a given and thinking they have more of a right to the data, than the user has a right to not provide it.
It feels that way for a lot of privacy concerns. "Telemetry" is the scare word for debug log, core dumps, and stack traces. I think it’s completely reasonable to want those.
This may have changed, I last used Tag Manager 9-10 ago. You basically added a single Javascript snippet to you website, then you could inject other Javascript into the pages, using various rules. So rather than having to redeploy our site every time the marketing department wanted to add a new tracking or retargeting script, we could just add it in Tag Manager. I think is a great tool if you insist on doing these types of thing. You can also extract and transform variables, so all the customization required to adapt to each service could be done within Tag Manager, keeping your website simpler.
One major issue Tag Manager solved for us was that a bunch of these online marketing companies that have their own tracking pixels/scripts absolutely suck at running IT infrastructure. More than ones we experienced poorly written 3rd. party scripts would break our site. Rather than having to do a redeployment, to temporarily disable a script, I could easily pop into the Tag Manager console and disable to offending service.
Maybe Google Tag Manager has changed, but it was a good tool, if you where in the business of doing those sorts of things. I suppose it's also a clever way of blocking all tracking from a site by just stopping the Tag Manager script from loading.
It’s a little bit like dependency injection for websites, used by marketing teams.
The people responsible for maintaining a site don’t want to know about all the different analytics tools the marketing team wants to use, and don’t want to be involved whenever any changes need to be made. So they expose a mechanism where the marketing team can inject functionality onto the page. Then all the marketing tools tell the marketing team how to use GTM to inject their tool.
There's a section in the article titled, "WHAT DOES GOOGLE TAG MANAGER DO?":
> Whilst Google would love the general public to believe that Tag Manager covers a wide range of general purpose duties, it's almost exclusively used for one thing: surveillance.
Years ago, I worked on a site where we constantly had requests from the non technical side of the company to make the site load faster. We were perplexed in engineering. The site loaded and was ready for us in less than a fraction of a second.
Eventually we realized that every dev ran ubo, and tried loading the site without it. It took about 5 seconds. Marketing and other parts of the company had loaded so much crap into GTM that it just bogged everything down
This is why I generally keep a mostly-clean browser around for development (only including some dev extensions). I've wasted half an hour when I had a stray uBO filter go off on a component I was working on once (wasn't even an ad) and that taught me a valuable lesson.
If you're testing a website, you've got to test it like your customers use it. I shake my head at the incompetence of web designers every time I encounter a website filled with scroll bars because the devs on macOS haven't bothered testing any other device.
It really isn't. I've been blocking all JavaScript for years now, selectively allowing what is essential for sites to run or using a private session to allow more/investigate/discover. Most sites work fine without their 30 JS sources, just allowing what is hosted on their own domain. It takes a little effort, but it's a fair price to pay to have a sane Internet.
The thing is - with everything - it's never easy to have strong principles. If it were, everyone would do it.
It's certainly not that bad if you have uMatrix to do it with, but I haven't found a reasonable way to do it on mobile. uMatrix does work on Firefox Mobile but the UI is only semi functional.
That’s my default as well. Self hosted/1st party scripts can load, but 3rd party scripts are blocked. The vast majority of sites work this way. If a site doesn’t work because they must have a 3rd party script to work, I tend to just close the tab. I really don’t feel like it has caused me to miss anything. There’s usually 8 other sites with the same data in a slightly less hostile site
Impossible to know because when I disable Javascript "the majority of the internet" works fine. As does a majority of the web.
I read HN and every site submitted to HN using TCP clients and a text-only browser, that has no Javascript engine, to convert HTML to text.
The keyword is "read". Javascript is not necessary for requesting or reading documents. Web developers may use it but that doesn't mean it is necessary for sending HTTP requests or reading HTML or JSON.
If the web user is trying to do something else other than requesting and reading, then perhaps it might not "work".
If you're spending 99% of your time on your favourite websites that you've already tuned the blocking on? Barely a problem.
On the other hand if your job involves going to lots of different vendors' websites - you'll find it pretty burdensome, because you might end up fiddling with the per-site settings 15+ times per day.
If I’m at work using a work provided computer, I don’t bother with the blocking. They are not tracking me as I do not do anything as me. I’m just some corporate stooge employee that has no similarity to me personally.
My personal devices block everything I can get away with
StackOverflow switched over from spying with ajax.google.com to GTM in the past year or so. All for some pointless out of date jQuery code they could self-host. I wonder how much they're being paid to let Google collect user stats from their site.
Echoing others, I've used NoScript for years and at this point it is practically unnoticeable.
Many sites work without (some, like random news & blogs, work better). When a site doesn't work, I make a choice between temporarily or permanently allowing it depending on how often I visit the site. It takes maybe 5 seconds and I typically only need to spend that 5 seconds once. As a reward, I enjoy a much better web experience.
The sites that don't work are usually the worst websites around - you end up not missing much. And if it's a store or whatever, you can unblock all js when you actually want to buy.
About as tiring as hearing about it all the time. Thank god it's a fringe topic these days but this article snuck it in. Probably the constant use of the word "surveillance" was an early tell haha.
>Use uBlock Origin with JavaScript disabled, as described above, but also with ALL third-party content hard-blocked. To achieve the latter, you need to add the rule ||.^$third-party to the My Filters pane.
This is a worse way to implement uBO's "Hard Mode" (except with JS blocked), which has the advantage that you can easily whitelist sites individually and set a hotkey to switch to lesser blocking modes.
I don't think this article makes a good case for why you should.
>The more of us who incapacitate Google's analytics products and their support mechanism, the better. Not just for the good of each individual person implementing the blocks - but in a wider sense, because if enough people block Google Analytics 4, it will go the same way as Universal Google Analytics. These products rely on gaining access to the majority of Web users. If too many people block them, they become useless and have to be withdrawn.
OK - but then also in the wider sense, if site owners can't easily assess the performance of their site relative to user behavior to make improvements, now the overall UX of the web declines. Should we go back to static pages and mining Urchin extracts, and guessing what people care about?
> if site owners can't easily assess the performance of their site
I would be more than happy to opt in to performance metrics or other reports if only I could have some level of trust that improving the UX is all it's gonna be used for. I want to live in a world where that is the everyday normal, and where the non-consensual collection and sale of personal data is a high-profile public scandal with severe legal consequences.
Analytics can have good uses, but these days it's mostly used to improve things for the operator (more sales, conversions, etc) and what's best for the website isn't always the best for the user. And so I block all that.
> Meanwhile, Google Tag Manager is regularly popping up on Government sites. This means not only that governments can study you in more depth - but also that Google gets to follow you into much more private spaces.
Edit: looks like this might exist already: https://addons.mozilla.org/en-US/firefox/addon/adnauseam/
Between this and "track me not" i've been fighting back against ads and connecting my "profile" with any habits since 2016 or so. I should also note i have pihole and my own DNS server upstream, so that's thiry-eight grand in ad clicks that got through blacklists.
https://www.trackmenot.io/faq
https://adnauseam.io/
Chrome banned it from their add on store but it can still be installed manually
I've already got most ads blocked by simply Piholing them, but GTM tracking my every move using first-party content is a different kind of interaction to attack.
Maybe I could manually improve a bit on that by deliberately register myself for various random services and just clicking around a bit to pretend I am interested in things I have no interest in. On the other hand with 20 years of tracking I think Google has all my interests and habits nailed down anyway.
While these were almost always very easy tickets to do, they were just one more interruption for us and a blocker for the stakeholders, who liked to have an extremely rapid iteration cycle themselves.
GTM was a way to make this self-service, instead of the eng team having to keep this updated, and also it was clear to everyone what all the different trackers were.
1. Understanding the security implications of code they add via tag manager. How good are they at auditing the third parties that they introduce to make sure they have rock-solid security? Even worse, do they understand that they need to be very careful not to add JavaScript code that someone emailed to them with a message that says "Important! The CEO says add this code right now!".
2. Understand the performance overhead of new code. Did they just drop in a tag that loads a full 1MB of JavaScript code before the page becomes responsive? Can they figure that out themselves? Are they positioned to make good decisions on trade-offs with respect to analytics compared to site performance?
It doesn't track things by itself. It just links your data to other tools like Google Analytics or Facebook Pixel to do the tracking.
This kind of data lets businesses do stuff like send coupon emails to people who left something in their cart.
There are lots of other uses. Basically, any time you want to add code or track behavior without dealing with a developer.
It’s used by marketing people to add the 1001 trackers they love to use.
This is not unreasonable! People spend a lot of money on ads and would like to find out if and when they work. But people act like its an unspeakable nebulous crime but this is probably the most common case by miles.
Companies were doing this for hundreds of years before Google even existed. You can learn if your ads work without invasive tracking.
Website-based advertising is a special case - the only one that makes this tracking possible. Advertisers need to understand the huge advantage they've been given, rather than taking it as a given and thinking they have more of a right to the data, than the user has a right to not provide it.
One major issue Tag Manager solved for us was that a bunch of these online marketing companies that have their own tracking pixels/scripts absolutely suck at running IT infrastructure. More than ones we experienced poorly written 3rd. party scripts would break our site. Rather than having to do a redeployment, to temporarily disable a script, I could easily pop into the Tag Manager console and disable to offending service.
Maybe Google Tag Manager has changed, but it was a good tool, if you where in the business of doing those sorts of things. I suppose it's also a clever way of blocking all tracking from a site by just stopping the Tag Manager script from loading.
GTM from 9-10 years ago and GTM today have nothing in common.
The people responsible for maintaining a site don’t want to know about all the different analytics tools the marketing team wants to use, and don’t want to be involved whenever any changes need to be made. So they expose a mechanism where the marketing team can inject functionality onto the page. Then all the marketing tools tell the marketing team how to use GTM to inject their tool.
> Whilst Google would love the general public to believe that Tag Manager covers a wide range of general purpose duties, it's almost exclusively used for one thing: surveillance.
Eventually we realized that every dev ran ubo, and tried loading the site without it. It took about 5 seconds. Marketing and other parts of the company had loaded so much crap into GTM that it just bogged everything down
If you're testing a website, you've got to test it like your customers use it. I shake my head at the incompetence of web designers every time I encounter a website filled with scroll bars because the devs on macOS haven't bothered testing any other device.
The thing is - with everything - it's never easy to have strong principles. If it were, everyone would do it.
I read HN and every site submitted to HN using TCP clients and a text-only browser, that has no Javascript engine, to convert HTML to text.
The keyword is "read". Javascript is not necessary for requesting or reading documents. Web developers may use it but that doesn't mean it is necessary for sending HTTP requests or reading HTML or JSON.
If the web user is trying to do something else other than requesting and reading, then perhaps it might not "work".
Dead Comment
If you're spending 99% of your time on your favourite websites that you've already tuned the blocking on? Barely a problem.
On the other hand if your job involves going to lots of different vendors' websites - you'll find it pretty burdensome, because you might end up fiddling with the per-site settings 15+ times per day.
My personal devices block everything I can get away with
I won't browse the Internet on my phone without it, everything loads instantly and any site that actually matters was whitelisted years ago.
Many sites work without (some, like random news & blogs, work better). When a site doesn't work, I make a choice between temporarily or permanently allowing it depending on how often I visit the site. It takes maybe 5 seconds and I typically only need to spend that 5 seconds once. As a reward, I enjoy a much better web experience.
https://github.com/gorhill/uBlock/wiki/Blocking-mode
https://github.com/gorhill/uBlock/wiki/Blocking-mode:-hard-m...
>The more of us who incapacitate Google's analytics products and their support mechanism, the better. Not just for the good of each individual person implementing the blocks - but in a wider sense, because if enough people block Google Analytics 4, it will go the same way as Universal Google Analytics. These products rely on gaining access to the majority of Web users. If too many people block them, they become useless and have to be withdrawn.
OK - but then also in the wider sense, if site owners can't easily assess the performance of their site relative to user behavior to make improvements, now the overall UX of the web declines. Should we go back to static pages and mining Urchin extracts, and guessing what people care about?
I would be more than happy to opt in to performance metrics or other reports if only I could have some level of trust that improving the UX is all it's gonna be used for. I want to live in a world where that is the everyday normal, and where the non-consensual collection and sale of personal data is a high-profile public scandal with severe legal consequences.
If the frontend automatic js is blocked, it doesn’t matter.
Yes absolutely do this please.
Why even bother with the effort of analytics only to ignore the answers? I'm honestly not sure I've ever seen a website improve.
The corruption of the system knows no bounds.
Ublock origin wiki referencing a method to block, unsure how effective it is(seems to be based on the first link): https://github.com/gorhill/uBlock/wiki/Static-filter-syntax#...
"*$1p,strict3p,script,header=via:1.1 google"
Perhaps some filter in your list already utilizing this but I'm unable to verify