Imagine a market in which companies charge a lot of hidden fees behind their customers' back, and users are not happy when they realize after the fact. The law is updated to say you are not allowed to charge the user a fee unless you tell him in advance.
Companies with tons of hidden fees decide to keep them but force you to read all the fees on every page of the menu before you can see the rest of the text, in the most annoying way possible, and promote the idea that the issue is not the extravagant fees, nor the fact that the companies hide them before and had to be forced by law to warn you about them, no the problem is a law that force them to tell you what you're getting into before it's too late !
That's, essentially, what's happening. And we have people complain that companies need to display their fees.
On this issue in the group that complain about the cookie law there are some people who are very wrong on purpose because it's in their interest, and some people who are very wrong because they genuinely don't understand the position they're defending, complaining about being made aware of the fee, instead of the fees themselves or the fact that the companies hide them if not forced by law.
To each their own belief about which category PG fits into.
Agree. How much corporate propaganda are people consuming that legislators are seen as wholly responsible for the bad behavior and malicious compliance actions of corporations?
What does it say about the relationship between businesses and consumers that the first response to this bad behavior is to shout "look what you made them do!"
Seemingly it is everyone's fault except the bad actors themselves.
It's so depressing. Many of the people who are pointing the finger at the regulators for the annoying cookie banners don't actually see the web site/app *as* a bad actor. The fact that they had been tracking tons of extra data via cookies without their consent or knowledge was totally fine to them as long as it wasn't inconveniencing them in any way. The cookie banner is an inconvenience to their mindless consumption, so NOW it's a problem and they just don't care what the solution actually is as long as the thing goes away.
I've seen this attitude from tech people, too, so it's not just a matter of tech ignorance or illiteracy.
Again, this should have been a >browser feature< instead of a website feature. I trust Safari and Firefox WAY MORE than I trust the website's owners to actually block cookies and protect privacy, as well as implement this in a better UX.
The proper way to have done this would have been to go to the W3C or WHATWG and proposed an extension to HTML for sites to define an opt-in manifest or something similar.
If it only were that simple. When the GDPR came out, a lot of confusion and misunderstanding ensued. Not only regarding the damn cookie banner. Even totally legitimate health-care providers started to collect signatures to be on the safe side. I still rememeber receiving a basic GDPR training where we were told that opt-out/signing is only necessary if the entity is planning to do weird stuff with your data. IOW, if someone wants you to sign, they plan a bad move. Then my bank wanted a signature. And a month later, one of my healthcare providers wanted a signature. After a chat with him, I learnt that his lawyer told him to collect the signatures just in case, and made him believe that if someone doesn't sign, that is a problem.
So now we have this situation where providers were trained to play the GDPR in such a way that they will never have a problem, no matter what they actually do with the data.
And consumers are pissed because they are made to sign things which essentially reduce their rights...
And if someone (like me) thinks the EU did a half-assed job there, the downvotes rain in.
The funny thing is it's not just corporations. When you open the German state railways' website, somehow you get a GDPR overlay, When you open the German revenue agency's website, you get greeted by a cookie banner on top.
I call upon all German users of this website to write to their MPs! Obviously the German civil service is a bad actor! The German deep state is plotting to discredit our beloved eurocrats and must be shut down! Den Sumpf trockenlegen!
Apple is doing the same thing, passive-aggressively doing things like removing support for pinning webapps / PWAs / whatever they're called to your home screen, then backtracking after backlash. Or Microsoft with their browser choice screen or Windows releases without media player. And even those aren't as bad as the malicious compliance of cookie banners.
> How much corporate propaganda are people consuming that legislators are seen as wholly responsible for the bad behavior and malicious compliance actions of corporations
Why do I need to be "consuming corporate propaganda" when I just hate that I need to dismiss banners on every news website, when I didn't have to before the regulation?
I don't care about being tracked. But now that all websites need to cover their asses in response to regulation, I'm forced to figure out which button I need to click on to read content, and these websites don't even appear to save my preferences whether I agree to be tracked or not.
Objectively, the outcome of this regulation is that my experience is worse. Are the companies bad actors? Sure! Sounds like the EU should account for companies' bad behavior instead of forcing the internet to be more annoying.
The flaw in your analogy is that the modal consumer cares about hidden fees, wants to be made aware of them, and might even make a different decision with that knowledge. The modal consumer does not care about cookies.
Imagine you walk into a restaurant and they hand you a paper that details full allergy information for all of the foods they serve, and then they wait for you to say, "I consent to these ingredients being in the food," before they can seat you. I think that's a closer analogy. We can all agree that the restaurant shouldn't hide that information from you, and that some minority of people might want the information, but do we really have to add this inconvenient step to the process for all people? The current real-world system, where allergy information is available upon request, was working fine.
There are some things that everyone cares about and would be appalled by, that businesses should have to inform people about, and many things that a small minority of people care about. Why stop at cookies? Maybe we should mandate a popup if the website's server infrastructure was manufactured abroad, and another popup if the company that runs the website has higher than average carbon emissions, and another popup if the food in the food court that serves the headquarters of the company that runs the website is not kosher. The lobby of people who care about cookies is of similar size to the lobby of people who care deeply about binary size and about running JavaScript. Should there be mandatory popups to execute JavaScript? If the website is >10MB, should I have to consent on a lightweight page before downloading it? How do you determine which activities warrant a popup warning and which do not?
I am not allergic, then I do not need to know about it and I will suffer no damages from not caring. This is not the case for tracking. Tracking can hurt you.
This is a bad example because the market usually fixes this problem. The reason why the market doesn’t fix the cookie banner problem and the reason why this is bad law is because users defacto do not care, it is merely annoying.
There’s a law in California that says that businesses which have chemicals that might cause cancer on the premises need to let people know. That’s great but the levels they set turn out to be lower than what you can feasibly test for and as a result all properties pretty much just put up the signs that say “there might be chemicals here”. The warning is useless and annoying because of market forces which is another way of saying the law incentivized the behavior that occurred.
The market is working perfectly here, if you remember that users are not the customers. Users are the product sold to adtech, data brokers, law enforcement, etc.
For data-harvesting companies users are like livestock, and nobody cares about livestock's opinion. It only matters how much value can be extracted from users, even if it's annoying, misleading, and relies on dark patterns.
I don’t think this is strictly accurate. There’s nothing about cookies themselves that makes them a problem. It’s the way they are used. Needing to inform people you are using cookies for sessions is like needing to inform people you are using a fork to eat. The problem is that some people are using the fork to stab people, so now we require everyone to say how they’re going to use it in advance. Instead of just prohibiting stabbing people.
A few places allow you to opt for a spoon instead, or drink right from the bowl without utensils. Note that it's not the customers who use the forks for stabbing; it's the restaurants themselves. To show their goodwill to a customer who does not trust them with a fork, they can offer a spoon.
The further we take this analogy, the more strained it becomes.
Yes, it's natural to use a cookie to track a session; this is a mechanism invented for that purpose. It's much less natural to share this tracking information with third parties, especially along with a record of your purchases or other interesting actions.
But ad revenue is much harder to obtain without targeting and thus tracking. And a lot of places depend mostly on ad revenue.
This is another case of "buy now, pay later" pattern, stretched to "take for free now, pay in loss of your privacy later". In a funny enough way, many people don't value the information they get on many ad-supported sites as highly as the marketers paying to grab their attention, so simply compensating by adding a subscription or one-time payment to go ad-free sometimes does not even work; the more generic / "doom-scrollalbe" the content is, the worse it works.
See for example GitHub's statement [1] about no longer displaying a cookie banner. While ironically the blog still does display them, the main site doesn't.
You don't need a cookie banner for session cookie, not in eprivacy nor in gdpr, same applies for all cookies that are "strictly necessary" for the functionnal operation of the website on the technical level. Language selection cookie, "remember me" cookie, etc ... Are all perfectly fine.
Do you have /any/ examples of websites that don't have a bunch of 3rd party cookies that still have a cookie banner?
Middle managers absolutely love anything with charts and graphs because it makes their decisions feel more scientific. That's why they want tracking software included on their websites. And if the law requires disclosure then a cookie popup is the solution.
> Not only that, I'm not an EU citizen and I'm not browsing websites based in EU but I'm still bombarded with cookie banners non-stop.
Again, that's the fault of the companies putting those up, they could make it opt-in to collect your data, they could just put a small notice on the footer with 2 simples links "Accept all/Reject all". But they chose, they decided to pester you with those banners as annoyingly as possible to make you have exactly the reaction you're having.
For your first point I disagree, my companies don't track and we don't have banner cookies.
On your second point, that is again a choice of said companies, not a problem with the law. The GDPR has proven very well that if they cared, they can segment who is affected or not, and not just big tech lots of random local news site and the likes are doing it just fine.
I think this is a good analogy and I agree that the intent of the law was not to force websites to have a cookie banner, it was just the side effect.
What I think we are missing is a browser option/API that lets the user choose the acceptable tracking level. Similar to the do not track header but more fine grained.
As we are missing that, extensions are doing a good job ATM
> The law is updated to say you are not allowed to charge the user a fee unless you tell him in advance.
Why not a real regulation then to get rid of hidden fees and heavy fines/jail time for companies that are found to be doing it?
PG's argument (I hope) is that there is no point in talking about "regulation" and "customer protection" if companies STILL get away with their ridiculous and hostile practices.
There is no customer benefit in having user data collection and tracking. Companies do it only to exploit you. Even the usual BS excuses ("oh, we need user data to customize the experience") could be done completely in-device.
I don't want regulatory bodies to just give more hoops for other companies to jump. They will jump it anyway, because it is profitable to do so. What I want is for regulatory bodies to effectively stop predatory practices.
I mean, that would be great, but I suspect that even just here on HN you'd get a lot of people strongly disagreeing with you. Because that would infringe upon the companies' "freedom" to profit in whatever way they see fit—and the people's "freedom" to let their data be vacuumed up and sold for massive profits.
Are we all such spoiled brats that some cookie banners interrupting our web browsing is all it takes for us to give up and call the malicious companies the winners and the law(s) trying to protect our privacy "bad"?
> On this issue in the group that complain about the cookie law there are some people who are very wrong on purpose because it's in their interest, and some people who are very wrong because they genuinely don't understand the position they're defending, complaining about being made aware of the fee, instead of the fees themselves or the fact that the companies hide them if not forced by law.
The reality is that I (and others who are complaining, as well as many who have resigned themselves to their fate) are happy to have a website "track me", certainly if the cost of non-tracking are having to click away an annoying popup, and think that people who compare a website wanting to know the number of their visitors to "hidden fees" are kind of being ridiculous.
"Number of visitors" does not constitute tracking. The tracking in question here is to discover who you are specifically and the absurd amount of detail about your online activities collected and shared with data brokers for aggregation and resale.
A few of these cookie prompts during the day and they'd be able to tell everything from where your kids go to school to the kind of prn you prefer to watch on weekdays and everything in between.
This is addressed in the article. They could track you, with your consent, in many different ways. The fact that they are choosing to force this cost upon you is what is ridiculous.
> The reality is that I (and others who are complaining, as well as many who have resigned themselves to their fate) are happy to have a website "track me", certainly if the cost of non-tracking are having to click away an annoying popup, and think that people who compare a website wanting to know the number of their visitors to "hidden fees" are kind of being ridiculous.
I agree that wanting to know the number of visitors is benign and it is not abuse.
But saying companies should be allowed to track me (for whatever purpose) across the web without my consent is also pretty ridiculous.
> people who compare a website wanting to know the number of their visitors to "hidden fees" are kind of being ridiculous
Is counting visitors all that sites are doing with tracking info?
They're not selling it to ad brokers, insurance companies, governments? They're not matching your name, address, and phone number with your web activity (including sexual interests, "anonymous" embarrassing stories, health concerns, etc)?
Well, different people want different things - I'd rather spend a millisecond to click 'refuse' rather than let them track me - out of spite if nothing else. Yes, cookie banners are annoying; the dark patterns within cookie banners (you need multiple clicks to get to the 'refuse' button while the 'accept' button is right there in your face) are even more so. But honestly - screw them.
> The reality is that I (and others who are complaining, as well as many who have resigned themselves to their fate) are happy to have a website "track me", certainly if the cost of non-tracking are having to click away an annoying popup
The you should doubly blame the companies, because that's what do not track was for, they're the one who decided to make it not work that way and instead being ignored and not considered a valid option for the law.
> think that people who compare a website wanting to know the number of their visitors to "hidden fees" are kind of being ridiculous.
You don't need a cookie for that, and what GDPR has told us is that we're not talking of that but about dozens or hundreds on every major sites so trying to frame it that way is disingenuous.
The fees example is maybe apples to oranges. The fees are a problem because they subvert the pricing information signals needed for the free market. The problem is not the fact that they are charged, the problem is that they are not included in an upfront price display. Were they included in the total upfront price and never specified the users should not care - it's not their business how a company spends their money.
But I suppose that was just an example you picked to illustrate the industry's malicious compliance, and not the main point, in which case fair enough. :-)
The use of secret tracking also subverts the pricing signals needed for the free market. Users aren't informed that the website is subsidized by the sale of the users' information, much less the details of the arrangements and monetary amounts.
If the total price of the website without the secret costs of tracking were presented upfront, it would be less of an issue.
I agree with almost everything you said, except for one thing:
I don't believe the euphemism "hidden fees" helps to clarify the fact that these people are taking money away from people without their knowledge or explicit consent.
We have other more precise words to describe that action. I asked ChatGPT what those could be, here's its answer:
Q: What are some english words meaning "taking money away from people without their knowledge or explicit consent"?
ChatGPT: There are several words and phrases in English that convey the idea of taking money away from people without their knowledge or explicit consent:
Embezzlement: This refers to the act of dishonestly withholding assets for the purpose of theft. It often involves someone in a position of trust, such as an employee, misappropriating funds entrusted to them.
Misappropriation: Similar to embezzlement, misappropriation involves taking something (usually money) for one's own use without permission or legal right, often in a breach of trust.
Theft: Theft is the generic term for taking someone else's property without permission, including money.
Fraud: Fraud involves intentional deception for personal gain, which can include financial deception or stealing.
Swindling: This term implies deceitful behavior to cheat or defraud someone, often involving trickery or manipulation.
Skimming: Skimming refers to the illegal practice of taking cash "off the top" of the proceeds of a business or other source of income without recording it.
Extortion: While not always directly related to taking money without explicit consent, extortion involves obtaining money, property, or services from an individual or entity through coercion or threat.
Pilfering: Pilfering involves stealing small amounts or petty theft, often done stealthily or without detection.
Conning: This refers to the act of deceiving or tricking someone, often for financial gain, through manipulation or persuasion.
Clandestine withdrawals: This phrase specifically refers to taking money from someone's account without their knowledge or consent, typically in a secretive or unauthorized manner.
We here are all interested in hearing your thoughts, so please filter raw chatbot output through them, rather than pasting the output verbatim, which isn't value-added, and can even be negative value, given chatbots' penchant for hallucinating information.
Hidden fees are bad because of the specific combination - the hiding, and the fees. Since tracking isn't hidden and isn't a fee, the analogy doesn't help to justify the EUs law.
People should have a default expectation that if they give their personal data to companies then it will be recorded. And if they don't want cookies then they should disable cookies. The EU's regulation hasn't revealed anything that is useful to know about.
People don't "give" their information to trackers, it's collected without their knowledge. I don't think most people expect the kind of things trackers collect is being collected.
Tracking is certainly hidden if you're not a programmer, and is certainly a fee if you value your time. Not all people live in low-trust societies or desire to.
>, Paul Graham came up with the thought, that the EU forces companies to have cookie banners. There is no law for cookie banners. [...] Companies could easily avoid any cookie banner. Just don’t track.
KingOfCoders/amazingcto, of course you are technically correct but Paul Graham wasn't talking about the letter of the law.
Instead, you have to interpret his complaint with the lens of game theory. I.e. The Law of Unintended Consequences that takes into account what companies actually do in response to laws instead of what we hope they will do.
Your blog post focused on good intentions of the law. PG's tweet focused on actual outcome.
Doesn't that argument work both ways? If you interpret the EU's regulation with the "lens of game theory", it is an unintended consequence of aggressive corporate data collection. Not sure why it makes sense to complain about the EU and not the companies.
Of course not. Only titans of industry and the landed gentry of the executive class are allowed to "move fast and break things", "ask for forgiveness rather than permission" and take "imperfect action rather than perfect action."
It's more morally permissible for corporate decision makers to install a global surveillance complex than for civil servants to attempt to regulate it.
Because the companies are getting what they want (data on users), but the regulation is not getting what it wants (no tracking or informed tracking).
I don't know if this mini-competition between regulators and companies is truly zero-sum, there could be some way to get everyone something they want. But with the current regulation, it is zero-sum, and the companies are winning and the EU is losing. And the EU "works for you", so of course you can complain to them.
> Not sure why it makes sense to complain about the EU and not the companies.
Unfortunately a non-negligible number of people in tech also have libertarian leanings, with a default “gubmint bad!” position, which makes them easy prey for adtech propaganda.
No, it does not work both ways. The roles of governments and corporations are not symmetric.
Good regulation is regulation that has good outcomes. If a law has bad outcomes it is a bad law. You can separately complain about what companies are doing but that doesn't change the fact that it's a bad law.
It is of course debatable whether GDPR as a whole has bad outcomes, but if we're talking about cookie banners in isolation then it certainly does.
>The blog clearly works from the actual outcome lense. [...] The companies _could_ just not track.
No, you've inadvertently stated a contradiction. Your use of the word _"could"_ is literally a hope/wish/intention of the law.
In contrast, the actual outcome is that the companies didn't stop tracking. We _wish_ they would stop tracking. (I.e. "The companies _could_ just stop tracking us!") But that hope still doesn't change the observation of reality.
Can you really say that confidently? I think a lot of these companies would go out of business if they didn't track users so it seems like under the law they have no option but to show cookie banners. Or are you claiming the law exempts companies in such circumstances?
I'm a fan of second-order thinking and unintended consequences, so I'm with you there. How would you frame a "don't track people without consent" without unintended consequences?
The article tries to make the point (perhaps fails), that companies do this intentionally to get the "consent" of people against their will, therefor running the tight line of breaking the law without breaking it.
> How would you frame a "don't track people without consent" without unintended consequences?
Drop the consent requirement? I.e. just don't track people. No third-party cookies, first-party only, and only for the correct operation of the site.
It's not the cookies that people object to, it's the tracking. Tracking provides no benefits to visitors. If there were no tracking risk, there would be no need to require consent.
Probably the same way most laws end up. We see the unintended consequences, then revise the law to counter the consequences. Thus the cat/mouse game continues.
An idea could be that the tracking has to be opt-in AND the webpage cannot stop critical use of the page as part of the opt-in process.
Then another round of consequences.. rinse repeat...
- no fines for non-compliance (or malicious compliance)
- no legal liability for data leaks of PPI
When businesses believe (correctly or incorrectly) that the benefit of tracking outweighs the cost (annoying users, regulatory noncompliance) they will do it. The fix is to make tracking too costly for businesses.
Fines for data breaches is one idea? If we want to disincentivize data hoarding, the main cost to data hoarding is data breaches, so we could perhaps penalize that.
This would have a different issue, specifically companies would no longer self-report data breaches, but it's just an idea. There are alternative approaches to getting to "don't track people without consent" that aren't a toothless stick by making it more expensive to track.
> The article tries to make the point (perhaps fails), that companies do this intentionally to get the "consent" of people against their will, therefor running the tight line of breaking the law without breaking it.
That X button is right there at the top near the tab name. Not sure how a user could be forced against their will into staying on the site presenting them with a cookie banner.
Everyone knows that bad actors will continue to behave badly in the face of the law. This isn't the insight you seem to think it is.
Really, PG's tweet has little to do with game theory or anything else. It is a first-world-problem whinge about having to click through cookie banners. Assessing the "actual outcome" of complex regulation and legislation is a task beyond the scope of a single tweet.
It might be useful for Graham to determine what claim he is trying to make in the first place. Is he rebutting a particular EU representative for boasting about how good they are at regulation? Or is the idea that the EU shouldn't have the audacity to attempt to regulate in the first place?
Except many companies respond to the cookie law with a cookie consent popup that violates the law (by making opt-out harder than opt-in).
Could we really have predicted from the "Law of Unintended Consequences" that companies would respond not by tracking less nor by giving people an easy way to opt out, but with a cookie consent popup that is not compliant and also really annoying to their visitors?
This is better explained by business operators being ignorant of the actual law and being ignorant of the UX impact.
There are a lot of ridiculous things a company can choose to do in response to any given law. Those choices are not mandated by the law. Horrible consent UX is not the only option to choose from.
Government can, and should, analyze likely (or unlikely) unintended consequences and use those to further shape the law, but at the end of the day, those consequences come from choices that people who are subject to the law make.
I think the big mistake the EU made is they probably thought: “Surely no company would choose to abuse their customers with horrible UI just because they don’t like the law and want to take their collective frustration out on their users!” The EU was obviously wrong about the extent to which companies would throw their users under the bus while maliciously complying.
The actual outcome is, from my experience, that tracking has reduced, a lot. When this law was enacted, *we all removed "like on Facebook"* buttons. Remember those? Yeah, we don't see them anymore. Google Analytics also was forced to change, at least a little.
Is there still tracking? Sure. But it's not so blatant anymore. There are hoops one needs to jump through. And that was the point - to make tracking a harder.
None of my projects have cookie banners. Why? Because I use a first party tracking system (Matomo), I anonymize all visits and I respect DNT. It's that easy.
It’s not the difficulty level that people object to.
It’s a combination of two things:
1) the law comes to the rest of the world from Europe. We (rest of the world) didn’t vote in the people who brought it. We’ve had quite enough of Europeans making rules for the rest of the world in the past few centuries thank you very much.
2) GDPR encodes an expectation that may or may not be common in the EU, but certainly isn’t common elsewhere. I don’t have any expectation of privacy when I walk in public or when I give any information at all to a business. My solution to this is: a) I wear pants outside, and b) I don’t give out private information. Whether the business ecosystem knows their age and purchasing patterns is largely immaterial to virtually everyone I’ve ever met.
And don’t show me a survey showing people don’t like it - if you prime people with the question, of course they will respond that way. They know their info is being gathered, and they just don’t think it’s as big a deal as GDPR would like it to be.
I see your point, but then to have a constructive conversation Paul Graham should also give his two cents about how the law could be improved. I don't know him, so I'll ask here: did he do that?
The outcome would be much better if the law explicitly stated that the initial cookie banner must have a "Necessary cookies only" opt-out one-click option. And that this option means truly necessary, not the Internet Explorer is needed by the operating system 'necessary'.
I’m not surprised. This is a “hot take-centric” platform issue, and a laziness in trying to understand him too. Or.. two people on the street yelling at each other but not listening.
Part of what it means to be "good at regulation" is to anticipate the likely consequences of regulations. So a regulation that says that "businesses must now give away their products for free, unless they honk each customer's nose" will result in a lot of sore noses.
Which is basically the case here. Almost all websites make money through ads, or at least keep logs of user activity to help them optimize their website, and that's not going to change, so the EU's boneheaded regulations make the customers suffer a little extra.
Only if you maintain your own ad inventory, instead of using Google/Facebook ads like 90% of online advertisers do. And neither of those platforms work without installing their scripts on your site.
Correct me if I'm wrong, aren't but IP addresses are considered to be "personal information" and therefore collecting them is "tracking" under the GDPR?
Your point is well made, and this is an unfortunate consequence of the regulation (and I enjoyed the analogy). But it isn't necessary to have cookie banners on every website. Github is a moderately complex, user-optimised website, right? https://github.blog/2020-12-17-no-cookie-for-you/
The EU regulation does not prevent ads from being shown, it specifically targets tracking. No tracking > no banner > everyone is happier > go ahead and show all the ads that are required.
And all that tracking comes down with inability to take risk on business side. Ad company wants to be 100% sure that ads are shown to humans, and pay only for those shown to humans(going deeper - to specific cohorts of humans, which in the past was approximated by content of the site showing ads).
Whereas sites serving ads want to extract as much money as it is possible from advertisers based on their audience count.
The incentives are on both sides to to one-up each other without tracking - hosts by inflating visitor numbers, advertisers by disputing that.
In a perfect world ad(wouldn't exist i know but bear with that) companies would pay X/month for site with Y visitors, where X depends on Y. No need for tracking, and roughly over multiple sites and multiple months it averages out.
Not enough conversion rates(risk for ad company - they could pay less)? offer lower rate per visitor next period.
Site gets spike in visitors(risk for host - they could charge more)? report higher estimated Y for next period.
What we got instead is an insane tracking infrastructure that costs way more than any possible profit gained for both sides. It's not even profit - it's avoiding being 'scammed', avoiding risk.
Remember that all that tracking bullshit started before targeted advertising was mainstream and widespread. It all started with bots and inflated click numbers, and inability to accept risk.
Tl;dr banning targeted advertising won't remove all tracking bullshit
A site can keep logs of user activity to help optimize without tracking my personal data. As soon as a company needs to track me, it's doing more than "optimizing its website"—it's using my data to sell me stuff or selling my data to third parties. And I'm glad it needs permission to do those things.
What if I want to optimize my site for certain classes of users? Say a less than abled person. What if I want to make my product easier to use with various control schemes used by a handicapped person and my product is so complex that tracking this demographic’s usage of my product is the easiest or only way? and what if there is no intent to sell your data?
I could ask permission and delay, or I could just capture the data and run experiments or A/B testing. You should also learn that nobody knows everything, and saying something isn’t required usually is just showing your own ignorance as in almost every case you’ll come across you will find at least one valid use.
Note that this isn't a cookie law, it's also the EU's main anti-malware law. The principle is that no piece of third-party controlled software should write information to your computer/phone, or read info from it, over the Internet, without your prior informed consent (with narrow exceptions for storage/reads that are needed to provide a service you've asked for, or equally narrow functions like load balancing). This isn't just about browser cookies, but also your webcam, your mic, and the contents of your Documents folder.
The principle seems sound, but the EU is deadlocked over reforms to create some extra exemptions, e.g. for security scans/mandatory updates, or privacy-respecting audience metrics. EU regulators are already sort of turning a blind eye to those, so it's fair to say the EU isn't great at regulating - it's not fixing what society mostly seems to see as bugs/overreach in the original (now decades-old) law.
> Note that this isn't a cookie law, it's also the EU's main anti-malware law. The principle is that no piece of third-party controlled software should write information to your computer/phone, or read info from it, over the Internet, without your prior informed consent
So it is a responsibility of the browser vendor to implement this.
No moreso than the OS itself. The real responsibility actually lies with the people causing the remote access (e.g. the website operator, the remote hacker, etc).
The Cookie banners aren't from the browser they're really from the site.
That said, it seems fair to require the browser vendor to implement it. The browser is the one that exposes a method to store data on the machine (ex. Cookies, LocalStorage) so it seems fair that they should know the user wanted data to be stored.
The standard for browser was called Do Not Track [0], but of course adtech killed it, there is another one now, but unless this is mandated by law or courts it won't go anywhere. Note there seems to be a court decision upholding DNT as rejection of consent [1], but this would have to be much more powerful and broadly adopted to work.
In this case, it is just showing that most companies are collecting more data than they need.
You don’t need a banner for the data that is necessary for the service to work at minimum level. There is no role for the consent since the site won’t work otherwise.
This is something a lot of people seem to misunderstand about GDPR. At its core it says you should only process people’s personal data within a lawful basis. There are 6, and consent is only one.
(a) Consent: the individual has given clear consent for you to process their personal data for a specific purpose.
(b) Contract: the processing is necessary for a contract you have with the individual, or because they have asked you to take specific steps before entering into a contract.
(c) Legal obligation: the processing is necessary for you to comply with the law (not including contractual obligations).
(d) Vital interests: the processing is necessary to protect someone’s life.
(e) Public task: the processing is necessary for you to perform a task in the public interest or for your official functions, and the task or function has a clear basis in law.
(f) Legitimate interests: the processing is necessary for your legitimate interests or the legitimate interests of a third party, unless there is a good reason to protect the individual’s personal data which overrides those legitimate interests.
> You don’t need a banner for the data that is necessary for the service to work at minimum level.
We were advised by our lawyers (a top SV tech law firm) that we should include a cookie banner in the EU even if we're only using cookies for functions like login. After eventually switching legal counsel (for unrelated reasons), we were told the same thing by our new counsel.
Either EU law covers cookie banners that use cookies for routine functionality, or it's so (deliberately) vague that even top tech law firms would rather everyone add a cookie banner than risk running afoul of the law. Either case validates PG's argument here.
I would put it another way. Any legislation against doing something is almost always motivated by someone's desire to do that very thing. Legislation is usually a battle of interests where the legislator, ideally, wants to protect the overall interests of the public when they conflict with narrower private interests. When the narrower interests belong to powerful groups, you often expect to see some struggle, and if the private interests have a way of making the regulation seem more intrusive and annoying than the harm it's intended to cause, they would take advantage of that to sway the public in their favour.
So legislators do expect such a struggle, and the shape it takes may be partly their fault, but it's clearly not all their fault. The more power the private interests have, the more likely they are to find some way to fight the regulation. They will certainly do everything they can to convince the public that the legislators are bad at regulation.
In this particular case, however, websites showing banners are also harming themselves as their competitors now have an interest in not showing banners and offering a better experience -- i.e. the regulation makes it worthwhile not to display banners in competitive situations. So we'll see how this all turns out.
No, people cannot escape responsibility by saying "the law made me be belligerent toward my users". It is intentional choice to use cookies and to make it unpleasant for people.
> No, people cannot escape responsibility by saying "the law made me be belligerent toward my users".
Correction: people should not be able to escape responsibility by saying this.
The problem is that right now people do escape responsibility for saying this because the EU is not properly enforcing these new laws.
Introducing a law and then not enforcing it has consequences, and those consequences should have been foreseen. Either the law is unenforceable due to practical constraints, in which case it's a bad law, or the EU is failing to enforce it due to inability.
Hopefully the EU starts putting more focus on enforcing its existing laws rather than creating new ones.
And use of cookies themswlves don't demand these banners, nor that they be so obstructive. Just don't collect unnecessary cookies or PII, or put in a prominent banner that doesn't overlap the site purpose.
I guess PG's original tweet assumes that cookie banners are a) bad, b) the fault of the EU, and C) unanticipated and unintended by the EU, thereby demonstrating their incompetence.
I can't really comment on what the lawmakers foresaw or intended, but I'd argue that cookie banners are actually a) good, and b) the fault of companies who can't imagine a better way to treat their users.
The reason I think they're good is that they cause a psychological nuisance to users of software which doesn't go out of their way to do them well or avoid their necessity. Over time I hope this will tend to cause an association in users minds that sites with cookie banners are somehow seedy or unscrupulous, like pop-up ads.
It's impossible to foresee everything, including the amount of malicious compliance.
In the end we are better off with this legislation and its future iterations and additions than we are without it. The extent to which people's data is misused is simply ridiculous.
Yes and no? To some extent, sure. As an example: But if companies or people went out of their way to comply with a law that is clearly not complying with the spirit of the law, just the letter of it, are you really responsible for that? Or are they because they're doing everything to not comply?
Let's say you make a law to reduce working hours from 40 to 37 hours except in "emergency situations". Now a company will force employees to sign off on "emergency situations" every week or they'll be fired. They're clearly not complying to the spirit of the law? Is it really your fault when you make a law like that? I'd say only to some degree, the people trying to abuse every loop hole are much more responsible in this case.
Companies using dark patterns, hiding the "reject all" option behind an additional click (which even is illegal) and even trying to collect all data possible are much more responsible than the EU's law. Oftentimes they are collecting data just because, not even thinking about it, because they'll add GA to their WordPress site without even looking at it or whatever. That cookie banners have become the standard around the web is sad because it just shows how much everyone is trying to track you.
It's not enough to write a law on principles alone. It must be clear and practical to comply and clear how it will be enforced. The EU should not have created a situation where the most practical solution for 1000's of companies is a cookie banner.
The thing is, the consequences seem to be very much intended. The consequences of forcing companies to be transparent about tracking, and hopefully letting the users start voting with their wallets as they get annoyed by the omnipresent "We Value Your Privacy"-popups (which is very ironic considering all the dark patterns et al that are used to have users get tracked).
If nothing else, at least now people know just how much they've been tracked. One can only hope that this increased consciousness would help people to choose services that don't track people. For example Hacker News doesn't need tracking cookies nor a cookie popup, and it seems to be doing just fine, even in terms of the law ;)
I would expect that most companies would be ashamed to publicly state that they sell your data to hundreds(!) of data providers and they would fix this before they had to disclose it. But nope, apparently the money is too good. And blaming the government is more convenient.
On the one hand when murders go up because you make using a gun in a crime an automatic 5 year prison term you should have foreseen that possible situation, on the other hand the real bad guy is the one shooting the witnesses.
I think pg is talking about the advertising banners, and yes, congratulations EU you have ruined our web experience to the benefit of even-worse-tracking that mobile applications do.
I think the bigger issue here is that this law did not fix anything, destroyed what little EU online advertising business existed, and focused on the wrong thing. For starters, the european people did not ask for this law, they have bigger problems, it was campaigned by specific german interest groups for which most EU citizens are indiffernt. Ad tracking is/was not a concern for the vast majority of EU citizens (who , again were never asked about this law) . Internet and social media addiction, however, IS an issue that most citizens have, and the EU has spent so much energy and capital on this pointless cookie banners issue, that it doesnt have more to spend on solving the addiction issue. Premature legislation always does that, and the worst is, there will never be accountability for such wrong decisions. The people who inspired the legislation are not up in some kind of election, and the upcoming MEP elections have nothing to do with EU politics and everything to do with domestic politics (Show me a country where MEP election results are not considered a proxy for national elections).
But it doesnt matter how many times someone points the political misaligments , there is no mechanism to change that until something really grave happens, when it will be too late.
A personal anecdote: I was charged with adding a cookie banner to my company’s website after having successfully resisted having one for many years. The reason given to me by the new owners of the business being that the marketing department wanted to try some new stuff, and the lawyers told them that it required consent on the part of our users. I was also told that I shouldn’t spend a lot of time on this, and to therefore use an off-the-shelf product (OneTrust), and to not customize it any way. When I remarked that the default texts for the banner sounded very scary and implied that we did a lot of things that we weren’t actually doing, I was told to leave them unchanged, because we had to assume that they had been vetted by (OneTrust’s) lawyers, and that it would be too legally risky to change them. My argument that OneTrust’s offering was a one size fits all that had to be compliant with the sleaziest, most ad-tech compromised media sites out there, but that we were not that, failed to make an impression.
A couple of observations:
1. Players like OneTrust and the consultants who specialize in this, are highly incentivized to play up the risks of not being compliant. My layman’s estimation of the legal risks is that the risk for good faith actors is actually pretty low. If the authorities find that you are not in compliance, you will most likely get a chance to rectify this, and possibly a slap on the wrist. Those scary fines measured in percent of global revenue, is not going to be what you face for an honest mistake.
2. Those businesses that rely on invasive tracking, and therefore really must use these banners, benefit from everyone else mistakingly believing that they too must compromise their UX with these banners. It makes what they do seem normal and acceptable.
Companies with tons of hidden fees decide to keep them but force you to read all the fees on every page of the menu before you can see the rest of the text, in the most annoying way possible, and promote the idea that the issue is not the extravagant fees, nor the fact that the companies hide them before and had to be forced by law to warn you about them, no the problem is a law that force them to tell you what you're getting into before it's too late !
That's, essentially, what's happening. And we have people complain that companies need to display their fees.
On this issue in the group that complain about the cookie law there are some people who are very wrong on purpose because it's in their interest, and some people who are very wrong because they genuinely don't understand the position they're defending, complaining about being made aware of the fee, instead of the fees themselves or the fact that the companies hide them if not forced by law.
To each their own belief about which category PG fits into.
What does it say about the relationship between businesses and consumers that the first response to this bad behavior is to shout "look what you made them do!"
Seemingly it is everyone's fault except the bad actors themselves.
I've seen this attitude from tech people, too, so it's not just a matter of tech ignorance or illiteracy.
The proper way to have done this would have been to go to the W3C or WHATWG and proposed an extension to HTML for sites to define an opt-in manifest or something similar.
So now we have this situation where providers were trained to play the GDPR in such a way that they will never have a problem, no matter what they actually do with the data.
And consumers are pissed because they are made to sign things which essentially reduce their rights...
And if someone (like me) thinks the EU did a half-assed job there, the downvotes rain in.
I call upon all German users of this website to write to their MPs! Obviously the German civil service is a bad actor! The German deep state is plotting to discredit our beloved eurocrats and must be shut down! Den Sumpf trockenlegen!
So yes, I do blame the government as I would be fine returning to the prior state.
Why do I need to be "consuming corporate propaganda" when I just hate that I need to dismiss banners on every news website, when I didn't have to before the regulation?
I don't care about being tracked. But now that all websites need to cover their asses in response to regulation, I'm forced to figure out which button I need to click on to read content, and these websites don't even appear to save my preferences whether I agree to be tracked or not.
Objectively, the outcome of this regulation is that my experience is worse. Are the companies bad actors? Sure! Sounds like the EU should account for companies' bad behavior instead of forcing the internet to be more annoying.
Imagine you walk into a restaurant and they hand you a paper that details full allergy information for all of the foods they serve, and then they wait for you to say, "I consent to these ingredients being in the food," before they can seat you. I think that's a closer analogy. We can all agree that the restaurant shouldn't hide that information from you, and that some minority of people might want the information, but do we really have to add this inconvenient step to the process for all people? The current real-world system, where allergy information is available upon request, was working fine.
There are some things that everyone cares about and would be appalled by, that businesses should have to inform people about, and many things that a small minority of people care about. Why stop at cookies? Maybe we should mandate a popup if the website's server infrastructure was manufactured abroad, and another popup if the company that runs the website has higher than average carbon emissions, and another popup if the food in the food court that serves the headquarters of the company that runs the website is not kosher. The lobby of people who care about cookies is of similar size to the lobby of people who care deeply about binary size and about running JavaScript. Should there be mandatory popups to execute JavaScript? If the website is >10MB, should I have to consent on a lightweight page before downloading it? How do you determine which activities warrant a popup warning and which do not?
There’s a law in California that says that businesses which have chemicals that might cause cancer on the premises need to let people know. That’s great but the levels they set turn out to be lower than what you can feasibly test for and as a result all properties pretty much just put up the signs that say “there might be chemicals here”. The warning is useless and annoying because of market forces which is another way of saying the law incentivized the behavior that occurred.
For data-harvesting companies users are like livestock, and nobody cares about livestock's opinion. It only matters how much value can be extracted from users, even if it's annoying, misleading, and relies on dark patterns.
The further we take this analogy, the more strained it becomes.
Yes, it's natural to use a cookie to track a session; this is a mechanism invented for that purpose. It's much less natural to share this tracking information with third parties, especially along with a record of your purchases or other interesting actions.
But ad revenue is much harder to obtain without targeting and thus tracking. And a lot of places depend mostly on ad revenue.
This is another case of "buy now, pay later" pattern, stretched to "take for free now, pay in loss of your privacy later". In a funny enough way, many people don't value the information they get on many ad-supported sites as highly as the marketers paying to grab their attention, so simply compensating by adding a subscription or one-time payment to go ad-free sometimes does not even work; the more generic / "doom-scrollalbe" the content is, the worse it works.
[1] https://github.blog/2020-12-17-no-cookie-for-you/
It is not about cookies.
Not only that, I'm not an EU citizen and I'm not browsing websites based in EU but I'm still bombarded with cookie banners non-stop.
Middle managers absolutely love anything with charts and graphs because it makes their decisions feel more scientific. That's why they want tracking software included on their websites. And if the law requires disclosure then a cookie popup is the solution.
Again, that's the fault of the companies putting those up, they could make it opt-in to collect your data, they could just put a small notice on the footer with 2 simples links "Accept all/Reject all". But they chose, they decided to pester you with those banners as annoyingly as possible to make you have exactly the reaction you're having.
On your second point, that is again a choice of said companies, not a problem with the law. The GDPR has proven very well that if they cared, they can segment who is affected or not, and not just big tech lots of random local news site and the likes are doing it just fine.
So again, you're aiming at the wrong culprit.
What I think we are missing is a browser option/API that lets the user choose the acceptable tracking level. Similar to the do not track header but more fine grained.
As we are missing that, extensions are doing a good job ATM
https://chromewebstore.google.com/detail/consent-o-matic/mdj...
https://addons.mozilla.org/ro/firefox/addon/consent-o-matic/
I found pretty late about Consent-o matic and it saved me a ton of time handling banners. It's exactly what we should have built-in the browser.
- it's not really a user choice when some browsers set it by default and therefore ignore it
- it's set globally for a browser but a user might want to give away their privacy to my specific site
... and show the banner anyway
Why not a real regulation then to get rid of hidden fees and heavy fines/jail time for companies that are found to be doing it?
PG's argument (I hope) is that there is no point in talking about "regulation" and "customer protection" if companies STILL get away with their ridiculous and hostile practices.
There is no customer benefit in having user data collection and tracking. Companies do it only to exploit you. Even the usual BS excuses ("oh, we need user data to customize the experience") could be done completely in-device.
I don't want regulatory bodies to just give more hoops for other companies to jump. They will jump it anyway, because it is profitable to do so. What I want is for regulatory bodies to effectively stop predatory practices.
We're a pathetic lot.
The reality is that I (and others who are complaining, as well as many who have resigned themselves to their fate) are happy to have a website "track me", certainly if the cost of non-tracking are having to click away an annoying popup, and think that people who compare a website wanting to know the number of their visitors to "hidden fees" are kind of being ridiculous.
A few of these cookie prompts during the day and they'd be able to tell everything from where your kids go to school to the kind of prn you prefer to watch on weekdays and everything in between.
I agree that wanting to know the number of visitors is benign and it is not abuse.
But saying companies should be allowed to track me (for whatever purpose) across the web without my consent is also pretty ridiculous.
https://arstechnica.com/tech-policy/2021/07/facebook-adverti...
Is counting visitors all that sites are doing with tracking info?
They're not selling it to ad brokers, insurance companies, governments? They're not matching your name, address, and phone number with your web activity (including sexual interests, "anonymous" embarrassing stories, health concerns, etc)?
The you should doubly blame the companies, because that's what do not track was for, they're the one who decided to make it not work that way and instead being ignored and not considered a valid option for the law.
> think that people who compare a website wanting to know the number of their visitors to "hidden fees" are kind of being ridiculous.
You don't need a cookie for that, and what GDPR has told us is that we're not talking of that but about dozens or hundreds on every major sites so trying to frame it that way is disingenuous.
But I suppose that was just an example you picked to illustrate the industry's malicious compliance, and not the main point, in which case fair enough. :-)
If the total price of the website without the secret costs of tracking were presented upfront, it would be less of an issue.
We have other more precise words to describe that action. I asked ChatGPT what those could be, here's its answer:
People should have a default expectation that if they give their personal data to companies then it will be recorded. And if they don't want cookies then they should disable cookies. The EU's regulation hasn't revealed anything that is useful to know about.
KingOfCoders/amazingcto, of course you are technically correct but Paul Graham wasn't talking about the letter of the law.
Instead, you have to interpret his complaint with the lens of game theory. I.e. The Law of Unintended Consequences that takes into account what companies actually do in response to laws instead of what we hope they will do.
Your blog post focused on good intentions of the law. PG's tweet focused on actual outcome.
It's more morally permissible for corporate decision makers to install a global surveillance complex than for civil servants to attempt to regulate it.
I don't know if this mini-competition between regulators and companies is truly zero-sum, there could be some way to get everyone something they want. But with the current regulation, it is zero-sum, and the companies are winning and the EU is losing. And the EU "works for you", so of course you can complain to them.
Unfortunately a non-negligible number of people in tech also have libertarian leanings, with a default “gubmint bad!” position, which makes them easy prey for adtech propaganda.
Good regulation is regulation that has good outcomes. If a law has bad outcomes it is a bad law. You can separately complain about what companies are doing but that doesn't change the fact that it's a bad law.
It is of course debatable whether GDPR as a whole has bad outcomes, but if we're talking about cookie banners in isolation then it certainly does.
The actual outcome is that they do want to track, and use adversarial patterns and malicious compliance to twist your arm and "force consent."
Paul Graham is still wrong.
No, you've inadvertently stated a contradiction. Your use of the word _"could"_ is literally a hope/wish/intention of the law.
In contrast, the actual outcome is that the companies didn't stop tracking. We _wish_ they would stop tracking. (I.e. "The companies _could_ just stop tracking us!") But that hope still doesn't change the observation of reality.
I'm a fan of second-order thinking and unintended consequences, so I'm with you there. How would you frame a "don't track people without consent" without unintended consequences?
The article tries to make the point (perhaps fails), that companies do this intentionally to get the "consent" of people against their will, therefor running the tight line of breaking the law without breaking it.
Drop the consent requirement? I.e. just don't track people. No third-party cookies, first-party only, and only for the correct operation of the site.
It's not the cookies that people object to, it's the tracking. Tracking provides no benefits to visitors. If there were no tracking risk, there would be no need to require consent.
An idea could be that the tracking has to be opt-in AND the webpage cannot stop critical use of the page as part of the opt-in process.
Then another round of consequences.. rinse repeat...
- no fines for non-compliance (or malicious compliance)
- no legal liability for data leaks of PPI
When businesses believe (correctly or incorrectly) that the benefit of tracking outweighs the cost (annoying users, regulatory noncompliance) they will do it. The fix is to make tracking too costly for businesses.
This would have a different issue, specifically companies would no longer self-report data breaches, but it's just an idea. There are alternative approaches to getting to "don't track people without consent" that aren't a toothless stick by making it more expensive to track.
That X button is right there at the top near the tab name. Not sure how a user could be forced against their will into staying on the site presenting them with a cookie banner.
Deleted Comment
Really, PG's tweet has little to do with game theory or anything else. It is a first-world-problem whinge about having to click through cookie banners. Assessing the "actual outcome" of complex regulation and legislation is a task beyond the scope of a single tweet.
It might be useful for Graham to determine what claim he is trying to make in the first place. Is he rebutting a particular EU representative for boasting about how good they are at regulation? Or is the idea that the EU shouldn't have the audacity to attempt to regulate in the first place?
Could we really have predicted from the "Law of Unintended Consequences" that companies would respond not by tracking less nor by giving people an easy way to opt out, but with a cookie consent popup that is not compliant and also really annoying to their visitors?
This is better explained by business operators being ignorant of the actual law and being ignorant of the UX impact.
Government can, and should, analyze likely (or unlikely) unintended consequences and use those to further shape the law, but at the end of the day, those consequences come from choices that people who are subject to the law make.
I think the big mistake the EU made is they probably thought: “Surely no company would choose to abuse their customers with horrible UI just because they don’t like the law and want to take their collective frustration out on their users!” The EU was obviously wrong about the extent to which companies would throw their users under the bus while maliciously complying.
Is there still tracking? Sure. But it's not so blatant anymore. There are hoops one needs to jump through. And that was the point - to make tracking a harder.
None of my projects have cookie banners. Why? Because I use a first party tracking system (Matomo), I anonymize all visits and I respect DNT. It's that easy.
It’s a combination of two things:
1) the law comes to the rest of the world from Europe. We (rest of the world) didn’t vote in the people who brought it. We’ve had quite enough of Europeans making rules for the rest of the world in the past few centuries thank you very much.
2) GDPR encodes an expectation that may or may not be common in the EU, but certainly isn’t common elsewhere. I don’t have any expectation of privacy when I walk in public or when I give any information at all to a business. My solution to this is: a) I wear pants outside, and b) I don’t give out private information. Whether the business ecosystem knows their age and purchasing patterns is largely immaterial to virtually everyone I’ve ever met.
And don’t show me a survey showing people don’t like it - if you prime people with the question, of course they will respond that way. They know their info is being gathered, and they just don’t think it’s as big a deal as GDPR would like it to be.
Which is basically the case here. Almost all websites make money through ads, or at least keep logs of user activity to help them optimize their website, and that's not going to change, so the EU's boneheaded regulations make the customers suffer a little extra.
Doesn't require tracking of individuals.
> or at least keep logs of user activity to help them optimize their website
Doesn't require tracking of individuals.
Only if you maintain your own ad inventory, instead of using Google/Facebook ads like 90% of online advertisers do. And neither of those platforms work without installing their scripts on your site.
Building a house doesn't require powertools, but if your company tries to do it with handtools we'll see who goes bankrupt first.
The EU regulation does not prevent ads from being shown, it specifically targets tracking. No tracking > no banner > everyone is happier > go ahead and show all the ads that are required.
The incentives are on both sides to to one-up each other without tracking - hosts by inflating visitor numbers, advertisers by disputing that.
In a perfect world ad(wouldn't exist i know but bear with that) companies would pay X/month for site with Y visitors, where X depends on Y. No need for tracking, and roughly over multiple sites and multiple months it averages out.
Not enough conversion rates(risk for ad company - they could pay less)? offer lower rate per visitor next period. Site gets spike in visitors(risk for host - they could charge more)? report higher estimated Y for next period.
What we got instead is an insane tracking infrastructure that costs way more than any possible profit gained for both sides. It's not even profit - it's avoiding being 'scammed', avoiding risk.
Remember that all that tracking bullshit started before targeted advertising was mainstream and widespread. It all started with bots and inflated click numbers, and inability to accept risk.
Tl;dr banning targeted advertising won't remove all tracking bullshit
I could ask permission and delay, or I could just capture the data and run experiments or A/B testing. You should also learn that nobody knows everything, and saying something isn’t required usually is just showing your own ignorance as in almost every case you’ll come across you will find at least one valid use.
How dare they!
</sarcasm>
Just because you made money of it, it doesn't mean it is right.
The principle seems sound, but the EU is deadlocked over reforms to create some extra exemptions, e.g. for security scans/mandatory updates, or privacy-respecting audience metrics. EU regulators are already sort of turning a blind eye to those, so it's fair to say the EU isn't great at regulating - it's not fixing what society mostly seems to see as bugs/overreach in the original (now decades-old) law.
So it is a responsibility of the browser vendor to implement this.
The Cookie banners aren't from the browser they're really from the site.
That said, it seems fair to require the browser vendor to implement it. The browser is the one that exposes a method to store data on the machine (ex. Cookies, LocalStorage) so it seems fair that they should know the user wanted data to be stored.
[0] https://en.wikipedia.org/wiki/Do_Not_Track
[1] https://dig.watch/updates/german-court-affirms-legal-signifi...
https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex%3A...
You don’t need a banner for the data that is necessary for the service to work at minimum level. There is no role for the consent since the site won’t work otherwise.
(a) Consent: the individual has given clear consent for you to process their personal data for a specific purpose.
(b) Contract: the processing is necessary for a contract you have with the individual, or because they have asked you to take specific steps before entering into a contract.
(c) Legal obligation: the processing is necessary for you to comply with the law (not including contractual obligations).
(d) Vital interests: the processing is necessary to protect someone’s life.
(e) Public task: the processing is necessary for you to perform a task in the public interest or for your official functions, and the task or function has a clear basis in law.
(f) Legitimate interests: the processing is necessary for your legitimate interests or the legitimate interests of a third party, unless there is a good reason to protect the individual’s personal data which overrides those legitimate interests.
Thankfully we have EU institutions to protect us from these evil companies. But somehow the EU institution websites all have cookie banners too.
We were advised by our lawyers (a top SV tech law firm) that we should include a cookie banner in the EU even if we're only using cookies for functions like login. After eventually switching legal counsel (for unrelated reasons), we were told the same thing by our new counsel.
Either EU law covers cookie banners that use cookies for routine functionality, or it's so (deliberately) vague that even top tech law firms would rather everyone add a cookie banner than risk running afoul of the law. Either case validates PG's argument here.
So legislators do expect such a struggle, and the shape it takes may be partly their fault, but it's clearly not all their fault. The more power the private interests have, the more likely they are to find some way to fight the regulation. They will certainly do everything they can to convince the public that the legislators are bad at regulation.
In this particular case, however, websites showing banners are also harming themselves as their competitors now have an interest in not showing banners and offering a better experience -- i.e. the regulation makes it worthwhile not to display banners in competitive situations. So we'll see how this all turns out.
Correction: people should not be able to escape responsibility by saying this.
The problem is that right now people do escape responsibility for saying this because the EU is not properly enforcing these new laws.
Introducing a law and then not enforcing it has consequences, and those consequences should have been foreseen. Either the law is unenforceable due to practical constraints, in which case it's a bad law, or the EU is failing to enforce it due to inability.
Hopefully the EU starts putting more focus on enforcing its existing laws rather than creating new ones.
I can't really comment on what the lawmakers foresaw or intended, but I'd argue that cookie banners are actually a) good, and b) the fault of companies who can't imagine a better way to treat their users.
The reason I think they're good is that they cause a psychological nuisance to users of software which doesn't go out of their way to do them well or avoid their necessity. Over time I hope this will tend to cause an association in users minds that sites with cookie banners are somehow seedy or unscrupulous, like pop-up ads.
In the end we are better off with this legislation and its future iterations and additions than we are without it. The extent to which people's data is misused is simply ridiculous.
Let's say you make a law to reduce working hours from 40 to 37 hours except in "emergency situations". Now a company will force employees to sign off on "emergency situations" every week or they'll be fired. They're clearly not complying to the spirit of the law? Is it really your fault when you make a law like that? I'd say only to some degree, the people trying to abuse every loop hole are much more responsible in this case.
Companies using dark patterns, hiding the "reject all" option behind an additional click (which even is illegal) and even trying to collect all data possible are much more responsible than the EU's law. Oftentimes they are collecting data just because, not even thinking about it, because they'll add GA to their WordPress site without even looking at it or whatever. That cookie banners have become the standard around the web is sad because it just shows how much everyone is trying to track you.
If nothing else, at least now people know just how much they've been tracked. One can only hope that this increased consciousness would help people to choose services that don't track people. For example Hacker News doesn't need tracking cookies nor a cookie popup, and it seems to be doing just fine, even in terms of the law ;)
I think the bigger issue here is that this law did not fix anything, destroyed what little EU online advertising business existed, and focused on the wrong thing. For starters, the european people did not ask for this law, they have bigger problems, it was campaigned by specific german interest groups for which most EU citizens are indiffernt. Ad tracking is/was not a concern for the vast majority of EU citizens (who , again were never asked about this law) . Internet and social media addiction, however, IS an issue that most citizens have, and the EU has spent so much energy and capital on this pointless cookie banners issue, that it doesnt have more to spend on solving the addiction issue. Premature legislation always does that, and the worst is, there will never be accountability for such wrong decisions. The people who inspired the legislation are not up in some kind of election, and the upcoming MEP elections have nothing to do with EU politics and everything to do with domestic politics (Show me a country where MEP election results are not considered a proxy for national elections).
But it doesnt matter how many times someone points the political misaligments , there is no mechanism to change that until something really grave happens, when it will be too late.
A couple of observations:
1. Players like OneTrust and the consultants who specialize in this, are highly incentivized to play up the risks of not being compliant. My layman’s estimation of the legal risks is that the risk for good faith actors is actually pretty low. If the authorities find that you are not in compliance, you will most likely get a chance to rectify this, and possibly a slap on the wrist. Those scary fines measured in percent of global revenue, is not going to be what you face for an honest mistake.
2. Those businesses that rely on invasive tracking, and therefore really must use these banners, benefit from everyone else mistakingly believing that they too must compromise their UX with these banners. It makes what they do seem normal and acceptable.