IANAL, but it is my understanding that those arcane menus are actually breaking the law (with respect to GDPR in the EU, at least). The default position should always be "No", and combined with that the "exit route" for anyone that opted-in should also be really, really easy. Like single-click easy. But business gonna business and bend the rules, I guess.
The laws have grown muddy due to lack of enforcement. The notion that there is such a thing as opting in creates a huge gray area, and that has made law enforcement hesitant to open any cases as they are likely to take a lot of time, and probably won't lead to any big victories.
The only compliant way to show tracking ads is to first ask the user if they can, where the default in case of non-response must be NO. You can't have it default on, and you can't condition access to the site on accepting non-essential cookies (where "essential" is for the actual site function, and not "essential because our business model means tracking ads keeps the servers running").
Once this is enforced properly, the web will be a better place. Right now it's a mess of cookie banners with no real function, that people just click OK to.
Which simply can't happen. There's a reason websites aren't manually indexed. And that same reason means you can never fully enforce these laws. What you can do, though, is score political points for selectively applying them to large unpopular players. It's always going to be a mess everywhere else, though.
The amount of stupidity surrounding this legislation is appalling. The obvious solution could have been:
- Use or make something similar of the ill fated Do Not Track header for anonymous users. The user decides on his stance regarding privacy. His machine makes it known to the server, the server acts according to the users wishes. No further user interaction required.
- Decide what to do if the user authenticates. Use the method above or offer a more granular way to control privacy settings to be configured by the user in his account settings.
Though that would basically kill analytics. The reasonable default for such a Header is no tracking. The controversy around the existing DNT header and the attempts of at least Microsoft to set it to "no track" on default, is "enlightening".
The legislation proposed the most obvious solution:
- just don’t collect anything, don’t even require cookies
From there every significant actors of the industry, short of firefox and Apple (perhaps Microsoft ?) just went on looking for the other ways, workarounds, anything to keep their business as close as it is now.
That going as far as completely cutting off whole swaths of users just to not bend to the rules.
Blame the players cheating and conspiring to bend the game and ignore or weaken any attempt to limit their reach.
I agree the cookie law was a failure. The EU does too, so it's being revised. The GDPR is a different matter altogether, it doesn't specify technical solutions at all, and covers all personal data, not just tracking.
Agreed, there should be much stricter rules around tracking.
And instead of the cookie popups it would have been much better to solve it the same way localization and notifications work: the browser asks users with an integrated dialog and the user can set it to not even show that popup in the settings. If the browser doesn't support that feature the website has to assume the user doesn't consent.
Well, publisher and advertiser made it a joke, by trying for the last ten years to extorcate unappropriate consent from users (and bothering them as much as possible in the process).
And regarding the option to set your preferences at the browser level, of course this is the best possible solution, but if you are following the ePrivacy reglement discussion, the article 10 (permitting browser to obtain consent for website with a standardized interface) is pretty close to getting killed.. Money talks.
The user experience should improve significantly once the new EU ePrivacy regulations are effected; these will make browser settings (for example, your settings could be: 'accept first-party cookies, reject third-party cookies') the source of consent. There's a decent summary here:
Perhaps this will lead some advertisers to attempt sketchy things like server-side application integration so that their cookies 'appear' to be first-party; either way, the policy has teeth and can apply fines the same way GDPR can, so any advertisers (or services themselves) found to be storing cookies without consent which are not strictly for site functionality may find themselves in hot water.
I'll be willing to bet that less scrupulous marketers who make a decent chunk of their revenue from users who they mislead into clicking / purchasing goods (i.e. targeting less skeptical users) will also attempt to get their audiences to lower their cookie settings. Think banners with content such as 'to get access to this special deal, we need you to update your settings'.
Note that it's completely possible to build rich web applications that don't use any cookies at all, especially nowadays with localStorage and all the infrastructure for progressive web applications.
It's also worth noting that cookies were controversial when they were originally introduced - it's not like they're some fundamental infrastructure that we've always relied upon. Here's some privacy and cookie advice from 1998, for example: https://web.archive.org/web/19980210083135/http://internet.j...
The whole thing looked a bit strange to me. First the interview started with a basic presentation of who Edward Snowden is and what he did... at the Websummit.
His talk brought nothing new if you follow him, maybe it brought attention to the issues to a wider public (it was all over the news in Portugal).
On the other hand I feel his presence could serve to white wash the whole thing. How many companies represented there would be out of business if they embraced Snoden's beliefs?
Tracking needs to be illegal. Period. As long as it's legal, even conditionally, we will be playing whack-a-mole. There's a helluva lot of trash on the internet that's there only because somebody is trying to game the advertising industry.
If you allow conditional tracking, and invite workarounds, we are in a race to the bottom. Ethical players are caught in a position of play dirty or die. When growth/profits sag a bit, they will have no riposte to the board member who questions their unwillingness to engage in cutting-edge gamesmanship to get around the law.
On the other hand, if you make it strictly illegal, and make the penalty an existential threat to the company, then everybody can play a fair game. The board member who pressures the company to do such things will be putting the company at risk, and the ethical folks have their response.
Advertisers don't need to track people. They don't need our personal information.
Case-in-point: Alphabet just bough Fitbit. Fitbit knows your most intimate details. They know when you sleep, when you are awake, where you are. They know when you go up and down stairs. I'm guessing they have a pretty good idea of when you make love, where you make love, and with whom (if you're both wearing). And they just sold their data-collection to a company that exists to exploit your data.
Strongly agree. I don't care why they're doing it. That a voyeur is taking creepy photos of me in order to advertise better rather than to get off, or that someone's trying to record my conversations to advertise better rather than to hunt and kill dissidents, does not make those activities OK at all. They are still, per se, abhorrent.
That the collected data is open to any imaginable abuse in the future, just as if the motivations, in addition to the actions themselves, had also been extremely malicious, since they (the big tech companies and the savvier small ones) default to collecting everything they can and never deleting it if they can avoid it, means stopping this is also pressing for very real, very serious safety reasons.
Data protection suggests that it is okay in the first place to collect data, as long as it is going to be protected. But this is a problem. Data shouldn't be collected in the first place, because you can't probably protect it properly anyways (leaks etc.).
He's totally right imho, but to defend the GDPR at least a bit: It does have the concept of "consent" so that data should only be collected if people agree to it being collected. However, I still have my doubts that it works like that in reality for several reasons:
- Too many websites place tracking cookies first and then let you disable them
- Too many apps use some kind of analytics without any consent
- The GDPR has different mechanisms to give companies a "legitimate interest in collecting data". How this is enforced is kind of unclear.
- The GDPR issued fines in the past, but what's gone is gone. It maybe helps that companies stop collecting more data in the future after they were caught, but you, as an individual, are still screwed.
So, the only way indeed is to stop SOME data collection in the first place and do it yourself. And you certainly can forget cookie banners and all that junk. Only thing that works is:
- don't sign up to abusive services
- use tracking protection on the web (uBlock Origin)
- possibly use something like pi-hole to prevent tracking for all your devices and apps
But of course, this doesn't stop data collection where you really have no choice but to agree to something.
The idea is that the fines as supposed to act as a deterrent, but unfortunately there's been a shitload of really bad practices stemming from the fact that many consultants have drawn the erroneous conclusion that pre-ticked opt-in boxes are legal. Fines were handed out regarding a few specific cases, but very few businesses know about this and most of the ones that do claim it doesn't concern them for whatever reason. Furthermore, the Danish authorities set a very dangerous precedent when they refused to take on Google with the sole reason that Google are to big and that someone else have to do it.
But yes, until the situation has stabilized and someone has gotten very badly hurt from a financial perspective, the only sensible thing is to block non-HTML content by default and only whitelist the stuff you actually want to run.
> - Too many websites place tracking cookies first and then let you disable them
Under GDPR, unless the website owner can prove a legitimate interest (which is pretty hard to do and doesn't work for advertising), this is illegal.
> - Too many apps use some kind of analytics without any consent
Unfortunately usage analytics can qualify as a legitimate interest, but the actual data used and the purpose matters and you might be able to drop a tracking cookie for improving your service, but AFAIK collecting health data, GPS location or other sensitive information won't fly for analytics without explicit consent.
And we are talking about first party analytics only. Having analytics cookies dropped as part of an advertising network without consent will not fly either.
> - The GDPR has different mechanisms to give companies a "legitimate interest in collecting data". How this is enforced is kind of unclear.
Not sure what you mean by "enforced", but there are rules for establishing a legitimate interest and just because companies claim they have it, doesn't mean that they actually do.
It's basically up to the data protection authorities to do their jobs. Give them some time, there are a lot of lawsuits already.
That is an excellent question, because this is indeed how it works on many sites. However, it is actually wrong. The proper way would be to have users opt-in.
In practice that's a violation of Art. 7 GDPR [0]. In particular, recital 42 [1] makes clear that consent can't be regarded as "freely given" if "the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment."
That's not compliant with the GDPR plain and simple. For example ads on a website are not required for that site to work (even if it's the only revenue), so the site cannot store data only to track users to show ads. The way a complliant site should work is it can say
"if you want to allow tracking cookies to get more relevant ads you can do so in settings".
I.e. no must be default in case of non-acceptance - and the site must still function.
> Data shouldn't be collected in the first place, because you can't probably protect it properly anyways (leaks etc.).
That's an absurd argument, any data can be leaked. Your bank data can be leaked, mine was, should bank be banned to hold any personal data for that reason?
Of course. But not every service needs the same amount of personal data like a bank account. Just think of all the apps that want access to your contacts, often for no particular reason.
Obviously, if fewer services collect data, the overall chance of lots of data being leaked goes down.
A technical solution is good, but it only protects the tech savvy. The GDPR gives us several additional guarantees compared to the technical solution:
- It protects everyone equally.
- It shifts the incentives towards not collecting, since the data becomes a liability.
- It empowers citizen to know what was collected through the use of GDPR requests. It further allows the citizen to know what that data was used for.
- It allows citizens to ask the collecting entity to remove any and all data they have about them.
- Finally, it allows legal recourse when data that was never supposed to get collected inevitably leaks.
So there are multiple things the GDPR does that a purely technical solution does not provide. As usual, the best protection is to have both. Used ublock, didn’t give consent, but still suspect foul play? Send the company a GDPR request, and if data was collected, ask for it to be deleted (and additionally report it to your gdpr regulation entity).
Also:
> - Too many websites place tracking cookies first and then let you disable them.
>
> - Too many apps use some kind of analytics without any consent
I’m pretty sure both of those are straight up illegal, thanks to the cookie law and the GDPR.
Love those cookie warnings. We need more popups like that. Imagine all the quality time spent clicking, knowing, for sure, that you are getting a cookie.
I agree. Power centers need rules on what we can remember and say; it's what they derive their protection and power from. Personal responsibility is the last thing they want people to have.
> Data protection suggests that it is okay in the first place to collect data, as long as it is going to be protected. But this is a problem. Data shouldn't be collected in the first place, because you can't probably protect it properly anyways (leaks etc.).
Article 5 of GDPR specifically deals with the reduction of what's collected and the destruction of data after it has served its purpose. The legislators were well aware that what isn't there cannot be lost, stolen or misused.
Not collecting personal data in the first place is always the preferred scenario.
Confused, doesn't GDPR prohibit you from collecting data in the first place unless you satisfy some criteria? Meaning it is data collection regulation?
It doesn't remove the ability to collect, but requires consent. When something is technically possible, and there's a monetary incentive, consent is easy to arrange, either with misleading popups, or by just making the product so sweet that the user forgets about privacy
Making the product sweet doesn't work (legally): if you rely on consent for capturing data, you cannot refuse access to the product if the user doesn't consent.
Yes, the GDPR prohibits most of it. But it's a law, so it doesn't inhibit data collection in any way. The inhibition must come from the companies themselves, and the mechanism to drive that is fines and penalties. Time will tell how effective the GDPR is at that.
Mr. Snowden is right. If you don't start with the assumption that ALL SECRETS LEAK SOONER OR LATER, your information security plan is definitely flawed.
Not even state actors with unlimited resources (that's you, NSA) can prevent stored secrets from leaking. It's ALWAYS something, whether teenaged hackers or far-flung contract system administrators with too much access (that's you, Mr. Snowden).
Rule 1. Don't collect data you don't need.
Rule 2. Don't store data you don't need.
Rule 3. Assume all data you store will leak, according to Murphy's law (at the worst possible time).
Rule 4. Make your stored data has limited utility. Eternal Blue (hi again NSA) was not such a secret.
Rule 5. Make sure your stored data has limited useful lifetime. US Social Security numbers do not have limited useful lifetime. Strangely enough, credit card numbers do have limited lifetime.
Rule 6. Do your best to set up leak detection. For example, seed your financial secret caches with fake social security numbers that raise flags when used.
Rule 7. See rule 3.
Secrets should be stored under the legal concept of strict liability. They're just like bulls in a farmer's field. If the bull escapes and causes damage, the farmer pays for it. No excuses. No need to prove negligence.
We have, up and running at scale, workers' compensation and the vaccine injury fund. Both of those assume strict liability. A dangerous factory sees its premiums go high enough to put it out of business. Same for a sloppy vaccine manufacturer.
Why can't NSA and Equifax be held to the same standard? (I'd hate to be POTUS announcing a tax increase to cover the damage caused by Eternal Blue.)
At least in theory, the GDPR is meant to restrict collection, not just how the data is used and stored. It has a big loophole in allowing for vague "business interests" to be taken into consideration whether collection is legal or not.
More than that, the GDPR clearly establishes ownership of PII and asserts that owners have the right to request information about how their data is handled as well as demand that data be destroyed, exported or corrected.
I think he doesn't. GDPR could outright ban the collection of certain data. Instead (and because this data is useful for govt. purposes) they chose to go with consent.
I'm sick of this whole situation. We need to ban "opt out". Everything businesses do should be "opt in" with "informed consent", and noncompliance should be be punished with summary closure and dissolution after the first offense.
And while we're at it, let nuke the data brokers. Preferably literally.
Until some lawsuits will happen and the EU has been more than happy to collect fines.
Once this is enforced properly, the web will be a better place. Right now it's a mess of cookie banners with no real function, that people just click OK to.
Which simply can't happen. There's a reason websites aren't manually indexed. And that same reason means you can never fully enforce these laws. What you can do, though, is score political points for selectively applying them to large unpopular players. It's always going to be a mess everywhere else, though.
- Use or make something similar of the ill fated Do Not Track header for anonymous users. The user decides on his stance regarding privacy. His machine makes it known to the server, the server acts according to the users wishes. No further user interaction required.
- Decide what to do if the user authenticates. Use the method above or offer a more granular way to control privacy settings to be configured by the user in his account settings.
Though that would basically kill analytics. The reasonable default for such a Header is no tracking. The controversy around the existing DNT header and the attempts of at least Microsoft to set it to "no track" on default, is "enlightening".
https://www.fastcompany.com/90308068/how-the-tragic-death-of...
I the meantime I installed the Firefox Add-on "I don't care about cookies" which does a reasonably good job to remove these annoyances.
- just don’t collect anything, don’t even require cookies
From there every significant actors of the industry, short of firefox and Apple (perhaps Microsoft ?) just went on looking for the other ways, workarounds, anything to keep their business as close as it is now.
That going as far as completely cutting off whole swaths of users just to not bend to the rules.
Blame the players cheating and conspiring to bend the game and ignore or weaken any attempt to limit their reach.
And instead of the cookie popups it would have been much better to solve it the same way localization and notifications work: the browser asks users with an integrated dialog and the user can set it to not even show that popup in the settings. If the browser doesn't support that feature the website has to assume the user doesn't consent.
And regarding the option to set your preferences at the browser level, of course this is the best possible solution, but if you are following the ePrivacy reglement discussion, the article 10 (permitting browser to obtain consent for website with a standardized interface) is pretty close to getting killed.. Money talks.
https://www.i-scoop.eu/gdpr/eu-eprivacy-regulation/#The_EU_e...
Perhaps this will lead some advertisers to attempt sketchy things like server-side application integration so that their cookies 'appear' to be first-party; either way, the policy has teeth and can apply fines the same way GDPR can, so any advertisers (or services themselves) found to be storing cookies without consent which are not strictly for site functionality may find themselves in hot water.
I'll be willing to bet that less scrupulous marketers who make a decent chunk of their revenue from users who they mislead into clicking / purchasing goods (i.e. targeting less skeptical users) will also attempt to get their audiences to lower their cookie settings. Think banners with content such as 'to get access to this special deal, we need you to update your settings'.
Note that it's completely possible to build rich web applications that don't use any cookies at all, especially nowadays with localStorage and all the infrastructure for progressive web applications.
It's also worth noting that cookies were controversial when they were originally introduced - it's not like they're some fundamental infrastructure that we've always relied upon. Here's some privacy and cookie advice from 1998, for example: https://web.archive.org/web/19980210083135/http://internet.j...
Dead Comment
His talk brought nothing new if you follow him, maybe it brought attention to the issues to a wider public (it was all over the news in Portugal).
On the other hand I feel his presence could serve to white wash the whole thing. How many companies represented there would be out of business if they embraced Snoden's beliefs?
If you allow conditional tracking, and invite workarounds, we are in a race to the bottom. Ethical players are caught in a position of play dirty or die. When growth/profits sag a bit, they will have no riposte to the board member who questions their unwillingness to engage in cutting-edge gamesmanship to get around the law.
On the other hand, if you make it strictly illegal, and make the penalty an existential threat to the company, then everybody can play a fair game. The board member who pressures the company to do such things will be putting the company at risk, and the ethical folks have their response.
Advertisers don't need to track people. They don't need our personal information.
Case-in-point: Alphabet just bough Fitbit. Fitbit knows your most intimate details. They know when you sleep, when you are awake, where you are. They know when you go up and down stairs. I'm guessing they have a pretty good idea of when you make love, where you make love, and with whom (if you're both wearing). And they just sold their data-collection to a company that exists to exploit your data.
This needs to be stopped. Now.
That the collected data is open to any imaginable abuse in the future, just as if the motivations, in addition to the actions themselves, had also been extremely malicious, since they (the big tech companies and the savvier small ones) default to collecting everything they can and never deleting it if they can avoid it, means stopping this is also pressing for very real, very serious safety reasons.
He's totally right imho, but to defend the GDPR at least a bit: It does have the concept of "consent" so that data should only be collected if people agree to it being collected. However, I still have my doubts that it works like that in reality for several reasons:
- Too many websites place tracking cookies first and then let you disable them
- Too many apps use some kind of analytics without any consent
- The GDPR has different mechanisms to give companies a "legitimate interest in collecting data". How this is enforced is kind of unclear.
- The GDPR issued fines in the past, but what's gone is gone. It maybe helps that companies stop collecting more data in the future after they were caught, but you, as an individual, are still screwed.
So, the only way indeed is to stop SOME data collection in the first place and do it yourself. And you certainly can forget cookie banners and all that junk. Only thing that works is:
- don't sign up to abusive services
- use tracking protection on the web (uBlock Origin)
- possibly use something like pi-hole to prevent tracking for all your devices and apps
But of course, this doesn't stop data collection where you really have no choice but to agree to something.
Edit: Clarification
But yes, until the situation has stabilized and someone has gotten very badly hurt from a financial perspective, the only sensible thing is to block non-HTML content by default and only whitelist the stuff you actually want to run.
Under GDPR, unless the website owner can prove a legitimate interest (which is pretty hard to do and doesn't work for advertising), this is illegal.
> - Too many apps use some kind of analytics without any consent
Unfortunately usage analytics can qualify as a legitimate interest, but the actual data used and the purpose matters and you might be able to drop a tracking cookie for improving your service, but AFAIK collecting health data, GPS location or other sensitive information won't fly for analytics without explicit consent.
And we are talking about first party analytics only. Having analytics cookies dropped as part of an advertising network without consent will not fly either.
> - The GDPR has different mechanisms to give companies a "legitimate interest in collecting data". How this is enforced is kind of unclear.
Not sure what you mean by "enforced", but there are rules for establishing a legitimate interest and just because companies claim they have it, doesn't mean that they actually do.
It's basically up to the data protection authorities to do their jobs. Give them some time, there are a lot of lawsuits already.
How does this work in practice? Sites just say "we collect data, click OK to accept". Where's the option to say no?
This was even reinforced with a court decision recently: https://www.technologylawdispatch.com/2019/10/cookies-tracki...
[0]: https://gdpr-info.eu/art-7-gdpr/
[1]: https://gdpr-info.eu/recitals/no-42/
That's not compliant with the GDPR plain and simple. For example ads on a website are not required for that site to work (even if it's the only revenue), so the site cannot store data only to track users to show ads. The way a complliant site should work is it can say
"if you want to allow tracking cookies to get more relevant ads you can do so in settings".
I.e. no must be default in case of non-acceptance - and the site must still function.
That's an absurd argument, any data can be leaked. Your bank data can be leaked, mine was, should bank be banned to hold any personal data for that reason?
Obviously, if fewer services collect data, the overall chance of lots of data being leaked goes down.
- It protects everyone equally.
- It shifts the incentives towards not collecting, since the data becomes a liability.
- It empowers citizen to know what was collected through the use of GDPR requests. It further allows the citizen to know what that data was used for.
- It allows citizens to ask the collecting entity to remove any and all data they have about them.
- Finally, it allows legal recourse when data that was never supposed to get collected inevitably leaks.
So there are multiple things the GDPR does that a purely technical solution does not provide. As usual, the best protection is to have both. Used ublock, didn’t give consent, but still suspect foul play? Send the company a GDPR request, and if data was collected, ask for it to be deleted (and additionally report it to your gdpr regulation entity).
Also:
> - Too many websites place tracking cookies first and then let you disable them.
>
> - Too many apps use some kind of analytics without any consent
I’m pretty sure both of those are straight up illegal, thanks to the cookie law and the GDPR.
Love those cookie warnings. We need more popups like that. Imagine all the quality time spent clicking, knowing, for sure, that you are getting a cookie.
Article 5 of GDPR specifically deals with the reduction of what's collected and the destruction of data after it has served its purpose. The legislators were well aware that what isn't there cannot be lost, stolen or misused.
Not collecting personal data in the first place is always the preferred scenario.
Not even state actors with unlimited resources (that's you, NSA) can prevent stored secrets from leaking. It's ALWAYS something, whether teenaged hackers or far-flung contract system administrators with too much access (that's you, Mr. Snowden).
Rule 1. Don't collect data you don't need.
Rule 2. Don't store data you don't need.
Rule 3. Assume all data you store will leak, according to Murphy's law (at the worst possible time).
Rule 4. Make your stored data has limited utility. Eternal Blue (hi again NSA) was not such a secret.
Rule 5. Make sure your stored data has limited useful lifetime. US Social Security numbers do not have limited useful lifetime. Strangely enough, credit card numbers do have limited lifetime.
Rule 6. Do your best to set up leak detection. For example, seed your financial secret caches with fake social security numbers that raise flags when used.
Rule 7. See rule 3.
Secrets should be stored under the legal concept of strict liability. They're just like bulls in a farmer's field. If the bull escapes and causes damage, the farmer pays for it. No excuses. No need to prove negligence.
We have, up and running at scale, workers' compensation and the vaccine injury fund. Both of those assume strict liability. A dangerous factory sees its premiums go high enough to put it out of business. Same for a sloppy vaccine manufacturer.
Why can't NSA and Equifax be held to the same standard? (I'd hate to be POTUS announcing a tax increase to cover the damage caused by Eternal Blue.)
PHB: "We might need it someday for analytics, just keep storing it."
* Engineer, PHB leave company *
New Engineer: "Hey does anyone know what this data is being used for?"
New PHB: "No idea, just keep it running. Don't want to break anything."
* New Engineer, new PHB leave company *
New new Engineer: "What's all this data for?"
New new PHB: "No idea, just keep it running. Don't want to break anything."
...rinse, repeat.
But how about this: Security auditor: "What's all this data for?"
PHB: "I don't know, it's the way it's always been done here."
Security auditor: "Your data retention policy doesn't pass ISO27001 (or PCI or whatever). No certification for you.
Cyberinsurance company: "We're tripling your rates because you aren't certified."
CEO: "PHB, deal with this problem."
At least in theory, the GDPR is meant to restrict collection, not just how the data is used and stored. It has a big loophole in allowing for vague "business interests" to be taken into consideration whether collection is legal or not.
More than that, the GDPR clearly establishes ownership of PII and asserts that owners have the right to request information about how their data is handled as well as demand that data be destroyed, exported or corrected.
And while we're at it, let nuke the data brokers. Preferably literally.