"The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her."
Which means that if e.g. a bank declines your mortgage application based on a decision from a fully automated system you have a right to have a human being review it.
Interestingly this made banks reluctant to use AI and helped with efforts in development of so-called "explainable AI".
Just did exactly that with Avant. Could see where I was initially approved but then their manager said I was "still on contract" and quashed my application.
In that case there is a human being who is legally responsible for the decision that is made. I have no problem with that. If the person is "just nodding", with possibly illegal consequences, that person can be sued.
Some countries have pushed it further : for administrative decisions where a program was used (whether alone or helping a decision), all the steps of the algorithm resulting in the decision must be provided in clear and simple language (if a request is made).
It is about whose mistake or choice it is, if a bank denies loans to $ProtectedCategory because of a bug in its software it is one thing, if it is because employees have biases and discriminate applicants it is another.
Instead of taking them to court, send an email to your local Data Protection Authority and complain while CC'ing github. Results aren't guaranteed, but the effort involved is small.
I'd recommend reviewing the rules of specific DPA, in Poland you have to provide quite sensitive data in the request (e.g. your full name and address) and has to be digitally signed - you can't just e-mail them. All other electronic messages can be legally ignored.
Your experience is not unique. Even when data is returned, it's sometimes incomplete or (purposefully?) made near impossible to read. I’m working with a privacy lawyer and a couple other engineers on some tools that would allow users to exert their rights (under GDPR, CCPA, and similar laws) more effectively. If anyone is interested in learning more or being a test subject, feel free to reach out.
I use the free plan, don’t reside in Europe, and recently wanted a backup. If you’re in the same boat, I recommend the following project — it has good documentation and immediately worked.
I've been delaying allowing whatsapp from sharing my data with facebook for a while now, but last news is that unless I give in to the extortion, I won't be allowed to send and receive messages to my contacts.
I'm going to request all my data to Whatsapp using GDPR before switching all my conversations to Telegram, I guess.
This won't work with WhatsApp because since they do not store your messages in plaintext.
I had to use some unofficial software ( https://www.wazzapmigrator.com/ ) to extract and store the messages from an unencrypted iPhone backup. Then you have everything in an sqlite file.
I was just looking through my google drive and there's no trace of my whatsapp chat backups. Even after running another backup. And pages online confirm what you're saying.
I guess I'll have to use my GDPR rights.
Interestingly enough, there only is the possibility of exporting account information info, but the information page about that procedure explicitly says that messages are not included.
That's relevant because since there's no other procedure to export data, this means that Whatsapp is already not okay with GDPR procedures.
A bit of a tangent, but I think some people often criticize GDPR because it doesn’t have perfect enforcement, or some people are annoyed by the cookie banners.
However, a positive aspect I don’t hear talked about enough is how it has had a chilling effect (in the most positive pro consumer way possible) I’ve noticed in my industry people are just much more careful about user data now, compared to it hardly being talked about before GDPR. Just the threat of those fines has scared C levels enough to put at least some engineering resources on privacy and security where there was much less before from my experience.
That's not accurate, at least as far as GDPR is concerned.
Only necessary ones don't need consent, but the bar for "necessary" is high: the software wouldn't be able to function without it and there's no way to implement the software without it. Think: "address" is necessary for "delivery".
Even then you still need consent to store the cookie under most versions of the "Cookie law", which is a complementary but different thing to GDPR.
Very true. Enforcement is never 100%. Crimes will continue to happen, some people will always get away with it. But that doesn't mean we should give up on making laws.
I think DNT could be rescued if it could be turned into a browser-wide consent UI. Currently, with its history of being set to 1 by default in some browsers, it doesn't really distinguish between "I don't consent" and "I haven't expressed an opinion", giving sites an excuse to ask you anyway.
Myself, I wish another GDPR iteration would instead mandate the shape and form of the initial consent popup, requiring it to fit to the following template (or something similar/equivalent):
+------------------------------------------------+
| Allow additional data collection? [X] |
| |
| This site would like to use technical means |
| such as cookies and local storage to collect |
| data about you and your computer. This data is |
| not necessary for the correct functioning of |
| this site, and does not impact the service |
| it provides. |
| |
| Do you consent to this opt-in data collection? |
| |
| GDPR requires this message to be shown because |
| the data collection requested is not necessary |
| and may carry data privacy risks. Necessary |
| data collection does not require consent form. |
| |
| [Learn purposes and] [>I do not consent<] |
| [configure consent ] |
+------------------------------------------------+
With an explicit [>I do not consent<] button, pre-selected, in the "call to action" color, doing the same thing as [X] does, which is declining data collection described. Displayed in the same language website content is, and with specific regulations guarding against the common "dark pattern" bullshit. I'm sure Brussels has some webdevs that would be happy to provide standard templates and React components and whatnot, so that site authors could just plug in a stylesheet and a JSON blob to configure the [Learn purposes...] section.
The ultimate solution would be for member states' DPAs to get off their collective butts and start issuing fines for the current crop of blatantly illegal consent popups, but in the interim, it would be helpful to regulate the popups, so that they clearly communicate that a) they're requesting strictly unnecessary tracking that can be safely ignored, b) showing an annoying popup is a choice by the website owners, who decided to request consent for additional tracking.
You are not wrong. Browsers could standardize the information exchange and approval that happens for cookies and then implement a sane UI for that similar to how e.g. browser location tracking requires opting in. That should be a perfectly valid alternative for home grown UX that website developers add themselves and offer a better UX.
To do this you would need to provide the legalese for that in some standardized way so the browser can pop up some UI that allows users to review that and approve/reject that. It should simply refuse any kind of cookie until the user has approved. That approval should be removable as well. Part of that should also cover having a sane API around that so sites can decide if they need to fall back to displaying their popups for this. Browsers that support this could even start defaulting to block all forms of cookies until explicit permission is in place for a website, regardless of existing UI. Many users have extensions that do this.
Wouldn't be the worst idea. Of course the flip side is that it also makes it easier for users to say "no" a lot (I would). And there is the notion that this may be a grey area under the current legal text. And of course some browser vendors have vested interest in the whole cookie & tracking business (Google).
> I’ve noticed in my industry people are just much more careful about user data now
My company and all the sites that I use didn't decrease data collection by a bit. They just added consent form in place of T and C. I would like to hear any counterexamples though, that some company actually stopped collecting data that they were collecting before GDPR.
I did this to a bunch of companies a while ago now, the results were incredibly boring with the exception of eBay who delivered the data by posting a USB drive to my house.
"The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her."
Which means that if e.g. a bank declines your mortgage application based on a decision from a fully automated system you have a right to have a human being review it.
Interestingly this made banks reluctant to use AI and helped with efforts in development of so-called "explainable AI".
Of course I document my export format too so others can do the same with data from my app: https://github.com/TheLastProject/Catima/wiki/Export-format :)
The sad part is the allowed 30 days timeframe. Stocard really abuses this to make you wait as long for your data as they legally can make you wait: https://twitter.com/SylvieLorxu/status/1389343401435439112
List of local DPAs: https://edpb.europa.eu/about-edpb/about-edpb/members_en
That said, where does the GDPR stand when there is an API already in place that allows for data extraction?
GDPR doesn't require me to become a user to get my data.
https://todoist.com/help/articles/backups
I use the free plan, don’t reside in Europe, and recently wanted a backup. If you’re in the same boat, I recommend the following project — it has good documentation and immediately worked.
https://github.com/darekkay/todoist-export
I've been delaying allowing whatsapp from sharing my data with facebook for a while now, but last news is that unless I give in to the extortion, I won't be allowed to send and receive messages to my contacts.
I'm going to request all my data to Whatsapp using GDPR before switching all my conversations to Telegram, I guess.
I had to use some unofficial software ( https://www.wazzapmigrator.com/ ) to extract and store the messages from an unencrypted iPhone backup. Then you have everything in an sqlite file.
I was just looking through my google drive and there's no trace of my whatsapp chat backups. Even after running another backup. And pages online confirm what you're saying.
I guess I'll have to use my GDPR rights.
Interestingly enough, there only is the possibility of exporting account information info, but the information page about that procedure explicitly says that messages are not included.
That's relevant because since there's no other procedure to export data, this means that Whatsapp is already not okay with GDPR procedures.
edit (2): I just sent an email to Whatsapp via their contact page (https://www.whatsapp.com/contact/?subject=messenger) asking for my data in accordance with GDPR. Let's see what happens.
https://telegram.org/blog/move-history
The annoying thing thoug, absolutely on whatsapp side, is that I have to export chats one by one.
Shouldn't I be enabled to extract my own backups, on my own gdrive, to read my own chats ?
However, a positive aspect I don’t hear talked about enough is how it has had a chilling effect (in the most positive pro consumer way possible) I’ve noticed in my industry people are just much more careful about user data now, compared to it hardly being talked about before GDPR. Just the threat of those fines has scared C levels enough to put at least some engineering resources on privacy and security where there was much less before from my experience.
Only necessary ones don't need consent, but the bar for "necessary" is high: the software wouldn't be able to function without it and there's no way to implement the software without it. Think: "address" is necessary for "delivery".
Even then you still need consent to store the cookie under most versions of the "Cookie law", which is a complementary but different thing to GDPR.
Myself, I wish another GDPR iteration would instead mandate the shape and form of the initial consent popup, requiring it to fit to the following template (or something similar/equivalent):
With an explicit [>I do not consent<] button, pre-selected, in the "call to action" color, doing the same thing as [X] does, which is declining data collection described. Displayed in the same language website content is, and with specific regulations guarding against the common "dark pattern" bullshit. I'm sure Brussels has some webdevs that would be happy to provide standard templates and React components and whatnot, so that site authors could just plug in a stylesheet and a JSON blob to configure the [Learn purposes...] section.The ultimate solution would be for member states' DPAs to get off their collective butts and start issuing fines for the current crop of blatantly illegal consent popups, but in the interim, it would be helpful to regulate the popups, so that they clearly communicate that a) they're requesting strictly unnecessary tracking that can be safely ignored, b) showing an annoying popup is a choice by the website owners, who decided to request consent for additional tracking.
To do this you would need to provide the legalese for that in some standardized way so the browser can pop up some UI that allows users to review that and approve/reject that. It should simply refuse any kind of cookie until the user has approved. That approval should be removable as well. Part of that should also cover having a sane API around that so sites can decide if they need to fall back to displaying their popups for this. Browsers that support this could even start defaulting to block all forms of cookies until explicit permission is in place for a website, regardless of existing UI. Many users have extensions that do this.
Wouldn't be the worst idea. Of course the flip side is that it also makes it easier for users to say "no" a lot (I would). And there is the notion that this may be a grey area under the current legal text. And of course some browser vendors have vested interest in the whole cookie & tracking business (Google).
My company and all the sites that I use didn't decrease data collection by a bit. They just added consent form in place of T and C. I would like to hear any counterexamples though, that some company actually stopped collecting data that they were collecting before GDPR.