I cannot find it now because OpenAI blocks archive.org (oh the irony), but previously their API privacy policy said no data retention beyond 30 (or 90, I can't recall) day period for abuse/monitoring. I know because I was researching this for an (EU) customer.
Now, the promise is still to not train on API inputs/outputs, but the retention promise is nowhere to be found, unless you're an Enterprise customer (ZDR).
Moreover, at least in my understanding of the order, the court ordered them to keep ALL, not "all unless for those you promised not to keep":
> OpenAI is NOW DIRECTED to preserve and segregate all output log data that would otherwise be deleted on a going forward basis until further order of the Court (in essence, the output log data that OpenAI has been destroying), whether such data might be deleted at a user’s request or because of “numerous privacy laws and regulations” that might require OpenAI to do so.
So in effect, the statement you quoted is false, and OpenAI is actually in breach of their privacy policies.
I asked ChatGPT to parse this in case I misunderstood it, and its interpretation was:
> The company is ordered to preserve all output log data that would otherwise be deleted, going forward, regardless of the reason for deletion (user request, privacy law, default retention policy, etc.).
> In other words: They must stop deleting any output log data, even if it would normally be deleted by default or at a user's request.
> This is more than just keeping data they would normally retain — it's a preservation order for data they would otherwise destroy.
As a result, OpenAI is now unusable for serious business use in Europe. Since the competition (Anthropic, Google) is not affected by this court order, the only loser here is OpenAI.
> Late Thursday, OpenAI confronted user panic over a sweeping court order requiring widespread chat log retention—including users' deleted chats—after moving to appeal the order that allegedly impacts the privacy of hundreds of millions of ChatGPT users globally.
When "delete" actually means "hide from my view", you can only hope that you live in a country with strong privacy and data protection laws.
Even when companies are honest about how "delete" works, they will use weasel language such as "delete from history" or "delete from inbox" instead of actually doing the thing the user intends.
Part of that is the company doesn't even know what they do internally - sometimes it purges instantly, other times it gets marked deleted in a database table that never gets purged until the fifteen billion row table takes down the service.
This half the time due to people wanting to recover their accidental mistakes from deleting stuff.
IMO hard deleting things is generally a bad practice until the user wants to delete their entire account or some other very explicit action is executed.
I have no experience with this, but I imagine actually deleting files from some giant collection of data that needs to be safely backed up is borderline impossible, no?
I expect that the big tech companies have all kinds of cold storage backups and no one is going to actually go spelunking in those archives to physically delete my data when I delete an email. It's more likely that they will delete the keys to decrypt it, but even the keys must be safely stored somewhere, so it's the same problem just with less data.
> When "delete" actually means "hide from my view", you can only hope that you live in a country with strong privacy and data protection laws.
I do, but presumably that doesn’t matter, as the US thinks its legal code outweighs the legal code for Europeans living in Europe. Jokes on Europeans for allowing Americans to take over the world stage for too long, I suppose.
Question for the crowd: if using the OpenAI service in Azure, is that included in the retention? OpenAI say API access but don’t specify if that’s just their endpoints or anyone running their models.
You’d have to check with Microsoft. OpenAI says that this doesn’t apply to customers with a Zero Data Retention endpoint policy, but my recollection is that Azure OpenAI doesn’t fall into that category unless it’s something that is explicitly paid for. That said, OpenAI also says that ChatGPT Enterprise customers aren’t impacted (aside from their standard policies around how long it takes to delete data, which they say is within 30 days), but only Microsoft would know if their API usage counts as “enterprise” or not.
https://openai.com/index/response-to-nyt-data-demands/
That official response was discussed on this website yesterday. Here is a link to the discussion:
https://news.ycombinator.com/item?id=44196850
This does not impact API customers who are using Zero Data Retention endpoints under our ZDR amendment.”
The court order seems reasonable. OpenAI must retain everything is can, and has not promised not to, retain.
Now, the promise is still to not train on API inputs/outputs, but the retention promise is nowhere to be found, unless you're an Enterprise customer (ZDR).
Moreover, at least in my understanding of the order, the court ordered them to keep ALL, not "all unless for those you promised not to keep":
> OpenAI is NOW DIRECTED to preserve and segregate all output log data that would otherwise be deleted on a going forward basis until further order of the Court (in essence, the output log data that OpenAI has been destroying), whether such data might be deleted at a user’s request or because of “numerous privacy laws and regulations” that might require OpenAI to do so.
So in effect, the statement you quoted is false, and OpenAI is actually in breach of their privacy policies.
I asked ChatGPT to parse this in case I misunderstood it, and its interpretation was:
> The company is ordered to preserve all output log data that would otherwise be deleted, going forward, regardless of the reason for deletion (user request, privacy law, default retention policy, etc.).
> In other words: They must stop deleting any output log data, even if it would normally be deleted by default or at a user's request.
> This is more than just keeping data they would normally retain — it's a preservation order for data they would otherwise destroy.
As a result, OpenAI is now unusable for serious business use in Europe. Since the competition (Anthropic, Google) is not affected by this court order, the only loser here is OpenAI.
> The court order seems reasonable.
Checks out.
When "delete" actually means "hide from my view", you can only hope that you live in a country with strong privacy and data protection laws.
IMO hard deleting things is generally a bad practice until the user wants to delete their entire account or some other very explicit action is executed.
I expect that the big tech companies have all kinds of cold storage backups and no one is going to actually go spelunking in those archives to physically delete my data when I delete an email. It's more likely that they will delete the keys to decrypt it, but even the keys must be safely stored somewhere, so it's the same problem just with less data.
Dead Comment
I do, but presumably that doesn’t matter, as the US thinks its legal code outweighs the legal code for Europeans living in Europe. Jokes on Europeans for allowing Americans to take over the world stage for too long, I suppose.
You can search, browse and continue your chats 100% offline.
It’s free while in beta https://testflight.apple.com/join/RJx6sP6t
So how does this work with services using the API like Copilot or Cursor?
Is OpenAI now storing all the code sent to the API?