Readit News logoReadit News
qup · 9 months ago
Is this the correct use of "opt-in?"

To me, having things "opt-in" means they're off and you can turn them on if you want.

If it's "opt-out" it's automatically on, and you can turn it off.

elAhmo · 9 months ago
Likewise, I think the title is literally of opposite what is actually happening.
alt227 · 9 months ago
I think they mean 'Enabled by default'
mejutoco · 9 months ago
Thus opt-out would be the correct term.
jyunwai · 9 months ago
You are correct. The headline author likely meant "opted in by default" or "enabled by default."

Deleted Comment

Ukv · 9 months ago
> Microsoft's Connected Experiences feature automatically gathers data from Word and Excel files to train the company's AI models. This feature is turned on by default, meaning user-generated content is included in AI training unless manually deactivated.

Not to say that Microsoft products respect privacy, but I don't see evidence that user Word/Excel files are being used for training.

The linked services agreement has had the same language (copy/transmit/etc. "to the extent necessary to provide the services") since at least 2015[0], and "connected experiences" seems to group a wide range of integrations; some like dictation/translation probably utilise ML, but that does not mean training on user content.

[0]: https://web.archive.org/web/20150608000921/https://www.micro...

itishappy · 9 months ago
To play devil's advocate, I don't see any evidence they're NOT training on user content either. Compared to how explicitly they indicate they're not using user content for targeted advertising, this seems like a huge oversight. Given how carefully they've put together these documents, I'm doubtful it was an oversight.
cptskippy · 9 months ago
I think it's appropriate to be concerned and seek clarification. And I don't like people immediately seeking to vilify Microsoft as if they came over to their house and shot their dog in front of their kids.
ca_tech · 9 months ago
Agreed. This was raised within our corp the other week and we read through the privacy and security documentation as it relates to Connected Experiences. Microsoft has outlined specifically what Connected Experiences covers.[1] [2] You could argue that predictive text is a product of machine learning but there is no clause allowing for training any generalized large language models using this data. The confusion may have arisen, if they read an article about CoPilot. If the user had a Microsoft Copilot 365 license, then the data would be used as grounding for their personal interaction with CoPilot. But still not used to train any foundational LLMs. However, even this data is still managed in compliance with Microsoft's data security and privacy agreements.

[1] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...

[2] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...

hulitu · 9 months ago
> Not to say that Microsoft products respect privacy

"Your privacy is very important for us" when you need to install an extension to have a blank start page (without ads) in Edge.

tjqgG · 9 months ago
A word processor stealing the user's IP by default should carry massive fines in the EU. This is pure deception. 20% of annual revenue should be appropriate.
jmclnx · 9 months ago
Hopefully full pretax revenue for Microsoft and all their subsidizes.
HelloUsername · 9 months ago
"In the M365 apps, we do not use customer data to train LLMs. This setting only enables features requiring internet access like co-authoring a document." @Microsoft365 https://twitter.com/Microsoft365/status/1861160874993463648
binarymax · 9 months ago
It’s absurd that Microsoft 365 uses Twitter as its official support announcement platform.

Official announcements about the outage the last couple days we’re posted there.

Twitter is a hostile platform that requires an account to view. Why does MS365 continue to use it?

Smar · 9 months ago
I wonder whether Twitter or M365 is more hostile towards users...
lupire · 9 months ago
Twitter was where the original misinformation was posted, so Twitter seems to be a good place to rebut it.
alt227 · 9 months ago
This seems like a security shit show.

Can we disable it by group policy across entire domains?

Surely no business would ever allow Microsoft to 'reformat, display, and distribute' confidential company documents?

Or am I missing something.

Thorrez · 9 months ago
Well, if there's some sort of cloud feature allowing you to share documents you write with others, it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.

However, the terms of service says "To the extent necessary to provide the Services to you and others, [...] and to improve Microsoft products and services". So they're saying they can use your content not just to provide you service, but to provide other people service and to improve all Microsoft products.

alt227 · 9 months ago
> it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.

That would be me sharing a specific document with a specific person. If their terms sepcified that they would only ""reformat, display, and distribute" to people we personally give permission to then that would be fine, but it doesn't.

HPsquared · 9 months ago
The word 'necessary' can do a lot of heavy lifting.
mschuster91 · 9 months ago
> "To the extent necessary to provide the Services to you and others, to protect you and the Services, and to improve Microsoft products and services, you grant to Microsoft a worldwide and royalty-free intellectual property license to use Your Content, for example, to make copies of, retain, transmit, reformat, display, and distribute via communication tools Your Content on the Services," the clause reads.

Well, this does make sense in the context of Office 365, OneDrive and the Office web apps in general. (Still dodgy regarding the "worldwide" part but there's no way around that because people can and do expect to access their stuff even while on vacation)

Silently enabling the training of remote AI however? That's not covered under any reasonable interpretation of the above legalese.

genrilz · 9 months ago
IANAL, but I think the "to improve Microsoft products and services" bit does mean that they do legally get to train their AI (which is a Microsoft service) on your data. Still a bastard move though.
jagged-chisel · 9 months ago
>… intellectual property license to use Your Content

Seems clear to me. Use any way Microsoft wants. The “for example” list is not exhaustive nor limiting.

genrilz · 9 months ago
IANAL again, but I don't think they get to do literally anything with your data. The phrase used is "to the extent necessary". For instance, I don't think they could scrape their user data for trade secrets and then sell those to the highest bidder.
orev · 9 months ago
Title as of the time of this comment:

> Microsoft Word and Excel AI data scraping slyly switched to enabled by default — the opt-out toggle is not that easy to find

As a tech person, keeping up with disabling and avoiding all this is becoming exhausting. I can’t imagine any regular non-tech person having any chance at avoiding it.

Is it time to just give up? At what point do you have to accept that the tsunami is here and there’s nothing you can do about it?

trod1234 · 9 months ago
Worse than exhausting, this is clearly a pattern of abuse done by purposeful intent.

Security fatigue is a well known thing in IT. Configuration fatigue where your configurations malevolently switch back on after the options you chose, disabled them is just as bad, resulting in vexatious experiences.

This is the problem when antitrust is not enforced, and regulation has killed all other smaller market participants. It creates dynamics (abuses) that cause societal upheaval which inevitably lead to violence.

Its really stupid, but the people making these decisions are evil people. Every reasonable person knows that actions have consequences.

greentxt · 9 months ago
>At what point do you have to accept that the tsunami is here and there’s nothing you can do about it?

Around the late 2000's, but maybe it was earlier. The best time to buy msft stock is always right now.

squigz · 9 months ago
The solution isn't to give up or attempt to avoid it - it's to make this sort of thing illegal.
rurp · 9 months ago
Yes, exactly. There's no reason for the burden to be on every single user of every product to disable this crap. The law should require companies to behave more ethically with real consequences if they do not.
robin_reala · 9 months ago
I just checked and this is turned off in my installation, but I’m not sure that’s from being EU based, or because my org has disabled it.