Heres something I just don't get. Microsoft just got their ass handed to them by the US Government because os (lapses in) security. Extremely more than coincidentally, Satya Nadella told the entire Microsoft org that if anyone had to choose between features and security, to choose security. I'm hearing from Microsoft people that all product roadmaps are deferred for a few months while security features are addressed. Their whole corporate spiel is "Microsoft runs on trust" (see the famous standards of business training on youtube).
And then someone goes and invents Recall. This is not the work of a lone engineer and a principal PM fishing for Impact or whatever they call success at Microsoft. This had to have gone through multiple levels of review. Microsoft PMs, CVPs, their corpo legal people, marketing approval. And yet no one stopped to say, "wait, this could blow up in our faces"?
I'll take a shot (not at MS, so I have no inside info).
GenAI is hotter than anything else right now. As Satya publicly stated, "we made them dance" -- which shows how high-priority it is to maximize "AI innovation" at MS.
If you disagree, think about how badly "Tay" blew up in MS's face and yet they still went ahead and bundled OpenAI LLM tech into all of Office365, just so they could have bragging rights about beating Google to it.
At this point, it's a race (to where? who knows) and no Big Tech Corp wants to be seen as "not at the forefront".
You are absolutely right. This AI race has been sort of funny to watch among the big tech incumbents.
Google kept on launching faked demos and hurriedly released an openly race biased image generator all in a bid to catch up with OpenAI.
Meanwhile Apple has been sort of lethargic. Recently annouced a deal with OpenAI to add GPT to their devices. They seem happy to continue playing catch up in this regard.
I think Meta is probably the only big tech giant that has kind of got their execution right straight from the jump. Can't point to any slip ups from their "AI announcements".
Untrue. Employees are asked to focus on sec. The org tells you that the company runs on trust. This is not lip service - it's not that internally they state otherwise, and it's certainly naive to think so.
What happens is that, meanwhile, business is happening. People are making (small) decisions that adds up to big things, and leadership tends to trust those below to make the right calls and implement technology correctly. Which, time and time again, doesn't happen.
Yes, maybe it need to be more of a priority, but, it also isn't a conscious decision.
the reason they did this is because if they succeed in sneaking this out, MS would have been 'drinking the milkshake' of google, adobe, facebook and everyone else trying to do the same thing at an app level. it would be the holy grail of training data -- they could replace or simulate the entire user at an inter-app level for A/B testing anything they want, or worse. the idea of having full user behavior history of the majority of the worlds computer users is just too tempting not to try for them and im sure theyll try again despite any backlash.
It's interesting to compare this to the Chrome/Safari/Edge browsing history, which is stored in an unencrypted SQLite database, and tracks what you do for the last 90 days. It's just a bit less visual, Incognito/Private modes work, and some users clear it more often.
But a whole lot of the surveillance attacks people imagine about Recall apply just the same to the browser. I think it's the "little brother" casual attacks that are so well enabled by Recall - it makes it faster, easier, and way more visual.
Your browsing history is unlikely to contain personal information, secrets, porn images etc. And if you use Chrome, they get your full browsing history by default.
I get your point, but Microsoft's Recall can capture anything onscreen - emails, personal info, porn, passwords and the like. And it feels, bizarrely for 2024, that little thought has gone into privacy or security.
> that little thought has gone into privacy or security.
I think the thought is proportional to the amount of thought a non-tech customer will put into it. Nobody seems to care about or understands privacy these days. Everyone knows they're being tracked everywhere they go physically and on the web. People use their real names, address, etc for every junk service they sign up for, without seeing any reason not to. If you tell people that their TV is tracking and taking screenshots of what they watch [1], they say "yeah, Netflix knows too".
It's literally, "how it's always been" for any non tech person under 30.
> And it feels, bizarrely for 2024, that little thought has gone into privacy or security.
No, no. They thought about the privacy and security aspect. They decided that it's better for their bottom line if Windows users don't have privacy from the mother ship. Really, they already decided that way back when Windows Vista first came out and periodically asked Microsoft HQ if you should continue being allowed to use your computer.
I think they actually did consider that - that's why they emphasized it was all on device. They thought about it, they just didn't think about how little we would trust that promise.
1. Browsing history doesn't show what the user is doing on the page. There is a big difference between logging "user visited his e-banking app", and logging his actual credentials as they are entered.
2. Browsing history watches one app. Screenshots watch everything across the entire OS.
Not just credentials - account balances, account numbers, etc. There's a big difference between your browser history recording that you opened your bank or healthcare provider's web site and Recall recording everything that appeared on the screen while you did.
People might use Incognito mode to browse porn, but I imagine it's a lot less common when looking at other sensitive sites.
The ickier parts are on the unintended capture side, like enabling "show password" on a site doesn't affect browser history but Recall may capture it in the clear.
Or from history you may see that you accessed a site, but not what you did on it (what comments you typed for example).
The browser history may not, the cache and other local storage may well.
The take-away is simple though: Modern desktop operating systems need a security model where individual applications are sand-boxed and protected from each other.
Legacy systems have security models that protect users from each other, but this isn't the personal computing world we live in anymore.
no it isnt the same, you may know I went to my health care provider's website, maybe even to make an appointment depending on the url, but with recall, everything that is on the page will be stored, not just the url. It's totally different. So the message I sent my healthcare provider that is discussing some of my most sensitive medical issues will be available to read and a record is kept of it... not just the url. Do you not see the difference?
Yes, but one product cycle and there's metadata (like a background texture) that tells the OCR to skip this page. Or ask your local LLM if the user is talking about medical conditions? If you like the feature at all you can make these things work.
Continuing with the comparsion, Recall applies to the entire operating system not just one application. To avoid it, one has to avoid Windows.
Whereas to avoid browsing history, one only has to avoid the popular, graphical, advertising corporation browser. As I am not interesting in graphics, I do this everyday, with ease, because there are countless clients besides "Chrome/Safari/Edge" that work with the www for consuming information.
From the submitted article, it seems like Microsoft will change/secure the access (and maybe storage) in some way, though there's no details on the specifics.
At least on macOS, I can't navigate to the directory holding Safari's data with other apps (without special full-disk read permissions).
There's also always private browsing, which exists specifically because people are aware of the implications of a browsing history and a persistent cookie jar.
That awareness will be much harder to build for an always-on screen recorder.
I hate that most browsers do not let you set them to keep history for longer than 90 days.
I want to be able to find things I've seen before. Recall would've been great if using it didn't require me to update to a version of Windows that contains "Copilot".
One difference is that Web browser history has been there 30 years, since before most people at the time had even touched a Web browser.
At the time, it wasn't very thinkable that someone would have the audacity to take and abuse that information.
It dates from when Internet people overall were more savvy about privacy than users overall today are, but it was also when the Internet was closer to a trustworthy environment, and before Wall Street sociopath types took over the tech and the culture.
Lots of kinds of abuse that today are routine and almost universal, for even startup tech companies, (e.g., embedding third-party trackers into Web site, and getting even worse from there), I think would've gotten them ostracized, and outraged demands for criminal charges.
During the dotcom gold rush, there was such a flood of totally new, posturing people, and so much money being thrown wildly at everything, that any remaining outrage was lost in the noise.
And now virtually no one knows any different.
But if you're trying to push some new abuse today, I think ordinary people are starting to have some awareness of what vicious sociopathic buttholes tech companies have become, and so acceptance might not be a slam-dunk.
The vulnerability is that the first thing any malware that happens to run on the PC will do is upload the Recall database, giving the attacker your entire usage history since installation (and of any other user account on the same PC). This can then be analyzed for worthwhile targets for scams and blackmailing.
I expect it does, if you're using Chrome outside of Incognito Mode. Iirc, there is an opt-out about "web history" on the google account - which then disables some other things so that it annoys enough people into keeping it on.
My browser history does. It's synchronized with every edge instance that I have running. I can open up the tabs from my mobile browser on my desktop and I could see the browser history from everything on everything.
Browsing history doesn't contain what's displayed on the page, and what you input into the input boxes, or POST requests. It's sorta like telephone metadata.
On the other hand, I am always freaked out by Chrome extensions that "can read and change your data on all websites". Can't they have more granular permissions? You gotta have a lot of trust for those extensions LMAO. They can read your bank passwords, probably!! And if they are ever sold...
Exactly - knowing the content of each webpage is pretty easy if you're "big brother" surveilling millions of people, even more so if you have a Chrome extension to help.
It's "little brother" that benefits a lot here: bosses, spouses, parents, etc., who otherwise wouldn't click on 1000 links in your history.
Yes, they can change it, that's what Manifest V2 deprecation is about. It will break a lot of ad blockers, because they rely on being able to read anything and change anything on all websites. Many people feel that Google is doing it to make more people watch more ads, not to improve security.
To be fair for me the extensions that get that are uBO, Privacy Badger, and Tampermonkey.
I trust gorhill and the EFF to not fuck me over on my data, and Tampermonkey kinda needs those sorts of permissions to work. My password manager has read access to every website but I'm already trusting it with all of my passwords so...
This is a horrible comparison. Browsing history doesnt show the contents of the page. It doesnt show you what you were doing on that page. It doesn't reveal anything other than you went there and maybe how long.
Yeah, I think this entire debate is uninformed hysteria and manufactured outrage. "If an attacker has administrator access, they can see everything you have done on your computer!". OK? That has literally always been the case? "Attacker is root" is game over and always has been. The original writeup from DoublePulsar tried to justify that Recall is somehow different from other such scenarios, but I found it totally unconvincing.
I think it's the right move to have it off by default, but I'm just not convinced by the outrage here.
Except that before today you didn’t even need admin for access to the database, any process that is allowed to read things could access the Recall database.
In a typical bigcorp environment, laptops are loaded with silently installed spyware. Certainly equivalent to taking a screenshot every second or an always-on keylogger.
The horse is out of the barn for many people during work hours. But in the OS and on by default is a different story!
Someone with root access can't see anything that has been deleted, or the contents of secure web pages I've visited, like my banks, doctor, or email. Very different.
If there's AI involved, everyone's panic level skyrockets.
No one retweets "Attacker gaining root access reveals all user information", but instead "Attacker gaining root access reveals all user information collected by AI program" will go viral for sure.
When Recall is enabled, it should have an overlay stating that it is active so that all users are aware. Something at least as obvious as the old Windows activation overlay.[0]
Otherwise, every creepy roommate, bad partner, bad friend, etc... will take advantage of this to do bad things.
[0] Ideally more obvious, like when Windows screen recording is running.
Yes, but that's a third party app which requires install.
Recall will be pre-installed on Windows 11. The ubiquity is what scares me, and should make MS take a second thought about liability, as far as treating their uses right.
If you mean Win-G, then clicking screen record, it shows an overlay which clearly indicates that the screen is being recorded. I don't think that you can hide that overlay.
Will have to wait and see if the extra security measures actually improve anything or not.
However regarding it being opt out… what would prevent a virus from just enabling it on a bunch of machines silently. Sure it would be caught but the damage done and most won’t be bothered to go in and disable it after.
Or Microsoft just decides they need to really market the hell out of AI and it gets turned on my default anyways.
Without Recall, an attacker needs to get a program to stay resident in memory to log keystrokes, screen contents, etc. for an extended period of time without getting detected. With Recall, they can get the same end effect by exfiltrating the Recall database file whenever it's convenient (i.e. an infected version of a text editor could send it while pretending to check for updates). This significantly lowers the barrier to entry for getting a victim's data, while also making it much easier to avoid detection.
Virus turns on recall, user might not notice much. A real Microsoft service is running. It can then just wait and activate later. If the user notices recall on, they'll just blame Microsoft. You can then just turn it on again. You can already see that many users are suspect that it'll go back to being on by default sometime in the future too. It's not uncommon to see system updates change settings.
The virus doing the same things as recall will be much noiser and much more suspicious. Making it much more likely to be removed.
Not to mention that once recall has been running a virus only needs to extract the data. It records far more than what a password manager does and is far easier to search through. It just makes a very large attack surface.
Basically, why would anyone develop keyloggers anymore? Microsoft did it for you. And it'll never be tripped by antivirus software because it's an official and legitimately signed program. You don't see a problem with this?
They can't even do their own infra securely, or did you forget a advanced persistent threat entity was in their system and minting certs to access all of azure recently?
Please stop with these kinds of made up fantasy scenarios.
There's no such thing as "accidental enablement" for stuff like this, as if it's a switch every employee at Microsoft has access to, and one of them one day can end up flipping by accident with their elbow and it ends up in production without anyone else noticing.
Either they decide to intentionally enable it or not. There are no accidents , when stuff like this needs to go through a committee of people for approval before it makes it into production.
Corporate is never on the bleeding edge of Windows feature updates. They bring security updates first, but feature updates are at least one generation behind, maybe more waiting for Microsoft to fix bugs and doing their own regression testing, plus they get to choose wich features employees receive or are enabled by default via group policy. In other worlds, recall was never making it into any corporation anyway.
This is generally true, but Windows is the standard for far more SMBs than larger enterprise customers, and in that context it’s not nearly so straightforward. I have a client, a health insurance benefits broker for other local businesses. They do very well for themselves, but it’s just 2-3 full-time people, so there’s never been much cause for a full-on domain with GPO policies to maintain a strict, stable state across their equipment. Traditionally, off-the-shelf systems with SMB-targeted software had been more than sufficient.
When Microsoft decided to push a feature upgrade last year that automatically enabled OneDrive backups for their home directories, it technically violated HIPAA by moving electronic patient health information contained within their scanned files folder onto OneDrive servers without any prior consent or authorization. They literally called me when they were unable to find their files, Microsoft had (laughably, if it weren’t so serious) placed a text file on the desktop titled “Where Did My Files Go.txt”, and then directed them to the OneDrive folders where it had moved their desktops, documents, and pictures without their knowledge or approval.
I have since moved them to Microsoft 365 accounts where I can apply GPO, but my clients were understandably unhappy about having a new annual subscription that didn’t add any tangible benefit, rather they’re now on the hook for a couple hundred bucks a year for what’s essentially a shake down. Pay for the new service that adds nothing meaningful to their experience, or else face the consequences of Microsoft ruining your business on a whim.
I would have agreed with you until revently. And now, everyone is throwing email, chat, code, everything into cloud based AI tools at a highly regulated company. This happened 6 months after they just locked everything down for actual employees because of an IP leak. Very strange times…
I think on the product side it’s pretty straight forward. They saw RewindAI talking up a bunch of traction and people seemingly interested. Someone assumed customers wanted this because of that data, and it’s a pretty easy thing to build, so they went ahead. I am surprised it got past security reviews but I can understand how it came to be from the product side.
They’ll probably think twice before jumping into the fray again with the Microsoft branded Informant Wire (I mean AI wearable) ;)
I bet there are a trillion companies and governments who want to know what all of their employees are doing every second of the workday. compliance won't stop them from trying.
I don't understand how Outlook isn't a compliance nightmare. Especially since it's moved to the cloud. The amount of very sensitive data Microsoft must have on just about every single business/industry thanks to outlook and excel is insane.
At one place I worked when the company replaced my old machine with a new windows 10 system it was configured to send every single keystroke back to Microsoft. There was zero concern over privacy or compliance, just an assumption that MS would never abuse that data for any reason. I did not have their faith and disabled that "feature" then changed a massive number of other policies to try and keep as much data out of Microsoft's hands as I could.
Depends on the compliance. If this monitoring sucks up any personal data (I don't mean employees' data here—personal data owned by anyone) there are erasure and data subject access requirements, for instance.
Security compliance generally does not require a third-party company, unaffiliated with the corporation, to be sent a copy of everything shown on a user's screen.
Corporate clients get whatever they want. I am certain that their Windows 10 support won't be pulled in Oct 2025 as MS has threatened for everyone else. And when they migrate to Win11, it will almost certainly be a separate OS image free of the garbage bloatware and ads that the consumer devices are plagued with.
Am I just imagining their saying that Windows 10 would be the last Windows? I had thought they would be moving to an Apple-esque model where OS updates would just become iterative and avoid the old EOL/upgrade cycle. It’s how I justified all of their tangential money-grabs on other fronts.
Any company that has compliance requirements to keep devices supported with security updates, it's the same as Win 7 to Win 10; you either update everyone to Win 11 or you pay for the security updates for Win 10 (IIRC you have 3 years to update before you can't pay anymore). Many will likely already be on Win 11 as the upgrade path is easier/quicker than Win 7 to 10.
Also they will not have the gunk installed anyway as they will almost certainly have Windows Enterprise which has more policies that can be set, and then they will also be ordering devices from an OEM or distributor that doesn't have the junk included.
Heck, if they aren't doing Autopilot from the OEM or distributor, they will almost certainly be applying their own Windows image.
On LinkedIn someone in my network pointed out that, apart from the security and privacy disaster, the name Recall was a bad choice because of negative events like product recall.
It would actually be a fantastic name if this were a real concern. Imagine, a well-known feature to mask any searches of a product recall. The only problem with this theory is that computer QA is so incredibly shit that the concept of a recall more or less doesn’t exist in the first place.
"Total Recall" in quotes makes me think you're trying to get your ass back to Mars and that you're trying to remember something because you had your memories wiped. It makes me think of nothing about a friendly service being offered forcefully upon you from your friendly and malevolent OS provider.
What's funny about this and other "recalls" (pun intended) of "products" from so-called "tech" companies as a result of "feedback", or "backlash" as Wired calls it, is that the companies never asked for such input and AFAIK no one is contacting the companies to give it. AFAICT it is obtained through surveillance. To people born after the internet this might seem normal, but to me it is quite odd. The companies claim to be operating in service of "users" but there is generally no direct contact between these "users" and the company.^1 With some isolated exceptions that have increased over time, there is no customer service. And in most cases the "customer" has not paid the company for the so-called "product". Generally, no one is asking a refund on their purchase of Windows because generally no one pays Microsoft for it. Instead people just complain into the ether.
No need to pay for being the target of surveillance. The "products" are free.
1. Unless we count the telemetry and "auto-updates". Users never asked for this stuff though, it is not initiated by them. This "product" is broken on delivery hence the alleged need to keep "fixing" it by remotely installing more software, presumably that isn't broken and will not used for surveillance, on peoples' computers. All for free. There is no money to refund if the "product" does not work as expected.
Don't worry, even when you pay very high prices for products and service you're still being spied on. A company will always make more money by charging you as much as possible and then also collecting every scrap of data they can get their hands on. Every smart TV is spying on users and many are pushing ads. How many people do you think demanded their money back? Every game console sold still spies on you. Every car. Every cell phone. When there are no products that don't spy on us what will people do? Return them all and live in empty houses?
It's so weird to me that a company like Microsoft would care that much about "reputation". Everyone basically hates them already. Many of the most successful companies in the US are widely (if not universally) hated by the people who pay them. Nobody loves comcast, or exxonmobil, or centurylink, or EA, or equifax, or facebook. People feel trapped and unable to avoid paying some companies or using their products no matter how much they hate them. How many people have paid Microsoft for their OS at all? How much money would they really lose if they ignored the bad press? How many grandmas would start downloading linux?
I'm glad Microsoft is making changes, but I wonder how much is out of fear for their reputation and how much is just to try and comfort people and get the news to stop talking about it so that everyone doesn't just disable it as soon as it rolls out.
If there was an HN commenter with a username taken from a DOS batch file who complains about game console surveillance, then would it be safe to conclude this might be a non-enterprise Windows user.
A reply about tangible products that people can choose to purchase or choose not to purchase is not a response to the comment I am making. These are examples of conscious choices, coupled with payment.
With few exceptions, non-enterprise Windows users are not consciously choosing to purchase a Windows license. They are choosing to purchase a computer. The decision to purchase a Windows license was made by the OEM. If we ask these computer purchasers what they bought, they are likely to describe a tangible product, e.g., "I bought a computer", not a license to use an operating system.
If something is wrong with the computer, then the purchaser can generally contact the seller/manufacturer for redress. But when the problem is with Windows, consumers generally do not contact Microsoft. Instead they complain into the ether.
Good luck enforcing warranties or products liability laws against Microsoft with respect to a free copy of non-enterprise Windows that was pre-installed on a computer. Windows can be broken six ways to Sunday, it can be a defective "product", and Microsoft can take its time fixing the problems, or even just leave them as is.
Few things that Recall can do:
- Make sure Recall is per user and not something that can be installed system wide for all users.
- When a user enable Recall, it should ask to setup a "Recall password" and generate a private public key and use the password to encrypt the private key.
- Use the above public key to encrypt all the data it stores.
- When user wants to search Recall history, ask for the password, decrypt the private key and use the decrypted private key to decrypt the data and show the data to user.
- Show some sort of indicator on taskbar that Recall is running, not a tray icon (which can be hidden), but a proper big red circle kind of thing.
To me this seems like another case of MS top executives telling every team that they have to do something with AI. Typical approach used by many executives and managers - "Here is a new tech, figure out a product to build with it".
And then someone goes and invents Recall. This is not the work of a lone engineer and a principal PM fishing for Impact or whatever they call success at Microsoft. This had to have gone through multiple levels of review. Microsoft PMs, CVPs, their corpo legal people, marketing approval. And yet no one stopped to say, "wait, this could blow up in our faces"?
GenAI is hotter than anything else right now. As Satya publicly stated, "we made them dance" -- which shows how high-priority it is to maximize "AI innovation" at MS.
If you disagree, think about how badly "Tay" blew up in MS's face and yet they still went ahead and bundled OpenAI LLM tech into all of Office365, just so they could have bragging rights about beating Google to it.
At this point, it's a race (to where? who knows) and no Big Tech Corp wants to be seen as "not at the forefront".
That's my $0.02, anyway :)
Google kept on launching faked demos and hurriedly released an openly race biased image generator all in a bid to catch up with OpenAI.
Meanwhile Apple has been sort of lethargic. Recently annouced a deal with OpenAI to add GPT to their devices. They seem happy to continue playing catch up in this regard.
I think Meta is probably the only big tech giant that has kind of got their execution right straight from the jump. Can't point to any slip ups from their "AI announcements".
Actions speak louder than words.
We should assume that everyone is on their worst behavior going forward. Zoom, Slack, MS, Apple, etc.
What happens is that, meanwhile, business is happening. People are making (small) decisions that adds up to big things, and leadership tends to trust those below to make the right calls and implement technology correctly. Which, time and time again, doesn't happen.
Yes, maybe it need to be more of a priority, but, it also isn't a conscious decision.
Deleted Comment
> And then someone goes and invents Recall
Maybe you should read about the history of Microsoft, especially about its security.
But people forget easily.
But a whole lot of the surveillance attacks people imagine about Recall apply just the same to the browser. I think it's the "little brother" casual attacks that are so well enabled by Recall - it makes it faster, easier, and way more visual.
I get your point, but Microsoft's Recall can capture anything onscreen - emails, personal info, porn, passwords and the like. And it feels, bizarrely for 2024, that little thought has gone into privacy or security.
I think the thought is proportional to the amount of thought a non-tech customer will put into it. Nobody seems to care about or understands privacy these days. Everyone knows they're being tracked everywhere they go physically and on the web. People use their real names, address, etc for every junk service they sign up for, without seeing any reason not to. If you tell people that their TV is tracking and taking screenshots of what they watch [1], they say "yeah, Netflix knows too".
It's literally, "how it's always been" for any non tech person under 30.
[1] https://themarkup.org/privacy/2023/12/12/your-smart-tv-knows...
No, no. They thought about the privacy and security aspect. They decided that it's better for their bottom line if Windows users don't have privacy from the mother ship. Really, they already decided that way back when Windows Vista first came out and periodically asked Microsoft HQ if you should continue being allowed to use your computer.
Of all the places on your computer that might contain porn images, that would be one of the very top candidates.
2. Browsing history watches one app. Screenshots watch everything across the entire OS.
People might use Incognito mode to browse porn, but I imagine it's a lot less common when looking at other sensitive sites.
Or from history you may see that you accessed a site, but not what you did on it (what comments you typed for example).
Does your browser history store pictures of your family?
The take-away is simple though: Modern desktop operating systems need a security model where individual applications are sand-boxed and protected from each other.
Legacy systems have security models that protect users from each other, but this isn't the personal computing world we live in anymore.
Whereas to avoid browsing history, one only has to avoid the popular, graphical, advertising corporation browser. As I am not interesting in graphics, I do this everyday, with ease, because there are countless clients besides "Chrome/Safari/Edge" that work with the www for consuming information.
Recall seems to be storing its info locally in an unencrypted SQLite database as well.
At least, that's according to the instructions here on how to access and view the contents:
https://www.heise.de/en/news/First-experiences-with-Recall-9...
From the submitted article, it seems like Microsoft will change/secure the access (and maybe storage) in some way, though there's no details on the specifics.
There's also always private browsing, which exists specifically because people are aware of the implications of a browsing history and a persistent cookie jar.
That awareness will be much harder to build for an always-on screen recorder.
I want to be able to find things I've seen before. Recall would've been great if using it didn't require me to update to a version of Windows that contains "Copilot".
At the time, it wasn't very thinkable that someone would have the audacity to take and abuse that information.
It dates from when Internet people overall were more savvy about privacy than users overall today are, but it was also when the Internet was closer to a trustworthy environment, and before Wall Street sociopath types took over the tech and the culture.
Lots of kinds of abuse that today are routine and almost universal, for even startup tech companies, (e.g., embedding third-party trackers into Web site, and getting even worse from there), I think would've gotten them ostracized, and outraged demands for criminal charges.
During the dotcom gold rush, there was such a flood of totally new, posturing people, and so much money being thrown wildly at everything, that any remaining outrage was lost in the noise.
And now virtually no one knows any different.
But if you're trying to push some new abuse today, I think ordinary people are starting to have some awareness of what vicious sociopathic buttholes tech companies have become, and so acceptance might not be a slam-dunk.
Deleted Comment
On the other hand, I am always freaked out by Chrome extensions that "can read and change your data on all websites". Can't they have more granular permissions? You gotta have a lot of trust for those extensions LMAO. They can read your bank passwords, probably!! And if they are ever sold...
It's "little brother" that benefits a lot here: bosses, spouses, parents, etc., who otherwise wouldn't click on 1000 links in your history.
I trust gorhill and the EFF to not fuck me over on my data, and Tampermonkey kinda needs those sorts of permissions to work. My password manager has read access to every website but I'm already trusting it with all of my passwords so...
Most of us know that the public Internet is based on surveillance capitalism, no matter if we hate it or are just complacent or ignorant.
OS wide is far more problematic and of low value to the user.
Deleted Comment
I think it's the right move to have it off by default, but I'm just not convinced by the outrage here.
In comparison browser history is nothing.
The horse is out of the barn for many people during work hours. But in the OS and on by default is a different story!
No one retweets "Attacker gaining root access reveals all user information", but instead "Attacker gaining root access reveals all user information collected by AI program" will go viral for sure.
Otherwise, every creepy roommate, bad partner, bad friend, etc... will take advantage of this to do bad things.
[0] Ideally more obvious, like when Windows screen recording is running.
Recall will be pre-installed on Windows 11. The ubiquity is what scares me, and should make MS take a second thought about liability, as far as treating their uses right.
MS just did what every other micromanagement company did and took screenshots every second or so.
However regarding it being opt out… what would prevent a virus from just enabling it on a bunch of machines silently. Sure it would be caught but the damage done and most won’t be bothered to go in and disable it after.
Or Microsoft just decides they need to really market the hell out of AI and it gets turned on my default anyways.
The virus doing the same things as recall will be much noiser and much more suspicious. Making it much more likely to be removed.
Not to mention that once recall has been running a virus only needs to extract the data. It records far more than what a password manager does and is far easier to search through. It just makes a very large attack surface.
Basically, why would anyone develop keyloggers anymore? Microsoft did it for you. And it'll never be tripped by antivirus software because it's an official and legitimately signed program. You don't see a problem with this?
This is what will happen. And when you turn it off again, it'll be turned back on by the next update. Enjoy.
If that occurs, the malware won't have access to months or years of data to sift through.
Malware that scrapes it and malware that turn it don't need to be the same.
Deleted Comment
There's no such thing as "accidental enablement" for stuff like this, as if it's a switch every employee at Microsoft has access to, and one of them one day can end up flipping by accident with their elbow and it ends up in production without anyone else noticing.
Either they decide to intentionally enable it or not. There are no accidents , when stuff like this needs to go through a committee of people for approval before it makes it into production.
When Microsoft decided to push a feature upgrade last year that automatically enabled OneDrive backups for their home directories, it technically violated HIPAA by moving electronic patient health information contained within their scanned files folder onto OneDrive servers without any prior consent or authorization. They literally called me when they were unable to find their files, Microsoft had (laughably, if it weren’t so serious) placed a text file on the desktop titled “Where Did My Files Go.txt”, and then directed them to the OneDrive folders where it had moved their desktops, documents, and pictures without their knowledge or approval.
I have since moved them to Microsoft 365 accounts where I can apply GPO, but my clients were understandably unhappy about having a new annual subscription that didn’t add any tangible benefit, rather they’re now on the hook for a couple hundred bucks a year for what’s essentially a shake down. Pay for the new service that adds nothing meaningful to their experience, or else face the consequences of Microsoft ruining your business on a whim.
everyone else just gets a laptop, unboxes it, turns it on, uses it, does whatever they want to it
see: any retail location in a strip mall, any mom/pop business, etc etc
They’ll probably think twice before jumping into the fray again with the Microsoft branded Informant Wire (I mean AI wearable) ;)
Once you have the Recall capabilities, it doesn't take much to start collecting and searching the data.
[1]: https://www.patrick-breyer.de/en/posts/chat-control/
At one place I worked when the company replaced my old machine with a new windows 10 system it was configured to send every single keystroke back to Microsoft. There was zero concern over privacy or compliance, just an assumption that MS would never abuse that data for any reason. I did not have their faith and disabled that "feature" then changed a massive number of other policies to try and keep as much data out of Microsoft's hands as I could.
Compliance doesn't say "company can't watch employee" -- in many cases it mandates surveillance.
This just lets the employee leverage that too.
Any company that has compliance requirements to keep devices supported with security updates, it's the same as Win 7 to Win 10; you either update everyone to Win 11 or you pay for the security updates for Win 10 (IIRC you have 3 years to update before you can't pay anymore). Many will likely already be on Win 11 as the upgrade path is easier/quicker than Win 7 to 10.
Also they will not have the gunk installed anyway as they will almost certainly have Windows Enterprise which has more policies that can be set, and then they will also be ordering devices from an OEM or distributor that doesn't have the junk included.
Heck, if they aren't doing Autopilot from the OEM or distributor, they will almost certainly be applying their own Windows image.
No need to pay for being the target of surveillance. The "products" are free.
1. Unless we count the telemetry and "auto-updates". Users never asked for this stuff though, it is not initiated by them. This "product" is broken on delivery hence the alleged need to keep "fixing" it by remotely installing more software, presumably that isn't broken and will not used for surveillance, on peoples' computers. All for free. There is no money to refund if the "product" does not work as expected.
I'm glad Microsoft is making changes, but I wonder how much is out of fear for their reputation and how much is just to try and comfort people and get the news to stop talking about it so that everyone doesn't just disable it as soon as it rolls out.
A reply about tangible products that people can choose to purchase or choose not to purchase is not a response to the comment I am making. These are examples of conscious choices, coupled with payment.
With few exceptions, non-enterprise Windows users are not consciously choosing to purchase a Windows license. They are choosing to purchase a computer. The decision to purchase a Windows license was made by the OEM. If we ask these computer purchasers what they bought, they are likely to describe a tangible product, e.g., "I bought a computer", not a license to use an operating system.
If something is wrong with the computer, then the purchaser can generally contact the seller/manufacturer for redress. But when the problem is with Windows, consumers generally do not contact Microsoft. Instead they complain into the ether.
Good luck enforcing warranties or products liability laws against Microsoft with respect to a free copy of non-enterprise Windows that was pre-installed on a computer. Windows can be broken six ways to Sunday, it can be a defective "product", and Microsoft can take its time fixing the problems, or even just leave them as is.
- When a user enable Recall, it should ask to setup a "Recall password" and generate a private public key and use the password to encrypt the private key.
- Use the above public key to encrypt all the data it stores.
- When user wants to search Recall history, ask for the password, decrypt the private key and use the decrypted private key to decrypt the data and show the data to user.
- Show some sort of indicator on taskbar that Recall is running, not a tray icon (which can be hidden), but a proper big red circle kind of thing.
To me this seems like another case of MS top executives telling every team that they have to do something with AI. Typical approach used by many executives and managers - "Here is a new tech, figure out a product to build with it".