Since this move would make (advanced) user experience worse, it sounds very realistic to me.
Had they suddenly say "Windows 11 was a big mistake, we are moving back to Windows 7 experience, HTML-based pseudo-UIs are banned" etc. - now that would sound unrealistic.
Microsoft has been increasingly moving to the cloud. The last update of Outlook changed it from a local application, to one that is through the edge browser. Absolutely awful. Eventually, I had to edit the registry to prevent it from force updating to the web browser option.
The Mac version of this, if you switch to the "new Outlook" looses the ability to connect to an Exchange server, which is one of the most bizarre downgrades I've ever seen.
I wouldn't put it past those slimeballs to release "Microsoft Linux" that has all the FOSS command line tools but still runs in the cloud and/or relies on the insecure windows codebase.
Not surprising, there is a decent value proposition in buying a thin client and leasing online compute.
Certainly you can get a used PC for less but most new PCs start at $500 with limited storage. Getting a fully functional PC for $X/month could work for certain values of X.
One of the biggest value proposition would be that Microsoft will take care of keeping it working and backing up your data. That's a huge sell if they can make it work.
You would probably need a very good network connection. I don't think this would work for non-stationary laptops.
This and cloud gaming... it has the obvious real-economic efficiency gains of centralizing hardware (and software) for better overall utilization, easier procurement, development, management and integration albeit all dependent on users' internet connection for actual usage.
The move to cloud infrastructure in the business world sets a precedent. Increasingly I notice self-hosted things being labeled 'legacy' by influencer blogs (even if it makes no fundamental sense infrastructure or software architecture-wise).
Users want things to "just work", no matter the wider implications. Makes me real scared for personal computing :(
Hopefully the final push that I need to give up video games (or at least explore the options around running them on Linux more closely) and move to Linux on the desktop full time rather than just running it in WSL.
I am pleasantly surprised how good combo Debian sid / Steam Beta / Proton Experimental has been for gaming (converted from OS X for different reasons and then discovered the gaming experience is great too)
Valve could have tried to cater to the Android/NDK folks to also support native GNU/Linux, and do some nudging into cross platforms APIs like Apple is doing with the game porting kit, instead they chose to solidify DirectX/Windows as the main APIs PC game developers care about.
[1] https://www.theverge.com/2023/10/24/23930478/microsoft-ceo-s...
Since this move would make (advanced) user experience worse, it sounds very realistic to me.
Had they suddenly say "Windows 11 was a big mistake, we are moving back to Windows 7 experience, HTML-based pseudo-UIs are banned" etc. - now that would sound unrealistic.
Keeping my fingers crossed!
Deleted Comment
Certainly you can get a used PC for less but most new PCs start at $500 with limited storage. Getting a fully functional PC for $X/month could work for certain values of X.
You would probably need a very good network connection. I don't think this would work for non-stationary laptops.
Latency is a huge killer though, and any small interruption becomes a problem.
Epic is going to have a seizure.
This is a large part of why my new PC is running PopOS.
The move to cloud infrastructure in the business world sets a precedent. Increasingly I notice self-hosted things being labeled 'legacy' by influencer blogs (even if it makes no fundamental sense infrastructure or software architecture-wise).
Users want things to "just work", no matter the wider implications. Makes me real scared for personal computing :(