> In 2014, the company decided it could do without many of its testers. Mary Jo Foley reported that "a good chunk" were being laid off. Microsoft didn't need to bother with traditional methods of testing code. Waterfall was out. Agile was in.
An average software dev today is expected to do the work and have the skillset that used to take a half dozen people or more.
There were of course even more roles in the prehistory, but if we think the 2000s, I can count at least: RDB design and management; planning and specification work; interfacing with the customer; testing; merging UI and backend engineering to "full stack"; merging coding, operations and admin to "devops"… I'm pretty sure that the only reason devs aren't yet expected to make their own sales is that the sales department is a profit center and, as such, sacrosanct.
As someone who worked in the film industry for 15 years, this is why I get weary of anyone telling me a new tool “will just make things easier.“ All it does is raise the expectations of what I am supposed to do even if it leads to my role expanding every six months without compensation.
I don’t know a single specialized camera operator anymore. Literally every shooter I know is also a competent editor, which I do think is neat and makes us better Cam
Ops, but it also means people expect everyone to shoot and edit. Also capture excellent sound. Don’t forget the set has to look good. Make sure you’ve got a good Rolodex of locations ready to go as well.
We don’t need to keep every individual role just because it’s traditionally been there, but in a lot of industries we’ve clearly gone the wrong direction. And with something like QA/QC I could see that being a huge problem because the payoff is not obvious so upper management is going to want you to get something out the door no matter what state it is in.
Even early in its history, Microsoft was famous for merging these together into a single role: "developer". I remember reading (but can't find now) an article about how IBM has all these fancy roles like designer, architect, tester and the lowly programmer, and Microsoft's approach of integrating then is what allowed then to succeed over early competitors.
Remember Steve Balmer chanting "Developers, developers, developers" (in about 2000)? That's why.
I'm not saying I totally agree (although I think I do at least a bit), just that this is hardly new.
The “developers” mentioned in the monkey boy song were actually third-party developers. Ballmer wasn’t talking about Microsoft’s internal teams or nomenclature.
> An average software dev today is expected to do the work and have the skillset that used to take a half dozen people or more.
I think that depended (and still depends) a lot on the organization and the nature of the product.
I distinctly remember doing backend and some frontend development, requirements specification, database design, customer interfacing and even a bit of ops, all on the same job and with the same title in the 00's. That was in a small-to-medium company and my clients were on the small side so the projects might not have even had half a dozen people to begin with.
Larger organizations and more enterprisey projects would have had more specialized and limited roles: customer/specs people, possibly frontend and backend devs, DBAs, testing people, and those in charge of ops and environments. In my experience, that's still more or less true in enterprisey development today.
I think a part of the problem is that while new technologies have emerged and reduced the need to manually work with some older or underlying technologies, they haven't replaced previous skills.
Containers have reduced the amount of work needed to deal with deployments and environments but they haven't removed the need to know servers or operating systems. Cluster management can reduce the amount of manual work on setting up containers but it doesn't remove the need to know the underlying container engine. So now you need to know Linux servers and containers and k8s and whatnot just in order to manage a local backend development setup. At the same time, frameworks have made a lot of frontend work more manageable but they haven't made JavaScript or other underlying stuff disappear.
Thus the scope of what being a fully-versed full-stack developer entails has grown.
No doubt, but the breadth of required knowledge today is vast.
Sure, we were "webmasters", but there is a huge difference between tinkering with some PHP, MYSQL, HTML, and Apache, and being an expert on the latest cloud offerings, security practices, etc. One could spend six months in analysis paralysis these days without writing a line of code.
I don't think that is real - I don't believe every company was able to afford having DBA, Dev, QA, Business Analysts, Ops etc. as always fully separate FTE.
Only biggest companies were able to have that. If you have a single application to run there is no work for DBA as FTE, in big company where you have multiple projects you can most likely have DBA as a department that handles dozens of databases and running infra. Same with Ops, you can have SRE or OPS doing your infra if you have dozens of applications to run.
Problem is having separate QA/DBA/Dev/Ops departments was breaking because people would "do their stuff" and throw problems over the fence. So everything would go to shit and we have seen it in big companies.
Other thing is - I have read about multiple companies trying "to be professional" burning money on exactly having separate roles, but in reality you cannot simply afford FTE or having full department of DBA or QA or OPS or just Dev - unless you basically are swimming in money.
> My guess is that this change has its roots in the move from physical media delivery of software to internet delivery.
> My instinct is that there is some general principle that relates “friction” and “quality”, although I’m not sure I have the vocabulary to describe it.
I think the principle is: the greater the impact of a mistake, the more effort you'll put in (up front) to avoiding one. The more friction, the greater the impact.
When software was distributed on physical media and users had no internet you basically had only one (1) chance to get it right. Buggy software would be buggy effectively forever. So (for instance) video game companies had insane QA testing of every release. After QA, it'd get burned onto an expensive cartridge and it'd be done. People would pay the 2025 equivalent of $100+ for it, and they'd be unhappy (to say the least) if it didn't even work.
Once users had internet and patching became possible, that slipped a little, then more. Eventually managers realized you could even get away with shipping an incomplete, not working product. You'd just fix it later in a patch.
Now, with software being delivered over the internet, often as SAAS, everything is in a constant state of flux. Sure they can fix bugs (if they chose too), but as they fix bugs they're constantly introducing new ones (and maybe even removing features you use).
I don't mind doing some of these tasks but if I could go my entire life without speaking to another customer, or even their engineering team, I would die happy
It's simple supply and demand. If the average dev can do that, then that's what will be demanded. If the average dev can't do that, then there's no use demanding it since there's no one to fill that opening (at that price point).
The software dev supply market is absolutely saturated.
The auto-translation by LLM on https://learn.microsoft.com/ is horrible. Because it has no idea what is explainer text and what part of the syntax of a command, programming language, class members, ... It translates reserved words that when taken at face value lead to errors. E.g. for https://learn.microsoft.com/de-de/windows-hardware/drivers/d... you get /gerät-aktivieren for what should be /enable-device which must not be translated. For this reason I made a bookmark to switch to English:
javascript: (function() {
var url = window.location.href;
if (url.match(/de-de/gi)) {
window.location.href = url.replace(/de-de/gi, 'en-us');
} else if (url.match(/en-us/gi)) {
window.location.href = url.replace(/en-us/gi, 'de-de');
}
})();
To be fair, everyone is committing this crime: Amazon, Google, IBM... as a native spanish speaker, I have never wanted to read documentation in spanish: it is usually of lower quality, not up to date and uses translations of technical terms that make it impossible to reconcile with the rest of the literature (I'm looking at you, dragon book translation. Nobody says "canaletas" instead of "pipelines").
At least before the translations were done by profesionals, but now they are done with tech capable of hallucinating new stuff itself... I'm trusting translations even less than before.
“Shell script” - “libreto de cápsula de sesión” :) (and mind you, this translation _was_ done by a supposed professional back in the 90s where it was basically the only option.
The only 3 languages worth writing documentation in are English Chinese and Japanese as far as I'm concerned. I only include Japanese because I've read some Japanese game console docs and they were done pretty well
OK, story time. One day at work, I was so fed up with websites automatically translating stuff, I went into my browser's language settings only to find out that I can't remove language preferences altogether, I need to have at least one language set (and I wanted to always have the originals, explicitely not English every time). A team mate came up with a solution: set the language to Latin. I found this a brillant idea: it's a language nobody in their right mind would be using on the web or automatically translate to and if they do, I'll applaud their dedication. So I did exactly that and went on a Latin adventure.
Turns out, the web becomes a more interesting place when you do. First of all, Google apparently makes a logical route from "Latin" to "Rome" to "Italy" to "Italian" and routinely displays login screens and such in Italian. hCaptcha breaks into a thousand pieces and displays (or displayed then) prompts in a broken mess of two to three different languages within one sentence. Several websites wouldn't load at all because they automatically request a translation from their backend server as the first order of business and after a Latin translation returns a 404, they just croak with an empty page.
I've encountered multiple websites actually translated to Latin with a language file, i.e. not live via Google Translate or similar. Probably still an automatic translation, but still I had the chance to applaud them.
But my absolute favourite are Google Maps embeddings. Turns out, every time you visit an embedded Google Maps instance, your account settings don't matter, your location doesn't matter, all that matters is your browser's language -- and Google Maps actually have Latin data for countries and cities! Granted, not for everything, but many many things are translated with Latin names, some of which directly correspond to the Latin names used in centuries past. You can browse the world for hours, sometimes trying to remember which city is behind a Latin name.
Documentation for Visual Studio used to translate SQL Keywords... But most confusing is that Excel sometimes really uses localised words in formulas (if/then/...). Makes some sense for Excel-Users, but is very confusing when you don't expect this.
Are those actually localized, or is it just the displayed value?
In other words, would a French document not load properly if Excel is running in German?
I basically never use Office, and even though I admit my opinion of MS quality is as low as it could possibly be, I still kinda hope that they didn't stoop this low.
lol I really despise it when corporations try to be "too" smart/helpful and automatically translate their shit to the local language guessed from your IP, which is a major pain when traveling.
Google absolutely sucks ass at this: You can't set your default language without having a Google account which I refuse to sign in except on YouTube, and there's no easy way to change the language on some of their pages.
Even worse nowadays on YouTube itself: The auto-translated titles and machine dubbing, for which you can only pick one default language, which all other content will be forcibly translopped into. God forbid I speak more than one language and am comfortable consuming content in its original language.
Anyone who determines language solely from IP is per se incompetent and needs to be completely blacklisted from the industry. The Accept-Language header exists for a reason.
Sometimes it's impossible even with an account. I can't search in English on my phone in Japan. If I go into options and change the language, the moment I click OK, it switches everything right back to Japanese. I know multiple colleagues who've had the same issue for years.
It's more annoying because browsers literally tell you what language they want.
Developers seem to have a canned response to this; largely that they don't trust users to have set their operating system language to the correct one... somehow
Microsoft does have a reputation for ungodly levels of backwards compatibility. You can still run the oldest Windows 95 programs today, still open the oldest Word documents, etc.
The issue with “quality” is that it's really subjective. As someone remembering the switch from Windows 98 (DOS-based) to Windows 2000 (NT-based), the boost in subjective quality was immense. But to someone who's already been on Linux for years, it would have looked like playing catch-up.
That's how computers and software work by default. It's an entirely different business philosophy. We can gain more market share but having a just works software environment for as many people as possible VS we can resell people the same software over and over again.
You're impressed that they managed to fill their diaper for so long without any leaks? Linux can read the oldest unix file, compile and run the oldest programs. Their "backwards compatability" is entirely a self created problem, they realized they can capture more value, in the short term, if users only see a binary so they have to implement a technically flawed solution.
Last I checked, the only way to run a 16 bit game from my childhood is on Linux. Very easy. Odd to congratulate Microsoft on backwards compatibility when they are not doing better than Linux.
> You can still run the oldest Windows 95 programs today, still open the oldest Word documents, etc.
No. You can't. Games requiring old DirectX versions will crash in subtle ways. A lot of programs are badly rendered on Windows 10 (for some reason Windows scales some UI elements but other not).
NT kernel is pretty solid and thr earlier NT kernel OS such as Win 2000 and XP are solid too. They definitely did not have the security features modern OSes have but security always evolves.
I mean NT is still way more advanced and modern than Linux will ever be, per definition. NT implements a lot of modern security architectures right into the kernel, while Linux inherently just lacks a lot of them.
But apart from NT I can't think of a lot more solid products that came out of Microsoft.
I think there’s a difference between a product having problems and needing a restart once in a while and the product actively behaving in an undesirable way.
A pretty thin opinion piece, I was expecting more details. But there are a bunch of comments under that article which is probably juicier than the main text.
When I think about bad quality control Power Automate from their Power platform comes to mind. It could be quite a useful tool to automate several things and connect different systems with each other. Perhaps a high level serverless function with a visual editor you might say.
But it is a beta version at best for years. There are two version of editors that frequently break. You cannot overtax the system or all running instances morph into endless running tasks. Flows regularly break if you want to update something and the flow was created in a version of the editor no longer available.
You can also see their attempts to monetize this test version that needs a powerhouse of a machine to run its editor in a browser.
I like it for what it is if you are already (trapped?) in a Microsoft environment. There is some potential here and few consultants will tell you that it basically isn't production ready. But the product manager should be quite ashamed.
Also I heavily doubt it will stay free, so perhaps plan infrastructure accordingly.
The article left out the most important question: are there any lasting negative consequences for Microsoft due to all these accidents? The answer is likely no. And that's all the the shareholders care about sadly. So this will continue to happen imo. Those Quality Assurance testers won't be coming back any time soon.
They seem to be very slowly losing to Apple on the laptop / productivity market and first signs of losing to Linux on gaming.
In the same way that their incompetence has been very slow to move the needle, once they lose the market it’s going to be almost impossible to get it back.
"first signs" for sure; I think / suspect the next big step for Valve would be to release a desktop gaming system. If it has a browser and "native" Discord it has the potential to take a chunk off the PC gaming market.
I don't think Linux will ever fully take over gaming regardless of effort unless competitive multiplayer game companies decide to give up on assuming total control over your system in order to make cheaters undetectable to the average gamer*.
It would require that a Linux based OS was released which allows games companies, in a standardised way, to take full control over the system. And at this point, it won't be a Linux distro, it will instead just be like Android. I think calling that a market takeover would be similarly thin and insignificant as calling android a "Linux takeover of the mobile OS market".
But this isn't to say that Microsoft won't lose the market share to "Gamedroid" or whatever, it just won't be losing it to Linux.
* As has been demonstrated, KLA and similar technologies do help make cheating more difficult and require more resources. But, as the cheating industry's pocketbooks make clear, cheating hasn't stopped, it has just become more discreet such that most players simply don't notice when they're losing to cheaters.
More than anything, once people realize that they can be fine without MS because 5-10% of the non-Apple market has done so (and the alternatives has figured out the kinks with the mass influx of users) it could move from a trickle to an avalanche.
The upside is that MS has the reserves and fallbacks to get their shit together if they realized that they are faced with a bad sitation and those that can't leave will get better products.
Microsoft’s value is Excel+ Azure. And as long as they can sell Excel, they can sell a bunch of the other stuff at a cost so low that leaders have trouble saying no to janky Teams and other software.
lol you've 'talked' about switching. I'm really surprised that any startup would be on Teams in the first place. I get enterprises but for startups I would think other tools make more sense (Slack, etc...).
My experience is that document sharing and collaborative edition work insanely well with Office. Visio is fool proof and quality is ok even with a poor connection. The integration with outlook is perfect. The product ecosystem is great so it’s easy to get room booking and auto-connect. Plus, copilot is good at minutes and transcription.
I can’t imagine going back to a time where I couldn’t just throw an excel file or ppt in a discussion and get collaborative editing straight away.
At the price point, it’s pretty much unmatched in my experience. What would people rather use instead?
I happen to use Windows on both personal and work laptop. Some of the bugs I see exist across Home and Enterprise version. Sleep remains a nightmare on Windows, and yes across laptops made by different manufacturers. I have created tickets and this, and IT doesn't have a solution.
I have decided that my next personal laptop definitely won't run Windows, and if I am allowed to ask for a Mac machine at work in the future, I'll jump at that opportunity.
That would mean two fewer Windows licenses and less usage of related products (good riddance, Edge!). And I am sure I am not the only one who is thinking about all this.
But of course I have no idea if that matters in the grand scheme of things -- after all, many people tolerate these bugs just like they tolerate all the ads by Microsoft, Google, Meta etc.
Every platform has some hardware that has issues with sleeping. I've had numerous Linux machines that entirely fail to resume often, this current MacBook I'm on fails to properly reconnect things on USB hubs after a resume, etc. The grass is always greener.
Try sleep study on your current machine. I had an issue with one machine constantly waking from sleep. Lots of other tools couldn't clue me into what was going on and why the system was actually waking. Sleep study pointed exactly to the device causing problems, disabling it from waking the system solved my sleep problems on that device.
As a network engineer I regularly participate and mediate 'vendor support shootouts' between vendors like cisco, juniper, arista, verizon, at&t, and, microsoft - all often amongst each other. It is at times maddeningly frustrating.
Microsoft by far and away is the least responsive in nearly all cases. In one case the project involved thousands of SQL servers and MS was 100% unresponsive - moreover this was not a small customer, but one of microsoft's largest corporate customers. Still nothing but silence. So out with MS for that project. The yearly license fees were well into six figures. Even though MS has a track record of unresponsiveness, the silence was surprising and noteworthy. Why bet money on a horse that doesn't show up.
As one MS Director put it out of frustration: "We do test, a lot. Our testers are called endusers. That's it."
More precisely: He said MSlers get paid by results, achieved Business Value. Testers exist and are called "End Users". Testing is mandatory and part of the core philosophy - they just must do it differently.
Reason: Fear of missing out if moving to slow.
I reminisce the times, where you put in a CD without internet connection. Actual Office is a mess. Thousands of half finished apps, subject to be cancelled anytime. Windows XP's UI was dubbed "glossy" - some of Office's apps UIs are LSD trips for kids. This is ridiculous. Nothing to work with and in no way usable for customer presentations.
That's a good point. I think it would be bearable if they actually had a good feedback platform & interact with their users. Feedback Hub is just terrible: slow, featureless & built on top of their buggiest ui platform.
An average software dev today is expected to do the work and have the skillset that used to take a half dozen people or more.
There were of course even more roles in the prehistory, but if we think the 2000s, I can count at least: RDB design and management; planning and specification work; interfacing with the customer; testing; merging UI and backend engineering to "full stack"; merging coding, operations and admin to "devops"… I'm pretty sure that the only reason devs aren't yet expected to make their own sales is that the sales department is a profit center and, as such, sacrosanct.
I don’t know a single specialized camera operator anymore. Literally every shooter I know is also a competent editor, which I do think is neat and makes us better Cam Ops, but it also means people expect everyone to shoot and edit. Also capture excellent sound. Don’t forget the set has to look good. Make sure you’ve got a good Rolodex of locations ready to go as well.
We don’t need to keep every individual role just because it’s traditionally been there, but in a lot of industries we’ve clearly gone the wrong direction. And with something like QA/QC I could see that being a huge problem because the payoff is not obvious so upper management is going to want you to get something out the door no matter what state it is in.
Remember Steve Balmer chanting "Developers, developers, developers" (in about 2000)? That's why.
I'm not saying I totally agree (although I think I do at least a bit), just that this is hardly new.
https://www.windowscentral.com/microsoft/the-real-story-behi...
https://youtu.be/ug4c2mqlE_0?si=qtqu7tOC7Xpw67aN
No. Microsoft was famous for having a role called Software Development Engineer in Test.
> Remember Steve Balmer chanting "Developers, developers, developers" (in about 2000)? That's why.
No. Ballmer's chant was about 3rd party developers.
I think that depended (and still depends) a lot on the organization and the nature of the product.
I distinctly remember doing backend and some frontend development, requirements specification, database design, customer interfacing and even a bit of ops, all on the same job and with the same title in the 00's. That was in a small-to-medium company and my clients were on the small side so the projects might not have even had half a dozen people to begin with.
Larger organizations and more enterprisey projects would have had more specialized and limited roles: customer/specs people, possibly frontend and backend devs, DBAs, testing people, and those in charge of ops and environments. In my experience, that's still more or less true in enterprisey development today.
I think a part of the problem is that while new technologies have emerged and reduced the need to manually work with some older or underlying technologies, they haven't replaced previous skills.
Containers have reduced the amount of work needed to deal with deployments and environments but they haven't removed the need to know servers or operating systems. Cluster management can reduce the amount of manual work on setting up containers but it doesn't remove the need to know the underlying container engine. So now you need to know Linux servers and containers and k8s and whatnot just in order to manage a local backend development setup. At the same time, frameworks have made a lot of frontend work more manageable but they haven't made JavaScript or other underlying stuff disappear.
Thus the scope of what being a fully-versed full-stack developer entails has grown.
Sure, we were "webmasters", but there is a huge difference between tinkering with some PHP, MYSQL, HTML, and Apache, and being an expert on the latest cloud offerings, security practices, etc. One could spend six months in analysis paralysis these days without writing a line of code.
Only biggest companies were able to have that. If you have a single application to run there is no work for DBA as FTE, in big company where you have multiple projects you can most likely have DBA as a department that handles dozens of databases and running infra. Same with Ops, you can have SRE or OPS doing your infra if you have dozens of applications to run.
Problem is having separate QA/DBA/Dev/Ops departments was breaking because people would "do their stuff" and throw problems over the fence. So everything would go to shit and we have seen it in big companies.
Other thing is - I have read about multiple companies trying "to be professional" burning money on exactly having separate roles, but in reality you cannot simply afford FTE or having full department of DBA or QA or OPS or just Dev - unless you basically are swimming in money.
My instinct is that there is some general principle that relates “friction” and “quality”, although I’m not sure I have the vocabulary to describe it.
I.e. where there is a barrier to entry, quality of results tends to improve.
I also see this in ease of publishing to social media, bias of “old music is better” (time has sorted wheat from chaff) and so on.
Perhaps there’s a well known description of this phenomenon somewhere already…
> My instinct is that there is some general principle that relates “friction” and “quality”, although I’m not sure I have the vocabulary to describe it.
I think the principle is: the greater the impact of a mistake, the more effort you'll put in (up front) to avoiding one. The more friction, the greater the impact.
When software was distributed on physical media and users had no internet you basically had only one (1) chance to get it right. Buggy software would be buggy effectively forever. So (for instance) video game companies had insane QA testing of every release. After QA, it'd get burned onto an expensive cartridge and it'd be done. People would pay the 2025 equivalent of $100+ for it, and they'd be unhappy (to say the least) if it didn't even work.
Once users had internet and patching became possible, that slipped a little, then more. Eventually managers realized you could even get away with shipping an incomplete, not working product. You'd just fix it later in a patch.
Now, with software being delivered over the internet, often as SAAS, everything is in a constant state of flux. Sure they can fix bugs (if they chose too), but as they fix bugs they're constantly introducing new ones (and maybe even removing features you use).
The software dev supply market is absolutely saturated.
At least before the translations were done by profesionals, but now they are done with tech capable of hallucinating new stuff itself... I'm trusting translations even less than before.
Deleted Comment
Turns out, the web becomes a more interesting place when you do. First of all, Google apparently makes a logical route from "Latin" to "Rome" to "Italy" to "Italian" and routinely displays login screens and such in Italian. hCaptcha breaks into a thousand pieces and displays (or displayed then) prompts in a broken mess of two to three different languages within one sentence. Several websites wouldn't load at all because they automatically request a translation from their backend server as the first order of business and after a Latin translation returns a 404, they just croak with an empty page.
I've encountered multiple websites actually translated to Latin with a language file, i.e. not live via Google Translate or similar. Probably still an automatic translation, but still I had the chance to applaud them.
But my absolute favourite are Google Maps embeddings. Turns out, every time you visit an embedded Google Maps instance, your account settings don't matter, your location doesn't matter, all that matters is your browser's language -- and Google Maps actually have Latin data for countries and cities! Granted, not for everything, but many many things are translated with Latin names, some of which directly correspond to the Latin names used in centuries past. You can browse the world for hours, sometimes trying to remember which city is behind a Latin name.
In other words, would a French document not load properly if Excel is running in German?
I basically never use Office, and even though I admit my opinion of MS quality is as low as it could possibly be, I still kinda hope that they didn't stoop this low.
Dead Comment
Google absolutely sucks ass at this: You can't set your default language without having a Google account which I refuse to sign in except on YouTube, and there's no easy way to change the language on some of their pages.
http://www.google.com/ncr
You can: https://www.google.com/preferences?lang=1
Conveniently this URL supports th hl param https://developers.google.com/custom-search/docs/xml_results... It explicitly sets the language, so in case you switched to a language you don't understand, visit this English link: https://www.google.com/preferences?lang=1&hl=en Ofc you can also delete cookies.
Interestingly not all available languages change the UI, e.g. 'Èdè Yorùbá (jorubera)' does nothing. The dropdown has 144 options, but the official interface language list only 74: https://developers.google.com/custom-search/docs/xml_results...
It's more annoying because browsers literally tell you what language they want.
Developers seem to have a canned response to this; largely that they don't trust users to have set their operating system language to the correct one... somehow
https://dijit.svbtle.com/trusting-the-user-they-know-what-la...
I have been Microsoft-adjacent for 30 years, and at no point in that time have I been aware of Microsoft having a reputation for "quality".
The issue with “quality” is that it's really subjective. As someone remembering the switch from Windows 98 (DOS-based) to Windows 2000 (NT-based), the boost in subjective quality was immense. But to someone who's already been on Linux for years, it would have looked like playing catch-up.
You're impressed that they managed to fill their diaper for so long without any leaks? Linux can read the oldest unix file, compile and run the oldest programs. Their "backwards compatability" is entirely a self created problem, they realized they can capture more value, in the short term, if users only see a binary so they have to implement a technically flawed solution.
No. You can't. Games requiring old DirectX versions will crash in subtle ways. A lot of programs are badly rendered on Windows 10 (for some reason Windows scales some UI elements but other not).
I think they went too Moneyball and figured telemetry and metrics could solve everything. McNamara fallacy and all that
Deleted Comment
But apart from NT I can't think of a lot more solid products that came out of Microsoft.
But it is a beta version at best for years. There are two version of editors that frequently break. You cannot overtax the system or all running instances morph into endless running tasks. Flows regularly break if you want to update something and the flow was created in a version of the editor no longer available.
You can also see their attempts to monetize this test version that needs a powerhouse of a machine to run its editor in a browser.
I like it for what it is if you are already (trapped?) in a Microsoft environment. There is some potential here and few consultants will tell you that it basically isn't production ready. But the product manager should be quite ashamed.
Also I heavily doubt it will stay free, so perhaps plan infrastructure accordingly.
In the same way that their incompetence has been very slow to move the needle, once they lose the market it’s going to be almost impossible to get it back.
As for Linux, I keep waiting for the return of netbooks wave, in something that isn't a constrained Chromebook or Android tablet with keyboard.
Ordinary people buy what they can see on the local PC stores, not buy over Internet, importing System 76 and similar.
It would require that a Linux based OS was released which allows games companies, in a standardised way, to take full control over the system. And at this point, it won't be a Linux distro, it will instead just be like Android. I think calling that a market takeover would be similarly thin and insignificant as calling android a "Linux takeover of the mobile OS market".
But this isn't to say that Microsoft won't lose the market share to "Gamedroid" or whatever, it just won't be losing it to Linux.
* As has been demonstrated, KLA and similar technologies do help make cheating more difficult and require more resources. But, as the cheating industry's pocketbooks make clear, cheating hasn't stopped, it has just become more discreet such that most players simply don't notice when they're losing to cheaters.
The upside is that MS has the reserves and fallbacks to get their shit together if they realized that they are faced with a bad sitation and those that can't leave will get better products.
The question is what can obviate Excel.
No one is using products from Microsoft due to their quality.
My experience is that document sharing and collaborative edition work insanely well with Office. Visio is fool proof and quality is ok even with a poor connection. The integration with outlook is perfect. The product ecosystem is great so it’s easy to get room booking and auto-connect. Plus, copilot is good at minutes and transcription.
I can’t imagine going back to a time where I couldn’t just throw an excel file or ppt in a discussion and get collaborative editing straight away.
At the price point, it’s pretty much unmatched in my experience. What would people rather use instead?
Its the same story since like 15+ years now.
I happen to use Windows on both personal and work laptop. Some of the bugs I see exist across Home and Enterprise version. Sleep remains a nightmare on Windows, and yes across laptops made by different manufacturers. I have created tickets and this, and IT doesn't have a solution.
I have decided that my next personal laptop definitely won't run Windows, and if I am allowed to ask for a Mac machine at work in the future, I'll jump at that opportunity.
That would mean two fewer Windows licenses and less usage of related products (good riddance, Edge!). And I am sure I am not the only one who is thinking about all this.
But of course I have no idea if that matters in the grand scheme of things -- after all, many people tolerate these bugs just like they tolerate all the ads by Microsoft, Google, Meta etc.
Try sleep study on your current machine. I had an issue with one machine constantly waking from sleep. Lots of other tools couldn't clue me into what was going on and why the system was actually waking. Sleep study pointed exactly to the device causing problems, disabling it from waking the system solved my sleep problems on that device.
https://learn.microsoft.com/en-us/windows-hardware/design/de...
But I think it's the worst aspect of the subscription model. In the past, people just wouldn't buy the new version if it sucked
Microsoft by far and away is the least responsive in nearly all cases. In one case the project involved thousands of SQL servers and MS was 100% unresponsive - moreover this was not a small customer, but one of microsoft's largest corporate customers. Still nothing but silence. So out with MS for that project. The yearly license fees were well into six figures. Even though MS has a track record of unresponsiveness, the silence was surprising and noteworthy. Why bet money on a horse that doesn't show up.
More precisely: He said MSlers get paid by results, achieved Business Value. Testers exist and are called "End Users". Testing is mandatory and part of the core philosophy - they just must do it differently.
Reason: Fear of missing out if moving to slow.
I reminisce the times, where you put in a CD without internet connection. Actual Office is a mess. Thousands of half finished apps, subject to be cancelled anytime. Windows XP's UI was dubbed "glossy" - some of Office's apps UIs are LSD trips for kids. This is ridiculous. Nothing to work with and in no way usable for customer presentations.
Maybe they should read bug reports posted by the end users, and not have half-baked solutions posted by Very Ignorant Persons.
Unfortunately their audience is probably too big.
This is true since at least Win 95. One usually needed to wait until SP2 to get a resemblance of quality from Microsoft.
Now, since Vista, they got rid of ServicePacks. This says a lot about their quality culture.