Let’s pause for a bit and dwell on the absurd amount of RAM it takes to run it even after this exercise. Anyone here remember when QNX shipped a demo in 2000 with a kernel, GUI, web browser and an email client on a single 3.5” floppy? The memory footprint was also a few megabytes. I’m not saying we should be staying within some miserly arbitrary constraints, but my goodness something that draws UI and manages processes has not grown in complexity by four orders of magnitude in 20 years.
Hasn't it, though? HDR, fluid animations, monstrous resolutions, 3D everything, accessibility, fancy APIs for easier development allowing for more features, support for large amounts of devices, backwards compatibility, browsers are almost unrecognizable in featureset to the point they resemble an OS unto themselves, email clients have stayed mostly the same at least except for the part that they also ship a browser and few of us even use 'em anymore!
Some of those features combine exponentially in complexity and hardware requirements, and some optimizations will trade memory for speed.
Not going to defend particular implementations, but requirements? Those have definitely grown more than we give them credit.
That's the desktop compositor. Windows 7 already had one and ran on 1 GB of RAM.
> accessibility
Not everyone needs it, so it should be an optional installable component for those who do.
> fancy APIs for easier development allowing for more features
That still use win32 under the hood. Again, .net has existed for a very long time. MFC has existed for an even longer time.
> support for large amounts of devices
No one asked for Windows on touchscreen anything. Microsoft decided that themselves and ruined the UX for the remaining 99% of the users that still use a mouse and a keyboard.
> backwards compatibility
That's what Microsoft does historically, nothing new here.
> browsers are almost unrecognizable in featureset to the point they resemble an OS unto themselves
No one asked for this. My personal opinion is that everything app-like about browsers needs to be undone, yesterday, and they should again become the hypertext document viewers they were meant to be. Even JS is too much, but I guess it does have to stay.
Resolutions and HDR are one area where I think the extra RAM load and increasing application sizes make complete sense. However, my monitors run at 1080p, don't do HDR, and my video files are rncoded at a standard colour depth. Despite all this, the standalone RAM usage has increased over the years.
Accessibility has actually gone down with the switch to web applications. Microsoft had an excellent accessibility framework with subpar but usable tooling built in, and excellent commercial applications to make use of the existing API, all the way back in Windows XP. Backwards compatibility hacks such as loading old memory manager behaviour and allocating extra buffer space for known buggy applications may take more RAM but don't increase any requirements.
Inagree that requirements have grown but not by the amount reflected in standby CPU and memory use. Don't forget that we've also gained near universal SSD availability, negating the need for RAM caches in many circumstances. And that's just ignoring the advance in CPU and GPU performance since the Windows XP days, when DOS was finally killed off and the amount of necessary custom tailored assembly drastically dropped.
When I boot a Windows XP machine, the only thing I can say I'm really missing as a user is application support. Alright, the Windows XP kernel was incredibly insecure, so let's upgrade to Windows 7 where the painful Vista driver days are behind us and the kernel has been reshaped to put a huge amount of vulnerable code in userspace. What am I missing now? Touchscreen and pen support works, 4k resolutions and higher are supported perfectly fine, almost all modern games still run.
The Steam hardware survey says it all. The largest target audience using their computer components the most runs one or two 1080p monitors, has 6 CPU cores and about 8GB of RAM. Your average consumer doesn't need or use all of that. HiDPI and HDR are a niche and designing your OS around a niche is stupid.
> Hasn't it, though? HDR, fluid animations, monstrous resolutions, 3D everything, accessibility, fancy APIs for easier development allowing for more features, support for large amounts of devices, backwards compatibility,
Soo the feature windows 7 had? I remember running 3D desktop with compositor and fancy effects on 1GB RAM laptop on Linux...
Please don't miss the malware within the OS itself: license services for software such as Microsoft Office and Adobe, and other applications without enough resource bounds.
> Let’s pause for a bit and dwell on the absurd amount of RAM it takes to run it even after this exercise.
I agree and I find the apologists to be completely wrong. I run a modern system: 38" screen, 2 Gbit/s fiber to the home. I'm not "stuck in the past" with a 17" screen or something.
The thing flies. It's screaming fast as it should be.
But I run a lean Debian Linux system, with a minimal window manager. It's definitely less bloated than Ubuntu and compared to Windows, well: there's no comparison possible.
Every single keystroke has an effect instantly. After reading the article about keyboard latency, I found out my keyboard was one of the lower latency one (HHKB) and yet I finetuned the Linux kernel for USB 2.0 polling of keyboard inputs to be even faster. ATM I cannot run a real-time kernel because NVidia refuses to modify a non-stock kernel (well that's what the driver says at least) but even without that: everything feels and actually is insanely fast.
I've got a dozen virtual workspace / virtual desktops and there are shortcuts assigned to each of them. I can fill every virtual virtual desktop with apps and windows and then switch like a madman on my keyboard between each of them: the system doesn't break a sweat.
I can display all the pictures on my NVME SSD in full screen and leave my finger on the arrow key and they'll move so quickly I can't follow.
Computers became very fast and monitor size / file sizes for a regular usage simply didn't grow anywhere near close as quickly as CPU performances.
I love this comment for getting at what, in my opinion, Linux on the desktop is all about: spending your time with a computer that just plain feels great to use.
It doesn't look the same for everyone, of course. It's not about some universalizable value like minimalism. But this is a great example of one of the dimensions in which a Linux desktop can just feel really great in an almost physical way.
The low-end requirements for Debian GNU/Linux (assuming a graphical install and an up-to-date version) are not that low. They're higher than the low-end for Windows XP when it first came out, and probably close to the official requirements for "Vista-capable" machines. So yes, it's a very efficient system by modern standards but it does come with some very real overhead nevertheless.
Well, other operating systems are still relatively decent at this. My main Linux install eats ~250 MiB of RAM after startup, and I've spent exactly zero amount of time on that, so it can be trimmed down further. That's on a system with 32 GiB of RAM — if you have less RAM, it will eat even less since page tables and various kernel buffers will be smaller.
FreeBSD can be comfortably used on systems with 64 MiB of RAM for solving simple tasks like a small proxy server. It has always been good at this — back in the day cheap VPS often used it (and not Linux) precisely because of its small memory requirements.
I've worked on several projects where performance was an afterthought. After the product scaled a bit, it suddenly became the highest priority - but at that time, it was impossible to fix. At least for everyone that created the problem to begin with.
I've taught high performance data structures to dev teams. I've tried to explain how a complex problem can sometimes be solved with a simple algorithm. I've spent decades on attempting to show coworkers that applying a little comp-sci can have a profound effect in the end.
But no. Unfortunately, it always fails. The mindset is always "making it work" and problem solving is brute-forcing the problem until it works.
It takes a special kind of mindset to keep systems efficient. It is like painting a picture, but most seem to prefer doing it with a paint roller.
And I've worked on systems where months were essentially squandered on performance improvements that never paid off because we never grew the customer base sufficiently for them to be worth while...
I'm all for dedicating time and effort towards producing performant code, but it does come at a cost - in some cases, a cost of maintainability (for an extreme example there's always https://users.cs.utah.edu/~elb/folklore/mel.html). In fact I'd suggest in general if you design a library of functions where obviousness/clarity/ease-of-use are your primary criteria, performance is likely to suffer. And there are undoubtedly cases where the cost of higher-grade hardware (in terms of speed and storage capacity) is vastly lower than that of more efficient software. I'd also say performance tuning quite often involves significant trade-offs that lead to much higher memory usage - caching may well be the only way to achieve significant gains at certain scales, but then as you scale up even further, the memory requirements of the caching start to become an issue in themselves. If there were a simple solution it would have been found by now.
My go to on this was I remember running Debian on a Pentium 166 with 32MB of RAM back in 98/99. It would boot to the desktop only using 6MB. It wasn't flash but it could handle the basics. Heck Windows XP would boot to Desktop using a little under 70MB.
But this isn't just Windows, currently I am on Kubuntu 22.04 and it is using about 1.5GB to get to the Desktop! Yes it is very smooth and flash but it seems like a bit much to do this.
This is why I am interesting in projects like Haiku and Serenity OS, they may bring some sanity back into these things.
Obviously there were huge limitations but it shows what can be done. This fit on one 170K floppy and ran on a 1.44mhz 8 bit machine with 64K of RAM.
In the 1990s I ran both Linux and Windows on less than 64M of RAM with IDEs, web browsers, games, and more.
If I had to guess what were possible today I’d fall back on the fairly reliable 80/20 rule and posit that 20% of todays bloat is intrinsic to increases in capability and 80% is incidental complexity and waste.
For me also the Commodore came to mind. It had 64K RAM and a 64K address range, because other things had to fit in there not all RAM was usable at the same time. Clock frequency of the PAL model was 985kHz (yes KILO), so not even a full MHz.
Yet, I could do
* word processing
* desktop publishing
* working with scanned documents
* spreadsheets
* graphics
* digital painting
* music production
* gaming (even chess)
* programming (besides BASIC and ASM I had a Pascal compiler)
* CAD and 3D design (Giga CAD [1], fascinated me to no end)
* Video creation [2]
For all this tasks there were standalone applications [3] with their own GUI [4]. GEOS was an integrated GUI environment with its own applications and way ahead of its times [5].
It still blows my mind how all this could work.
My first Linux ran on a 386DX with 4M of RAM, but this probably as low as on can get. Even the installer choked on that little RAM and one had to create a swap partition and swapon manually after booting but before the installer ran. In text mode it was pretty usable though, X11 worked and I remember having GNU chess runnning, but it was quite slow.
[3] Some came on extension modules which saved RAM or brought a bit of extra RAM, but we are still talking kilobytes. For examples see https://www.pagetable.com/?p=1730
[4] Or sort of TUI if you like; the strict separation of text and graphics mode wasn't a thing in the home computer era.
[5] The standalone apps were still better. So, as advanced GEOS was, I believe it was not used productively much.
Zawinski’s Law - every program on windows attempts to expand until it can be your default PDF viewer. [cloud file sync, advertising display board, telemetry hoover, App Store…]
2GB is a ridiculous amount of memory for something like an OS.
When we see egregious examples like Windows, then it's arguable having constraints might be desirable. It is well-known that "limitation breeds creativity". It's certainly true outside of "tech" companies. I have witnessed it first hand. "Tech" companies are some sort of weird fantasy world where stupidity disguised as cleverness is allowed to run rampant. No more likely place for this to happen than at companies that have too much money.
Many of them do not need to turn a profit and a small number have insane profits due to lack of meaningful competition (cf. honest work). With respect to the later, it's routine to see (overpaid) employees of these companies brag on HN about how they do very little work.
The standards were also a lot lower back then. Modern-day users expect high resolution and color depth for their screens, seamless hardware support no matter what they plug into the machine, i18n with incredibly complex text rendering rather than a fixed size 80x25 text mode with 256 selectable characters, etc. These things take some overhead. We can improve on existing systems (there's no real reason for web browsers to be as complex as they are, a lot of it is pure bells and whistles) but some of that complexity will be there.
You can achieve good memory footprints with Linux, just 2 or 3 years ago I was daily driving Arch linux with bspwm as a window manager, it used only 300 mb, for me is pretty darn good, but as soon as I opened my vscode with a JS project my ram usage was at 12gb. We have a lot of bloatware everywhere, that’s pretty sad.
edit: This remind me a some rants from Casey Muratori about VS[0] and windows terminal[1]
I remember needing to get Windows XP under 64MB of RAM so that I could run some photo editing software. XP was relatively feature complete, I don't think Windows currently ships with 32x the features of XP (64MB vs 2048MB minimum).
Linux with a lightweight GUI for example can still run okay with just 128MB. I ran Debian with LXDE on an old IBM T22, and it worked perfectly well. Running Firefox was a problem (but did eventually work), but something more stripped down like NetSurf or Dillo is blazingly fast.
We don’t need to worry about memory efficiency until we stop getting gains via hardware improvements. For now developers can just slap a web app into some chromium based wrapper, make sure their code doesn’t have any n^2 in it and you’re good to go.
Tell that to the person on a fixed income who has to invest in an expensive new machine because their 2015 laptop (which still has a whopping 4 GB of memory and a CPU that would have been top-of-the-line twenty years ago) has become unusably slow.
Software efficiency is a serious equity and environmental issue, and I wish more people would see it that way.
I have a netbook from around 2010. It has 2 GB of RAM and a single core Atom processor. It boots to a full Linux GUI desktop in a minute or so. It can handle CAD software, my editor, and my usual toolchain, if a bit slowly. It even handles HD video and the battery still holds a 6 hr charge.
But it doesn't really have enough RAM to run a modern web browser. A few tabs and we are swapping. That's unusably slow. A processor that's 5 or 20x slower is tolerable often. Working set not fitting in RAM is thrashing with a 1000x slowdown. And so this otherwise perfectly useful computer is garbage. Not enough RAM ends a machine's useful life before anything else does these days, in my experience.
That's fine for those desktop users which don't care about spinning fans, but many users are on laptops, and care about battery life. An inefficiently coded app might keep the CPU in high levels even if it's absolutely not required for the app because it is just a chat app or such.
> For now developers can just slap a web app into some chromium based wrapper […]
making 10% of users unreachable in order to more easily reach the other 90%. yeah, it’s a fine business strategy. though i do wish devs would be more amenable to the 10% of users who end up doing “weird” things with their app as a result. a stupid number of chat companies will release some Electron app that’s literally unusable on old hardware, and then freak out when people write 3rd party clients for it because it’s the only real option you left them.
DRAM density and cost isn't improving like it used to.
Also memory efficiency is about more than just total DRAM usage; bus speeds haven't kept pace with CPU speeds for a long time now. The more of the program we keep close to the CPU -- in cache -- the happier we are.
You are getting a whole runtime and standard library bundled in. The whole point of python is for quick and dirty scripts because saving you 4 hours is worth more than using 20mb less ram for something that gets run a couple of times.
early expectations on code interfacing and re-usability failed catastrophically
in my previous job rather than give people root access to their laptops we had to do things like running a docker image that ran 7zip and we piped the I/O to/from it, and I'm not kidding we all did this and it was only bearable thanks to bash aliases and the fact that we had 16GBs of RAM
This removes WinSxS. That's fine for embedded, since you'd just package the DLLs you need with any executables you want to run, but trying to run this as a general purpose OS is a fools' errand. Calling WinSxS "bloat" when that "bloat" is allowing 30+ years of backwards compatibility (and a lot of stuff will break) is creative by the article's author for sure.
Nothing wrong with Tiny11 though, if you know what it is good at and use it for that. Namely, "offline" Windows for some appliance-like usage (e.g. factory controls, display screens, et al) when Linux won't do for whatever reason and licensing Windows IoT isn't possible (small business/personal project/etc).
The idea that removing WinSxS saves space is generally misguided anyway. The vast majority of content there is actually the original file that was used to create a hard link at the destination. So obviously removing the file doesn’t really save any appreciable amount of space.
The remaining content unique to WinSXS is either for cryptographic validation, app compat, or the driver stack.
WinSXS looks like a huge folder in explorer, because explorer's size estimates do not tell you about hard links. It's not that big. I need to question somebody who thinks removing it will remove a lot of bloat.
it does more than that, it keeps a copy of every dll you've ever installed not just the ones in use. there's a reason it just gets bigger forever even if you don't install more programs
how much backwards compatability do i neee nowadays?
my laptop only needs to run a few things:
browser
vscode
steam
the microsoft drawing app
some office stuff
sublime
discord
which all update pretty regularly.
the age of the desktop app has been replaced by the age of the browser and electron based apps. i can imagine businesses who built their own set ups back in the age of the desktop app being stuck with it, but for the most part i dont think i used windows' backwards compatability anymore
They may update regularly but that doesn't mean they only use modern features. E.g. even just Steam itself (not just games in it) is largely still 32 bit on Windows requiring gigabytes of 32 bit compatibility files using interfaces going back decades even though Windows 11 itself doesn't have a 32 bit version anymore.
Isn't it interesting that both you and I frequently use Sublime and VS Code? Why can't VS Code kill off Sublime? It's interesting to me where a text editor like Sublime can't be a preferable IDE, but an IDE also isn't a preferable text editor?
"De-bloated" implies that the stuff removed is "bloat", i.e. worthless. I wouldn't assume that a "de-bloated" install was any less suitable for general purpose computing tasks.
I have a media pc running windows 10 with 2gb ram. It's run great with media player classic, Netflix, and even steam installed. I certainly would not assume "debloated" means "completely crippled"
I think that's the point - some people have assumption that 2g is meaningless whereas others see it as HUUUGE amounts of memory. Never mind historically, let us consider what a modern phone can do with 2gb ram.
Calling WinSxS "bloat" when that "bloat" is allowing 30+ years of backwards compatibility (and a lot of stuff will break) is creative by the article's author for sure.
Taking up a lot of space on your drives for data to maintain backwards compatibility makes sense. Why, when not being actively used, does it need to occupy gigabytes of RAM?
It really staggers me the lengths some people will go to try and preserve something that is actively against them, when there are alternatives right there.
I'm not saying Linux is for everyone, but the kind of people creating and running these scripts really should have no issue daily driving Ubuntu or even Arch. Or if they desperately need photoshop or whatever, get a mac.
It's like watching people constantly go back to an abusive relationship.
It really staggers me the lengths some people will go to try and preserve something that is actively against them, when there are alternatives right there.
The same can be said for those working on jailbreaks and the M1 Linux project, as well as all of the cracking/hacking scene. For some people, it's far more interesting and enjoyable to fight --- and possibly win --- than just "abandoning ship".
Well, maybe I'm a minority for having a EE/physical science hobby, but also belong to the kind of people you are referring to.
I'm pretty stuck to Windows as I need it to drive my home lab. I need to run Windows to
1. Get data from a old optical spectrometer. It was designed for optical endpointing of plasma etching. And one will have a hard time finding anything that is not running Windows in a fab (except lithography).
2. Run a 28 years old piece of software to acquire timestamps from a HP 53310A modulation domain analyzer
3. Grab frames from an old xray detector
4. Work with two NI DAQ cards. Yes, they are supposed to work on Linux, but I always get weird errors on my Ubuntu work computer while they never failed me on my Windows laptop.
5. Use Autodesk Inventor to prepare files for 3D printer/machine shop. Siemens NX used to work on Linux, but apart from that, there is not a single piece of non-toy 3D CAD software that I'm aware of support Mac or Linux.
6. LTSpice simulations and Altium Designer layouting.
Windows is the only first class citizen in many areas, software development and artistic work are two exceptions.
And so far, it seems I can still always be one step ahead of MS in the anti-consumer war, so I'm not too worried.
> Or if they desperately need photoshop or whatever, get a mac.
I'm kind of in that situation and I don't thing going with Mac and the Apple ecosystem really is better than trying to use Windows 10 as long as possible on an older Thinkpad.
Agreed. My initial response to any post beginning something like "On Windows XP..." "On Windows 7..." "On Debian..." would be like: "Well you already have Windows XP/7/Debian/whatever. If you want to use that, use that. Nobody is forcing you to use Windows 11."
For the people who do want to use Windows 11, and who see it for what it is, it's pretty great. For the people who use Windows XP/7 or who stick to some minimalistic un-featured XFCE-running underpowered Linux machine, you do your own thing. No need to force that on everyone else.
debloating does not mean "making it like XP/W7" - it means ripping out the horsecrap and unnecessairy components that are both unnecessairy and a waste of space and being able to control what goes on your system - sort of like what nix allows us to do; it also means having options to turn things on and off, ect.
for the non tech savvy - windows is still a great choice for those wanting to simply game and not learn something new like linux - these are the same folks that do notice a difference in OS being bloating and see ads, while asking for help knowing others know more; which a lot of us do not have the time nor energy to fully support a vast array of friends' systems. these debloated windows are great for those folks, and for me not having to /shurg and have people buy more hdd space for nothing.
was it not linus himself that mentioned that linux as a popular desktop os will not be a thing until manufactures who provide prebuild OS's (and support them) - ship them with linux?
but in all honesty i fell that the X vs Wayland needs to be a bit more solidified, similarly with alsa/pulse/pipewire lol ; but those are different issues
I've had access to cheap Windows for years, which is why I kept it around as a secondary OS on my desktop to get around the hassle of getting games run on Linux. Games are mostly play/finish/forget for me anyway.
But since a few years back, most games I were interested in ran perfectly fine on Linux. I haven't rebooted into Windows for almost a year now. So I think I will, instead of upgrading to 11, eventually delete it and use the second SSD to hold my games on Linux and won't look back.
I remember the days I have been building a bare metal recovery for some of our Windows systems using WinPE, imagex and Python. There was this feeling of sane people pouring into M$ to modernize the OS a little bit and cool stuff came out. But in the end, it's still the same inscrutable mess it always were. Nowadays with more and more ads and unnecessary fluff that gets in the way.
I'm not quite there but... my laptop is mostly just for gaming, with some email, chat and web browsing on the side. So I thought I'd allow Windows 10 to upgrade to 11 and see how it is. (It's not getting anywhere near my desktop!)
But... Windows 11 is just... annoying. The UI is worse than 10 in all the ways that matter to me. So I finally put Linux Mint on this laptop, and it's been pretty good. Not flawless, but really good. By default, I install and play games on Steam.
Notable exception is Anno 1800, which has a clunky multiplayer setup anyway, and just doesn't connect under Linux, but works (begrudingly) under Windows.
Northgard has been awesome, but just tonight I had a bunch of server connection issues - can't 100% blame Linux, though 15 minutes into a multiplayer game, I was dropped while the two Windows players kept playing. But it's not conclusive!
At any rate, I think for many PC gamers, Linux gaming would work, though it's still not 100% "install, join, play" for every game.
Not 11 but the "Windows 10 IoT Enterprise LTSC 2021" is significantly better if you want an (official) full fledged Windows OS without the bloat. I'm using that on the Steam Deck w/ dual booting and it's perfect
LTSC is no longer debloated. The latest version comes with the same crapware, Windows app store, "recommended" Microsoft account, telemetry that can't be turned off etc.
It's just a stable version frozen in time but heralding it as the bloat-free alternative is no longer true.
I have access to it through work and I gave it a spin recently but it's no longer what it used to be.
I have installed IoT 2021 (Windows 10 IoT Enterprise LTSC 2021 Version 21H2) like 3 weeks ago there was nothing, no App Store, less telemetry (there is some but significantly less than on the "normal" Windows versions, but I reinstalled the App Store so maybe because of that)
How you activate that? I think KMS will not work for IoT versions.
I use KMS activated non-IoT LTSC 2021 on my obsolete Surface. MS will not sell that edition to "consumers" like me, so I don't feel guilty at all for pirating it.
Windows activation is a simple thing. You need a KMS emulator, and you need to point your Windows to that. If you don't want to set up your own emulator, you can just search the internet for emulators, and point your Windows to one of those. This is what most activators do anyways. But running your own is also easy, if compiling and running a software is easy for you. I personally use vlmcsd.
Just learned about this. I wish I could get some ameliorated.info scripts for that build. I might be willing to try Winders again in that case. There are some older win32 applications that I miss.
The main difference that IoT has longer lifecycle support, the activation method is different (but you don't even have to), and the IoT version is only available in english
But it doesn't really matter they are virtually the same
Honestly, I just gave up after I bought my last computer, installed POP OS and installed Steam and have so far been able to play all my games without a single issue, except The Witcher 3 which only required a configuration change and I was golden. I will use Windows only on work machines but on personal machines its Linux for me from now on.
So far Anno 1800 has an issue where multiplayer games only connect if I play on Windows (but single player runs flawlessly in Linux Mint.) Every other game I've played has been great. StarCraft II (in Bottles), Conan, Valheim, Northgard (so far.)
PC gaming on Linux is not perfect, but it's really damn good.
I had a similar experience with GrapheneOS on my pixel 6 pro.
It got multi-day battery life out of the box, which is far in excess of what Google advertises for that hardware.
Once I installed google play services (which have zero end-user benefit, other than enabling compatibility with apps that have bundled google spyware), battery life more than halved, bringing it in line with what Google claims.
I suspect anti-trust and consumer protection lawsuits would start flying around if more people realized that over 50% of their phone battery was there to support malicious bundled code.
Maybe you are Ok with paying for every app or making do with open sources ones that do not benefit from ad revenue stream. Plus, you don't need maps, pay or cast support. But many other people like this features and if they don't, isn't it great that there are working AOSP builds for Pixel 6 Pro so that they can roll their own ROMs on top of that? No need to hack like for Windows 11.
Why would maps (get lat/long from GPS chip when navigating / searching), cast and pay need to burn battery when the phone is idle?
(Also, third party implementations of maps, such as organic maps and here we go can install and run fine without impacting battery life when they are not running.)
The answer is that the actually-useful features are bundled with mandatory malware that does need to run in the background in order to implement 24/7 surveillance. That bundling clearly violates US antitrust law.
Also, I suspect most people buying > $1000 phones would be willing to pay 10’s of dollars for lifetime licenses for maps, pay and cast (which is roughly what they would cost as standalone products), especially if they were privacy preserving, and doubled the phone’s battery life.
I really think Microsoft needs to take a hard look at Windows and realize that it needs the ability to switch, install, or even decide at boot as a purpose built OS.
Take gaming for example, I pretty much only use my PC for gaming (I prefer my Mac for general purpose stuff) and there is a lot there that is really unnecessary. But where this really becomes an issue is on devices like the Steam Deck.
I installed Windows 10 on mine, used a debloat script to remove anything that was not strictly necessary for gaming, downloading games, and related tasks and I was able to get better performance and battery life for the same games than I did under SteamOS.
While I imagine that this would complicate testing of updates to support these separate purposes, it feels like Windows is trying to do to much all at once.
However I also recognize that much of what I removed is also things like telemetry that I doubt they would remove.
What debloat script did you use, and how did you decide which one to use? My experience is that there is a lot of them out there, and it's impossible to tell which ones actually do something that results in an observable difference, and what the potential drawbacks are of the things being changed.
A lot of the time I feel like you end up with having to do a lot of research for a very minor practical effect.
I used this one https://github.com/Sycnex/Windows10Debloater and yeah I had to heavily customize it and then I did need to re-enable something afterwords which I found on the github.
Basically what I did was I started with the default and then unchecked (or checked? I don't remember what the UI called for now) anything related to Xbox and the Store and I didnt have any issues.
I also did a comparison before and after and it was actually a pretty decent improvement. About a 10fps improvement over SteamOS and a Normal Windows 10.
For me the biggest incentive was being able to play xbox game pass games and not needing to worry about any compatibility issues with Proton which is why I went down that route.
But yeah your second part is very true. I feel, the impact is minimal if you are on a traditional PC. But on something with such limited resources like a Steam Deck, the difference can be going from 40 fps to mid 50's and a few more minutes of battery life.
But it isn't something I would recommend most people do. More just kinda pointing out that with the effort I think Microsoft could make a lean Windows really just by taking a look at what is actually necessary to be run for specific tasks.
>I really think Microsoft needs to take a hard look at Windows and realize that it needs the ability to switch, install, or even decide at boot as a purpose built OS.
Possible by making your own custom image using dism. Everything debloat scripts are doing can be done before even making an ISO.
>I really think Microsoft needs to take a hard look at Windows and realize
Why though? Microsoft is a business, Windows is a product. It works well as a product, sales is good, deployments are many. Why should they reconsider the current strategy?
Not going to defend particular implementations, but requirements? Those have definitely grown more than we give them credit.
That's the job of the GPU driver, mostly.
> 3D everything
That's the desktop compositor. Windows 7 already had one and ran on 1 GB of RAM.
> accessibility
Not everyone needs it, so it should be an optional installable component for those who do.
> fancy APIs for easier development allowing for more features
That still use win32 under the hood. Again, .net has existed for a very long time. MFC has existed for an even longer time.
> support for large amounts of devices
No one asked for Windows on touchscreen anything. Microsoft decided that themselves and ruined the UX for the remaining 99% of the users that still use a mouse and a keyboard.
> backwards compatibility
That's what Microsoft does historically, nothing new here.
> browsers are almost unrecognizable in featureset to the point they resemble an OS unto themselves
No one asked for this. My personal opinion is that everything app-like about browsers needs to be undone, yesterday, and they should again become the hypertext document viewers they were meant to be. Even JS is too much, but I guess it does have to stay.
Accessibility has actually gone down with the switch to web applications. Microsoft had an excellent accessibility framework with subpar but usable tooling built in, and excellent commercial applications to make use of the existing API, all the way back in Windows XP. Backwards compatibility hacks such as loading old memory manager behaviour and allocating extra buffer space for known buggy applications may take more RAM but don't increase any requirements.
Inagree that requirements have grown but not by the amount reflected in standby CPU and memory use. Don't forget that we've also gained near universal SSD availability, negating the need for RAM caches in many circumstances. And that's just ignoring the advance in CPU and GPU performance since the Windows XP days, when DOS was finally killed off and the amount of necessary custom tailored assembly drastically dropped.
When I boot a Windows XP machine, the only thing I can say I'm really missing as a user is application support. Alright, the Windows XP kernel was incredibly insecure, so let's upgrade to Windows 7 where the painful Vista driver days are behind us and the kernel has been reshaped to put a huge amount of vulnerable code in userspace. What am I missing now? Touchscreen and pen support works, 4k resolutions and higher are supported perfectly fine, almost all modern games still run.
The Steam hardware survey says it all. The largest target audience using their computer components the most runs one or two 1080p monitors, has 6 CPU cores and about 8GB of RAM. Your average consumer doesn't need or use all of that. HiDPI and HDR are a niche and designing your OS around a niche is stupid.
Windows didn't really see a lot of actual progress in this area since the Win2k days. Lots of activity and churn yes, but little actual progress.
May I remind of https://www.enlightenment.org/
20 years ago, there were "live cds" that could do most of what you mention, at maybe 512 MB ram.
Soo the feature windows 7 had? I remember running 3D desktop with compositor and fancy effects on 1GB RAM laptop on Linux...
RAM requirements for Windows as OS are ridiculus.
and to be honest, nowadays the biggest issue is the web browser and the sheer amount of memory and processing that modern websites use.
it's unbelievable.
I know that it installs various libraries. I do not know why those libraries are dozens of megabytes each.
that said 2GB is acceptable considering the state of everything
not saying i wouldn't like to have QNX class back
I agree and I find the apologists to be completely wrong. I run a modern system: 38" screen, 2 Gbit/s fiber to the home. I'm not "stuck in the past" with a 17" screen or something.
The thing flies. It's screaming fast as it should be.
But I run a lean Debian Linux system, with a minimal window manager. It's definitely less bloated than Ubuntu and compared to Windows, well: there's no comparison possible.
Every single keystroke has an effect instantly. After reading the article about keyboard latency, I found out my keyboard was one of the lower latency one (HHKB) and yet I finetuned the Linux kernel for USB 2.0 polling of keyboard inputs to be even faster. ATM I cannot run a real-time kernel because NVidia refuses to modify a non-stock kernel (well that's what the driver says at least) but even without that: everything feels and actually is insanely fast.
I've got a dozen virtual workspace / virtual desktops and there are shortcuts assigned to each of them. I can fill every virtual virtual desktop with apps and windows and then switch like a madman on my keyboard between each of them: the system doesn't break a sweat.
I can display all the pictures on my NVME SSD in full screen and leave my finger on the arrow key and they'll move so quickly I can't follow.
Computers became very fast and monitor size / file sizes for a regular usage simply didn't grow anywhere near close as quickly as CPU performances.
Windows is a pig.
It doesn't look the same for everyone, of course. It's not about some universalizable value like minimalism. But this is a great example of one of the dimensions in which a Linux desktop can just feel really great in an almost physical way.
FreeBSD can be comfortably used on systems with 64 MiB of RAM for solving simple tasks like a small proxy server. It has always been good at this — back in the day cheap VPS often used it (and not Linux) precisely because of its small memory requirements.
Deleted Comment
I've taught high performance data structures to dev teams. I've tried to explain how a complex problem can sometimes be solved with a simple algorithm. I've spent decades on attempting to show coworkers that applying a little comp-sci can have a profound effect in the end.
But no. Unfortunately, it always fails. The mindset is always "making it work" and problem solving is brute-forcing the problem until it works.
It takes a special kind of mindset to keep systems efficient. It is like painting a picture, but most seem to prefer doing it with a paint roller.
I'm all for dedicating time and effort towards producing performant code, but it does come at a cost - in some cases, a cost of maintainability (for an extreme example there's always https://users.cs.utah.edu/~elb/folklore/mel.html). In fact I'd suggest in general if you design a library of functions where obviousness/clarity/ease-of-use are your primary criteria, performance is likely to suffer. And there are undoubtedly cases where the cost of higher-grade hardware (in terms of speed and storage capacity) is vastly lower than that of more efficient software. I'd also say performance tuning quite often involves significant trade-offs that lead to much higher memory usage - caching may well be the only way to achieve significant gains at certain scales, but then as you scale up even further, the memory requirements of the caching start to become an issue in themselves. If there were a simple solution it would have been found by now.
How about all those sandboxes, protections and mitigations?
Nowadays people care about security waaay more than people did 20-30 years ago.
But this isn't just Windows, currently I am on Kubuntu 22.04 and it is using about 1.5GB to get to the Desktop! Yes it is very smooth and flash but it seems like a bit much to do this.
This is why I am interesting in projects like Haiku and Serenity OS, they may bring some sanity back into these things.
I guess that with careful selection of GUI components one can fit empty desktop to 60 MB.
Until you start browser anyway.
https://en.m.wikipedia.org/wiki/GEOS_(8-bit_operating_system...
Obviously there were huge limitations but it shows what can be done. This fit on one 170K floppy and ran on a 1.44mhz 8 bit machine with 64K of RAM.
In the 1990s I ran both Linux and Windows on less than 64M of RAM with IDEs, web browsers, games, and more.
If I had to guess what were possible today I’d fall back on the fairly reliable 80/20 rule and posit that 20% of todays bloat is intrinsic to increases in capability and 80% is incidental complexity and waste.
Yet, I could do
* word processing
* desktop publishing
* working with scanned documents
* spreadsheets
* graphics
* digital painting
* music production
* gaming (even chess)
* programming (besides BASIC and ASM I had a Pascal compiler)
* CAD and 3D design (Giga CAD [1], fascinated me to no end)
* Video creation [2]
For all this tasks there were standalone applications [3] with their own GUI [4]. GEOS was an integrated GUI environment with its own applications and way ahead of its times [5].
It still blows my mind how all this could work.
My first Linux ran on a 386DX with 4M of RAM, but this probably as low as on can get. Even the installer choked on that little RAM and one had to create a swap partition and swapon manually after booting but before the installer ran. In text mode it was pretty usable though, X11 worked and I remember having GNU chess runnning, but it was quite slow.
[1] https://youtu.be/ZEf9XMrc5u8
[2] OK, this one is a bit of a stretch but there actually was Videofox for creating video titles and shopping window animations: https://www.pagetable.com/docs/scanntronik_manuals/videofox....
[3] Some came on extension modules which saved RAM or brought a bit of extra RAM, but we are still talking kilobytes. For examples see https://www.pagetable.com/?p=1730
[4] Or sort of TUI if you like; the strict separation of text and graphics mode wasn't a thing in the home computer era.
[5] The standalone apps were still better. So, as advanced GEOS was, I believe it was not used productively much.
[0] https://github.com/smallstepforman/Medo
Zawinski’s Law - every program on windows attempts to expand until it can be your default PDF viewer. [cloud file sync, advertising display board, telemetry hoover, App Store…]
When we see egregious examples like Windows, then it's arguable having constraints might be desirable. It is well-known that "limitation breeds creativity". It's certainly true outside of "tech" companies. I have witnessed it first hand. "Tech" companies are some sort of weird fantasy world where stupidity disguised as cleverness is allowed to run rampant. No more likely place for this to happen than at companies that have too much money.
edit: This remind me a some rants from Casey Muratori about VS[0] and windows terminal[1]
[0] https://youtu.be/GC-0tCy4P1U
[1] https://youtu.be/hxM8QmyZXtg
Linux with a lightweight GUI for example can still run okay with just 128MB. I ran Debian with LXDE on an old IBM T22, and it worked perfectly well. Running Firefox was a problem (but did eventually work), but something more stripped down like NetSurf or Dillo is blazingly fast.
Software efficiency is a serious equity and environmental issue, and I wish more people would see it that way.
But it doesn't really have enough RAM to run a modern web browser. A few tabs and we are swapping. That's unusably slow. A processor that's 5 or 20x slower is tolerable often. Working set not fitting in RAM is thrashing with a 1000x slowdown. And so this otherwise perfectly useful computer is garbage. Not enough RAM ends a machine's useful life before anything else does these days, in my experience.
making 10% of users unreachable in order to more easily reach the other 90%. yeah, it’s a fine business strategy. though i do wish devs would be more amenable to the 10% of users who end up doing “weird” things with their app as a result. a stupid number of chat companies will release some Electron app that’s literally unusable on old hardware, and then freak out when people write 3rd party clients for it because it’s the only real option you left them.
https://aiimpacts.org/trends-in-dram-price-per-gigabyte/
DRAM density and cost isn't improving like it used to.
Also memory efficiency is about more than just total DRAM usage; bus speeds haven't kept pace with CPU speeds for a long time now. The more of the program we keep close to the CPU -- in cache -- the happier we are.
in my previous job rather than give people root access to their laptops we had to do things like running a docker image that ran 7zip and we piped the I/O to/from it, and I'm not kidding we all did this and it was only bearable thanks to bash aliases and the fact that we had 16GBs of RAM
Dead Comment
Nothing wrong with Tiny11 though, if you know what it is good at and use it for that. Namely, "offline" Windows for some appliance-like usage (e.g. factory controls, display screens, et al) when Linux won't do for whatever reason and licensing Windows IoT isn't possible (small business/personal project/etc).
The remaining content unique to WinSXS is either for cryptographic validation, app compat, or the driver stack.
WinSXS looks like a huge folder in explorer, because explorer's size estimates do not tell you about hard links. It's not that big. I need to question somebody who thinks removing it will remove a lot of bloat.
my laptop only needs to run a few things:
browser vscode steam the microsoft drawing app some office stuff sublime discord
which all update pretty regularly.
the age of the desktop app has been replaced by the age of the browser and electron based apps. i can imagine businesses who built their own set ups back in the age of the desktop app being stuck with it, but for the most part i dont think i used windows' backwards compatability anymore
* Steam (the root process, not the subsequent Chromium child processes) is 32-bit, as are a lot of games.
* Discord is 32-bit.
Do you want steam to actually run any game? :D
Deleted Comment
I use an eeePC laptop with 2 Gb or RAM as my home computer. It's quite usable with Linux.
Not that I'm planning to install Win11 on it, but the assumption that 2GB is enough only for embedded devices is incorrect.
I think that's the point - some people have assumption that 2g is meaningless whereas others see it as HUUUGE amounts of memory. Never mind historically, let us consider what a modern phone can do with 2gb ram.
Taking up a lot of space on your drives for data to maintain backwards compatibility makes sense. Why, when not being actively used, does it need to occupy gigabytes of RAM?
There's no need, which is why it doesn't.
I'm not saying Linux is for everyone, but the kind of people creating and running these scripts really should have no issue daily driving Ubuntu or even Arch. Or if they desperately need photoshop or whatever, get a mac.
It's like watching people constantly go back to an abusive relationship.
The same can be said for those working on jailbreaks and the M1 Linux project, as well as all of the cracking/hacking scene. For some people, it's far more interesting and enjoyable to fight --- and possibly win --- than just "abandoning ship".
I'm pretty stuck to Windows as I need it to drive my home lab. I need to run Windows to
1. Get data from a old optical spectrometer. It was designed for optical endpointing of plasma etching. And one will have a hard time finding anything that is not running Windows in a fab (except lithography).
2. Run a 28 years old piece of software to acquire timestamps from a HP 53310A modulation domain analyzer
3. Grab frames from an old xray detector
4. Work with two NI DAQ cards. Yes, they are supposed to work on Linux, but I always get weird errors on my Ubuntu work computer while they never failed me on my Windows laptop.
5. Use Autodesk Inventor to prepare files for 3D printer/machine shop. Siemens NX used to work on Linux, but apart from that, there is not a single piece of non-toy 3D CAD software that I'm aware of support Mac or Linux.
6. LTSpice simulations and Altium Designer layouting.
Windows is the only first class citizen in many areas, software development and artistic work are two exceptions.
And so far, it seems I can still always be one step ahead of MS in the anti-consumer war, so I'm not too worried.
I'm kind of in that situation and I don't thing going with Mac and the Apple ecosystem really is better than trying to use Windows 10 as long as possible on an older Thinkpad.
For the people who do want to use Windows 11, and who see it for what it is, it's pretty great. For the people who use Windows XP/7 or who stick to some minimalistic un-featured XFCE-running underpowered Linux machine, you do your own thing. No need to force that on everyone else.
for the non tech savvy - windows is still a great choice for those wanting to simply game and not learn something new like linux - these are the same folks that do notice a difference in OS being bloating and see ads, while asking for help knowing others know more; which a lot of us do not have the time nor energy to fully support a vast array of friends' systems. these debloated windows are great for those folks, and for me not having to /shurg and have people buy more hdd space for nothing.
was it not linus himself that mentioned that linux as a popular desktop os will not be a thing until manufactures who provide prebuild OS's (and support them) - ship them with linux? but in all honesty i fell that the X vs Wayland needs to be a bit more solidified, similarly with alsa/pulse/pipewire lol ; but those are different issues
This is not strictly true, theres tonnes of reasons to use windows, for example I run thousands of gameservers.
If I can shave memory usage of the OS that translates to a lot of cost savings.
Windows XP/Vista/7 and soon: 10 being EOL does force me to upgrade.
But since a few years back, most games I were interested in ran perfectly fine on Linux. I haven't rebooted into Windows for almost a year now. So I think I will, instead of upgrading to 11, eventually delete it and use the second SSD to hold my games on Linux and won't look back.
I remember the days I have been building a bare metal recovery for some of our Windows systems using WinPE, imagex and Python. There was this feeling of sane people pouring into M$ to modernize the OS a little bit and cool stuff came out. But in the end, it's still the same inscrutable mess it always were. Nowadays with more and more ads and unnecessary fluff that gets in the way.
But... Windows 11 is just... annoying. The UI is worse than 10 in all the ways that matter to me. So I finally put Linux Mint on this laptop, and it's been pretty good. Not flawless, but really good. By default, I install and play games on Steam.
Notable exception is Anno 1800, which has a clunky multiplayer setup anyway, and just doesn't connect under Linux, but works (begrudingly) under Windows.
Northgard has been awesome, but just tonight I had a bunch of server connection issues - can't 100% blame Linux, though 15 minutes into a multiplayer game, I was dropped while the two Windows players kept playing. But it's not conclusive!
At any rate, I think for many PC gamers, Linux gaming would work, though it's still not 100% "install, join, play" for every game.
It's just a stable version frozen in time but heralding it as the bloat-free alternative is no longer true.
I have access to it through work and I gave it a spin recently but it's no longer what it used to be.
Which one? Or I guess it's not a public release?
I have installed IoT 2021 (Windows 10 IoT Enterprise LTSC 2021 Version 21H2) like 3 weeks ago there was nothing, no App Store, less telemetry (there is some but significantly less than on the "normal" Windows versions, but I reinstalled the App Store so maybe because of that)
*Edited the wording
It looks like this as a fresh install https://ia904606.us.archive.org/11/items/en-us_windows_10_io...
I use KMS activated non-IoT LTSC 2021 on my obsolete Surface. MS will not sell that edition to "consumers" like me, so I don't feel guilty at all for pirating it.
What about Windows 11, does something like "Windows 11 IoT Enterprise LTSC" exist also? Would that be equally good/debloated ?
Thank you for the tip btw!
I just use Intel’s Clear Linux so… meh.
The main difference that IoT has longer lifecycle support, the activation method is different (but you don't even have to), and the IoT version is only available in english
But it doesn't really matter they are virtually the same
EDIT: spelling and attempted to adjust elegance
So far Anno 1800 has an issue where multiplayer games only connect if I play on Windows (but single player runs flawlessly in Linux Mint.) Every other game I've played has been great. StarCraft II (in Bottles), Conan, Valheim, Northgard (so far.)
PC gaming on Linux is not perfect, but it's really damn good.
(This is on a Ryzen 7 + Geforce RTX laptop)
It got multi-day battery life out of the box, which is far in excess of what Google advertises for that hardware.
Once I installed google play services (which have zero end-user benefit, other than enabling compatibility with apps that have bundled google spyware), battery life more than halved, bringing it in line with what Google claims.
I suspect anti-trust and consumer protection lawsuits would start flying around if more people realized that over 50% of their phone battery was there to support malicious bundled code.
(Also, third party implementations of maps, such as organic maps and here we go can install and run fine without impacting battery life when they are not running.)
The answer is that the actually-useful features are bundled with mandatory malware that does need to run in the background in order to implement 24/7 surveillance. That bundling clearly violates US antitrust law.
Also, I suspect most people buying > $1000 phones would be willing to pay 10’s of dollars for lifetime licenses for maps, pay and cast (which is roughly what they would cost as standalone products), especially if they were privacy preserving, and doubled the phone’s battery life.
Take gaming for example, I pretty much only use my PC for gaming (I prefer my Mac for general purpose stuff) and there is a lot there that is really unnecessary. But where this really becomes an issue is on devices like the Steam Deck.
I installed Windows 10 on mine, used a debloat script to remove anything that was not strictly necessary for gaming, downloading games, and related tasks and I was able to get better performance and battery life for the same games than I did under SteamOS.
While I imagine that this would complicate testing of updates to support these separate purposes, it feels like Windows is trying to do to much all at once.
However I also recognize that much of what I removed is also things like telemetry that I doubt they would remove.
They don't care. This won't bring them money. Showing you ads and tracking you will, so they'll continue doing it.
A lot of the time I feel like you end up with having to do a lot of research for a very minor practical effect.
Basically what I did was I started with the default and then unchecked (or checked? I don't remember what the UI called for now) anything related to Xbox and the Store and I didnt have any issues.
I also did a comparison before and after and it was actually a pretty decent improvement. About a 10fps improvement over SteamOS and a Normal Windows 10.
For me the biggest incentive was being able to play xbox game pass games and not needing to worry about any compatibility issues with Proton which is why I went down that route.
But yeah your second part is very true. I feel, the impact is minimal if you are on a traditional PC. But on something with such limited resources like a Steam Deck, the difference can be going from 40 fps to mid 50's and a few more minutes of battery life.
But it isn't something I would recommend most people do. More just kinda pointing out that with the effort I think Microsoft could make a lean Windows really just by taking a look at what is actually necessary to be run for specific tasks.
Possible by making your own custom image using dism. Everything debloat scripts are doing can be done before even making an ISO.
Why though? Microsoft is a business, Windows is a product. It works well as a product, sales is good, deployments are many. Why should they reconsider the current strategy?