Owen used to organise the Manchester Linux User Group at the MCC as well, I fondly remember those early days when I was learning Linux. Looking back it was an amazing privilege to connect with some extremely knowledgeable people in the Linux ecosystem.
What a glorious piece of history. I wonder what other "scratching my itch" solutions became so mainstream that people forgot about the original authors.
I think all of todays popular Linux distros, Debian, Gentoo, Fedora, Arch, SUSE and so on, are all very much "scratching my itch" projects that somehow managed to outlive the original authors engagement with the project.
It's not like any of them where planning to be used by millions of people.
Yes and no. I realise that to younger members of the Linux community they're all from long ago, but they're not the same age.
There aren't really clear generations in Linux distros, but as an approximation:
Debian is pretty old, but it's a 2nd gen distro, borne from dissatisfaction with the very early SLS.
So was Slackware, but it took SLS and improved it. Slackware is arguably the oldest surviving distro.
SuSE has roots as a German version of Slackware. Red Hat's package manager was bolted on later.
Gentoo and Arch are relatively modern, being 21st century projects. Arguably, they're 3rd gen.
Fedora is a 4th gen distro, younger than any of the others here. Its ancestor was Red Hat Linux, which was contemporaneous with Debian -- but was left behind by Debian's technical encancements: in 1996 or so, Debian introduced `apt`, a package manager with automatic recursive dependency resolution. This put it far in the lead of Red Hat, which still only had RPM and no dependency resolution.
Red Hat went in another direction. Red Hat Linux 7 became RHEL, a commercial, paid-for, supported distro.
The free RHL went on for 2 more versions, reaching Red Hat Linux 9, which then became Fedora Core, version 1 of the free unsupported community distro.
Has the distribution model been good for Linux? It led to different approaches to things like desktop environments, packaging, and a variety of platforms, but 30+ years later, there are several sane choices for server distros, desktop distros are even more fragmented, and the most popular user distros are Android and ChromeOS.
I think so, because the users seem to like having different options. For commercial software, it makes sense to count how many devices use a particular distribution as the measure of “success”, but for projects like most Linux distributions , I don’t know that number of users makes sense. Why should we care how many users a particular distribution has, when almost all of them aren’t paying or contributing? Having more users doesn’t make the software any better inherently, and nobody is making money from those users. Instead, I would argue that user enthusiasm and dev interest are better measures of success for open source projects like this, and arch, Debian, Linux mint, etc are all doing fine in those regards.
The packaging model of distribution is ubiquitous. Every distribution does the same thing just using different control files and tools. The differentiation between distributions is all in their packaging policies and platform decisions. In a loose way Unix (SysV, HP, AIX) started packaging before Linux but Linux said "Every project is its own package" and ran with it. The de-facto "Download release; apply patches; configure; make; make install; collect files" is present in every distribution package. Everything up through deployment is the same pattern across all distributions.
It is tempting to consider the replacement for a lively evolving market of good options is just the best option only better for all the labor focused on a singular end but in fact the likely result of a singular option is most of that labor that literally only exists because of the interest in doing ones own thing just ceases to exists and is lost.
What persists never had the privilege of benefiting from ideas taken from all those other now non-existent projects and is on the whole mediocre.
The amount of software available for linux vs the BSDs tells me that the distro model has not hurt linux. If a homogenous software stack from a single centralised set of software was beneficial, it would be more likely that porting to linux from a BSD codebase would be the norm, rather than the other way around.
This really brings back memories of how painful installing any software in the early 90s was. The small company I worked for got us a Yggdrasil CD to try but we were unable to get it installed on any of the PCs we had at the time. MCC might have done better, but we hadn't heard of it.
I very clearly remember my very first version of Slackware -- pre 3.0.0 (which I actually bought on cd for a few bucks). I don't remember that first version, just that I downloaded the floppy disk sets over zmodem at 14.4kbps (thankfully saving to hard disk, not to floppy).
That first version of Slackware I used had the Linux kernel 1.2.8; IIRC that series went to 1.2.13 before going through the a.out->ELF transition.
Anyway, original point, that Slackware distro of 1.2.8 had a bug where every single time I had to reinstall the bootloader for a newly-compiled Linux kernel (which I had to do regularly), LILO was broken and hung at the `LI` prompt... those who were there may remember, the number of letters of LILO: that were output gave a sign to the source of the error.
But every single time, I had to rescue boot, and try to remember what I had to fix to make LILO work again.
Could very well have been Slackware. Slackware was my first Linux distribution, it came as a set of like at least 20 floppies. All of mine were repurposed AOL disks. After spending about a solid week or so downloading the whole set of disk images over a slow and intermittent dialup connection, the next most painful thing was the fact that floppies were notoriously unreliable. Some disks would throw I/O errors when writing. Some would get caught immediately after when verifying. Many others showed no problems until install time. Getting two dozen floppies to actually read 100% of their contents successfully took a week or two on its own because I only had one computer to work with.
Was Yggdrasil that bad? My first distro was Slackware and, with the help of the book accompanying the CD, it was doable. Sure you had to define modelines for X11 (the Xorg name didn't exist bad then) to support your monitor and supporting GPUs was quite the endeavour, but in the end we'd make it work. We'd even compile and run Emacs (in 45 minutes or so).
MCC Interim Linux wikipedia page notes it started out with Linux kernel 0.12 https://en.wikipedia.org/wiki/MCC_Interim_Linux
https://www.kernel.org/pub/linux/kernel/Historic/old-version...
It makes me want to play, configure, compile, tidy and optimize! https://github.com/ESP32DE/Boot-Linux-ESP32S3-Playground
I had no idea he had such a claim to fame....though I suspect he didn't either!
It's not like any of them where planning to be used by millions of people.
There aren't really clear generations in Linux distros, but as an approximation:
Debian is pretty old, but it's a 2nd gen distro, borne from dissatisfaction with the very early SLS.
So was Slackware, but it took SLS and improved it. Slackware is arguably the oldest surviving distro.
SuSE has roots as a German version of Slackware. Red Hat's package manager was bolted on later.
Gentoo and Arch are relatively modern, being 21st century projects. Arguably, they're 3rd gen.
Fedora is a 4th gen distro, younger than any of the others here. Its ancestor was Red Hat Linux, which was contemporaneous with Debian -- but was left behind by Debian's technical encancements: in 1996 or so, Debian introduced `apt`, a package manager with automatic recursive dependency resolution. This put it far in the lead of Red Hat, which still only had RPM and no dependency resolution.
Red Hat went in another direction. Red Hat Linux 7 became RHEL, a commercial, paid-for, supported distro.
The free RHL went on for 2 more versions, reaching Red Hat Linux 9, which then became Fedora Core, version 1 of the free unsupported community distro.
RHL was killed off after v9.
What persists never had the privilege of benefiting from ideas taken from all those other now non-existent projects and is on the whole mediocre.
That first version of Slackware I used had the Linux kernel 1.2.8; IIRC that series went to 1.2.13 before going through the a.out->ELF transition.
Anyway, original point, that Slackware distro of 1.2.8 had a bug where every single time I had to reinstall the bootloader for a newly-compiled Linux kernel (which I had to do regularly), LILO was broken and hung at the `LI` prompt... those who were there may remember, the number of letters of LILO: that were output gave a sign to the source of the error.
But every single time, I had to rescue boot, and try to remember what I had to fix to make LILO work again.
I somehow got it to boot up but didn't really know what to do with it after that.
And first linux distribution with a GUI was "TAMU linux", 3 months later: https://lwn.net/Articles/91371/
Both were released by universities