I get the nostalgia, but boy are computers so much better than back then. MacOS 7-8 (not sure about 9, never used it) would crash constantly. There was no protected memory, so as soon as one app started misbehaving, you'd have to reboot the computer and it would take forever and you'd lose your work. My recollection was that this would typically happen 3-4 times a day. Apple's project to make a new MacOS that didn't do this failed and so they had to buy NeXT, and the rest is history.
I think it depended on your workload, and how often you rebooted them. But yes, macOS reliability has come a long way since then.
UX I'm less sure about - I now use a 4K monitor to do a bunch of things I used to do quite successfully on 1024x768.
Performance of a bunch of basic apps has also taken a few hits here and there (at least in part due to that big resolution bump, and despite massive hardware improvements).
I definitely have some nostalgia for that era, but it's possibly more nostalgia for the age before the internet consumed all of computing...
That’s true. The nostalgia nerds can rightly criticise the ever-spiralling bloat and inefficiency of modern systems. But modern macOS can go for months without a crash/reboot! Almost unimaginable in the Mac OS 9 / Win95 era.
This is what pushed me to get my parents on NT4 in 1997: the system would run until you wanted to reboot it, rather than crashing twice a day because Netscape decided to have a conniption like Win9x did. Never looked back. Win2000 was absolutely awesome for them, WinXP was twice the size and half the speed but fine and very long-lived, Vista was extremely reliable despite all the hate heaped on it...and we're approaching the modern era.
Meanwhile, I was running IRIX (for which my record was 4 years without a reboot, and that only due to a power outage) and Linux; and when I got some money, macOS X: all rock solid.
edit: I just remembered that Netcraft used to have a server uptime leaderboard, and for quite a while in the 1990s, Lamborghini was at the top of the list with their IRIX servers. The earliest archive I can find of the list is from 2001, with ONLY FreeBSD and IRIX occupying the top 29 places: https://web.archive.org/web/20010226190549/http://uptime.net...
I think it used to be that the back end (hardware, kernel, services, libraries) was in many ways worse, but the front end (UI/UX) was largely better. And today we have the reverse; amazingly fast and mostly reliable hardware, solid underlying components, and ugly, hostile interfaces on top. Hence the number of folks wishing for the Windows 2k GUI on a modern NT OS, a slightly more robust version of some OS X (I think I usually hear Snow Leopard?), or those of us running CDE on bleeding edge Linux.
People aren't nostalgic for the stability of the system, they're nostalgic for a time when user interfaces were more friendly to humans and less flat/abstract
Most popular PC operating systems (CP/M, DOS, Mac, ...) from the 1980s to the early 1990s didn't have memory protection, yet people used them and still use them.
George RR Martin famously wrote on WordStar for DOS (a rewrite of the CP/M version) until at least 2014.
Turbo Pascal didn't have memory protection, but people still loved it and compare it favorably to modern IDEs. Both Turbo Pascal (DOS and CP/M) and Think C (Mac) have featured on HN repeatedly iirc.
As I understand it, Apple's Lisa had memory protection, but the Mac gave it up to reduce hardware and memory requirements. Apple's Pascal compilers (like many) supported range checking, but developers turned it off, giving up reliability for performance and code size. Then they switched to C, which laughs at memory safety* and introduced null pointers that conveniently pointed to Mac OS data in low memory/writable RAM (perhaps a hangover from Apple II/6502 programming.)
It's almost like CPU designers going for performance at any cost and introducing isolation and security flaws.
* Prof. Kernighan might point out that there are ANSI C compilers that are memory safe and that nobody uses, and that clang/LLVM might even implement a safe memory model someday
I still remember in the late oughts, when Mac software updates came out (say, 10.x.1 or whatever, Safari updates, etc.) an app called Software Update would tell you about them, allow you to click to download and install what you wanted, the installation all happened in the background while you worked, and then at the end the SWU dock icon would bounce to prompt you to restart when ready. Guess what? quick option-right-click → Force Quit on the app would shut that up, then you could go about your day and reboot when you were ready.
Today, the Mac harangues you every time you wake it about the updates, then to apply them, you must reboot immediately, and stare at various Apple logos and progress bars for an indeterminate amount of time (no estimates offered).
One of the reasons why I love the Angry Videogame Nerd is that when he came out, the state of third-generation retrogaming nostalgia involved a lot of wearing of rose-colored glasses. Remember Castlevania? Remember Mega Man? Double Dragon? Remember having to blow on the cartridges before they'd work (which never did really work)? Etc. The AVGN stood as a counterpoint to all that, reminding us that many of the games from that era -- even games we loved or wanted -- kind of sucked, and that we can appreciate modern games for the extra bit of care that went in to them to make them not suck so bad. (This was 2004, though, and what goes around comes around...)
The same is kind of true of retrocomputing. We look back to these old platforms as if due to coming from a "simpler time", they never had any latency and never experienced bugs. And sure, if you fire it up for five minutes on some online WebAssembly emulator you find on Hackernews, it seems much snappier and more pleasant to use than Windows 11. But back then, we were running them on CPUs literally hundreds of times slower than even a potato-class computer from today. And due to memory protection being frickin' absent from almost all consumer-grade operating systems, they crashed. A lot. Even the Amiga was crash- and Guru-Meditation-prone depending on what you were running on it. It was such an enormous relief for me, trying out Linux or even Windows NT for the first time, to watch the OS simply yeet out a misfiring app; disconnect any access it had to the network, file system, or window system; and proceed merrily on its way as if nothing happened. Not following the exact sequence of steps it required to set up the message pump properly so that your program can respond to messages as required in Windows 3.1, for instance, can cause strange glitches within Windows itself, requiring a restart of Windows; or even hard-lock the system requiring a cold boot.
Our computers are so powerful these days, and our software so sophisticated, that they've eliminated entire classes of problems from the old days, only to open the door to entirely new classes of problems (like adware that would have brought a Pentium II to its knees, and sparked a user complaint campaign that would have resulted in major egg on the vendor's face if not bankruptcy from the ensuing lawsuits, being routine, and even required, on commercial operating systems of today).
Offtopic, but here's how to feel old: More time has passed between the debut of the AVGN and the present, than has passed between the debut of the NES in the West and the debut of the AVGN.
While what you said is true (MacOS was really unstable in these tumultuous years), this link specifically celebrates the user interface of the later versions of non-OSX MacOS. This doesn't really have a relation to the stability of the OS.
> this link specifically celebrates the user interface of the later versions of non-OSX MacOS. This doesn't really have a relation to the stability of the OS.
"Very pretty but can't do much" was a general take on the Mac OS cube of the day.
The lack of a fan or any decent cooling, the "lack of a floppy disk" (for those of us who didn't use Zip drives), it was pretty to look at but hard to work with.
We had one to run FrameMaker on, but beyond type-setting (& fonts), it was a shiny thing which was treated like a sunday sports car.
Where I was, the Tex user group is what eventually materialized into a Linux User group and there was simultaneously love for the screen, rendering and fonts for the Mac, but near hatred at having to use it to professionally typeset things.
Math publications quickly jumped ship out of Adobe due to OS 9, but very few came back to the OS X versions until years later when Apple started making really good laptops with fast hardware.
Does anyone else remember the early third party tool that would help you find extension conflicts by enabling half of them, asking you if the problem was still there, and then subdividing again and again? (Can't remember what it was called, but boy! was it helpful.)
I still recall the workaround to avoid restarting with some of these, once to the bewildered confusion of the ageing IT teacher looking over my shoulder in the library:
At the ">" ROM debugger prompt, type the following lines,
pressing Return after each:
It's unfortunate Apple never shipped any of the modern OS's it worked on from 1988 to 1996. All of them would have had a classic Mac UI style on top of a microkernel. Resurrecting Pink would be an interesting retro move.
it's possible to bring back the usability of earlier operating systems with the stability of also earlier, but not quite as earlier, operating systems (OSX has crashed hard-booted three times on me this year, haven't had that happen in a long time)
10.6.8 was solid (people often forget that 10.6 and the intermediate patches were a bit of a mess) but when I use it I definitely feel a few spots where some polish and QoL improvements would be welcome. Its theme’s dominant gray is also a bit too dark.
I don’t mind SIP personally. It always unnerved me that any random executable was just a single admin password entry away from doing whatever it pleased.
My ideal desktop OS environment would be something like a polished-up Snow Leopard using the brighter, more refined theme from 10.9 Mavericks (except with the flat scrollbars de-flattened). If I had that I don’t think I could use anything else.
SIP is quite terrible. Requiring an access level higher than root to access system files, and only processes that are signed by Apple and have special entitlements can have that access level?
It's my computer, I should be at the top of the permissions totem pole, not Apple. If they're dead set on using signing to enforce the higher-than-root level, then I should be the one signing the executables. Fortunately, you can disable this crap.
"General purpose computing" is a niche market. Most buyers would be more than content with a machine that lets them run the specific software they need for their specific use cases, and otherwise protects them from executing malicious code. Joanna Rutkowska's "evil maid attack" is a real security threat; and one of the edges that Apple has in the marketplace is being one of the first vendors to design practical, evil-maid-resistant personal computers.
Genuinely curious, what are you doing that requires you to turn SIP off? I'd have thought only OS developers and maybe forensics people of various stripes need to do that.
Its performance over wifi is also shockingly close to that of third party KB+mouse sharing software running hardwired which is crazy. Trying to use Synergy, etc over wifi is normally a laggy mess.
It's certainly not true that it focused on stability even if Bertrand said it. Any kind of change reduces stability by introducing regressions. This includes performance improvements and it even includes other stability fixes.
Not from me. A black box filesystem that can't even accurately say how much space is free and destroys mechanical hard drives with longer term use. There were other options to deal with HFS+'s 2038 problem.
I have an ancient laptop with Snow Leopard running. It keeps my Nikon Film Scanner useful since Nikon abandoned the software for these scanners rather abruptly.
I've been a pro photographer since 1990 and the Kodachrome setting embedded within the Nikon scanner is fantastic for capturing the color and detail of this exceptionally sharp film.
There’s something about the Platinum UI that makes it timeless. I have been running XFCE with a Platinum theme on several of my machines/VMs for a while (https://taoofmac.com/space/blog/2022/04/12/2330) and I’m a bit sad that the XFCE top-level app menu plugin seems to have been deprecated in recent releases, since it really adds to the experience.
The XFCE global menubar plugin being deprecated likely happened because there’s no standardized way for applications on Linux to surface their menus to the environment for display (or to power a Unity-style HUD or whatever), which means that hackery via plugins for GTK and Qt are required to make a global menubar work. Keeping that hackery functional probably is a constant battle.
There’s also how GTK apps have been on a crusade against menubars since GTK3 which doesn’t help matters. Same goes for Linux builds of Electron apps, which don’t offer the user the option to display a menubar even though all the menu work is already done for macOS builds.
Whatever the case it bums me out too. I know a lot of people hate global menubars but I’ve always loved them and it kills me that they’re not a practical option on the Linux desktop, despite ultimate customizability being one of the most frequently-cited reasons to use Linux as a desktop OS. You can do anything you want, as long as what you want falls within the boxes of Win9X-like, tablet-like, or "trendy minimal tiling window manager"…
Global menu bars are a great way of making complex functionality discoverable without overwhelming the user. They should be thought of like an index at the back of a book — not for everyday use, but a universal standard way to call upon infrequent tools, and to discover commands which might be useful.
It's disappointing the degree to which a bulk of the Linux community seems to be chasing whatever paradigms they learned on whatever Windows machines they were first to exposed to (or pay no consideration to UX at all).
It seems that the Stallman-esque "FOSS or die" attracts a lot of people who think about engineering problems more than HCI ones, and there is a vocal subset of "FOSS or die" in every Linux community I've encountered.
> I know a lot of people hate global menubars but I’ve always loved them and it kills me that they’re not a practical option on the Linux desktop, despite ultimate customizability being one of the most frequently-cited reasons to use Linux as a desktop OS.
Isn't that the point? It's so customizable that app writers opt out of menubars. And the system menubar has no power over the apps.
The big loss for me was when many apps took over the windows 3.0 MDI interface style. (Specifically thinking of code here). I was doing some hacking in codwarrior on Mac OS 9 a few weeks ago and it was such a joy to have 5 source code windows open any once - and a separate window with build or find results that weren’t wrapped in a miserable column. There’s actually a lot about Xcode I do like - but treating content panels like the old Puzzle desk accessory was never an improvement.
Always been rather partial to the Windows 9x look myself, but Mac OS 9 is a solid runner-up to me in terms of looks. Enjoyed seeing some of the reimaginings of modern apps, quite gorgeous.
Such great work. I'm amazed by the video. I wondered for a moment if they managed to tweak these apps (I'm a bit naive I guess).
Never had a Mac but I remember as a kid when I went to a friend who had one. It was so fascinating to me.
So weird to be nostalgic of something I never owned or used.
UX I'm less sure about - I now use a 4K monitor to do a bunch of things I used to do quite successfully on 1024x768.
Performance of a bunch of basic apps has also taken a few hits here and there (at least in part due to that big resolution bump, and despite massive hardware improvements).
I definitely have some nostalgia for that era, but it's possibly more nostalgia for the age before the internet consumed all of computing...
Meanwhile, I was running IRIX (for which my record was 4 years without a reboot, and that only due to a power outage) and Linux; and when I got some money, macOS X: all rock solid.
edit: I just remembered that Netcraft used to have a server uptime leaderboard, and for quite a while in the 1990s, Lamborghini was at the top of the list with their IRIX servers. The earliest archive I can find of the list is from 2001, with ONLY FreeBSD and IRIX occupying the top 29 places: https://web.archive.org/web/20010226190549/http://uptime.net...
George RR Martin famously wrote on WordStar for DOS (a rewrite of the CP/M version) until at least 2014.
Turbo Pascal didn't have memory protection, but people still loved it and compare it favorably to modern IDEs. Both Turbo Pascal (DOS and CP/M) and Think C (Mac) have featured on HN repeatedly iirc.
As I understand it, Apple's Lisa had memory protection, but the Mac gave it up to reduce hardware and memory requirements. Apple's Pascal compilers (like many) supported range checking, but developers turned it off, giving up reliability for performance and code size. Then they switched to C, which laughs at memory safety* and introduced null pointers that conveniently pointed to Mac OS data in low memory/writable RAM (perhaps a hangover from Apple II/6502 programming.)
It's almost like CPU designers going for performance at any cost and introducing isolation and security flaws.
* Prof. Kernighan might point out that there are ANSI C compilers that are memory safe and that nobody uses, and that clang/LLVM might even implement a safe memory model someday
Tbh, it's all downhill from there, every new release gets a little worse not to mention all the hardware issues too.
Today, the Mac harangues you every time you wake it about the updates, then to apply them, you must reboot immediately, and stare at various Apple logos and progress bars for an indeterminate amount of time (no estimates offered).
The same is kind of true of retrocomputing. We look back to these old platforms as if due to coming from a "simpler time", they never had any latency and never experienced bugs. And sure, if you fire it up for five minutes on some online WebAssembly emulator you find on Hackernews, it seems much snappier and more pleasant to use than Windows 11. But back then, we were running them on CPUs literally hundreds of times slower than even a potato-class computer from today. And due to memory protection being frickin' absent from almost all consumer-grade operating systems, they crashed. A lot. Even the Amiga was crash- and Guru-Meditation-prone depending on what you were running on it. It was such an enormous relief for me, trying out Linux or even Windows NT for the first time, to watch the OS simply yeet out a misfiring app; disconnect any access it had to the network, file system, or window system; and proceed merrily on its way as if nothing happened. Not following the exact sequence of steps it required to set up the message pump properly so that your program can respond to messages as required in Windows 3.1, for instance, can cause strange glitches within Windows itself, requiring a restart of Windows; or even hard-lock the system requiring a cold boot.
Our computers are so powerful these days, and our software so sophisticated, that they've eliminated entire classes of problems from the old days, only to open the door to entirely new classes of problems (like adware that would have brought a Pentium II to its knees, and sparked a user complaint campaign that would have resulted in major egg on the vendor's face if not bankruptcy from the ensuing lawsuits, being routine, and even required, on commercial operating systems of today).
Offtopic, but here's how to feel old: More time has passed between the debut of the AVGN and the present, than has passed between the debut of the NES in the West and the debut of the AVGN.
"Very pretty but can't do much" was a general take on the Mac OS cube of the day.
The lack of a fan or any decent cooling, the "lack of a floppy disk" (for those of us who didn't use Zip drives), it was pretty to look at but hard to work with.
We had one to run FrameMaker on, but beyond type-setting (& fonts), it was a shiny thing which was treated like a sunday sports car.
Where I was, the Tex user group is what eventually materialized into a Linux User group and there was simultaneously love for the screen, rendering and fonts for the Mac, but near hatred at having to use it to professionally typeset things.
Math publications quickly jumped ship out of Adobe due to OS 9, but very few came back to the OS X versions until years later when Apple started making really good laptops with fast hardware.
It was the 90s equivalent of a kernel panic - you'd get a dialog that basically went "the computer is about to explode. click here to restart"
https://macintoshgarden.org/apps/conflict-catcher-9
At the ">" ROM debugger prompt, type the following lines, pressing Return after each:
SM 0 A9F4 G 0
:)
Then we discovered how much more money could be made by focusing on ad tech.
The relationship between people and technology was so different in 2009. I miss it.
I don’t mind SIP personally. It always unnerved me that any random executable was just a single admin password entry away from doing whatever it pleased.
My ideal desktop OS environment would be something like a polished-up Snow Leopard using the brighter, more refined theme from 10.9 Mavericks (except with the flat scrollbars de-flattened). If I had that I don’t think I could use anything else.
It's my computer, I should be at the top of the permissions totem pole, not Apple. If they're dead set on using signing to enforce the higher-than-root level, then I should be the one signing the executables. Fortunately, you can disable this crap.
But for those of us who need that access we can disable it, with a bit of a dance to make it hard for people who _don't_ need it.
"General purpose computing" is a niche market. Most buyers would be more than content with a machine that lets them run the specific software they need for their specific use cases, and otherwise protects them from executing malicious code. Joanna Rutkowska's "evil maid attack" is a real security threat; and one of the edges that Apple has in the marketplace is being one of the first vendors to design practical, evil-maid-resistant personal computers.
Here are the instructions to enable logging in as root, since privilege escalation seems inconvenient for you:
https://support.apple.com/en-us/102367
[1] https://www.apple.com/macos/continuity/
I've been a pro photographer since 1990 and the Kodachrome setting embedded within the Nikon scanner is fantastic for capturing the color and detail of this exceptionally sharp film.
Deleted Comment
There’s also how GTK apps have been on a crusade against menubars since GTK3 which doesn’t help matters. Same goes for Linux builds of Electron apps, which don’t offer the user the option to display a menubar even though all the menu work is already done for macOS builds.
Whatever the case it bums me out too. I know a lot of people hate global menubars but I’ve always loved them and it kills me that they’re not a practical option on the Linux desktop, despite ultimate customizability being one of the most frequently-cited reasons to use Linux as a desktop OS. You can do anything you want, as long as what you want falls within the boxes of Win9X-like, tablet-like, or "trendy minimal tiling window manager"…
It seems that the Stallman-esque "FOSS or die" attracts a lot of people who think about engineering problems more than HCI ones, and there is a vocal subset of "FOSS or die" in every Linux community I've encountered.
Isn't that the point? It's so customizable that app writers opt out of menubars. And the system menubar has no power over the apps.
Also the MacOS "Zoom" button makes so much more sense than the OSX/macOS green button - and I still miss window shading.
What I miss is not being able to set the priority of WiFi networks
Now it's whatever slop in every direction.
For what it's worth, I feel the same way about the Nintendo 64, and the Mario franchise, etc.
Makes me think that nostalgia requires a certain quality, not just familiarity.