Readit News logoReadit News
burlesona · 5 years ago
I got really into computers in the mid-late 90s by which point we were mostly down to Wintel and Apple hanging on by a thread.

What I remember from that era is that nothing was compatible with anything else. It took a lot of work to interoperate between two PCs, let alone cross the gap between OSes. So for a long time, I have kind of taken the current world of few OSes that are highly interoperable as being a great thing: you can build your own Linux machine and still do real work with people on Windows and Mac, etc.

But the more I learn about computing in the 80s and early 90s, the more I’m impressed by the variety and diversity of ideas that came out of that era. I see now that today’s standardization has a real cost, which is that we don’t really see new ideas or new paradigms.

For the HN crowd, especially those who are older than me and can remember the earlier era of computing, what do you think about that trade off and where we’ve ended up today?

Are we better off living in a world where all computers can work together with minimal fuss? Or would it be better if we still had a wide range of vendors taking significantly different approaches and innovating at a much faster pace - albeit in incompatible ways?

pjmlp · 5 years ago
Personally I prefer the plethora of OSes approach, and I never been a big UNIX fan to start with.

Yes it does have a couple of nice ideas, and it was much better to use than Windows 3.x + MS-DOS, but that is about it.

All the UNIX based OSes that I really appreciate, have moved beyond it, namely NeXTSTEP, Solaris/NeWS, Irix, Plan 9/Inferno.

Thankfully only BSD and GNU/Linux are stuck into being a continuum of UNIX clones without much to add, when you see their conferences it always boils down to kernel features or filesystems.

GNU/Linux has the ingredients to make an Amiga like desktop experience, with D-BUS, GNOME/KDE, but the fragmentation and love for POSIX CLI applications just doesn't make it worthwhile to try to make it work.

Look at iOS, macOS, Android, Windows (UWP), GenodeOS, Fuchsia, Haiku, Azure Sphere for the pursuit of modern ideas in OS research. The fact that some of those have a POSIX like kernel (or deeply customise Linux kernel) it is just an implementation detail of the overall architecture.

nix23 · 5 years ago
>NeXTSTEP, Solaris/NeWS, Irix, Plan 9/Inferno.

With the Exemption of Plan9 all the OS's you named are pure Unixes. As a example FreeBSD is nearly a mirror of Solaris. Nexstep was pure BSD with a Mach Kernel (OO development was the great seller on NextStep)...Irix? Well OpenGL and XFS is available under Linux.

>iOS, macOS, Android, Windows (UWP), GenodeOS, Fuchsia, Haiku, Azure Sphere for the pursuit of modern ideas in OS research

Haiku modern?? BeOS is 25 years old.

otabdeveloper4 · 5 years ago
Looks like you missed the whole systemd and cgroups thing. The two are huge and novel developments in OS design, and much more significant than the GUI flimflam you pine for.

(Yeah, Linux is a server OS, not news.)

Ekaros · 5 years ago
What I'm personally bit horrified with is that "Unix" is the end all of OS design and so on. With some fundamental designs going back to 70s... Like everything being text. Which seems quite a mess in current world with multimedia and increased networking. Yes we have spend decades hacking to get it to work... But maybe more options and different designs would enrich us lot better.

And really we do have layer that can connect all of the different systems together. Namely IP. So even if we had fundamentally different systems, there is no reason why intercommunication wouldn't be possible at this point.

TheOtherHobbes · 5 years ago
Yes, it's not either/or. The web is a kind of meta-OS with a very minimal and simple API, and as far as the web is concerned everything in userland operates as a thin-client. The OS is only of interest locally.

By the mid-90s I was furious with both Microsoft and Apple for setting computing back by at least a decade.

You could - with extreme effort - make an Atari ST multitask. And of course with the Amiga it was built in.

So why did we throw that away and go backwards to DOS and then single-process Windows - which eventually reinvented multitasking nearly a decade later and sold it as if it was the most astounding development in the history of computing?

Of course there were technical challenges - protected memory, protected processes, and so on. But the earliest versions of Windows didn't have those either.

So it was a disappointing and frustrating time - an alienating difference of philosophy between consumer computing designed for creativity and exploration, and box-shifting commodity computing designed for form-filling and bureaucracy, which might allow you to have some fun after hours if you behaved yourself and the machine didn't crash.

Considering how smart the Amiga team were, it would have been very interesting to see what they could have done with ubiquitous fast networking, high-res graphics and video, and pocketability.

I suspect the result would have been far more open and inspiring than the corporate sand trap we have today.

patterns · 5 years ago
This reminds of Robert Pike's "Systems Research is Irrelevant" speech [1]. Now, 20 years after his speech, we are still stuck with the same notions (such as everything being a string). It's not that there are not plenty of alternatives around, however, expectations are so high that it's almost impossible to make a new computer system economically viable. On the other hand, the hacker and maker scene is very active, some of them building operating systems and hardware such as tiny Lisp-based machines [2] and OSes [3]. (My only gripe is that most of the new "avant-garde" systems are still text/file-based.)

I'd love to see a next wave in personal computing, starting with a clean slate, building on the research, insights and learning from the mistakes that have been made. I have no doubt that it will happen, the question is only when.

As for interoperability: Even on the same platform there are countless problems getting software to talk to each other, so I don't think that a new system will make the situation any worse.

[1] http://www.herpolhode.com/rob/utah2000.pdf

[2] https://www.tindie.com/products/lutherjohnson/makerlisp-ez80...

[3] https://mezzanos.herokuapp.com/

mixmastamyk · 5 years ago
Good enough, cheap, source code available wins in the long term. Also worse is better. But even worse was a lot better than DOS.
swiley · 5 years ago
IMO Unix hasn't taken over as much as people think it has. If you look at an OS more closely they typically have a POSIX API layer on top of whatever unique ideas they have. Even Linux does quite a lot of its own thing.
simonh · 5 years ago
I think the issue is that, from a user perspective, the implementation details of the kernel and base OS doesn’t matter. As long as it’s sanely implemented and doesn’t impose onerous limitations, it should just get out of the way. The vendor can implement it however they like, with whatever technologies they like, and as long as it supports featureful applications users shouldn’t need to care.

The reason Unix was so successful was precisely because it was simple and portable a d designed to get out of the way. It was as unremarkable as possible, by design. All the interesting stuff is left to application developers. You want to implement a relational file system, go ahead. You want to develop your own GUI layer, fine. Compared to the other major OSes of the day like VMS, Pick, PrimeOS, etc it’s as unoppinionated as possible. Likewise with C, which is intended to be as low level and paradigmless as they could. Just an abstraction over assembler just as Unix is a thin abstraction over the hardware.

mattmcknight · 5 years ago
Whatever got us the Internet was worth it. To generalize, it's fine to have a variety of components, as long as the interface between them is roughly standardized, and the innovation can come in the components. That we can have a Playstation and an iPhone use the same network is a good thing, and they can do whatever they want with the rest of the stack.

A computer pre-internet felt a lot more like an island. I had an Amiga, but I wasn't aware of 90% of the stuff that was out there, and could barely afford to buy a compiler.

ddingus · 5 years ago
I was in a similar scenario during those times. I did end up with internet gateway access in the very late 80s, which helped a lot.

Speaking of islands though, check out this FujiNet thing!

https://fujinet.online/

Turns out, Atari 8-bit computers have device independent I/O.

Suddenly, all those little islands are connected, and people are writing games in basic, playing together online.

noizejoy · 5 years ago
Not always islands, if you had a modem and a phone line. Pre-Internet were increasingly numerous BBS systems, from big one’s like AOL and CompuServ to medium sized one’s like GEnie and many tiny independents. Instead of remembering URLs you had lists of phone numbers (the smaller one’s requiring long distance calls when those were still expensive) for your little modems to dial. My Atari ST (a contemporary of the Amiga) was online a lot already during those days. I actually connected to the Internet via CompuServ for quite a while before the local phone company began to offer dial-up Internet service. And pirated as well as free software were widely available already back then.
an_opabinia · 5 years ago
Prior to the iPhone we were “standardized” on resistive touch screens, which sucked compared to the iPhone’s capacitive screen. Not because a stylus was a good way to use a portable device, and not because capacitive touch screen technology was unknown - it was invented long ago. It’s because of cost. You’re just conflating standardization with cost cutting.

If there was a cheap and easy to install OS in that era they’d all use it. Oh wait that was DOS & Windows, that was the whole point, that’s what happened, it was adopted because 999/1,000 vendors are interested in cost cutting not innovation.

It’s hard to celebrate cost cutters. History never celebrates the crummy cost cutters. I feel no nostalgia for that.

madhadron · 5 years ago
> Not because a stylus was a good way to use a portable device

I personally love using a stylus, have have back to my old Palm Pilot days.

alexisread · 5 years ago
I think there have been many missed opportunities in computing, however I'd suggest that the Internet and GPL (open source licencing in general) would have come regardless of the diversity of operating systems. Both these two innovations have really bound the software world together - compatability these days is really at the TCP level (eg. docker and APIs) and allows for a massive diversity of architectures (kafka, istio, materialize.io, IFTTT, Macrometa etc) - the bar has moved in a way.

Having said that, some of the missed opportunities of note are:

MINT: Gem and Tos weren't really developed much by Atari, but they did buy in MINT which has preemptive multi tasking and memory protection. With the Aes being retargetable (graphic cards), gdos vector fonts and postscript printing, tcp and lan networking stacks, shell, global shortcut system, posix compatibility and multi-user capabilities, it managed to evolve Tos to effectively a unix with usable desktop aka a standard OSX or linux-on-the-desktop well before now.

Secondly, choosing A2 instead of Android would have been huge. A compiled multithreaded, multitasking self-hosted OS with GC, zooming UI, 3x faster than Linux and small enough for one person to understand (250k kernel).

https://www.research-collection.ethz.ch/bitstream/handle/20....

One other benefit here would have been no Google-Oracle lawsuit to mess with API copyright :)

kmeisthax · 5 years ago
Oracle v. Google would have happened eventually. The underlying problem is the law, not the litigants. In any other field that deals with copyright, the idea that copyright protection extends to what is effectively a series of headings and summaries would be obvious. The problem is that copyright is a poor fit for software - our headings and summaries carry function to them, and changing them breaks code. In general, the law doesn't want to grant copyright over purely functional things, or even things where functionality and creativity are frustratingly mixed. That's why boat hulls and hardware have separate ownership regimes - copyright over them was explicitly rejected. But we also explicitly said software is copyrightable, which leaves open the question of where the unprotected function ends and the protected creativity starts.

Maybe in the world where Android never happened, some other company stumbled upon this rich vein of confusion, and we're dealing with SCO/Novell/Attachmate/Whoever v. Apple over who can implement UNIX APIs in their kernels.

jl6 · 5 years ago
This is the way of the world in microcosm. When globalization has smeared all cultures together, and the only languages left are English, Spanish, and Mandarin, the world will be far more interoperable - but at the cost of every other independent niche society. I’m not actually saying it’s not worth it.
tasogare · 5 years ago
The idea that English (in this case Spanish and Mandarin as well) will kill all other languages is debious and forget a very important fact: through most of history and in a lot of places bilingualism is the norm. Current examples are some African countries where languages local are spoken and French is used as vehicular language; Standard Chinese while people speak another Sinitic language. What is likely to happen is more people being proficient in a second or even third languages (like it is in Hong Kong).
pvg · 5 years ago
The incompatibilities just moved to different places, where the focus of development and competition is. Taken collectively, the various core services of mobile devices, desktop OS's and browsers interoperate very poorly across vendors. Think about stuff like identity, auth, sync; even very basic cases - e.g. user using a couple of Apple devices one of which is a macOS laptop where they run Chrome - are an incomprehensible mess of incompatibility to most non-nerd users.
bhauer · 5 years ago
A big "Yes" to proportional scroll bars. I didn't have an Amiga, but rather an ST. The ST also had proportional scroll bars and for years, I could not understand why the major platforms (Windows, MacOS) did not. It was a pet peeve that really bothered me when I would sit down to use someone's Mac or PC of the time.

I should have realized it could be worse. With the modern era's "mobile first" UX regime, we now have many desktop experiences that hide the scrollbars entirely—revealing them only on interaction—wholly removing any at-a-glance utility we enjoyed from proportional bars.

throwaway_pdp09 · 5 years ago
> revealing them only on interaction

but you can't interact with them except by bringing the mouse over the right side, and you can't see them so you literally have lost a visual cue of the very thing you want to interact with. It's just amazing.

ithkuil · 5 years ago
Yes, but the fact that so few people actually complains probably tells you a lot about the input devices that people use.

Touchpads with two finger scroll and mouse with scroll wheels have turned scrolling into a gesture that doesn't necessarily require UI element, in pretty much the same way that a physical keyboard is just there for you to type on and doesn't pop up in existence when you have a focused input field

mixmastamyk · 5 years ago
It’s even weirder when you realize how much more resolution we have now.

A favorite implementation of mine was 4dwm on SGI. You could middle-click on the trough and it would go directly there, rather than the normal paging by left-click. Still try to use it everywhere and am disappointed.

twic · 5 years ago
In Cinnamon on Linux, the scrollbars in Firefox, Terminal, and Xreader are all proportional, and jump directly on left-click. Shift-click moves one page.
badsectoracula · 5 years ago
Middle click to jump to the clicked point is very common in X11 toolkits to the point where it is considered a "de facto" standard (some applications that implement their own scrollbars, like Firefox, will even use that functionality just on X11 even though technically are able to do that anywhere).

On Windows you get the same functionality with Shift+Left Click. You also get to see which applications use Qt since it isn't implemented there :-P.

codesnik · 5 years ago
on mac: Preferences > General > "Click in the scroll bar to:" > "Jump to the spot that's clicked". Though with mac touchpads, I forgot when was the last time I ever clicked the scrollbar.
abrowne · 5 years ago
Chromium does this on Linux (at least I assume a three finger trackpad tap is middle click). IIRC, Qt apps show a menu with a "scroll to here" option.
masswerk · 5 years ago
Option + click on Mac…
jerf · 5 years ago
"I could not understand why the major platforms (Windows, MacOS) did not."

I expect that it's because in order to have correct proportional scrollbars, it requires you to know the exact height of the entire scrolling area (or have decently good heuristics for them, which is not as much simpler a problem than you might think), which is an expensive thing to require of a scrollable area. The more content the system can handle, the more expensive that is. It's a big thing to ask of a heterogenous collection of layout code.

Even today there's some scrollbars that aren't as proprotional as they seem. emacs, for instance, seems to be using some cheating heuristics rather than being precise. Under normal coding or text editing usage you might not notice, but if you fill a buffer with a lot of very differently-sized text you may find the experience of scrolling around is less slick than you might expect. Browser scrollbars can also behave strangely while the page is loading, as I'm sure everyone has experienced even if they haven't noticed. In that case it's perhaps more obvious why a smoothly-scrolling, totally-correct scrollbar is a surprisingly hard problem in the general case.

wolfgke · 5 years ago
> A big "Yes" to proportional scroll bars. I didn't have an Amiga, but rather an ST. The ST also had proportional scroll bars and for years, I could not understand why the major platforms (Windows, MacOS) did not. It was a pet peeve that really bothered me when I would sit down to use someone's Mac or PC of the time.

Windows has proportional scrollbars: https://devblogs.microsoft.com/oldnewthing/20030731-00/?p=43...

asveikau · 5 years ago
TFA explains that it has it now but didn't during the heydey of the Amiga.

My light experience from a long time ago is it also takes quite a bit of code to implement this on Win32 relative to other platforms. Your link could be used as evidence.

cutitout · 5 years ago
I love how the circuit diagrams were part of the manuals. Admittedly, an Amiga 500 was vastly less complex than a modern computer, but still: what today companies like Apple want to keep under wraps, so not everybody can just repair anything easily, everyone got whether they wanted/needed it or not.

Amiga nostalgia isn't just nostalgia, the Amiga was great. From age 10 to 14 most of my life revolved around my Amiga 500 and then Amiga 4000, and while it was great then, too, I feel doubly blessed in hindsight, considering what tech has become and where it's heading, to have been allowed to catch a brief glimpse of what could have been. And the music. All that music.

Deleted Comment

hvs · 5 years ago
As expected, most of the features were rare on home computers at the time, but have become ubiquitous (or unnecessary) on modern computers. HOWEVER, the `Datatypes` feature is really cool and would actually be a nice feature that still doesn't exist in Windows, macOS, or Linux.

https://wiki.amigaos.net/wiki/Datatypes_Library

pjmlp · 5 years ago
It does exist in Windows,

https://docs.microsoft.com/en-us/windows/win32/wic/-wic-lh

https://docs.microsoft.com/en-us/previous-versions//dd443207...

https://docs.microsoft.com/en-us/windows/win32/shell/customi...

and in macOS,

https://developer.apple.com/library/archive/documentation/Gr...

https://developer.apple.com/library/archive/documentation/Mu...

Although, yes I do concede it is more complex to implement, the way to do it depends on what kind of data type we are talking about, and it only works properly if the applications actually make use of the OS frameworks instead of doing their own thing.

And as expected with the GNU/Linux "desktop", nothing does exist that comes close to it.

richardjdare · 5 years ago
Wow, in 24 years as a Windows user I never knew about Windows Imaging Components, nor did I come across Image Units in the 4 years or so I was doing iOS dev on a Mac. I've never seen an end-user application that used them, or told me as a user that I could expand its functionality with them.

In contrast, on the Amiga, I knew about Datatypes out of the box as they were written up in the user manual, and pretty much the whole developer community got on board with them. If an app used DataTypes it was a feature that the user knew about.

The Amiga community seemed much more unified than today with respect to using all the latest features of the OS. Even the smallest public domain utilities provided AREXX ports so the user could script them.

vidarh · 5 years ago
That seems limited to specific types of media, though. Datatypes is a general purpose framework for creating a hierarchy of parsers for file formats.

It can be images or sound, but also text, or anything else. E.g. here's a small sample of datatypes from Aminet, excluding anything sound and graphics related:

* Datatypes for transparent decompression

* Datatypes for syntax highlighting

* Datatypes for parsing document formats (word etc.)

* Assorted hypertext datatypes

* Assorted datatypes for showing debug information and structure of various binary formats, like executables.

zimpenfish · 5 years ago
I'm possibly missing something but that Image Unit thing seems to be a filter rather than a decoder?

The Core Audio stuff is definitely datatypes though - converting things to a common backend format for apps to use.

pjc50 · 5 years ago
The Windows equivalent is/was COM and in particlar OLE ("object linking and embedding"). With appropriate DLLs you can make a new file format and drag-and-drop a file into Word and open it inside Word.

The peak of this was ActiveX; in Microsoft's world around 2000, they really wanted you to be able to embed controls and programs in web pages and then put those web pages everywhere, including on the desktop background ("active desktop").

The problem of running third-party code inside your application is not just security but it tends to crash bady and put the name of the hosting application on the crash. Eventually Microsoft got so fed up with that, especially in the kernel, they built a system where video drivers could crash and restart while the OS kept on trucking, and use of COM has faded.

pjmlp · 5 years ago
For those not following up with Windows development, all major new Windows APIS since XP are ActiveX aka COM aka WinRT aka UWP.

COM is pretty much alive and was modernised with .NET type system.

toyg · 5 years ago
> The Windows equivalent is/was COM and in particlar OLE [...] it tends to crash badly

The mistake was implementing this stuff in C++, which is way too fragile on its own. Had they built a separate VM to manage requests for these objects, it would have been no problem: VM crashes, host app just restarts it. But Java did not exist yet and VMs were not really a thing, resources were too scarce.

I do think the ideas behind COM/OLE were pretty good, unfortunately real development on desktop OSes basically stopped mid-'00s so we might never know what it could have been.

mrob · 5 years ago
My favorite feature from the past is RISC OS's "adjust" feature, bound to the third mouse button. This had three functions:

Selecting menu items without closing the menu.

Toggling selection status of selectable icons (same as control+click in modern GUIs).

Changing the length of text selections (same as shift+click in modern GUIs).

Sadly, modern GUIs seem to limit themselves to two mouse buttons. Adding "adjust" would improve productivity if you had a three button mouse, and you could ignore it if you lacked mouse buttons and not be any worse off. Maybe there are other good features from early OSs we have forgotten.

smallstepforman · 5 years ago
BeOS (and by extension Haiku) has Translators: https://www.haiku-os.org/legacy-docs/bebook/TheTranslationKi... Supports image, text, sound, streaming, etc.
tyingq · 5 years ago
I'd always hoped that extended attributes (xattr) would be used for something like that. Nobody seems to use them, though, since they are optional and no standard naming scheme exists.
zozbot234 · 5 years ago
One unique feature of the Amiga that I've not seen referenced here is that programs, scripts etc. could reference removable media by label and if the media was not in the drive when needed, a Kickstart-managed requester would pop up asking for it to be inserted promptly. This made it very easy to manage even a single-floppy-drive system. Linux could support this even now, as all the required pieces are there (e.g. automount support) and it would be quite useful for a number of things; however it doesn't, AFAICT.
rob74 · 5 years ago
To be fair, this feature was more useful in the nineties than today :) And for the ones not familiar with Amiga jargon, a requester is what you would call a dialog box on other OSes...
antod · 5 years ago
I have vague memories of named pipes between GUI programs that also went along with datatypes.

eg in the file dialog of GUI app 1 saving a file to something (I forget the details) like PIPE:foo, then in GUI app 2 opening that data from PIPE:foo without it having to go to disk (I suppose when you're dealing only with floppies that is a bigger deal than now).

Am I hallucinating? ;)

toyg · 5 years ago
> it would be quite useful for a number of things

Would it really, though? CDROM are basically dead, what are you going to ask: "please connect another internet cable?"

kitotik · 5 years ago
Prompting to connect to a remote server, insert a thumb drive, enter an encryption key, attach a backup drive would all be useful.
catfish · 5 years ago
Back in 89 while stationed in Hawaii I worked for a small PC shop whose owner was a HUGE AMIGA FAN. Commodore worked their dealers hard, and just prior to Christmas that year he was required to stack 250 machines in inventory which just about broke the bank.

Meanwhile I was hired to build PC clones for a few hours everyday. While I was there he would argue constantly about the superiority of the Amiga vs. the PC and though I knew he was right (486 PC's had just hit the scene) I would argue right back about the fact the PC's were winning due to the velocity of change in hardware, storage, video, etc.

Anyway we got to the point with it that his wife started yelling at both of us to cut the crap, so not wanting to piss her off since she signed the checks I shut up and let him drone on.

As Christmas approached we were selling PC's 10 to 1 over Amiga's and his anxiety about not moving the Amiga's hit the roof. About a week prior to Christmas I came in one day and started building the PC's on order (about 30+ on order) and he was forced to help me with building them out.

He was pissed, completely off his rocker angry, and started verbally hammering me beyond what I could take. I finally decided enough was enough.

I turned to him and I said "I know one thing the PC can do that the Amiga can't do." And taking the bait like the fish he was, sez to me "Nothing can beat the Amiga". I sez, "There is defintely one thing it does better", and he SHOUTS "WHAT?"

Sez me, "It makes money!" and then I turned to the stack of nearly 200 remaining Amiga's sitting in the shop shelves and pointed at them and laughed long and loud.

He fired me on the spot.

Not exactly on point for this article. But a great story in the timeline of "our thing". I should mention that I did own a Commodore 64 and the Amiga before I switched to the PC. I loved them both and if it wasn't for the C-64 assembler module I would not have built the companies I created years later. The 6502C was the best learning chip ever!

2 centavos...

rektide · 5 years ago
Speaking for the Dead is something I really really hope we can do more for computing. There's so many pasts that we forget.

I want condensed good & bads of CORBA, SOAP, NeXT, ESB, & so many others. It feels like there's only dwindling folklore of so many of these things. At least I can point newcomers to C10K and Apache forking models to discuss some of the webserving systems architecture work that emerged around 2000, that gives a fairly broad view of the challenges & was afoot. But I've found few clear stories, clear tellings for so many of the faded technologies. Nice to see Amiga here somewhat avoiding that fate, having some stories told.

reaperducer · 5 years ago
One little thing that I always liked about the Amiga was that it could show human dates.

I never owned one, but my friend had an Amiga 1000, and I remember the directory listing had dates like "Last Thursday," or "Christmas, 1990," or "An hour ago."

I don't know if that was part of the stock Amiga OS, or an add-on, but it was awfully cool.

timbit42 · 5 years ago
"Last Thursday" and "An hour ago" were part of the stock Amiga OS but it didn't include "Christmas 1990".