Readit News logoReadit News
peatmoss · 5 years ago
I poked around on an old Indy at a vintage computer show a couple years back, and the main takeaway I had was, “holy crap the UI elements feel instantaneous.”

I know it’s been posted here many times about how computers have become perceptually slow, but that Indy after a couple minutes of poking around really drove the point home in a way that no numbers ever could.

Computers have gained a lot, for sure, but they’ve also lost a lot. I wonder if it’s even possible to make a modern computer fast in a way that feels fast again.

derefr · 5 years ago
It certainly is. Most of the reason modern computers don't feel instantaneous is actually a trade-off: old computers were less adaptive to change.

In old GUIs (e.g. Windows 3.1), many things—file associations, program launchers, etc.—got loaded from disk into memory once—usually at GUI startup—and then the state of those things was maintained entirely in memory, with programs that updated the on-disk state of those things either 1. also independently updating the in-memory state with a command sent to the relevant state-database-keeper; or 2. requiring that you log out and back in to see changes.

Today, we don't have everything sitting around loaded into memory—but in exchange, we have soft-realtime canonicity, where the things you see in the GUI reflect the way things are, rather than a snapshot of the way things were plus (voluntary, possibly missable/skippable) updates. Install a program that has higher-than-default-binding file associations? The files in your file manager will update their icons and launch actions, without the program needing to do anything.

There's ways to eat your cake and have it too—to have on-disk canonicity and instant updates—but this require a very finicky programming model†, so we haven't seen any GUI toolkit offer this, let alone one of the major OSes do so.

† Essentially, you'd need to turn your Desktop Environment into a monolithic CQRS/ES aggregate, where programs change the DE's state by sending it commands, which it reacts to by changing in-memory state (the in-memory aggregate), and then persists a log of the events resulting from those commands as the canonical on-disk state (with other aggregates fed from those to build OLAPable indices / domain-state snapshots for fast reload.) This gets you "Smalltalk windowing semantics, but on a Unix filesystem substrate rather than a VM memory-image substrate."

StillBored · 5 years ago
While you might be slightly right, my experience tuning windows machines leads me to believe your missing the mark.

I'm going to say the three largest contributions to general desktop lag are:

Animations and intentional delays. It can't be said, how much faster a machine feels when something like MenuShowDelay is decreased to 0, or the piles of animations are sped up.

Too many layers between the application draw commands and the actual display. All this compositing, vsync, and minimal 2d acceleration creates a low level persistent lag. Disabling aero on a win7 machine does wonders to its responsiveness. But even then pre-vista much of the win32 GDI/drawing API was basically implemented in hardware on the GPU. If you get an old 2d win32 API benchmark, you will notice that modern machines don't tend to fare well in raw API call performance. 30 seconds poking around on youtube, should find you a bunch of comparisons like this https://www.youtube.com/watch?time_continue=25&v=ay-gqx18UTM.... Keep in mind that even in 2020 pretty much every application on the machine is still relying on GDI (same as linux apps relying on xlib).

Input+processing lag, USB is polling with a fairly slow poll interval rate (think a hundred or so ms). Combined with the fact that the keystrokes/events then end up queued/scheduled through multiple subsystems before eventually finding their way to the correct window, and then having to again reschedule and get the application to retrieve and process it via GetMessage()/etc. Basically, this is more a function of modern software bloat, where all those layers of correct architecture add more overhead than the old school get the ps2 interrupt, post a message to the active window queue, schedule the process managing the window messages. (https://social.technet.microsoft.com/Forums/windows/en-US/b1...)

There are a number of other issues, but you can retune those three areas to some extent, and the results are a pretty noticeable improvement. Having someone at MS/etc go in and actually focus on fixing this might have a massive effect with little effort. But that doesn't appear to be in the cards, since they appear to be more interested in telemetry and personal assistants.

dialamac · 5 years ago
Not entirely sure your analysis passes muster. While it is true pigs like chrome and slack are the norm, computers still have many orders of magnitude more memory than 20 years ago and even if a large GUI app keeps state on disk most of it should be hot in some kind of cache whether an explicit cache or just the OS page cache.

There are other forces at work.

For instance on my primary workstation after a period of time from cold boot, pretty much everything is sitting comfortably in 32 GB of memory. A lot of time is more due to the

1) sheer volume of crap that has to be shuttled in memory, because memory accesses still aren’t free 2) there’s a surprising amount of repetitive CPU intensive tasks just to get a program started up on merely bookkeeping tasks like linking, conversion, parsing.

In addition to all the built in latencies already mentioned.

To prove this you can setup a system using a RAM disk and notice that responsiveness is often still subpar.

tarsinge · 5 years ago
> It certainly is. Most of the reason modern computers don't feel instantaneous is actually a trade-off: old computers were less adaptive to change.

To the point that old Macs had most of the system routines in ROM[0].

[0]https://en.wikipedia.org/wiki/Macintosh_Toolbox

Lutzb · 5 years ago
Just yesterday I started a VM of beta2 of HaikuOS for a good dose of nostalgia. Some observations: the responsiveness of the UI is instantaneous. So much so that it feels alien nowadays. Also the UI does not waste space, but at the same time it does not feel cluttered. Laying out windows feels more organized than Windows 10 or recent macOS. I can't really point out why. Lastly, having dedicated window borders is so useful for aiming the mouse pointer when reszing. At some point our UIs became more a design than a utility.
keithnoizu · 5 years ago
I was really sad to see BeOS go the way of the dodo as a kid, was a great os.
weinzierl · 5 years ago
I worked with an Indy for some time and what was really cool - besides the smooth and fast reaction to user input you mentioned - was that the UI elements were scalable. If I remember correctly they were vector and not bitmaps. Back then I thought that this will soon be the norm. I'd never imagined that we would still be struggling with this more than 20 years later.

EDIT: After googling SGI workstation models I think what I used was actually most likely an O2[1]. Great design, I remembered its distinct look even after so many years.

[1] https://en.wikipedia.org/wiki/SGI_O2

whoopdedo · 5 years ago
There was a revelation a while back that instantaneous UX can be detrimental. If the action occurs so quickly that the users can't see it happening, they have a tendency to assume it didn't happen. Programmers had to introduce intentional latency through elements such as the file copy animation.
blattimwind · 5 years ago
I disagree. What you describe is a problem of interaction design founded on bad assumptions; with good interaction design I don't have to show the user that the computer is doing something for the user to be able to tell it happened. This is a problem of the system not showing its state transparently and relying on the user to notice a change in hidden state indicated by a transient window.

Windows Explorer gets your particular example right: When you copy a bunch of files into a folder, it will highlight all of the copied files after it is done, so it doesn't matter if you saw the progress bar or not.

joshspankit · 5 years ago
Having seen all this from the inside of more offices than I can remember, I would say definitively yes. That programmers did have to introduce delays for that, and for other reasons.

However, this was a trust thing, and not an innate limitation of human capacity.

Now that we’re in 2020, I would argue that almost 0 users need that crutch.

If the work is done, it’s done. Similarly, employees on early keyboard-only terminals didn’t need any delay either. They trusted that when they said do X, the system did it. How else could you process dozens (or hundreds!) of items a minute?

derefr · 5 years ago
That was indeed a while back—we had fewer ways to communicate change back then. Given all the fancy technologies in a modern laptop/mobile device, I wonder how much more we could tell the user without slowing down to give visual feedback—instead communicating by haptic and/or 3D-audio feedback after the fact (the same way that e.g. macOS plays a sound effect after-the-fact when you put something in the Trash—but with a procedurally-synthesized soundscape, rather than the static "it went to the bottom-right.")
cellularmitosis · 5 years ago
> Programmers had to introduce intentional latency

Hmm, perhaps if you replace the word "latency" with "duration"?

nineteen999 · 5 years ago
Back in 1998/1999 I worked in a mom-and-pop ISP shop with 64 dialup lines. We had an Indy that ran our mail server, website, databases, FTP and web hosting, LDAP server, etc etc. And it was still somewhat usable as a desktop as well.
cellularmitosis · 5 years ago
(straying slightly off-topic) I recently had the same experience with an Apple eMac. Grab a window, shake it around as fast as you can, or just move the mouse cursor around in circles quickly.

"Whoa, why does this seem so much more responsive than my modern machines?" Two things, I think: 1) the mouse cursor is being updated with every vertical refresh (80Hz), and 2) the latency of the CRT must be lower than an LCD.

Not bad for a 700MHz, 18 year old machine!

blattimwind · 5 years ago
CRTs are "dumb" devices, they literally just amplify the R/G/B analog signal while deflecting a beam using electromagnets according to some timing signals. As far as input lag goes, they're the baseline. For fast motion they have some advantages at leat over poor LCD screens as well, since non-strobing LCDs quite literally crossfade under constant backlight between the current image and the new image; we perceive this crossfading as additional blurring. A strobing LCD on the other hand shifts the new image into the pixel array and lets the pixels transition while the backlight is turned off. The obvious problem - it's flickering.

LCDs that aren't optimized for low latency will generally just buffer a full frame before displaying it, coupled with a slow panel these will typically have 25-35 ms of input lag at 60 Hz. LCDs meant for gaming offer something called "immediate mode" or similar, where the controller buffers just a few lines or so, which makes the processing delay irrelevant (<1 ms). The image is effectively streamed through the LCD controller directly into the pixel array.

cameldrv · 5 years ago
The weird thing is that I remember the Indy being perceptually very slow at the time compared to Windows or Solaris.

I just started using Linux part time on the desktop, and it’s significantly more responsive than MacOS, to the point that I prefer using it, even though the Linux UI otherwise stinks in a lot of ways.

jordache · 5 years ago
>holy crap the UI elements feel instantaneous.

I seriously doubt it's an SGI optimization.

I had an SGI Octane (multiple levels higher than an Indy). I've never felt IRIX's UI was noticeably more performant the contemporaneous Windows running on standard consumer PC hardware..

p_l · 5 years ago
There was a certain level of optimization, though arguably the graphic stack was much more complex than GDI, so yes, Windows 9x with GDI accelerator would have more performant interface for basic work.

Xsgi included some interesting optimizations, starting with how it was a compositing X server and a feature that is known to break GTK3 all the time - Xsgi would provide a low color visual as the first choice, and this was reflected in the provided Motif and other libraries and exploited by software developed for Irix - the latter mostly because you could not depend on the end user having a 24bit display, though.

This means that the bandwidth requirements across all stages of drawing was reduced for common UI components, instead of slinging around high resolution 32bit RGBA bitmaps for everything as is the norm today on Linux. I haven't checked, but I strongly suspect that the more ascetic UI controls combined with classic X11 drawing calls also resulted in higher speed vs. slinging lots and lots of bitmaps with overdraw using XRender.

For more on that, consider checking the literature written about avoiding overdraw on Android and how much of an impact it had on latency of user interface. iOS, btw, actually did a lot of dirty tricks to essentially prevent developers from doing any overdraw without explicitly going for it, and supposedly a strong part of early review process was checking for overdraw - because the actual iPhone hw was not that powerful and they ran close to the bandwidth budget per frame.

tluyben2 · 5 years ago
Definitely felt like it to me; NT (4?) was sluggish. It was much better than the other Windows but at our office it was kind of a joke (no-one touched them unless they needed to compile to an .exe) compared to SGI / Sun machines.
walkingolof · 5 years ago
It would be interesting to measure this perceived performance with a camera to get hard numbers.
rayiner · 5 years ago
This is true even on more recent vintage stuff. Little stutters are far more common on my iPhone XR with iOS 13 than they were on my old 6S Plus, much less the iPhone 5.
octorian · 5 years ago
I actually have two SGI machines kicking around (an Indigo2 and an Octane2), and I really wish I had a better idea of what to do with them beyond poking around at the desktop for 10 minutes.

One big problem with all these old "workstation" computers, is that while us hobbyists still have our ways of getting the OS installed... The actual applications people ran on them almost seem lost to time. When software is so unbelievably expensive during its heyday, it tends to not make the jump over to "abandonware" repositories once its time has passed. This unfortunately makes demos of these old machines far more boring than demos of old PCs.

weinzierl · 5 years ago
I think CATIA V4 (CAD software) was big on SGI - at least that's what I used an Indy for back in the day. I believe V4 didn't run on the PC, so you needed a workstation anyway. Don't know if one can find a copy anywhere but I think it would be fun to use it again. It was solid software with a good UI, quite different from CATIA V5 which (also) ran on the PC and had a very colorful and noisy UI.

EDIT: After googling SGI workstation models I think what I used was actually most likely an O2[1]. Great design, I remembered its distinct look even after so many years.

[1] https://en.wikipedia.org/wiki/SGI_O2

mhd · 5 years ago
Heh, I was doing CAD workstation support about 15 years ago, when there was the big switch between CATIA V4 and V5. SGIs were mostly Octanes plus a few bigger irons, I think, but it also ran on HP-UX and some Sun workstations.

There wasn't a lot of movement in the 3D workstation space at the time, whereas PC 3D accelerators were taking off big time. So you ended up with a system that was faster, a lot cheaper and where the regular PC maintenance software and infrastructure could be used (boy, homogenous Unix devops was a nightmare).

Having said that, CATIA was more a workhorse CAD and CAE software, so probably not the best to show off neat UIs, and amongst engineers it had a somewhat ponderous reputation.

nitrogen · 5 years ago
You probably have zero chance of getting your hands on the software, but in the late 90s TV ststions ran their weather graphics off of SGI workstations with some software based in part on something called Inventor. The software was incredibly easy to use for building 3D animations, which were baked to video with built-in loops and pauses for the weather segment.

If your meteorologist didn't want to hold a remote, the weather producer would have to sit in front of the workstation with their hand over the spacebar waiting for the right cues to advance to the next loop/pause point.

reaperducer · 5 years ago
If your meteorologist didn't want to hold a remote, the weather producer would have to sit in front of the workstation with their hand over the spacebar waiting for the right cues to advance to the next loop/pause point.

At some stations, it wasn't about the talent not wanting to hold the remote, it was about the station not having an engineer on staff who could rig up the remote.

At many stations in the 90's, and even some today, the "remote" is nothing fancier than a garage door opener, with the relays hooked into a breakout box to a DB25 serial port.

rbanffy · 5 years ago
I did a lot of development sitting in front of my IBM 43p with it's Intergraph monitor and model M keyboard (not original). All I needed was to ssh into my Linux machine and run the software over the network. The responsiveness, however, will be mostly gone in this scenario - X doesn't come for free.

These machines are excruciatingly slow by today's standards and I wouldn't want to run modern software on them. Still, they are the dinosaurs of the PC ecosystem - evolutionary dead-ends that hint at what could be. Who wouldn't want to study a living dinosaur?

jonjacky · 5 years ago
There was a graphical programming language for SGIs called AVS. I saw impressive scientific applications made with it: elaborate physics simulations with 3D graphics. The language and the applications seem to be completely forgotten.

This page about AVS is dated 1995 and cites a 1989 paper:

https://web.cs.wpi.edu/~matt/courses/cs563/talks/avs/avs.htm...

armitron · 5 years ago
Almost all of the software worth preserving is preserved. It may not be openly available (though odds are it will eventually be dumped openly on the net) but you can find pretty much anything you want if you put in some effort. It comes down to you getting access to one of many private repositories that are dedicated to software preservation.
ddingus · 5 years ago
Yes. I regret it a little, but I worked in the industry having access to a lot of big software.

Piled up a bunker of Sgi machines, O2, Octane, Indigo, Indy, most very well equipped with the advanced memory and graphics options.

Alias, I-deas, Maya, Adobe, 3DS Max... Let's just say I could license most of that at will due to an error...

Learned a ton of high end skills that I benefit from today too. Great fun. And amazing demos. Putting those together was a total blast. People would get blown away using Showcase, the Sgi tools for video capture, audio, and Composer to mix, RIP, burn. This was mid 90's when most people were using Win 98, or maybe NT 3.5.1

Let's also say I got rid of said error (so don't ask) and gave the whole lot away to a 20 something me just itching for those same experiences. Those machines were well loved and used. Cool.

I needed a change away from that kind of computing as it went on the wane. Didn't want to look back.

But at the peak? I was very seriously productive on Irix. The Indigo Magic Desktop took everything I ever threw at it.

And one could flat out bury those machines with a heavy workload and still the UX was golden, responsive almost as if idle!

At one point, at some conference, the head scientist at Sgi said, "We turn compute problems into I/O problems."

How the machines performed showed that ethos off well, IMHO.

Honestly, today on say, Win 10, I can do all I did then on a laptop, but not enjoy it as much as I did that environment. It is responsive and fun!

Big software on Irix remains one of the peak computing experiences I have had. Damn good times.

I may have to put this on a Linux install and have some fun.

In my view, the Irix scheduler is insane good at balancing UX with workloads. It may not always be the peak possible throughput, but a skilled user can continue to blast through their tasks pretty much no matter what the OS load is.

On a lark, I got to try an extreme example of that:

Irix 5.3 on an Indigo Elan, 30Mhz CPU. I forget which one. I want to say R3k. (Check Ian's SGI Depot for more info)

I compiled "amp", which is an optimized mp3 player that formed the basis of many players after it was written.

At 30 Mhz, that Indigo Elan could play up to 192Mbps mp3 files, over NFS, while the desktop remained responsive.

At 256Mbps, CPU load was about 95 percent. Would glitch on occasion.

I found that quite impressive personally. I used that little Elan as a X terminal for a while and it was a pleasure to use.

...ah, Sgi.

Damn. Say what you want, their stuff was fun, had amazing docs, and got work done.

[Looks at Android / Win 10]

Meh.

That's part of why I scaled down. Unloaded that gear and went small, embedded retro for my fun computing. The work is easier today, and fine by me, because it is just work.

ddingus · 5 years ago
Minor trivia:

By glitch, I mean the music would drop, not the desktop.

nix23 · 5 years ago
Slap OpenBSD or NetBSD on it.
octorian · 5 years ago
I hate to say it, but that's even more useless than a vanilla IRIX installation. By far.

One big problem with running a 3rd party OS on these old machines (specifically SGI, Sun doesn't have this issue) is that they tend to not really support any of the hardware that actually makes these machines interesting.

asveikau · 5 years ago
I have some old machines too, and that thought occurs. But that is also slightly boring. What can I do on those that I can't do on amd64 at higher speeds?
ddingus · 5 years ago
Honestly, those won't compare.

It is the close marriage between IRIX and the hardware and top engineering that shines.

rbanffy · 5 years ago
Ecce Homo feelings. You don't slap a generic OS on top of a classic machine that was designed for something different.
Erlich_Bachman · 5 years ago
Can someone please explain what are the main hypothetical advantages of this desktop environment? Even the article itself does a poor job of explaining this, it seems to assume that the reader already knows what IRIX/whatever is, and why did (do?) people use it over other environments?
morganvachon · 5 years ago
As others have said, it's mostly nostalgia, and perhaps an effort to go back to what was a comfortable and sensible working environment from 25-30 years ago. It's the same reason the Haiku operating system is being developed: Nostalgia for an operating system that was far more advanced and beautiful than its contemporaries in the late 90s and early 2000s.
rbanffy · 5 years ago
You can check https://maxxinteractive.com/ for more information.
xvector · 5 years ago
It doesn’t really help. What is SGI? What makes MaXX better than other desktops today? The page assumes too much background and doesn’t really do anything to convince a new reader.

That said the desktop looks amazing! Love the theme.

jasoneckert · 5 years ago
IRIX was my favourite UNIX flavour in the 1990s. As a result, I was tempted to try MaXX out after reading this post for purely nostalgic reasons. I keep an SGI Fuel in my basement (running IRIX 6.5.30 and everything from Nekochan) for when I need a nostalgia kick.

However, after thinking about whether I'd actually use MaXX over GNOME for a few minutes, I decided that there was no compelling reason to do so other than nostalgia.

Has anyone tried this out and decided to use it as their daily desktop? If so, I'm interested to hear your experiences and rationale. Cheers!

easygenes · 5 years ago
Never used IRIX, but I genuinely really like the icon set in that first screenshot. Also, SGI Screen is still one of my favorite fixed width fonts.

Also I think FVWM was heavily inspired from this, and I’ve used that as a very efficient UI over VNC (gradients, images, rounded corners, etc. are expensive in terms of bandwidth. The minimalist aesthetic of this interface conveys a lot of structure in little data).

hakfoo · 5 years ago
FVWM was more a mwm (Motif) clone. It dates to the '90s, back before OpenMotif or Lesstif and having something with even that aesthetic was a step up.
tpmx · 5 years ago
Nostalgia:

At my first software development job in the mid 90s, the "cool" basement room with all of the smart/weird developers in it had a a mix of Sun SPARCstation 10/20 and Sun Ultra 1 workstations.

There was also this one weird SGI O2 they had just bought to port their software to the IRIX platform, but noone wanted really wanted to use it, because of IRIX. So I picked that workstation, just to be in that room. Smartest decision of my life - what I learned in there defined my career.

The Irix Interactive Desktop (based on Motif) felt so incredibly responsive on the O2, compared to Motif/CDE on the Sun workstations. It was almost BeOS-like in that regard. It was the little touches that mattered. A random example: the CPU usage monitor updated at like 10-20 Hz, instead of like 1 Hz on the Sun workstations.

Not that anyone really used CDE in normal work - other window managers were far more efficient. I ended up using http://www.lysator.liu.se/~marcus/amiwm.html on this O2.

cygned · 5 years ago
I had an internship at an IBM reseller and consultancy couple years back. When we were at a client’s after finishing a project, they showed me their basement full of decommissioned computers and they let me take one home for free. I picked an SGI Indigo2, purple version, running IRIX. I had no idea what to do with it, so I played around for a couple of months and then made the mistake of throwing it away. (That still bugs me). I still love the interface and the way things were organized in the system, the file manager in particular. It ran nedit as editor and I used that on Linux for a couple of years going on. Nostalgia...

edit: Remembering, I got the machine with original SGI keyboard, mouse and screen.

bluedino · 5 years ago
I had a purple one and a green one! I’d play Doom (which didn’t use the 3D hardware), played around with a few of the OpenGL demos, barely surged the internet with the ancient version of Netscape, then didn’t do much else because I didn’t have the CD’s that came with it or the knowledge to try and build anything else.
exikyut · 5 years ago
Do you maybe have any viable excuses to say hello to that particular client again?
teleforce · 5 years ago
From reading the comments it seems this effort is more on the nostalgic sentiment for reliving the IRIX desktop.

I'm more interested on reliving the CEDAR/Tioga interactive desktop environment pioneered by the Xerox R&D team back in 1980s/1990s for their in-house productivity tools [1]. The system still have some productivity enabling features that are still not widely used as of now. Xerox also managed to port the system to SunOS at the time.

Anyone aware of any effort or clone that can enable CEDAR/Tioga to run on or emulated Linux?

[1] https://news.ycombinator.com/item?id=22375449

rbanffy · 5 years ago
I can work from an IRIX desktop (or CDE, or OpenWindows, even on original hardware), but I'm not sure I could work on any of these environments.
PaulHoule · 5 years ago
I remember a prof who bought a purple SRI workstation that didnt have enough RAM, he plugged into the Ethernet, couldn't get work done with it, left it plugged in anyway.

I found out two years later that the root password was the empty string.

Then there was the time I went to Syracuse for the first conference on Java for scientific computing and Geoff Fox had two identical twins from eastern Europe to run a demo on two workstations hooked up to a big SGI computer unsuccessfully which Geoff answer with 'never buy a gigabyte of cheap RAM'.