Readit News logoReadit News
jart · 3 years ago
I don't want to come across as disrespectful to my elders but in many ways I feel that certain kinds of nostalgia like this are holding open source back. One of my favorite pieces of software is GNU Make. Having read the codebase, I get the impression that its maintainer might possibly be a similar spirit to the OP. The kind of guy who was there, during the days when computers were a lot more diverse. The kind of guy who still boots up his old Amiga every once in a while, so he can make sure GNU Make still works on the thing, even though the rest of us literally would not be able to purchase one for ourselves even if we wanted it.

It's a pleasure I respect, but it's not something I'll ever be able to understand because they're longing for platforms that got pruned from the chain of direct causality that led to our current consensus (which I'd define more as EDVAC -> CTSS -> MULTICS/CPM -> SysV/DOS/x86 => Windows/Mac/Linux/BSD/Android/x86/ARM).

My point is that open source projects still maintain all these #ifdefs to support these unobtainable platforms. Because open source is driven by hobbyism and passion. And people are really passionate about the computers they're not allowed to use at their jobs anymore. But all those ifdefs scare and discourage the rest of us.

For example, here's a change I recently wrote to delete all the VAX/OS2/DOS/Amiga code from GNU Make and it ended up being 201,049 lines of deletions. https://github.com/jart/cosmopolitan/commit/10a766ebd07b7340... A lot of what I do with Cosmopolitan Libc is because it breaks my heart how in every single program's codebase we see this same pattern, and I feel like it really ought to be abstracted by the C library, since the root problem is all these projects are depending on 12 different C libraries instead of 1.

wzdd · 3 years ago
> I recently wrote to delete all the VAX/OS2/DOS/Amiga code from GNU Make and it ended up being 201,049 lines of deletions.

That commit 10a766eb you linked to appears to have significantly more than removal of those architectures -- you also appear to have deleted all the tests, documentation, and support for multiple (human) languages. Of the portions of diff which aren't deleted files, many of the deletions seem to be around code reformatting (i.e. they add a line for each line they remove), and removal of includes which your libc doesn't need / support. I do see that you have removed several #ifdef VMS (and similar) stanzas, but the vast bulk of the changes are either removal of, or modifications to, unrelated files.

Although I agree instinctively that we shouldn't expect to run the latest version of Make on the VAX in our basement, this diff doesn't make that argument very well and IMO borders on disingenous.

Or, to put it another way, after looking at your diff, I feel much happier about your hypothetical make-on-Amiga afficionado, because I know that they also care about i18n, documentation, and testing!

rbanffy · 3 years ago
Identifying the moment we need to stop supporting a platform is frequently non-obvious. Unisys still supports MCP (as Clearpath OS), VMS is supported and was ported to x86, Atos supports GECOS, and some people are making CP/M fit inside dedicated word processors. A couple months back there was a report of ncurses failing on Tandem NonStop OS (still supported, IIRC, by HPE). As long as something works, we'll never hear about all those wonderful exotic platforms people still use for various reasons. There must be a lot of PCs controlling machinery doing GPIO through parallel ports while emulating PDP-8's with some poor intern having to figure out how to make changes to that code.
jart · 3 years ago
Here's a simple criterion I propose: is the platform disbanded?

For example, in GNU Make's dir.c file. There's a lot of stuff like this:

    #ifndef _AMIGA
        return dir_file_exists_p (".", name);
    #else /* !AMIGA */
        return dir_file_exists_p ("", name);
    #endif /* AMIGA */
There should be a foreseeable date when we can say, "OK the Amiga maintainers have added a feature that lets us use '.' as a directory so we can now delete that #ifdef". But that day is guaranteed to never come, because the Amiga project is disbanded. So should we keep that until the heat death of the universe?

I would propose that we instead say, if you use Amiga, there's great support for it in versions of GNU Make up until x.y.z. So if you love old Amigas, you'll be well served using an older version of GNU Make. I think it's straightforward.

dwidget · 3 years ago
While I agree that having one library could be a good solution, I don't think all those #ifdefs are wasted. There are a lot of legacy tech programs that use systems way older than I ever imagined would still be in use. There was a minor crisis at for an org I was working at one time where they were going to need to flip a multimillion dollar system because the only source of replacement parts was a hobbyist in his garage and for new gov compliance purposes that guy was going to need to become a cleared contractor supplier...which can be problematic if the person in question is an open source advocate whose main purpose in running this business in retirement is supplying enthusiasts rather than government departments or contractors.

I'm sure some of those systems and ones like it make plenty of use out of those #ifdefs though, and it's not just a handful of old fogey enthusiasts cramping everyone elses style. Established systems can't always evolve as fast as the general market.

causi · 3 years ago
Because open source is driven by hobbyism and passion. And people are really passionate about the computers they're not allowed to use at their jobs anymore. But all those ifdefs scare and discourage the rest of us.

Isn't this the same process you yourself referenced? There's nothing stopping people from forking and building leaner versions of these programs, but it turns out that projects with those passionate, nostalgic developers are more successful even with the support burden than that same project without them. That backwards-support might be a cost rather than a waste.

pengaru · 3 years ago
Scaring new talent away from spending their precious time on a solved problem like GNU make is a feature not a bug. Work on something more relevant to today's challenges.

There's plenty of things "holding open source back", this isn't a significant one of them IMNSHO.

jart · 3 years ago
Saying make is a solved problem is a real failure of imagination. I used to do a lot of work on Blaze and Bazel. I intend to add support for a lot of the things it does to GNU Make. Such as using ptrace() to make sure a build rule isn't touching any files that aren't declared as dependencies. I can't do that if our imagination is stuck in the 80's with all this DOS and Amiga code.
rodgerd · 3 years ago
> this isn't a significant one of them IMNSHO.

"You can't have systemd in Debian, what about kFreeBSD" "You can't use Rust until it supports DEC Alpha"

...there are no shortage of examples where open and free software is held back by hyper-niche interests, where our pet twenty and thirty year old, long-dead projects and processor architectures create absurd barriers to improve anything.

anthk · 3 years ago
Hey, NetBSD and MacPPC are still used, and people still backports Nethack/Slashem/Frotz to those archs.

Old hardware is always useful.

jart · 3 years ago
All Actually Portable Executables run on NetBSD. I love NetBSD. I helped fix a bug in their /bin/sh. I even put the little orange flag on my blog. https://justine.lol/lambda/#binaries See also https://github.com/jart/cosmopolitan#support-vector
the_only_law · 3 years ago
NetBSD is so cool, and I have so many machines sitting around I need to get running on (SGI, Alpha, Dreamcast, etc.)

Sadly I’ve heard it can be rough on older architectures still. I’ve been told, that at least on VAX, for example is not in the best of states because usermode dependencies on Python. From what I was told, Python currently doesn’t have a VAX port due to the architectures floating point design.

grishka · 3 years ago
The weird thing I keep seeing is that many C libraries still define their own integer types for some reason instead of just using the ones from stdint.h. Even new ones, that certainly didn't ever need to support ancient platforms and ancient compilers, like libopus.
cesarb · 3 years ago
> instead of just using the ones from stdint.h. Even new ones, that certainly didn't ever need to support ancient platforms and ancient compilers, like libopus.

But stdint.h is from C99, and AFAIK there are non-ancient compilers for non-ancient platforms that still don't fully support C99.

Nursie · 3 years ago
> many C libraries still define their own integer types for some reason instead of just using the ones from stdint.h

Many C developers in my experience (as recent as 5 years ago) haven't really adapted to C99. This is in part because some platforms (Microsoft, I'm looking at you) resisted adopting it for quite some time and partly because a lot of C development is carried out by older developers who have long given up on keeping up with new developments. And I say that as a dev in my 40s.

I think stdint etc are great, but for some people the start of any 'serious' C codebase still requires a whole load of int and pointer type redefinitions.

flembat · 3 years ago
The ROM on my Amiga is from March 28th this year.
pjmlp · 3 years ago
A example of holding on to old stuff is still making use of a system programming language designed in 1972 in 2022.
jart · 3 years ago
I agree. That's why the Cosmopolitan Libc repository includes support for C++ as well as a JavaScript interpreter. It has Python 3. You can build Python 3 as a 5mb single file Actually Portable Executable that includes all its standard libraries! Then you put your Python script inside the executable using a zip editing tool and it'll run on Mac, Windows, Linux, name it. You can also build Actually Portable Lua too. More are coming soon.
cbmuser · 3 years ago
Make is a very old codebase that you shouldn’t change in dramatic way anyway. It’s in itself an outdated piece of software which has far better and more modern replacements.

No need to break it for older systems.

jjtheblunt · 3 years ago
> I feel that certain kinds of nostalgia like this are holding open source back

i'm misunderstanding what the post had to do with promoting open source

Dead Comment

mobilio · 3 years ago
oppositelock · 3 years ago
Oh, it's a lot longer story than that. I worked as SGI from just around its peak, to its downfall, seeing the company shrink to a tenth of its size while cutting products.

At the time, I was a fairly junior employee doing research in AGD, the advanced graphics division. I saw funny things, which should have led me to resign, but I didn't know better at the time. Starting in the late 90's, SGI was feeling competitive pressure from 3DFx, NVIDIA, 3DLabs, Evans and Sutherland (dying, but big), and they hadn't released a new graphics architecture in years. They were selling Infinite Reality 2's (which were just a clock increase over IR1), and some tired Impact graphics on Octanes. The O2 was long in the tooth. Internally, engineering was working on next gen graphics for both, and they were both dying of creeping featureitis. Nothing ever made a deadline, they kept slipping by months. The high end graphics pipes to replace infinite reality never shipped due to this, and the "VPro" graphics for Octane were fatally broken on a fundamental level, where fixing it would mean going back to the algorithmic drawing board, not just some Verilog tweak, basically, taping out a new chip. Why was it so broken? Because some engineers decided to implement a cool theory and were allowed to do it (no clipping, recursive rasterization, hilbert space memory organization).

At the same time, NVIDIA was shipping the GeForce, 3DFx was dying, and these consumer cards processed many times more triangles than SGI's flagship Infinite Reality 2, which was the size of a refrigerator and pulled kilowatts. SGI kept saying that anti-aliasing is the killer feature of SGI and that this is why we continue to sell into visual simulation and oil and gas sector. The line rendering quality on SGI hardware was far better as well. However, given SGI wasn't able to ship a new graphics system in perhaps 6 years at that point, and NVIDIA was launching a new architecture every two years, the reason to use SGI at big money customers quickly disappeared.

As for Rick Beluzzo, man, the was a buffoon. My first week at SGI was the week he became CEO, and in my very first allhands ever, someone asked something along the lines of, "We are hemmoraging a lot of money, what are you going to do about it"? He replied with, "Yeah, we are, but HP, E&S, etc, are hemmoraging a lot more and they have less in the bank, so we'll pick up their business". I should have quit my first week.

panick21_ · 3 years ago
Trying to be both sell a seller of very high end computer products while also doing your own chips and graphics at the same time is quite the lift. And at the same time their market was massively attacked from the low end.

The area where companies could do all that and do it successfully kind of ended in the late 90s. IBM survived but nothing can kill them, I assume they suffered too.

What do you think, going back to your first day, if you were CEO could have been done?

I always thought for Sun OpenSource Solaris, embracing x86, being RedHat and eventually Cloud could have been the winning combination.

mrpippy · 3 years ago
Do you know anything about the rumor that an O2 successor was prototyped that used NVIDIA graphics? (I think I read that on Nekochan long ago).

The slow pace and poor execution of CPU and graphics architectures after ~1997 is crazy to think about. The R10000 kept getting warmed over, same for IR, and VPro, and the O2.

The Onyx4 just being an Origin 350 with ATI FireGL graphics (and running XFree86 on IRIX) was the final sign that they were just milking existing customers rather than delivering anything innovative.

rasz · 3 years ago
>NVIDIA was launching a new architecture every two years

at the peak of nvidia 3dfx war new chips were coming out every 6-9 months

Riva 128 (April 1997) to TNT (June 15, 1998) took 14 months, TNT2 (March 15, 1999) 8 month, GF256 (October 11, 1999) 7 months, GF2 (April 26, 2000) 6 months, | 3dfx dies here |, GF3 (February 27, 2001) 9 months, GF4 (February 6, 2002) 12 months, FX (March 2003) 13 months, etc ...

anamax · 3 years ago
In many cases, an executive's behavior makes sense after you figure out what job he wants next.
unixhero · 3 years ago
Thank you so much for your inside story. Hilbert space memory organization sounds great :)
buescher · 3 years ago
I have no clue what hilbert space memory organization could possibly be - arbitrarily deep hardware support for indirect addressing? - but it sounds simultaneously very cool and like an absolutely terrible idea.
jart · 3 years ago
That reads like a tabloid, the way it attacks individuals and t-shirts. I heard the fall of SGI summed up in one sentence once. It went something like, "SGI had a culture that prevented them from creating a computer that cost less than $50,000." That's all probably all we need to know.
digisign · 3 years ago
--> The Innovator's Dilemma
the_only_law · 3 years ago
I always find the story of DEC interesting as well.
rbanffy · 3 years ago
It was the pinnacle of tech tragedy to see them being acquired by Compaq.

At least until Oracle, of all companies, acquired Sun...

ulzeraj · 3 years ago
Also Itanium.
panick21_ · 3 years ago
Podcast seems to be gone.
rbranson · 3 years ago
I love all the nostalgia, but the post doesn't really answer the most interesting part of the title: why do workstations matter? I was really hoping there was some revelation in there!
corysama · 3 years ago
Alan Kay attributes a big part of the advances of PARC to the custom workstations they built for themselves. They cost $20k(?) but ran much faster than off the shelf high-end machines at a time when Moore’s Law was accelerating CPU speed dramatically. He says it let them work on machines from the future so they had plenty of time to make currently-impossible software targeting where them common machines would be when they finished it.
justinlloyd · 3 years ago
I am currently working with a hardware start-up, that happens to have "some monies" in the bank to deliver what we need. And if I was asked to describe how the culture inside the company feels, I would say "like the early days of NeXT." There's money here to do what we want, there's technically smart guys in the room, nothing is off the table in terms of what we're willing to try, we have a vision of what we want to build, nobody is being an architecture astronaut, all of us have shipped product before and know what it takes.

Where I am going with all this is that what we're trying to build, the consumer grade hardware to run it won't exist for two more years so we're having to use really beefy workstations in our day-to-day work. Not quite PARC level of built-from-scratch customization, but not exactly cheap consumer grade desktops either.

buescher · 3 years ago
It also helps if you are Alan Kay or the other talents that were at PARC back then. What future would you create if you had a custom $100K (2022 dollars) workstation?
blihp · 3 years ago
They allow you to spend much less time thinking about resource constraints and/or performance optimization and just focus on what you're trying to get done and/or do more than would be possible with conventional systems. Workstations let you buy your way past many limitations.

The closest example today would be people like developers, AI researchers, 3D designers and video editors buying high-end video cards (quite possibly multiple) running in Threadripper systems. They're paying up for GPU power and huge amounts of cores/RAM/IO bandwidth/whatever to either do something that isn't feasible on a lower end system or to complete their work much more quickly.

Wistar · 3 years ago
This is correct. I do video and 3D with a Threadripper 3990X with 128GB RAM and a 3090 because I don't want to even think about computational restraints. It is overkill for 95% of my work but, that other 5% where I am rendering something arduous, it pays off.
EricE · 3 years ago
I think an analogy to supercars is pretty relevant. They are a minuscule percentage of cars developed/sold but have a disproportionate influence on the car market overall.

I'm sure there are analogies for a lot of other industries as well.

Also - there is no cloud, just someone else's computer. Which is why I will never rely on something like a Chromebook, the web or other modern day equivalents of dumb terminals :)

Deleted Comment

zozbot234 · 3 years ago
Let's be clear, those workstations were hella expensive. (The Amiga was not in the true workstation range, but rather more of a glorified home computer. Their workstation equivalents would probably be the stuff from NeXT.) Their closest modern equivalent would probably be midrange systems like whatever Oxide Computer is working on these days. A workstation was simply a "midrange" level system that happened to be equipped for use by a single person, as opposed to a shared server resource. The descendant of the old minicomputer, in many ways.
sleepybrett · 3 years ago
I'd say when you get into a fully kitted 2k video toaster you get into 'workstation' territory for my potentially personal definition of 'workstation'. For me a 'workstation' is a machine built and optimized for a task that primarily runs that task and that task only. It is sometimes the 'core hardware' that is interesting, but often many of the peripherals are more interesting. Things I consider workstations include Avid and other video editing systems, machines built for cad, and yes many of the 'desktop' sgi machines which generally did nothing but run software like softimage all day every day.

The 'workstation' largely died because general off the shelf machines because fast enough to perform those task almost as well. You now see a more open market for the peripherals that help 'specialize' a general purpose computer. Wacom tablets, video capture devices, customized video editing controllers, midi controllers, GPUs, etc

jmwilson · 3 years ago
Yep, the closest I ever got to a SGI was drooling over their product brochures as a kid. The cost of a modest Indy was about the same as a mid-range car. It's hard to grasp as a modern PC user that these workstations could handle classes of problems that contemporary PCs could not, no matter what upgrades you did. Today, it would be like comparing a PC to a TPU-based (or similar ASIC) platform for computing.

From what I've read, Oxide is making racks of servers and has no interest in workstations that an individual would use.

sleepybrett · 3 years ago
When a game company I worked at went out of business and couldn't unload their aging Indigo Elans and Indys I picked up one of each for about a hundred bucks. I now have some regrets simply because their monitors have strange connectors, so i keep them around and they are heavy and annoying to store. That said I could probably pay off my initial purchase and then some by unloading one of their 'granite' keyboards (ALPs boards, collectors love them).
LinuxBender · 3 years ago
Anecdotally a friend had a computer store that sold Amiga's and had his entire inventory bought out by the CIA of whom never paid him so they must have been powerful for something. This was in the late 90's. No idea what they were using them for. I used one to help a friend run a BBS. I could play games with incredible graphics whilst the BBS was running in the background.
vidarh · 3 years ago
If it was late 90's, as much as I love Amiga, it would have been for niche stuff like replacing a bunch of information screens or something like that where they could have replaced it with PCs but would then need to change their software setup. In terms of "power" the Amiga was over by the early 90's, even if you stuffed it full of expensive third party expansions. It still felt like an amazing system for a few years, but by the late 90's you needed to seriously love the system and AmigaOS to hold onto it, and even for many of us who did (and do) love it, it became hard to justify.
notreallyserio · 3 years ago
No kidding:

https://daringfireball.net/linked/2019/12/17/sgi-workstation...

> The Octane line’s entry-level product, which comes with a 225-MHz R10000 MIPS processor, 128MB of memory, a 4GB hard drive, and a 20-inch monitor, will fall to $17,995 from $19,995.

Really makes the M1 Ultra look affordable.

usefulcat · 3 years ago
The Indy, which predated the Octane, started much lower ($5k according to Wikipedia, presumably in mid-nineties dollars), but yeah your point very much stands.
api · 3 years ago
> Really makes the M1 Ultra look affordable.

The amount of power you can buy today for under $1000 let alone under $10000 is insane compared to back then. The M1 Ultra is not that expensive compared to mid-range workstations or even high-end PCs of previous eras.

guyzero · 3 years ago
That's just over $31,000 in 2022 dollars. I don't think I can even imagine what kind of modern desktop you could build for that much money.
blihp · 3 years ago
A loaded up Amiga (i.e. add a CPU accelerator board, more RAM than most PCs could handle, specialized video processing cards etc) could get into the low end of workstation territory. But you are right that architecturally, they had more in common with high end PCs than workstations of their day. The Amiga's main claim to fame from a hardware standpoint was their specialized chipset.
Sohcahtoa82 · 3 years ago
My late dad was a huge Amiga fan back in the day. I was just a little kid at the time and didn't see what the big deal was.

Looking back at what it was capable of though...they were doing 256 colors and sampled audio at a time when x86 was still pushing 16 colors and could only produce generated tones through the speaker built into the case.

There was some really good music on the Amiga, too. Some of my favorites:

Hybris theme: https://youtu.be/Siwd7b0iXOc

Pioneer Plague theme: https://youtu.be/JSLcN6GBzO0?t=17

Treasure Trap theme: https://youtu.be/n5h_Wu7QRpM

And of course, you can't mention Amiga music without also mentioning Space Debris: https://youtu.be/thnXzUFJnfQ

reaperducer · 3 years ago
they were doing 256 colors and sampled audio at a time when x86 was still pushing 16 colors

4,096 colors.

https://en.wikipedia.org/wiki/Hold-And-Modify

Sohcahtoa82 · 3 years ago
I knew it could do 4,096 colors and I played around with it in Deluxe Paint, but I don't recall any games that used it. Being a kid at the time, the games were all I cared about.
btbuildem · 3 years ago
Yeah, Amiga was light years ahead of PC and others in the music scene. It was truly incredible at the time. ProTracker blew my mind -- you could just.. make music, by hand. So many games had rich soundtracks. Not to mention the demo scene, arguably an art form, blending audio and visual effects in a constant race of one-upmanship while squeezing every last bit of performance from these ancient chipsets. Or the whole genre of cracktros (short intros put in bootloaders before games by the crackers / pirates distributing them). I still have this as a ring tone lol: https://www.youtube.com/watch?v=JAmTHCgvtnQ

Edit: I'm just going to throw this in here for good measure: the Lotus Turbo Challenge II soundtrack https://www.youtube.com/watch?v=vETonlaTZ4c

I mean, somebody made an instrumental cover of it years and years later, that's how it stuck in some people's minds! https://www.youtube.com/watch?v=b6Ijk17osUc

vintermann · 3 years ago
The Hybris music is kind of peculiar for the Amiga, since it uses a ton of fast arpeggiated chords, like you had to do on the 8 bit computers to play chords at all (but on the Amiga, you could just sample the chord and still have three hardware channels left for the rest of the music). It was essentially retro even when it was made!

Hybris was a gorgeous game too. It was top notch arcade quality graphics back when it came in 1987.

bitbckt · 3 years ago
I still boot up my maximum-spec Octane2 (2x600MHz R14k, 8GB RAM, VPro V12, PCI shoebox) every so often to bask in the good ol’ days.

After nekochan went offline, there isn’t really a central gathering place for SGI fans anymore, but we are out there.

im_down_w_otp · 3 years ago
That's a beast of a machine!

In my home office I have a little mini museum that consists of display of esoteric 90's workstations:

* Apple Quadra 610 running A/UX (Apple's first UNIX)

* NeXT NeXTstation Turbo

* SGI Indigo2 IMPACT10000

* Sun Ultra 2 Elite3D

* UMAX J700 (dual 604e) running BeOS

* HP Visualize C240

All working and all fun to fire up and play around with from time to time. Tracking down software to play with is a challenge at times. Since most of what I want to fiddle around with is proprietary and long since abandoned (Maya, CATIA, NX, etc.). If by some chance we were to end up on a conference call, you'd see them displayed in the background. :-)

em-bee · 3 years ago
neat, here is my museum:

* Apollo Domain 4500

* HP 9000

* m86k/25 NeXTstation

* NeXTstation Turbo

* NeXT Cube with NeXTDimension card

* SPARCstation 5

* SGI OEM machine from Control-Data with mips R2000A/R3000

* SGI Indy

* IBM RS6000/320H

* IBM RS6000/250

* Cobalt Qube 2700D

* Sun JavaStation1

* Sun Ray1

* SPARC IPC

* Alpha-Entry-Workstation 533

they are in storage at my grandmothers now, and i don't know if any of them still run. some of these i was using actively as my workstation at home. some where just to explore. as i got more and more into free software, dealing with the nonfree stuff on those machines got less and less appealing. though i was also running linux on machines that were supported.

classichasclass · 3 years ago
My 900MHz V12 DCD Fuel says hi. I miss Nekochan.
bitbckt · 3 years ago
Forgot to add the DCD. :)
jacquesm · 3 years ago
That's like driving a classic in modern day traffic, it's a bit slower but it does the job with elegance. Nice rig!
justinlloyd · 3 years ago
Sounds like it will run circles around my Indigo2 R10K in the workshop. What do you do with all that power?
sleepybrett · 3 years ago
Sometimes I miss 3dwm though apple stole a lot of it's best ideas and put them into the original osx
jacquesm · 3 years ago
There were some attempts at getting 3dwm to be ported to Linux, but I'm not sure what came of them.
pjmlp · 3 years ago
It was thanks to SGI hosting of C++ STL documentation (pre-ISO/ANSI version) that I learned my way around it.

Being graphics geek, I also spent quite some time around the graphics documentation.

For me, one of the biggest mistakes was only making IrisGL available while keeping Inventor for themselves.

To the subject at hand, this is one difference I find with most modern computers, the lack of soul in a vertical integration experience blended between hardware and software.

toddm · 3 years ago
I have fond memories of the SGI machines - workstation and larger - I worked on in the 1990s and early 2000s. Octanes, O2s, Origins, Indigos, and so on.

They were best-in-class for visualization, and when used with Stereographic Crystal Eyes hardware/glasses, 3D was awesome. We also rendered high-quality POV-Ray animations on an O2 in 1996, when the software was barely 5 years old!

My last big computing efforts were on a SGI Origin 2000 (R12000) in 2002, and the allure of that machine was being able to get 32 GB of shared RAM all to myself.

reaperducer · 3 years ago
3D was awesome

Around 1990 or 2000, I was able to see how some of the big energy companies in Houston were using SGI machines' 3D capabilities.

They'd have rooms about 10 feet square with projectors hanging from the ceiling that would take seismic data and render it on the walls as colorful images of oil deposits and different strata of rocks and gas and water and such. Using a hand controller, the employees could "walk" through the earth to see where the deposits were and plot the best/most efficient route for the drilling pipes to follow.

Pretty much today's VR gaming headset world. Except, without a headset. And this was almost a quarter of a century ago.

I can't imagine what the energy companies are doing now, with their supercomputers and seemingly limitless budgets.

sleepybrett · 3 years ago
I saw a driving simulator built with an actual car and a couple of reality engines driving projectors that were projecting on screens all around the car. It was a pretty impressive setup.

Now you can probably build that out of forza, a decent gaming pc, and some hobbiest electronics.