Just to add my experience to the pile: when I went to college I was able to convince my parents to get me a custom PC from a company called GamePC. Among the specs in 1998:
400Mhz Pentium 2
128MB
Nvidia Riva TNT
3DFX Voodoo2
CDRW (4x4x24 I think)
Syquest SparQ (Awesome, but had major issues)
Internal Zip Drive
Just a ridiculous system for the time. Quake 2 and Starsiege Tribes were really popular in our dorm and that system was just perfect for it. Also popular was burning lots of pirated games, so we'd order CDRs in bulk from this really random site overseas. High quality "gold" CDRs and they were far more reliable than any of the ones you'd find in stores in the US for about half the cost.
Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
> Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
Half of HN alive at the time probably had that motherboard - ABIT BP6 was the reason I got my hands on a copy of W2K, and also started playing with Linux.
I'm still bummed that CPU manufacturers basically killed off the third party chipset industry. Maybe it was inevitable when memory controllers moved on-die, but I remember when there was actual competition in the chipset business.
It was a cool board. I didnt technically have one, but I built my dad a W2K server on a BP6. I always wanted to hack on it and overclock with it. But after I handed it over, I wasnt allowed to touch it, "you'll burn up my processors." Since he didn't care about overclocking he had dual P2-400s or maybe 450s. It was a beast. He could run SQLServer and compile Delphi apps so hard.
I got my kicks though with a BF6 and a 300A. Those were the times; atleast until the AthlonXPs (AXIA -- anybody?) were released.
Had pretty much the same thing... but only one overclocked Celeron to 433. Was amazing upgrade from my pentium 133 with a Matrox Millenium; which I somehow used to complete Half Life in low FPS agony.
agree -- that dual celeron setup (often with a peltier cooler) was suuuper common, I knew so many people who rushed out to get them and run at 500?
it was my second exposure to SMP though: First was dual socket Pentium Pro 200mhz which ran nt4.0 for the longest time (which I still keep that hefty cpu around on my desk for laughs)
I'm slightly confused, how would games of that era benefit from a dual CPU setup?
Old games were decidedly single-threaded and built for a single-core world. It was only in the mid-to-late 2000s that games started to be more optimized for multi-core CPUs. And multi-CPU is even more difficult because there isn't any cache sharing.
I wonder if there something like this today you will have in college? Some low cost graphic card rigs? Or is it more like some cloudflare set-ups today?
It was just single-CPU, but I had the ABIT BH6 with a Celeron 300A, one of the most overclockable setups ever. Mine was stable at 450mhz without any special cooling.
Similar experience, I had a Cyrix PR200 which really underperformed the equivalent Intel CPU.
Convinced my parent's to buy a new PC, they organized with a local computer store for me to go in and sit with the tech and actually build the PC. Almost identical specs in 1998: 400Mhz Pentium 2, Voodoo 2, no zip drive, but had a Soundblaster Live ($500 AUD for this at the time).
I distinctly remember the invoice being $5k AUD in 1998 dollars, which is $10k AUD in 2024 dollars. This was A LOT of money for my parents (~7% of their pretax annual income), and I'm eternally grateful.
I was in grade 8 at the time (middle school equivalent in USA) and it was the PC I learnt to code on (QBasic -> C -> C++), spent many hours installing Linux and re-compiling kernel drives (learning how to use the command line), used SoftICE to reverse engineer shareware keygen (learning x86 assembly), created Counterstrike wall hacks by writing MiniGL proxy dlls (learning OpenGL).
So glad there wasn't infinity pools of time wasting (YouTube, TikTok, etc) back then, and I was forced to occupy myself with productive learning.
I could share almost exactly the same story. So grateful my parents could afford, and were willing to spend, the money on a nice PC that I entirely monopolised.
That and high speed internet. I played for a couple of years on 28.8K. The day I got a better graphics card was great. No more choppiness. The day I got cable internet was life changing in Tribes (and Tribes 2)!
I think I still have a pic somewhere of the infamous NoFix scolding "LPBs"
vgh! Except that texture transparency worked with glide (voodoo) cards and not with opengl or software rendering. So if you made a custom skin with transparency texture there was a brief time in the Tribes 1.1 to 1.2 era where you could be somewhat invisible to people with voodoo cards (if your skin was in a skin pack that everyone had).
Ah, a fellow Tribesplayer. Just so you know, we still play tribes. Join us! http://playt1.com/ - the community mantains the master server and clients these days. There are good pick-up games on fri+weekends.
I love this game, it's also amazing to me how the concept of "skiing" was foreign to me when I first played T1 and T2, and now its a core game mechanic.
> There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz
Was that the BP6 motherboard from Abit?
I had that board, those processors and used to overclock them too.
Also ran Linux and BeOS on it (though IIRC you had to patch BeOS for SMP support).
Quake 3 ran so smooth on that machine, even without Q3s experimental SMP support enabled.
That was actually my all time favourite computer, even to this day.
I also had a TNT2 in an earlier machine, but the BP6 machine had a GeForce 3.
Dual 300As overclocked to just over 500Mhz each on a BP6 with Geforce 256 here too! Fastest, smoothest machine I ever used until the M1 MacBook. Quake 3 multiplayer demo ran so fast it gave me motion sickness ^^ Years later I "upgraded" to a 1Ghz Athlon and it felt like a downgrade a lot of the time.
> though IIRC you had to patch BeOS for SMP support
The board might have require a driver or patch, but SMP was BeOS's entire reason for being! The drawing of each window on the screen ran in a separate thread. It was their main selling point.
Oh man the Celeron A, which was basically a Pentium II with on-die L2 cache. Intel attempted to handicap it by limiting it's FSB to 66 MHz, but any half-decent motherboard would allow you to bump that up to 100 MHz so long as you had the rest of the hardware to support it (i.e., PC-100 memory). This resulted in a pretty significant bump in CPU frequency.
Overclocking Celeron's those were the days. Intel binning down a bunch of processors capable of reaching higher clock rates but selling them as a lower end part was a boon for college students everywhere.
NVidia RIVA TNT which used the AGP bus on the Intel LX440 mobo.
A whopping 128Mb of RAM and 8Gb HDD.
I recall using a program called WinSplit to split the Nvidia driver over several floppy discs on my bosses Win3.1 machine in the office. I didn't have internet at home and really wanted to play Jedi Knight and Battlezone.
I recall the legendary Celeron being the 300A. It was 300MHz, but was easily overclocked to 450MHz. There were higher clocked versions, but regardless of which CPU you got, they ultimately were only able to overclock to about the same frequencies.
Also, the celerons of that generation did not have unlocked multipliers. The only way to overclock them was to overclock the front side bus, which also controlled memory bandwidth. The "standard" FSB speed was 66MHz. By overclocking a 300MHz CPU to 450MHz, you got a 100MHz memory speed. By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
My friend in college had one. Windows 98 didn't support SMP, so he had to run Windows 2000, which was based on Windows NT, and would be the basis for XP. Compatibility with games was sometimes...interesting. Windows ME came out about that time, but was absolute garbage. All of us either stuck with 98SE or experimented with 2k. None of us actually bought it of course...
So the story originally started with the cacheless 266 Mhz Celeron. CPUs were delivered as AICs (add-in-cards) at the time, with separate cache chips, so to deliver a budget processor, they shipped the same silicon, but without the cache chips added. Removing the cache drastically tanked the performance, especially on integer work loads (typically productivity software), but didn't really affect floating point workloads. However, it had the side benefit of removing the part of the AIC that was most sensitive to over-clocking (the cache). It used a 66Mhz clock with a fixed 4x multiplier, and upping the clock to 100Mhz got the Celeron running at 400Mhz, which had performance roughly equivalent to a 266 Mhz Pentium II with cache for integer workloads, but for games, it was almost as fast as the fastest Pentium II of the time (which topped out at 450Mhz).
In order to stop the overclocking, Intel decided to add some cache back to the CPU, but to save money, rather than using cache chips, they stuck a relatively tiny amount of cache directly on the CPU die, and released the now infamous Celeron 300A
Because the cache was on-die, it could overclock just as well as the previous celeron, but this time the 300A was faster than the equivalent Pentium because the on-die cache ran at twice the clock speed of the external caches
> By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
I think the PCI bus probably also typically ran at some fraction of the front-side bus. The common FSB frequencies around those times were 66 or 100 MHz which gave a standard ~33 MHz PCI bus frequency with a multiplier of 1/2 or 1/3. FSB frequencies that weren't close to a multiple of 33 MHz might have caused trouble with some PCI cards. Might have depended on how the motherboard or chipset handled the bus frequencies, too.
Of course the PCI bus should probably always run at 33 MHz but I think I saw it being modified with the FSB speed at least on some motherboards.
It was crazy how fast things moved back then. A friend of mine had a 233MHz P2 with 32GB and a 2D card, and within two years it was a dinosaur, being shown up by machines like yours, 400-450MHz, 3D cards, way more memory....
I think the other commenter is right...you're thinking of DVD-R vs DVD+R, possibly even DVD-RW and DVD+RW.
Based on the specs listed, OP was in college just before me or may have overlapped. The big gold CD-R stacks (you could bur in jewel cases, on spindles, or just gross stacks which were nice and cheap) were a huge thing with my group (who encoded to FLAC & MP3 -V0 and burned audio CDs relentlessly. We felt we were archiving our liberal arts college music library and radio library for the future! Who knows. Some of that "future" is still backed up and on hard disks, and I should migrate them to SSD or tape just on principle.
At that point CD-R were cheaper than CD-RWs, and because most archiving/distributing didn't require rewriting (not return-on-investment wise anyway), we just shared programs on CD-R as well. In some ways it was a beautiful technology! Particularly fidelity to a spec everyone tried to bend and break for a profit angle, when honestly, there was no point for many of us using CD-R
It was truly jaw dropping firing up quake 1 for the first time on 3dfx voodoo1. Double the resolution of software and super smooth framerate, and much better quality texture mapping too. I recall tweaking some setting (gl_flashblend?) so that I could see pretty glowing orbs around my rockets (and strategically, also everybody else's rockets).
It's hard to convey just how revolutionary the original voodoo cards were. There aren't many times in my life where there was a clear line of before and after, but this was one of those times.
They also had the most recognizable unified box art style from all HW makers[1]. When you saw those eyes staring into your soul off the shelves, you knew it was a 3dfx GPU. They also had the best ads. [2] HW vendors today don't have the balls anymore to use designs like that, it's all flat sterile corporate nonsense.
Still blows my mind that it was just a flash in the pan. At the time it felt that 3dfx was certainly going to be a dominant force in computing for years. And although they lingered a bit, the whole event was barely over 2 years.
I was the unfortunate owner of an S3 ViRGE card at the time - the (in)famous "3D decelerator". I managed to get Quake running on it, and it looked nice, but was slower than with software rendering...
I had an S3 ViRGE too. It really was a decelerator, and the number of games that actually supported it was minuscule. I managed to run GLQuake, but without any textures - just shades of gray - and even that ran at most a couple of frames per second.
But there was another game - Terminal Velocity - that actually looked a lot better with hardware rendering, although it was still slower than software rendering. So, I would run it with hardware rendering to enjoy flying and then restart the game with software rendering to actually fight the enemies. :)
Same here. I can still vividly remember the experience of loading in with a voodoo2 for the first time. It was night and day -- mind completely blown. The late `90s really established a new version of the gamer; consoles were fun, but computer gaming was a different world. It made me a junky for reading about hardware, overclocking and gaming.
Replaying Heretic 2 back in 1998 with my first Voodoo (banshee) was a borderline otherwordly experience, compared to my first playthrough of the game using software rendering. Nothing has blown my mind the same way since.
I have had three experiences like this in my life:
1. PC Speaker -> Sound Blaster: Most games that I had were instantly upgraded
2. Doom: my first "real" fluid 3D experience, with stairs, etc, compared to maze-like maps in Wolfenstein
3. Software Rendering -> 3dfx (Canopus Pure3D): Transparent water surfaces in Quake (if you re-vis'd the maps), smooth and fluid visuals, it was amazing.
The closest thing to this, in modern gaming, has been the boss fights in Elden Ring: https://i.imgur.com/gzLvsLw.mp4 -- visually, they are quite epic.
Debatable. I always preferred the crisp look of the software renderer to the washed out GLQuake. Same with Quake 2. I think it because textures back then were too low resolution so filtering just makes them look muddy.
It’s also because the VGA signal quality from the 3dfx Voodoo wasn’t very good.
It didn’t have a traditional 2D graphics core at all, so it needed another graphics card for rendering the desktop (any non-accelerated apps really), and this was connected to the Voodoo using VGA passthrough. There was a noticeable image quality loss from this arrangement.
A Matrox card would give you crisp VGA with nice saturation, but the 3D acceleration was nearly worthless. Choices…
I agree that the washed-out textures haven’t aged well.
But at the time, not having pixelated textures was the first thing people looked at when judging graphics quality. I remember that being a HUGE selling point of the N64 and something that made the console look almost a generation ahead of the PlayStation and Sega Saturn to kids back then.
Today, I think I prefer the PSX look, thoug. Maybe with Z-buffer correction to avoid the warped textures of the PlayStation.
Even today I think a lot of Doom clones look better (or more nostalgic) with software rendering and texture mapping rather than OpenGL. There's an intensity of saturation to the colors that's lost. Fireblu is never quite so eye burning as when it's in software.
I came here to comment similarly, the lower pixelated software rendered Quake seems to work well with the textures. They have a bumpmappy fuzzy feel that gets lost with the sharp corners everything is super flat texture mapped and filtered version that one got from the 3d accelerators of the time. I guess my brain just adds context to the low res images.
Before unreal, I had a s3-virge for 2d and a powerVR 3d accelerator pair, and I was always flipping between software, virge and powerVR depending on game. Which at the time were largely hexen/heretic. The powerVR was higher resolution and clean/sterile but never seemed like a lot better experience.
But then there was unreal, the first game I think was absolutely better on an accelerator (voodoo2 in my case). Its also pretty much the last of the serious software renderers and outside of the voodoo's definitely did a better job with software lighting /texture mapping/etc than any of the previous (affordable) accelerators. Which is why I ended up finally replacing the powerVR with the voodoo2. The results were 'unreal'. Some of that might just be bias, I played insane amounts of doom/etc but never really got into quake. Quake just seemed like doom rehashed to me, so I was off playing warcraft/diablo/hexen/etc.
And frankly, outside of FEAR, I stopped playing 1st person shooter games for 20 years, the graphics improvements were so incremental, I just kept seeing flat low polygon models everywhere. And I don't think that looks good. Even after all the tessellation/bump mapping/endless tricks I kept seeing frames where I could play "count how many polygons are onscreen right now" games. Its gotten better the past few years, particularly some of the lighting, at least the screenshots/cut scenes are no longer obviously not in game rendering. The UE5 demo is slowly becoming reality in actual games, so maybe its time to revisit a few of them.
Always thought the original software renderer looked much better. It didn’t have the bilinear filtering, so the textures didn’t look all smooth and ‘washed out’, which suited the environment more imho
I can't speak for the original GLQuake on 3dfx hardware, but on OpenGL-compatible Quake engines (which include modern Quake source ports such as Quakespasm, Ironwail, and vkQuake), bilinear texture filtering is an option that can be turned off.
I play on vkQuake with nearest-neighbor texture filtering, square particles, and the "classic" water warping effect and lighting effects, alongside 8x MSAA, 16x anisotropic filtering, high-resolution widescreen, etc. This keeps the art style consistent with the look of the original Quake, while still allowing for the benefits of hardware 3D acceleration.
In terms of pixels, it was 4x the resolution. And for fun, one of the window textures (visible in the difficulty choice lobby IIRC) was implemented as a mirror in glquake - IIRC John Carmack said it was so easy to do in OpenGL he did it as a kind of test.
I dreamt about having the vodoo but i could not afford it. Went with a rendition verite based one. It was underpowered compared to the vodoo but I really consider it the first real GPU as it was a RISC processor.
For me it was Carmageddon. I bought it later on an ipad and it may have just been rose tinted glasses of being completely blown away back in the day but the ipad version never seems quite as crisp...
If I remember correctly to get transparent water the level also had to be re processed through the "vis" program with a transparent water flag set.
vis did a precalculation for where a level segment(the partition in binary space partition) could be seen from any other level segment. the end effect was that while glquake did have a option for transparent water, the geometry would not draw on the far side. making the effect useless without a bit of extra work. But I have to admit I have no idea if entities(other players) would draw or not.
Adding: the server and client had to both be running vis patched maps to be able to see other players in the water due to the way entity visibility was calculated server-side.
The downside to running vis patched maps on a server is it used slightly more CPU than unpatched maps IIRC. Perhaps someone that ran more servers than I did (I ran two nodes on an Intergraph InterServe with dual P6-200s) could weigh in on what the impact was at scale.
There was also a perverse effect on some games. With a graphics card, your gameplay could be altered and you had to unlearn all the reflexes you built on CPU rendering alone. Moto Racer (1997) was like that. The gameplay with a graphics card was completely different, even trajectories (I assume lag made the cpu accept a little bit more rounding errors).
> I can very clearly remember installing the card and then loading up 2Fort4 in the Team Fortress mod and suddenly being able to see THROUGH THE WATER.
Searching for "2Fort4" in YouTube yielded some interesting videos for people curious what the original Quake Mod version of the map looked like:
As someone who still spends at least 3 hours a week playing 2Fort on the current Team Fortress 2, it's fascinating to see how the core of the map is still basically the same after 20 years.
EDIT: Another video which goes into more detail about the history of each 2fort version, going back to its original inspiration from a Doom 2 level:
Even having a solid dial-up connection with a ~180-185ms ping was a ridiculous advantage when most HPBs were ~350ms, particularly in clan invitationals for Q1CTF. We were playing as LPBs in the dorm at ~45-60ms and 180ms wasn't that much of a concern, aside from sliding around corners more than expected, but at 350ms you were basically shooting predictively at where you assumed they'd be next, not where they 'were'.
On a very different scale, but I recall playing bzflag decades ago and discovering that I simply could not jump my tank to the second level. My graphics card was so slow that something wasn't working correctly, and no matter how many times I tried from different distances I would almost make it, but not quite.
I used to play games like Starsiege (the mech game) on dialup. With our 250ms pings, your brain just learned to compensate for the delay and click that much earlier to shoot.
But yeah, those lucky people with their DSL modems and 60ms pings would wipe the floor with us.
When qtest was released, I was there in the IRC channel and one of the first to play.
I remember connecting to someone's machine and just destroying everyone. Afterward I got a message from someone congratulating me, but being incredulous about my ping time being 26ms.
I happened to be on the first floor in the first dorm on campus to get wired internet access, and they had an OC3 dedicated to it. Two months earlier there were 16 of us splitting that line, with public IP addresses. (Going back to dial-up after leaving the dorm was.. difficult).
So I told him, yeah I kinda have a ridiculous internet connection here at school. He was like, "no, you don't understand - it is my machine. I am playing on the server and your ping time is lower than mine!"
Crazy illustration of "nothing happens anymore." 3dfx seemed just as dominant in the 1990s as NVIDIA does today. But from founding to selling to asset sell-off, the company lasted just six years. Meanwhile NVIDIA has been king of the hill since the GeForce was released in 1999, which was 25 years ago.
AMD overtook Nvidia at times in the gaming space. I'd say that Nvidia has been king of the hill since the introduction of CUDA, since that's what really cemented their position in the tech sector.
Pre-AMD acquisition ATI also often had better hardware specs than NVIDIA, but their drivers were so often buggy and terrible. By the time they'd been fixed the reviews were long since done on the initial release versions.
AMD seems to run a better software shop, at least.
The 90s was an absolutely crazy period for PC hardware. So many flash in the pan companies making a novel devices and then dying entirely as their niche became obsolete. There used to be tons of display board manufacturers and very few of them survived the 3D acceleration introduction.
Sometime in the late late 00s, i put my voodoo card on Craigslist. I got pinged immediately, told me he’d pay double if i reserved it for him. The cards were precious for keeping some arcade game cabinets running, and with the company no more, my used card was a lifeline. I wanna say it was a golf game like golden tee? I was delighted to make the sale and happy to charge him the original (not double) price
I recall Unreal has an option to switch to use the 3dfx card, and if IIRC, it has some additional features like more colourful lights and such.
Unreal was such a beast back in the day that it completes beats Quake 2 and other contemporary FPS even on software rendering. TBH it still looks beautiful even by today's standards, if you ignore the low polygon counts.
I'm not a person who cares too much about graphics, even for FPS (I don't really enjoy most of the modern FPS except Wolfenstein, which has interesting gameplay), and I argue that too much graphics eye candies simply decrease the overall quality of the game, but 3dfx definitely was a huge bang back in the day.
the performance boost also made a significant difference in how well the game played: i remembered when the voodoo 1 came out i had a 100mhz pentium and running quake (in low resolution) was "fine" but ran at like 20-25fps. With a voodoo that game ran at a smooth 60fps which made it so much more fun for such a fast-paced game (while also running at a higher resolution with smooth textures and improved lighting effects). It made a huge difference on multiple axes.
The percentage change in resolution you ran the games at was also absolutely mind blowing too.
For the most part we went from running the game at 320x200 or 320x240 to 640x480 in that first round of acceleration. I think in terms of % change it is a bigger leap than anything we've really had since, or certainly after 1920x1080.
So you suddenly had super smooth graphics, much better looking translucency and effects, and the # of pixels quadupled or more and you could just see everything so much more clearly.
Unreal had a small menu where you could switch between different renderer backends precisely because of things like different cards having different... Driver quality let's say.
I remember how i was amazed when i got my first 3d card, a Voodoo 2. It was like having an arcade at home.
The 3dfx logo spining up when you launched a GLide game was something.
Unreal in particular was amazing, i remember as a kid just watching the lighting and the water.
At that time every light in a game had to be colored, just because it could be done. Small rooms with green, red and blue lights moving all over the place, so 90s.
I never had that "Wow" factor again, from there everything felt like incremental instead of revolutionary. How an absolute market leader disapeared in 2 years is incredible.
I think i only got the same wow factor the first time i tested a VR headset.
My first 3D card was a Righteous Orchid 3d. It had a mechanical relay in it to switch between 2d and 3d modes, so it made a distinctive click[0] when you loaded Quake3D.
Or, too many times, it didn't, and I had to try again.
Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
edit: corrected the amount of memory
/end reminiscing about a simpler time
Half of HN alive at the time probably had that motherboard - ABIT BP6 was the reason I got my hands on a copy of W2K, and also started playing with Linux.
https://en.wikipedia.org/wiki/Universal_Abit
I got my kicks though with a BF6 and a 300A. Those were the times; atleast until the AthlonXPs (AXIA -- anybody?) were released.
I think for a while Intel started labelling the socket 370 celeron boxes as “for single core systems only”, but it was a lie.
Old games were decidedly single-threaded and built for a single-core world. It was only in the mid-to-late 2000s that games started to be more optimized for multi-core CPUs. And multi-CPU is even more difficult because there isn't any cache sharing.
I wonder if there something like this today you will have in college? Some low cost graphic card rigs? Or is it more like some cloudflare set-ups today?
I am starting to feel old.
Cathode Ray Dude's channel has a great video about the history of this amazing motherboard: https://www.youtube.com/watch?v=UE-k4hYHIDE
Convinced my parent's to buy a new PC, they organized with a local computer store for me to go in and sit with the tech and actually build the PC. Almost identical specs in 1998: 400Mhz Pentium 2, Voodoo 2, no zip drive, but had a Soundblaster Live ($500 AUD for this at the time).
I distinctly remember the invoice being $5k AUD in 1998 dollars, which is $10k AUD in 2024 dollars. This was A LOT of money for my parents (~7% of their pretax annual income), and I'm eternally grateful.
I was in grade 8 at the time (middle school equivalent in USA) and it was the PC I learnt to code on (QBasic -> C -> C++), spent many hours installing Linux and re-compiling kernel drives (learning how to use the command line), used SoftICE to reverse engineer shareware keygen (learning x86 assembly), created Counterstrike wall hacks by writing MiniGL proxy dlls (learning OpenGL).
So glad there wasn't infinity pools of time wasting (YouTube, TikTok, etc) back then, and I was forced to occupy myself with productive learning.
/end reminiscing
(also, shazbot)
I think I still have a pic somewhere of the infamous NoFix scolding "LPBs"
"Shazbot!" "The enemy is in our base!" "Woohoo!"
Was that the BP6 motherboard from Abit?
I had that board, those processors and used to overclock them too.
Also ran Linux and BeOS on it (though IIRC you had to patch BeOS for SMP support).
Quake 3 ran so smooth on that machine, even without Q3s experimental SMP support enabled.
That was actually my all time favourite computer, even to this day.
I also had a TNT2 in an earlier machine, but the BP6 machine had a GeForce 3.
The board might have require a driver or patch, but SMP was BeOS's entire reason for being! The drawing of each window on the screen ran in a separate thread. It was their main selling point.
I think any BX chipset motherboard would do it… but you may have to resort to messing with covering/hotwiring CPU pins.
300Mhz PII - it came in a black cartridge thing.
NVidia RIVA TNT which used the AGP bus on the Intel LX440 mobo.
A whopping 128Mb of RAM and 8Gb HDD.
I recall using a program called WinSplit to split the Nvidia driver over several floppy discs on my bosses Win3.1 machine in the office. I didn't have internet at home and really wanted to play Jedi Knight and Battlezone.
Also, the celerons of that generation did not have unlocked multipliers. The only way to overclock them was to overclock the front side bus, which also controlled memory bandwidth. The "standard" FSB speed was 66MHz. By overclocking a 300MHz CPU to 450MHz, you got a 100MHz memory speed. By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
My friend in college had one. Windows 98 didn't support SMP, so he had to run Windows 2000, which was based on Windows NT, and would be the basis for XP. Compatibility with games was sometimes...interesting. Windows ME came out about that time, but was absolute garbage. All of us either stuck with 98SE or experimented with 2k. None of us actually bought it of course...
Fun times.
In order to stop the overclocking, Intel decided to add some cache back to the CPU, but to save money, rather than using cache chips, they stuck a relatively tiny amount of cache directly on the CPU die, and released the now infamous Celeron 300A
Because the cache was on-die, it could overclock just as well as the previous celeron, but this time the 300A was faster than the equivalent Pentium because the on-die cache ran at twice the clock speed of the external caches
I think the PCI bus probably also typically ran at some fraction of the front-side bus. The common FSB frequencies around those times were 66 or 100 MHz which gave a standard ~33 MHz PCI bus frequency with a multiplier of 1/2 or 1/3. FSB frequencies that weren't close to a multiple of 33 MHz might have caused trouble with some PCI cards. Might have depended on how the motherboard or chipset handled the bus frequencies, too.
Of course the PCI bus should probably always run at 33 MHz but I think I saw it being modified with the FSB speed at least on some motherboards.
Even HDDs were smaller back then.
Deleted Comment
Dead Comment
Big difference.
Based on the specs listed, OP was in college just before me or may have overlapped. The big gold CD-R stacks (you could bur in jewel cases, on spindles, or just gross stacks which were nice and cheap) were a huge thing with my group (who encoded to FLAC & MP3 -V0 and burned audio CDs relentlessly. We felt we were archiving our liberal arts college music library and radio library for the future! Who knows. Some of that "future" is still backed up and on hard disks, and I should migrate them to SSD or tape just on principle.
At that point CD-R were cheaper than CD-RWs, and because most archiving/distributing didn't require rewriting (not return-on-investment wise anyway), we just shared programs on CD-R as well. In some ways it was a beautiful technology! Particularly fidelity to a spec everyone tried to bend and break for a profit angle, when honestly, there was no point for many of us using CD-R
[1] https://www.ixbt.com/img/r30/00/02/08/90/boxes.jpg
[2] https://www.reddit.com/r/pcmasterrace/comments/41r1wj/3dfx_w...
But there was another game - Terminal Velocity - that actually looked a lot better with hardware rendering, although it was still slower than software rendering. So, I would run it with hardware rendering to enjoy flying and then restart the game with software rendering to actually fight the enemies. :)
When I got a real job next summer, bought an AGP Matrox Millenium G200 and after that the NVIDIA GeForce2, and never strayed from NVDA since!
Debatable. I always preferred the crisp look of the software renderer to the washed out GLQuake. Same with Quake 2. I think it because textures back then were too low resolution so filtering just makes them look muddy.
It didn’t have a traditional 2D graphics core at all, so it needed another graphics card for rendering the desktop (any non-accelerated apps really), and this was connected to the Voodoo using VGA passthrough. There was a noticeable image quality loss from this arrangement.
A Matrox card would give you crisp VGA with nice saturation, but the 3D acceleration was nearly worthless. Choices…
But at the time, not having pixelated textures was the first thing people looked at when judging graphics quality. I remember that being a HUGE selling point of the N64 and something that made the console look almost a generation ahead of the PlayStation and Sega Saturn to kids back then.
Today, I think I prefer the PSX look, thoug. Maybe with Z-buffer correction to avoid the warped textures of the PlayStation.
Overall it looked better, but a lot of Quake 2 players weren't aware of a lot of the small details that were put into the textures.
Before unreal, I had a s3-virge for 2d and a powerVR 3d accelerator pair, and I was always flipping between software, virge and powerVR depending on game. Which at the time were largely hexen/heretic. The powerVR was higher resolution and clean/sterile but never seemed like a lot better experience.
But then there was unreal, the first game I think was absolutely better on an accelerator (voodoo2 in my case). Its also pretty much the last of the serious software renderers and outside of the voodoo's definitely did a better job with software lighting /texture mapping/etc than any of the previous (affordable) accelerators. Which is why I ended up finally replacing the powerVR with the voodoo2. The results were 'unreal'. Some of that might just be bias, I played insane amounts of doom/etc but never really got into quake. Quake just seemed like doom rehashed to me, so I was off playing warcraft/diablo/hexen/etc.
And frankly, outside of FEAR, I stopped playing 1st person shooter games for 20 years, the graphics improvements were so incremental, I just kept seeing flat low polygon models everywhere. And I don't think that looks good. Even after all the tessellation/bump mapping/endless tricks I kept seeing frames where I could play "count how many polygons are onscreen right now" games. Its gotten better the past few years, particularly some of the lighting, at least the screenshots/cut scenes are no longer obviously not in game rendering. The UE5 demo is slowly becoming reality in actual games, so maybe its time to revisit a few of them.
I play on vkQuake with nearest-neighbor texture filtering, square particles, and the "classic" water warping effect and lighting effects, alongside 8x MSAA, 16x anisotropic filtering, high-resolution widescreen, etc. This keeps the art style consistent with the look of the original Quake, while still allowing for the benefits of hardware 3D acceleration.
The software renderer has this gritty feel that is integral to the art I feel.
That said, the 3dfx was impressive at the time, and I was very jealous of my buddy who got one.
1. Sprite-based -> 3D sandbox world (in my case: Stunts, F29 Retaliator, Gunship 2000, Wolfenstein 3D)
2. Hardware 3D rendering (I had the NVidia RIVA 128ZX)
3. Fast-paced real-time multiplayer (Delta Force: Black Hawk Down)
The 4th might be the usage of LLMs or similar technology for (mostly-)unattended content generation: NPC dialogue etc.
I had a similar experience seeing Quake2 running with the Glide renderer (on a Voodoo2) for the first time It was amazing.
I can very clearly remember installing the card and then loading up 2Fort4 in the Team Fortress mod and suddenly being able to see THROUGH THE WATER.
Sniper's paradise!
vis did a precalculation for where a level segment(the partition in binary space partition) could be seen from any other level segment. the end effect was that while glquake did have a option for transparent water, the geometry would not draw on the far side. making the effect useless without a bit of extra work. But I have to admit I have no idea if entities(other players) would draw or not.
update: found this https://quakeone.com/forum/quake-help/general-help/4754-visp...
Apparently there is a no_vis option to run without the visible set optimizations.
The downside to running vis patched maps on a server is it used slightly more CPU than unpatched maps IIRC. Perhaps someone that ran more servers than I did (I ran two nodes on an Intergraph InterServe with dual P6-200s) could weigh in on what the impact was at scale.
Searching for "2Fort4" in YouTube yielded some interesting videos for people curious what the original Quake Mod version of the map looked like:
https://www.youtube.com/watch?v=bJh36LuKwVQ&pp=ygUGMkZvcnQ0
As someone who still spends at least 3 hours a week playing 2Fort on the current Team Fortress 2, it's fascinating to see how the core of the map is still basically the same after 20 years.
EDIT: Another video which goes into more detail about the history of each 2fort version, going back to its original inspiration from a Doom 2 level:
https://www.youtube.com/watch?v=Tid9QwAOlng&t=375s
The video also misses that there was a pretty popular 2fort for half life 1.
But yeah, those lucky people with their DSL modems and 60ms pings would wipe the floor with us.
Nowadays, everyone has a < 10ms ping.
I remember connecting to someone's machine and just destroying everyone. Afterward I got a message from someone congratulating me, but being incredulous about my ping time being 26ms.
I happened to be on the first floor in the first dorm on campus to get wired internet access, and they had an OC3 dedicated to it. Two months earlier there were 16 of us splitting that line, with public IP addresses. (Going back to dial-up after leaving the dorm was.. difficult).
So I told him, yeah I kinda have a ridiculous internet connection here at school. He was like, "no, you don't understand - it is my machine. I am playing on the server and your ping time is lower than mine!"
AMD seems to run a better software shop, at least.
Unreal was such a beast back in the day that it completes beats Quake 2 and other contemporary FPS even on software rendering. TBH it still looks beautiful even by today's standards, if you ignore the low polygon counts.
I'm not a person who cares too much about graphics, even for FPS (I don't really enjoy most of the modern FPS except Wolfenstein, which has interesting gameplay), and I argue that too much graphics eye candies simply decrease the overall quality of the game, but 3dfx definitely was a huge bang back in the day.
For the most part we went from running the game at 320x200 or 320x240 to 640x480 in that first round of acceleration. I think in terms of % change it is a bigger leap than anything we've really had since, or certainly after 1920x1080.
So you suddenly had super smooth graphics, much better looking translucency and effects, and the # of pixels quadupled or more and you could just see everything so much more clearly.
Which Wolfenstein?
The 3dfx logo spining up when you launched a GLide game was something.
Unreal in particular was amazing, i remember as a kid just watching the lighting and the water.
At that time every light in a game had to be colored, just because it could be done. Small rooms with green, red and blue lights moving all over the place, so 90s.
I never had that "Wow" factor again, from there everything felt like incremental instead of revolutionary. How an absolute market leader disapeared in 2 years is incredible.
I think i only got the same wow factor the first time i tested a VR headset.
Or, too many times, it didn't, and I had to try again.
[0] https://riksrandomretro.com/2021/06/07/the-righteous-click