I believe the main reason for the focus on these earlier decades is due to the openness of both the hardware and software during this time. Manual published by computer manufacturers contained detailed schematics of all the circuits, assembly code listings of the BIOS and system programs. As a hobbyist I can build my own SBC based off these schematics and probe physical pins on the chips in order to debug the board. As things became smaller and more integrated, chips started including more functionality and closed-source firmware and actually integrating them into your own designs became increasingly difficult.
I think its even simpler than that. The cohort with the largest amount of spare cash flow and moderate amounts of hobby time right now (35 to 55 year old roughly) remembers that era of computing fondly from their childhood. Its the same reason junk cars from the 1950s were all the rage in the 1990 to ~ 2010 era. Whatever the current aggregate richest in spare cash flow remembers positively from their youth gets a resurgence
Others have answered why not later, but why not earlier is also a question. Earlier computers were a lot more work. CPUs weren't single chips, they were collections of dozens or hundreds of ICs. Before that they were thousands of individual transistors, Ben Eater has a Youtube series designing and building such a machine. Before that came vacuum tubes, in types which are no longer produced and aren't at all common to find working. Before that came electromechanical systems with machined cams, cogs, wheels, lots of relays, etc. So the earlier in computing capability you go the more expensive and difficult it gets.
Early 80s parts are a bit of a sweet spot combination of low price, enough challenge to be interesting but not so much as to be frustratingly difficult, and an interesting turning point in computing history as the first single-IC CPUs became popular.
I'm fascinated by the early PDP-11 range - and if I saw one readily available, reasonably priced, and close at hand (shipping is not trivial for those machines) then I don't know if I'm actually competent enough to handle a machine where normal configuration duties include changing the wirewrapped backplane!
Yeah, exactly this. I've been toying with the idea of making a computer out of individual transistors, but building something capable enough of having an interactive CLI would probably take years, thousands of dollars, and some nontrivial engineering.
Hooking up a 68k to some peripherals? Quite doable. Building something out of 7400 ICs? Tricky, but if your expectations aren't too high definitely possible. Transistors, vacuum tubes, mechanical stuff? You better be really dedicated to it!
> Hooking up a 68k to some peripherals? Quite doable.
I did that for my project in the microprocessor lab class I took in college in the early '80s. It is indeed quite doable [1][2]. It was a bit scary, because the 68k was quite new and Motorola only gave the school a small number of samples. I was told that if I fried it or broke it by not being careful enough when removing it from a socket I would not be given another one. I could not afford to buy one either. And I was taking this class in the last term of the school year as a senior, so failing it would mean not graduating.
> Building something out of 7400 ICs? Tricky, but if your expectations aren't too high definitely possible.
I considered doing that. EdX has a version of MIT's 6.191, "Computation Structure", which goes through how logic gates work at the MOSFET level on chip, then goes through combinatorial and sequential logic, followed by how to design a 32-bit RISC processor, which you them build and test in a logic level simulator. (You don't do all of the processor. You are given pre-defined modules for a RAM bank, a 32 x 32bit register file, and ROM). That's the first two parts of the course. The third part adds pipelining and caching to processor.
I took the first two parts at EdX and afterwards seriously considered actually building my processor out of 74xx series chips.
My parts list came to 350 chips. And that's not counting whatever parts I'd need for the RAM, register file, and ROM. That's way to big for my breadboard! :-)
My ALU design includes a shift unit that can do left or right shifts or rotates from 0 to 31 bits in one cycle and that uses around 90 chips. I could drop that, and change things so that shift and rotate instructions trap, and then emulate them in software. That cuts the chip count down to around 260.
Still too big for me. Even changing from 32 bits to 16 bits, or even 8 bits, would be too big, and so the idea to build it was discarded.
I think it is a form nostalgia. But, the early 80s was a fast changing time for tech where new and much more powerful hardware was coming out almost every day.
Plus, tech was not controlled by large corporations as it is now. Back then there were many small hardware/software vendors competing and in many cases helping each other advancing. Now, everything is vanilla and controlled by Large Corps.
Me, I miss the days before GUIs, I always thought plain text interfaces were and still are the better in many cases than point/click.
But, the early 80s was a fast changing time for tech where new and much more powerful hardware was coming out almost every day.
I feel like the nineties are more like that. Deep into the 80ies, most people would probably still get something with a Z80 or MOS 6502/6510. If you were rich, maybe you got an XT 4.77 MHz somewhere in the 80ies. The 80286 and later the 80386 were out of reach for most consumers.
The late 80ies to late 90ies things change rapidly. The 386, 486 (remember going from SX to DX) and Pentium all became in reach for normal households. At the start of the 90ies most people would still have a PC speaker, but then within 1-2 years everyone got Sound Blasters (or clones). Around 1992 or so CD-ROM drives start to become popular. In 1996 3D games go to the next level as 3dfx blasts onto the market. People start getting internet at home.
In the 90ies, if you got a PC, it would already be outdated in 6-12 months. This was certainly not the case in the 80ies (or 00s, 10s, or 20s).
Also on the software side there are huge shifts. Outside Apple, PCs and home computers were mostly command-line interfaces in the 80ies. But in the early to mid-90ies graphical interfaces took over the world (Windows 3.x becoming a major hit), true multi-tasking becomes more important (OS/2, Windows 95, Windows NT), etc.
> But, the early 80s was a fast changing time for tech where new and much more powerful hardware was coming out almost every day.
This. It was an amazing time. Like, it was a trip to read the ads in the monthly computer magazines. There would be several each month that would be something I never even dreamed of doing with a computer. Groundbreaking advances every month, over and over.
That feeling - that revolutionary advances were being made all the time - was amazing while it lasted. While it lasted. A lot of people may want to feel that again.
But I think you can't go home again. From the vantage point of 40 years of (slower) advances later, those things look pretty primitive. It won't feel the same.
>Plus, tech was not controlled by large corporations as it is now. Back then there were many small hardware/software vendors competing and in many cases helping each other advancing. Now, everything is vanilla and controlled by Large Corps.
Yes. My mother’s first computer was made by a company in Northern Kentucky called Thoroughbred Computers. It was tremendously expensive but locally made.
> "Plus, tech was not controlled by large corporations as it is now."
IBM would beg to differ. They were monopolists in a way that Microsoft and Google are only pale imitations of. "Nobody ever got fired for buying IBM" and the term FUD were coined to describe them. The entire x86 hardware ecosystem has its roots in IBM's dominance through the original IBM PC hardware architecture.
One aspect of why the '80s-'90s were interesting to the retro enthusiast because it was the era with the most diversity and competition in home computing. By 2000 or so, the IBM PC standard had steamrollered every other competitor, leaving only the Mac occupying a small niche.
When I went to college in the late '00s I was able to find several other Linux users. The counter culture to the Windows/Mac duopoly was alive and well. In some ways we won by being accepted as mainstream for certain niches. Between WSL, Steamdeck and Docker Linux has seen more adoption than I was expecting.
Computers haven’t really changed much since the 1990s, I mean I have been running Linux since 1994. Even in the late 1970s the DEC VAX had an architecture basically similar to modern computers.
What else could you be into? Computers pre the IBM 360 sucked. Personally I have some nostalgia for the 1970s PDP-11 and PDP-10 but not enough to really put a lot of time into setting up emulators. (The PDP-11 has the awkward bit that it has a 16-bit user address space, running RSTS/E you basically get a BASIC environment (or something like a CP/M environment) which is just slightly better than what you get with a 16-bit micro. I’d used installations where 20 people could enjoy that at once but for an individual hobbyist you might as well use or emulate a micro… But those GiGi terminals beat the pants off the Apple ][ and such)
Mass producing 8-bit micro hardware has tough economics today (can’t beat RasPi) but it is possible to make a computer of that era with mainly or entirely discrete components. (Even a 6502 or display controller!). Forget about making an IBM PC clone or an Amiga. (An FPGA can get there if you have enough $$)
If I wasn’t already overloaded with side projects I’d like to deeply explore the 24-bit world of the eZ80, like the old 360 and a place where micros never really developed a comfortable ecosystem. You could have more memory than the IBM PC that is easier to code for; you could run Linux with X Windows and Win 95 in 16GB of memory back in the day so such a system could kick ass. The AgonLight 2 has a programmable display controller based on the ESP32 and it should be very possible to make a more capable sprite-based graphics system than most contemporary game consoles, maybe even as good as the coin-op systems that supported games like Street Fighter
It represents a unique time in the history of computers -- an important gap between the mainframe era that came before it and the standard PC and internet eras that came after it. The wide variety of inexpensive and incompatible machines offered at that time has never been seen before or since.
This maverick era produced all kinds of innovation that simply doesn't happen today with one-fits-all computers that are controlled remotely by operating systems and cloud services. Modern machines bear no resemblance to 80's home computers except in their purity as computing machines, Turing machines, with simple I/O and programming.
Once you find out "how it all works" and that all computers are the same, one develops a certain interest in how we got there and perhaps a wistfulness for what things might have become. Home computers of the early 80's is where that answer lies.
1) Completely stand-alone machines with just enough expansion to then get some form of connectivity (or added data storage)
2) Self-contained, instant boot systems without an "OS" as we know it today (instant on is much more fun than it might seem), and that you explored until you could learn literally what every single register/port did.
3) Single-box systems with almost everything under or behind the keyboard, that you had to plug into a TV to get anything done with.
There was a purposefulness and immediacy to using those machines that is absent from today's bloated, over-connected, highly distracting and ADHD-inducing systems.
Also, what came after this era it was either boring (standard x86 PCs) or too complex to easily maintain over decades, e.g. Suns or SGIs with failing [UW]SCSI drives that are hard to replace these days.
I think because 90s which is far more interesting is continuum. PC is partially backwards compatible over decades. Thus it is hard to isolate any single segment unlike "8bits" which were mostly one and done. You can get some points like end of DOS beginning of Windows 95/98. But there after it kinda blurs. Tech got better and old stuff kept mostly working. So you really cannot just stuck yourself in singular point.
Not that there isn't opportunities like peak of DOS gaming with Gravis Ultra Sound and some Sound Blaster. Or early 3d gaming.
Firstly - is it really true that this is the main focus? I see a bunch of interest in the beige box era PCs as well.
Assuming it's true, though, then I would imagine there are several contributing factors:
The 80s is when computing arrived for the masses - and most of those masses were children at the time; the first computer I owned was a ZX81 and I was 9 years old when I got it. That lends it powerful nostalgia value. For later generations computers were likely more part of the background.
That generation of people is also now entering their late 40s or 50s. They probably have some income (especially if they got into IT) and their outgoings are likely tailing off - if they have kids then those kids are leaving the nest or have already. So there's spare cash to spend on all the bits and pieces that they couldn't afford back when they owned them the first time!
It's all far enough in the past that you can see it through rosy spectacles. Ram Pack wobble, slow tape loading, limited memory and primitive graphics all become features instead of limitations.
Then for younger generations who are getting into this the above points mean that there's a background of somewhat knowledgeable people to propagate information about these machines.
Add on top of that the limited nature of the machines meaning that one can have a complete-ish (or illusion of such) understanding of the machines. That's always been appealing.
Personally I find the 1970s minicomputers far more fascinating! But my dad worked with some of those and I adore Unix culture so I'm probably atypical.
There is a 30-year lag in collectibles; people in their 50s tend to buy what they coveted in their late teenage years. From what I understand, this trend is also present in the car market.
That's my bet; and so this 30-ish (I'd give it +/-5) year lag is the same reason that, when Back to the Future came out in the mid-80s, they traveled back in time to the mid-50s; why Forrest Gump centered on the 60s during the 90s; why That 70s Show was popular during the early 00s; why the now already decade-old Stranger Things was set in the 80s; and how 90s nostalgia has become "a thing" semi-recently.
Happy Days was big, and was set in the 50s while being from the mid-70s... but, I'd argue it could have been the early 50s, the show was on for over a decade, and AFAIK it wasn't an immediate hit (though it also didn't have the Fonz at first); also, FWIW, there was also a big nostalgia boom in the 70s that was focussed on the 40s (see the video linked below), so I would just lean on +/-5 years as why we see little discrepancies.
"[A behavioral scientist we asked about this] determined that we are not really all that happy about the present, we are terribly uncertain about the future, and when we talk about the old days we always refer to them -- I suppose because of our collective memory -- as the good ol' days; and, I guess, really, when you go back, you really can select the better parts of the good ol' days... like the music." -- a radio executive in the 70s
I don't think it is that simple. For example, take a look at the Connections Museum in Seattle: despite most of the technology being 50-100 years old, a decent bunch of their volunteers are in their 20s! Some people just love the technology itself for what it is.
And honestly, I don't think there's a lot to be nostalgic about with semi-modern tech. The hardware is a bit faster now, but looks and works the same as a machine from 15 years ago. Windows Vista is almost 20 years old by now, and is functionally virtually identical to Windows 11: some stuff got moved around and the whole UI got a dash of paint, that's about it. Games like GTA, CoD and FIFA got flashier graphics, yet are identical gameplay-wise. Want to play something novel like Portal? Just install it on your 2024-era machine via Steam.
People aren't going to be nostalgic about 2000s or 2010s era computing for the same reason very few people are nostalgic about toasters.
There's definitely nostalgia around MP3 players, iPods, or early smartphones, though! Those were still novel, had a lot of variation, and were genuinely world-changing to people.
The cars that a current fifty year old saw driving around in their late teenage years were mostly hot garbage. There's a lot for a person to be nostalgic about when it comes to the eighties, but not cars.
As an 80s kid, I want a Lamborghini Countach, Ferrari Testarosa and a DeLorean. My computer collectible would be a Pentium i586 in glass mounted on the wall; to me it represents the inflection point when we had enough compute for anything.
Today it's hard enough getting OEMs to explain what their features do to anyone smaller than an OS vendor, much less how they work.
Early 80s parts are a bit of a sweet spot combination of low price, enough challenge to be interesting but not so much as to be frustratingly difficult, and an interesting turning point in computing history as the first single-IC CPUs became popular.
That's part of the fascination too though.
Hooking up a 68k to some peripherals? Quite doable. Building something out of 7400 ICs? Tricky, but if your expectations aren't too high definitely possible. Transistors, vacuum tubes, mechanical stuff? You better be really dedicated to it!
I did that for my project in the microprocessor lab class I took in college in the early '80s. It is indeed quite doable [1][2]. It was a bit scary, because the 68k was quite new and Motorola only gave the school a small number of samples. I was told that if I fried it or broke it by not being careful enough when removing it from a socket I would not be given another one. I could not afford to buy one either. And I was taking this class in the last term of the school year as a senior, so failing it would mean not graduating.
> Building something out of 7400 ICs? Tricky, but if your expectations aren't too high definitely possible.
I considered doing that. EdX has a version of MIT's 6.191, "Computation Structure", which goes through how logic gates work at the MOSFET level on chip, then goes through combinatorial and sequential logic, followed by how to design a 32-bit RISC processor, which you them build and test in a logic level simulator. (You don't do all of the processor. You are given pre-defined modules for a RAM bank, a 32 x 32bit register file, and ROM). That's the first two parts of the course. The third part adds pipelining and caching to processor.
I took the first two parts at EdX and afterwards seriously considered actually building my processor out of 74xx series chips.
My parts list came to 350 chips. And that's not counting whatever parts I'd need for the RAM, register file, and ROM. That's way to big for my breadboard! :-)
My ALU design includes a shift unit that can do left or right shifts or rotates from 0 to 31 bits in one cycle and that uses around 90 chips. I could drop that, and change things so that shift and rotate instructions trap, and then emulate them in software. That cuts the chip count down to around 260.
Still too big for me. Even changing from 32 bits to 16 bits, or even 8 bits, would be too big, and so the idea to build it was discarded.
[1] https://imgur.com/Ts9wcfW
[2] https://imgur.com/3D4rvdC
Plus, tech was not controlled by large corporations as it is now. Back then there were many small hardware/software vendors competing and in many cases helping each other advancing. Now, everything is vanilla and controlled by Large Corps.
Me, I miss the days before GUIs, I always thought plain text interfaces were and still are the better in many cases than point/click.
I feel like the nineties are more like that. Deep into the 80ies, most people would probably still get something with a Z80 or MOS 6502/6510. If you were rich, maybe you got an XT 4.77 MHz somewhere in the 80ies. The 80286 and later the 80386 were out of reach for most consumers.
The late 80ies to late 90ies things change rapidly. The 386, 486 (remember going from SX to DX) and Pentium all became in reach for normal households. At the start of the 90ies most people would still have a PC speaker, but then within 1-2 years everyone got Sound Blasters (or clones). Around 1992 or so CD-ROM drives start to become popular. In 1996 3D games go to the next level as 3dfx blasts onto the market. People start getting internet at home.
In the 90ies, if you got a PC, it would already be outdated in 6-12 months. This was certainly not the case in the 80ies (or 00s, 10s, or 20s).
Also on the software side there are huge shifts. Outside Apple, PCs and home computers were mostly command-line interfaces in the 80ies. But in the early to mid-90ies graphical interfaces took over the world (Windows 3.x becoming a major hit), true multi-tasking becomes more important (OS/2, Windows 95, Windows NT), etc.
This. It was an amazing time. Like, it was a trip to read the ads in the monthly computer magazines. There would be several each month that would be something I never even dreamed of doing with a computer. Groundbreaking advances every month, over and over.
That feeling - that revolutionary advances were being made all the time - was amazing while it lasted. While it lasted. A lot of people may want to feel that again.
But I think you can't go home again. From the vantage point of 40 years of (slower) advances later, those things look pretty primitive. It won't feel the same.
Yes. My mother’s first computer was made by a company in Northern Kentucky called Thoroughbred Computers. It was tremendously expensive but locally made.
IBM would beg to differ. They were monopolists in a way that Microsoft and Google are only pale imitations of. "Nobody ever got fired for buying IBM" and the term FUD were coined to describe them. The entire x86 hardware ecosystem has its roots in IBM's dominance through the original IBM PC hardware architecture.
One aspect of why the '80s-'90s were interesting to the retro enthusiast because it was the era with the most diversity and competition in home computing. By 2000 or so, the IBM PC standard had steamrollered every other competitor, leaving only the Mac occupying a small niche.
What else could you be into? Computers pre the IBM 360 sucked. Personally I have some nostalgia for the 1970s PDP-11 and PDP-10 but not enough to really put a lot of time into setting up emulators. (The PDP-11 has the awkward bit that it has a 16-bit user address space, running RSTS/E you basically get a BASIC environment (or something like a CP/M environment) which is just slightly better than what you get with a 16-bit micro. I’d used installations where 20 people could enjoy that at once but for an individual hobbyist you might as well use or emulate a micro… But those GiGi terminals beat the pants off the Apple ][ and such)
Mass producing 8-bit micro hardware has tough economics today (can’t beat RasPi) but it is possible to make a computer of that era with mainly or entirely discrete components. (Even a 6502 or display controller!). Forget about making an IBM PC clone or an Amiga. (An FPGA can get there if you have enough $$)
If I wasn’t already overloaded with side projects I’d like to deeply explore the 24-bit world of the eZ80, like the old 360 and a place where micros never really developed a comfortable ecosystem. You could have more memory than the IBM PC that is easier to code for; you could run Linux with X Windows and Win 95 in 16GB of memory back in the day so such a system could kick ass. The AgonLight 2 has a programmable display controller based on the ESP32 and it should be very possible to make a more capable sprite-based graphics system than most contemporary game consoles, maybe even as good as the coin-op systems that supported games like Street Fighter
This maverick era produced all kinds of innovation that simply doesn't happen today with one-fits-all computers that are controlled remotely by operating systems and cloud services. Modern machines bear no resemblance to 80's home computers except in their purity as computing machines, Turing machines, with simple I/O and programming.
Once you find out "how it all works" and that all computers are the same, one develops a certain interest in how we got there and perhaps a wistfulness for what things might have become. Home computers of the early 80's is where that answer lies.
1) Completely stand-alone machines with just enough expansion to then get some form of connectivity (or added data storage)
2) Self-contained, instant boot systems without an "OS" as we know it today (instant on is much more fun than it might seem), and that you explored until you could learn literally what every single register/port did.
3) Single-box systems with almost everything under or behind the keyboard, that you had to plug into a TV to get anything done with.
There was a purposefulness and immediacy to using those machines that is absent from today's bloated, over-connected, highly distracting and ADHD-inducing systems.
It was, in short, a more civilised time :)
Not that there isn't opportunities like peak of DOS gaming with Gravis Ultra Sound and some Sound Blaster. Or early 3d gaming.
Assuming it's true, though, then I would imagine there are several contributing factors:
The 80s is when computing arrived for the masses - and most of those masses were children at the time; the first computer I owned was a ZX81 and I was 9 years old when I got it. That lends it powerful nostalgia value. For later generations computers were likely more part of the background.
That generation of people is also now entering their late 40s or 50s. They probably have some income (especially if they got into IT) and their outgoings are likely tailing off - if they have kids then those kids are leaving the nest or have already. So there's spare cash to spend on all the bits and pieces that they couldn't afford back when they owned them the first time!
It's all far enough in the past that you can see it through rosy spectacles. Ram Pack wobble, slow tape loading, limited memory and primitive graphics all become features instead of limitations.
Then for younger generations who are getting into this the above points mean that there's a background of somewhat knowledgeable people to propagate information about these machines.
Add on top of that the limited nature of the machines meaning that one can have a complete-ish (or illusion of such) understanding of the machines. That's always been appealing.
Personally I find the 1970s minicomputers far more fascinating! But my dad worked with some of those and I adore Unix culture so I'm probably atypical.
Happy Days was big, and was set in the 50s while being from the mid-70s... but, I'd argue it could have been the early 50s, the show was on for over a decade, and AFAIK it wasn't an immediate hit (though it also didn't have the Fonz at first); also, FWIW, there was also a big nostalgia boom in the 70s that was focussed on the 40s (see the video linked below), so I would just lean on +/-5 years as why we see little discrepancies.
https://youtu.be/l0ZXItfw5r4
"[A behavioral scientist we asked about this] determined that we are not really all that happy about the present, we are terribly uncertain about the future, and when we talk about the old days we always refer to them -- I suppose because of our collective memory -- as the good ol' days; and, I guess, really, when you go back, you really can select the better parts of the good ol' days... like the music." -- a radio executive in the 70s
And honestly, I don't think there's a lot to be nostalgic about with semi-modern tech. The hardware is a bit faster now, but looks and works the same as a machine from 15 years ago. Windows Vista is almost 20 years old by now, and is functionally virtually identical to Windows 11: some stuff got moved around and the whole UI got a dash of paint, that's about it. Games like GTA, CoD and FIFA got flashier graphics, yet are identical gameplay-wise. Want to play something novel like Portal? Just install it on your 2024-era machine via Steam.
People aren't going to be nostalgic about 2000s or 2010s era computing for the same reason very few people are nostalgic about toasters.
There's definitely nostalgia around MP3 players, iPods, or early smartphones, though! Those were still novel, had a lot of variation, and were genuinely world-changing to people.