Readit News logoReadit News
mrandish · a year ago
I just love this kind of thorough, well-researched digital archeology delving into the whys and hows of 70s and 80s computers. Having lived through that era as a fledgling teenage computer hobbyist when clock speeds were sub-megahertz and 4K of RAM seemed like an impossible amount of space to fill, we had no idea we were living in the Cambrian explosion of personal computing.

Every platform represented a unique branch in an evolutionary tree undergoing incredibly rapid evolutionary permutations. Each with their own clubs, offset-printed zines and software libraries, none compatible with the others. Few people outside some universities and corporations had ever touched a self-contained computer and no one in my family's extended social circles even had a computer at home! I remember it striking most people as simply being weird, prompting questions with a tone akin to "I heard you have a personal dirigible at your house. Um... what would you even do with that?"

No one knew what the future of computing would look like and we certainly never imagined that by the late 90s, the wild explosion of early diversity would have encountered an asteroid-scale die-off, shrinking to a single major survivor with a distant second place in household computers - leaving behind a landscape which felt shockingly empty compared to the wildly diverse frontier only a decade earlier.

II2II · a year ago
> 4K of RAM seemed like an impossible amount of space to fill

It seems impossible to imagine the impossible these days. The first home computer I remember my family owning is a VIC-20 (5 kB RAM), where the screen could contain about 1/2 kB of text. Granted, getting anything onto and off of the computer was a lot more difficult/expensive. I can barely even imagine earlier home computers.

And that Cambrian Explosion is no joke. The 80's were crazy. The 90's stepped it up even further.

mrandish · a year ago
> getting anything onto and off of the computer was a lot more difficult/expensive.

Yes, that's why 4k seemed so huge (really about 3.5k after zero page and some buffers were allocated by the BASIC interpreter). Initially, I didn't have any method of storage. The tape cassette recorder was another $50 I had to save up for. Having to retype BASIC programs from hand-written notes every time the power was turned off strongly incentivized short programs!

Taniwha · a year ago
Parts of this reminds me about an issue I had around the same time, I worked on porting A/UX the Unix port for the Mac II, we reportedly got half of the early production run, they came without plastics in cardboard boxes, with schematics and PAL equations (PALs were relatively new then) I found and they fixed a bug in the PAL equations (didn't handle 24-bit writes some new 68020 instructions could make) so I had some cred with them when we ran into a weird bug, very occasionally you worked with the IWM chip (used floppies) the keyboard would freeze up - but only on some machines.

Managed to figure out that the machines that worked had an ADB (keyboard) chip with markings on it, the ones that didn't had no markings - Apple swore they were the same, eventually told us the ones that didn't work were the ones the were doing the manufacturing run with ..... bug from hell ..... turns out there was some circuitry in there they were quite proud of, when you accessed the parallel IO (VIA) chip it tweaked the clock phase a little to give you faster access, part of working with the IWM chip involved setting up the timer in the VIA chip to time when the next sector would go by, we'd poll the timer fast enough to tweak the clock on every clock cycle resulting in the ADB chip which was connected to the VIA being clocked too fast .... we replaced that code with a simple delay loop

juliangamble · a year ago
> Acorn did ship a computer with the 65816—the Acorn Communicator—but when Sophie Wilson visited WDC in 1983 and saw Mensch and crew laying out the 65816 by hand, it struck her: if this motley crew on a card table could design a CPU, so could Acorn. Thus Wilson and Steve Furber forged their own CPU: the Acorn RISC Machine.
8A51C · a year ago
The first commercially available Acorn RISC processor was released as a co-processor for the BBC Micro. Acorn always had processors on the mind it seems as the Tube interface and protocol [1] is solely for co-processors.

There's an excellent Rasperry Pi based project, PiTubeDirect, which emulates the ARM and many other co-processors on original Acorn 6502 based hardware; Atom, Electron, Micro and Master [2]. The original expansion hardware is, as expected, incredibly rare and valuable.

[1] https://mdfs.net/Info/Comp/Acorn/AppNotes/004.pdf [2] https://github.com/hoglet67/PiTubeDirect

KingOfCoders · a year ago
Watching some Sophie Wilson videos over the years, she is a tremendous person.
nazgulsenpai · a year ago
I watched his YouTube video on this a few days ago. It goes WAAAAAY more in-depth than you'd expect and is a great hour of second-monitor viewing.
nsxwolf · a year ago
I can't listen to that much alliteration. I had to stop pretty early on.
cainxinth · a year ago
I still have my childhood IIGS. It sat in my folks’ poorly insulated attic for two decades before I rescued it, and it booted up on the first try! They don’t make ‘em like they used to.
kevin_thibedeau · a year ago
Their power supplies have RIFA brand capacitors in the power supply. Their cases are 100% guaranteed to be cracked from moisture ingress at this age. They will fail catastrophically when you plug it in at some point. If you want to keep it running, you need to replace the supply or recap it (Just the X & Y RIFAs).
rbanffy · a year ago
Apple II's were designed for the most computer-hostile environments in existence: primary schools.
buildsjets · a year ago
Replace the 3.7v lithium backup battery for the RTC before it blows up and spews acid all over your board! ROM01 versions have it soldered in, ROM03 is in a battery holder. Replacements for both are readily available.
cainxinth · a year ago
I did not know that and will investigate. Thanks!
bongodongobob · a year ago
As someone else said, google Apple RIFA caps and replace them. They are 100% going to fail on you and damage the machine. It's the first thing you do when purchasing an old Apple.

Deleted Comment

Reason077 · a year ago
Man, I wanted a IIGS so bad back in the day.

I remember our school computer lab had a whole bunch of Apple IIe's, a single Mac, and a single IIGS. The GS was by far the most coveted because (unlike most Macs of that era) it had a COLOUR screen and could play relatively advanced games. Eventually they upgraded to mostly Macs.

Dad ended up buying a 386 PC, which was probably for the best. Those SVGA graphics!

icedchai · a year ago
I had an Apple IIc, but wanted a IIgs. A flame war on a local BBS convinced me the Amiga was better, so I went with an A500.
codr7 · a year ago
The A500 was a sweet machine though...
gbeeson · a year ago
I rocked the IIe but wanted a IIgs like no other. I'd 'visit' it at our local 'Apple store' before the Apple store (It was Tokamac in Palo Alto if I remember correctly and I may not). I also collected an Amiga A500 and an Atari 1040st. Love. Those. Times.
twoodfin · a year ago
To me the most fascinating nugget of history was how close a young Tony Fadell (later General Magic, iPod, Nest) came to supplying high-speed 65816 chips!
jmbwell · a year ago
This was a fascinating and ... detailed ... story. I appreciate that it went a little further into the history of Apple's involvement ARM than the recent spate of blog posts that didn't go back past Newton.
rbanffy · a year ago
I too was surprised at how detailed it was. Learned a lot in those 50 minutes or so.

Plus, the Apple Iix mock-up is a beauty. I'll need to order a 50cm x 50cm 3D printer and find the right filament color for the Snow White look.