Younger people on here may not realize what a mindbending experience it was to see a GUI for the first time. The user experience of most pre-Mac computers was "black screen with glowing green bracket". If you typed something, the computer would respond "SYNTAX ERROR", which was even more intimidating. You in your ignorance had committed not just an error, but a syntax error, whatever that was.
Seeing a Mac for the first time, from the smiling icon at bootup to the way the pointer moved over things, and the way you could click to make things appear on the screen, was a transformative moment similar to the first time people a generation later encountered an image-capable web browser. It was one of those moments (like seeing VisiCalc in action) that made large numbers of people realize that a world was coming when computers might be usable by, and useful to, regular mortals.
I was nine years old when I saw my first Mac in 1984, and my only previous experience with computers was teaching an IBM PC to print 'BUTT' in an infinite loop using BASIC. I'd love to hear the impressions people older than me got the first time they used a GUI.
> the impressions people older than me got the first time they used a GUI
I wrote a whole book about it: Inventing the Future (starts end of 1976)
The PARC people already had several independent efforts (Bravo, Markup, Draw) and Smalltalk to build them with. Our task was to unify all the functionality into a single GUI.
(and by the way, Jobs and his minions didn't sneak in and steal it, contrary to what you've read.)
Not everyone was sold on it, a fact which some people would love to forget now. "We'll just slap a GUI on all this great functionality" was something marketing-droids actually did say.
The book is written with a strict "no hindsight" rule, but a friend and I did reconsider it all, with hindsight definitely allowed:
Same here. We had a VIC-20, which couldn't even display 80col. Then I got to go on a trip to the local uni over Xmas break and play with the first Macs. Everything changed. It was what computers were supposed to be, a kind of 2d VR where what you created could be printed IRL.
The next closest watershed moment to that would probably be getting to uni and being able to telnet around the world.
Then after that, probably getting to use an SGI Onyx to do "military grade" flight sims.
It cannot be understated what the Mac did to the computing industry. Yeah, the Star did it first, and the Mother of All Demos showed it off in 1968, but those things were never in people's living rooms. Thr Mac was.
I was in a funny in-between generation where I had a mix of GUIs and command lines first, through Atari/Amiga, and PC DOS and Windows (early 2.0, 3.x versions). The common thing about these machines were the unholy hacks one had to resort to to do more than one thing concurrently. The DOS TSRs or hooking things up to interrupt handlers on the Atari... heh. Anyway one day I watched someone type a command followed by an ampersand on some university Unix system. And again a few times. And they all actually executed concurrently, without any tricks like all but one got suspended. I still don't think I ever experienced the same feeling of awe as that day :)
I saw the Mac and was intrigued, but never touched one at the time. I hadn't played with one yet. BUT, about a year later, I DID check out an Amiga. Same CPU inside of it, but this one was in color. And had a GUI also, but also a CLI if you wanted to use that. AND it pre-emptive multitasked out of the box (no protected memory though, so it did crash). I got involved with the Amiga before the Mac...and I couldn't understand why people liked the Mac so much with how limited it was comparatively.
Just shows the power of presentation and a company behind a product that could launch it to bigger things. Commodore never really understood what it had with the Amiga, but Apple did...which is why the Mac has stuck around and the Amiga is just a footnote now.
I had access to both Macs and Amigas, and the Mac was simply way more polished. It was a complete product that did useful things. It didn’t have a command line because it didn’t need^ one.
The Amiga was very flashy but, ultimately, it felt like a toy to me. It was more fun to hack on than the Mac, but most people don’t care about that.
Wasn't desktop publishing a big early "killer app" for the Mac? It had the Laserwriter and I am under the impression that its management and rendering of fonts were better as well.
Amiga had a big impact on audio and art of I remember correctly. I took an art course in 1993 and the lab was full of Amigas. Not only did they have color screens but there was a rudimentary video capture system as well. We couldn't actually edit videos but we could incorporate video into projects.
We did have a Mac in the electronic audio lab though. Mind you, it was a single Mac Quadra. We had just gotten a 60mb hard drive which really opened up audio editing capabilities. We would dump our projects onto a DAT and reload them in our next time booked in the lab. All that gear was really expensive. I ended up using the analog stuff more (Moog something or other, Otari 1/4" 4 track, razor blades, tape, etc.) because it was so much easier to get up and going and tape loops were a lot of fun.
I only got to use an Amiga much later when it was already "retro" but the gigantic pixelated icons meant to be displayed on an TV over composite and ugly fonts and gaudy colors just gave me the impression that it was a toy. Reinforced by all the pirated games the one I had to use came with.
It's a shame Commodore never properly iterated on the product to let it reach it's potential.
The first time I saw a personal computer was at the home of a friend. His father was an engineer for Ping and they had a computer at their house. We weren't allowed to touch it but we watched his dad play a Star Trek game on it for a little while.
A couple years later I'd get my hands on a TRS-80 at school. I loved it.
When I went to high school in 1983 we had the Apple IIe in our lab. I spent hours on them doing basic and later Pascal. By then I had a vic-20 at home.
I was a part of a nerdy group that hung out in the computer lab any time we could and we had a pretty good relationship with the head of the match department. When they go their first Mac, that was the first time I saw gui in person.
We were given a few minutes to play with it and I really liked it. I don't think it was so shocking though. While the command line was my normal initial interface, I was used to playing games that provided a GUI of their own to interact with the game. The gui on the mac was another program like a game running on top of the OS.
After high school I didn't have a pc of my own for years. When I did get one is was a Packard Bell sx386 running DOS. I had that for a decent amount of time before I got my first copy of windows. For me it all just fit together and I would say to this day I think of a computer and it's operating system as the CLI it provides. The GUI is an abstraction that runs on top. I like a nice desktop ( KDE is my favorite ) but I get very frustrated any time I can't get underneath it and do what I want. My first instinct any time I hit a problem that feels difficult, is to drop to the shell.
> The gui on the mac was another program like a game running on top of the OS.
I presume you speak metaphorically, but just in case. Macintosh wasn't like that, the GUI was the OS, which is what made it so different from Microsoft's attempts, no DOS to drop down to.
> Younger people on here may not realize what a mindbending experience it was to see a GUI for the first time.
Indeed. When the Mac came out, my friend and I went to a local computer store to look at one.
My friend sat there for several minutes, mostly just playing with the menu bar, watching the menus come and go, flickering as he went from one to another with the mouse. He just sat there slack jawed going “Wow!”.
Folks have to appreciate the simple marvel of having that beautiful, crisp display with square pixels. This was not routine.
It was amazing.
We both bought one with the student discount. $1480 for a 128k Mac. Over $4300 inflation adjusted today. At the time a box of floppies were $40. Man did we blow vast amounts of money on computer hardware back then.
And, yes, a 128k, single floppy Mac was borderline useless. We got back alley RAM upgrades and external floppy drives. There went another $800.
But with that we could develop in C on the machine.
I wasn't that impressed with the Mac GUI the first time I saw it. There was no color, it could barely do more than beep, and there wasn't much software available yet so it seemed pretty useless compared to my Commodore 64. I had Magic Desk on the C64 in 1983 and while it wasn't a bitmapped desktop, it was a desktop with many similarities to the Mac. It even had a pointer.
When the Amiga came out, there was no way I was choosing a Mac over the powerful Amiga with 4096 colors, stereo sound, multi-tasking and amazing games. While I had it I was able to digitize and edit images, record and edit audio, built and used a MIDI interface with it, and even went online with it even before the web was a thing. I also ran a BBS on it and was still able to use the Amiga for all these other things at the same time.
The Mac was the first commercially available bitmapped GUI but the Amiga was the first system with a bitmapped GUI that was like modern OSes with color, sound, and multi-tasking. The Mac didn't have those things until later.
It’s funny, my father was quite dismissive of GUIs. He thought they were just toys, I don’t think he even used Windows until he was forced to around ‘98.
(the iPhone transformed him into the biggest Apple fan on the planet, somehow)
I remember the transition to GUIs and, although they looked nice, they didn't feel so productive. By the time the GUI had started up and you had found what you were looking for, you'd have been well into your task on DOS and Wordstar or Turbo C or whatever.
The very first GUI I ever used was a thing called Fleet Street Editor on a BBC Micro. It was slow, but since the task at hand was about graphics, it made sense to use it. All other things you could do on a micro were a lot faster on the CLI.
I was too. I often referred to them as point and grunt. I stuck with DOS plus DesqView (no mouse) until I started using OS/2 around 1994 or so. Having multiple windows was fine, but I preferred the command line. I kind of still do.
I'm older. But I also saw my first Mac in 1984. ;-)
My first exposure to a computer was in 1981, when I took a high school programming class. So I was certainly not intimidated by text entry or errors. While in college, someone came through and demonstrated an Apple Lisa in the big lecture hall, and of course the Mac came out soon thereafter.
What I can say is that complete familiarity with text based systems didn't make me any less impressed with the GUI. It might have made me in less of a rush to go out and get one of my own, but my bank account had its own opinion about that too.
The first time when I have used a GUI was with Windows 3.0, but it did not impress me much, because at that time it was just a useless toy.
On the same computer I was using many MS-DOS programs which not only were much more useful than anything that existed for Windows at that time (1990), but they also had much more clever user interfaces, some of which have never been completely matched by the later GUI programs, e.g. Lotus 1-2-3, AutoCAD, XTree, the Brief text editor and many others.
While it is said that in GUI programs it should be easier for the user to discover the available commands and their behavior, that was not really a problem for the good MS-DOS programs, because they all came with huge well-written user manuals, where the user could find everything that was needed to know about the user interface, so even when the UI consisted of obscure keyboard shortcuts, learning it was not difficult.
The only thing that I really liked in the Windows 3.0 GUI were the included fonts (Times, Helvetica and Courier), which were much nicer than the SuperVGA bitmap fonts that were used by the MS-DOS programs.
Later, Windows 3.1 and 3.11 for Workgroups were still not competitive with MS-DOS or DR-DOS. Only IBM OS/2 2.0 looked like a real improvement over the operating systems with a text-based UI, but it lacked useful applications.
Everything changed with the launch of Windows 95, together with the availability of cheap 486 CPU clones, which finally enabled a decent speed for GUIs, which previously had been annoyingly slow for any professional use.
Only with Windows 95, both myself and everybody I knew have switched to using GUIs.
First used a GUI on my Amiga 500 back in 1987. The interface was relatively fast and responsive with simple window animations. I mostly used it to play videogames due to the graphics and S T E R E O sound (which was amazing). In fact, the graphics capability was much more impressive than the novelty of the GUI, and this was coming from a ZX-81 and an Atari 800.
I can remember in 7th grade reading about the Lisa and the claim a computer so easy to use that 2 year old could use seemed non-sensical to my experience.
My Mom english major (who made a “living” as a potter saw I a Mac 128K at the mall and went on a frenzy to buy and experienced the birth of desktop publishing at the age of 46.
Not to mention that until the Apple //e was released in 1983, there weren’t even arrow keys on the Apple ][ keyboard. Even in the Apple //e era, word-processing was not only not WYSIWYG, but you had to insert printer-specific control sequences in your output which meant that on some printers, if you had a line break in the middle of an underlined phrase, the underline would continue from the left margin on the second line.
A bitmapped GUI is a different beast. You can use WordPerfect with a mouse but it's not a GUI. Borland's Turbo Vision, which had a TUI windowing system, didn't come out until 1990. GEM was 1984, after the Lisa and during the lead-up to the first Mac sales. (And, honestly, how many Atari ST's ever sold?)
My computers were always Macs at home. It’s no wonder I saw IBM compatibles and even Apple IIs and thought of them as ridiculous. I think it probably pushed me into thinking of computers as tools for accomplishing a task instead of as a hobby, as many of my peers did. It took me a long time to appreciate what early hobby computers were.
A [1984] tag would provide some context, as that is when this interesting article was published. (This is mentioned in a callout near the beginning, but it is easy to skip over those.)
Definitely recommended reading, especially if you weren't around in those days.
I like this passage which explains some computing terms we take for granted now:
> The model was, of course, the Lisa workstation with its graphic “windows” to display simultaneously many different programs. “Icons,” or little pictures, were used instead of cryptic computer terms to represent a selection of programs on the screen; by moving a “mouse,” a box the size of a pack of cigarettes, the user manipulated a cursor on the screen. The Macintosh team redesigned the software of the Lisa from scratch to make it operate more efficiently, since the Macintosh was to have far less memory than the 1 million bytes of the Lisa.
The article also reminded me how expensive computers were in those days. Accounting for inflation, the $2500 Macintosh would be about $7500 today, and the $10,000 Lisa with its 1 million bytes of memory would be $30,000.
Back in 1982, I decided the just-introduced IBM PC was not a real computer because it didn't have a front panel with lights and switches. So I bought an Ithaca Intersystems DPS-1, a high end S-100 system with a Z80 processor and a real front panel, for $8000. Later I added a 20 megabyte hard drive for another $6000. So I guess that $14,000 computer would be about $42,000 today!
Read the article and, as it happens, I am also wrapping up reading "Soul of a New Machine".
Something about both that strikes me: that wonderful era when the engineers were (and pardon me) left the fuck alone.
To be sure there was management (and I am under no illusion that Jobs would have been in fact one of the more complicated managers to report to) but there was an attitude of "let me wire-wrap it up tonight and I'll show you we can handle the mouse interrupts without a dedicated driver chip." Or, "I think I figured out a fast way to handle rendering to overlapping windows within out memory constraints. I coded it up last night."
It strikes me that a device like the Macintosh (or the Data General Eclipse for that matter) will never be created again in the same way.
I'm also old enough to see parallels in how software development was done at the beginning of my career and how it is done today. I admit that I miss those days.
Soul is one of my favorite non technical book about technical products. I read it from back to back burning candles and am now reading it for a second time. As a software person building physical products always intrigue me.
This, plus the 1984 Macintosh article showed me what was a great time back in the 70s/80s for computer engineers -- you could design the hardware with a very small team, and there were more diversity in computer manufacturers back then.
Nevertheless, thanks to people like Ben Eater, I can still live that era with much more information and hand-holding. His 6502-computer series is a gold mine. I'm going to watch all of the videos and build one afterwards.
In the back of my mind, sometimes I feel that building a customer computer, if possible to be a 32/64 bit one, is an essential skill to fight the dystopian future we are inevitably walking into.
Maybe not something at the micro scale of the Macintosh where everything was designed from scratch but me and a few other people designed and made our own rocket flight computer out of sensors and an NXP 32 bit microcontroller in college.
I’d been doing some retro-mac’ing the past couple days, and one thing I noticed is how impossibly “tight” the mouse cursor movement always seems to the physical input. It’s probably just a few dozen instructions from taking the interrupt to scaling and placing the cursor for the next vbl task, but it’s like sticking a 10 point landing.
Cursors on sun workstations (not just because of their optical grips mousepad) or windows boxes always felt like “oh there’s an abstraction between the physical hardware and the pointer, that’s just the way it is” - and nobody cared to make it better.
It's been getting better in the past decade but similarly, one of the papercuts of the Linux desktop experience that used to bug me was how you could kinda feel the numerous layers it was composed of slipping around a bit. I think all modern OSes suffer this to at least a small extent today.
That "tightness" you mention is quite tangible on my 500Mhz PowerBook G3 when it's booted into OS 9. It would be nice to get that back in its entirety somehow.
There's a related tale to this in Andy Hertzfeld's Macintosh Folklore book when demoing to Microsoft Bill Gates assumed it required specific hardware to draw the cursor (bonus outburst from Steve too)
Great article which I do recommend. Brought back memories.
My friend DJ and I both attended the summer program for highschoolers at CMU in 1981. She accepted the early admission that were were both offered (I couldn't see skipping my senior year), so she was starting her sophomore year when I started as a freshman. DJ was a woman - a very rare thing in engineering at that time. She was MechE, but my ECE class of over a hundred had just two women.
Anyway with that story background, DJ bought an IBM and I bought a Mac. They were probably the only two computers in our dorm for a while. We joked with each other about their respective pros and cons. Funny thing is that many of those same jokes and pros and cons are mentioned today.
I still have that Mac. Unmodified Mac early off that assembly line :)
What is interesting to me reading this is to see the unmentioned influence of Steve Wozniak in the efforts to keep chip counts and costs down in the Macintosh. That seems very much a Woz thing (his designs for the Apple ][ were works of art in this respect).
I know that. It’s not a direct influence, but a cultural one. Woz was always looking for ways to keep things simple and cheap in the hardware and it’s clear that this influenced those who worked on the Mac.
They forgot the part were Jobs ordered that Woz’s IIgs (also being designed at the time) must be kneecapped (and have its clock rate halved), so it wouldn’t overshadow the Mac.
I have also heard that the Mac people (though possibly not Jobs specifically, as he left in 1985) didn't want the //gs to compete with it, and were unhappy with having a Mac-like GUI in color on the //gs when the original Mac was black and white. It also beat the Mac to the punch with better sound capabilities and ADB.
Jobs certainly had no interest in making the Mac compatible with the Lisa, and intentionally positioned it as a competitor - effectively killing the Lisa line. Sadly, preemptive multitasking didn't become standard in mainline Mac OS until OS X in 2001 - some 18 years after it appeared on the Lisa.
Apple also didn't want the Mac and Newton lines to compete with each other, much as they are currently trying to differentiate MacBooks and iPads. Kind of a shame since the eMate seems like it was an interesting system, and pen-based and palmtop Macs could have been interesting as well. Today iPadOS still has many limitations compared to macOS, in spite of running on the same silicon.
I think that’s a myth. (a) the timing is wrong (b) The people who worked on the IIgs seem to think WDC couldn’t deliver enough of the faster 65816 parts to meet Apple’s volume, and that’s why they never shipped a faster IIgs.
Seeing a Mac for the first time, from the smiling icon at bootup to the way the pointer moved over things, and the way you could click to make things appear on the screen, was a transformative moment similar to the first time people a generation later encountered an image-capable web browser. It was one of those moments (like seeing VisiCalc in action) that made large numbers of people realize that a world was coming when computers might be usable by, and useful to, regular mortals.
I was nine years old when I saw my first Mac in 1984, and my only previous experience with computers was teaching an IBM PC to print 'BUTT' in an infinite loop using BASIC. I'd love to hear the impressions people older than me got the first time they used a GUI.
I wrote a whole book about it: Inventing the Future (starts end of 1976)
The PARC people already had several independent efforts (Bravo, Markup, Draw) and Smalltalk to build them with. Our task was to unify all the functionality into a single GUI.
(and by the way, Jobs and his minions didn't sneak in and steal it, contrary to what you've read.)
Not everyone was sold on it, a fact which some people would love to forget now. "We'll just slap a GUI on all this great functionality" was something marketing-droids actually did say.
The book is written with a strict "no hindsight" rule, but a friend and I did reconsider it all, with hindsight definitely allowed:
https://www.albertcory.io/lets-do-have-hindsight
The next closest watershed moment to that would probably be getting to uni and being able to telnet around the world.
Then after that, probably getting to use an SGI Onyx to do "military grade" flight sims.
It cannot be understated what the Mac did to the computing industry. Yeah, the Star did it first, and the Mother of All Demos showed it off in 1968, but those things were never in people's living rooms. Thr Mac was.
Just shows the power of presentation and a company behind a product that could launch it to bigger things. Commodore never really understood what it had with the Amiga, but Apple did...which is why the Mac has stuck around and the Amiga is just a footnote now.
The Amiga was very flashy but, ultimately, it felt like a toy to me. It was more fun to hack on than the Mac, but most people don’t care about that.
^ I’m so happy it has one today.
Amiga had a big impact on audio and art of I remember correctly. I took an art course in 1993 and the lab was full of Amigas. Not only did they have color screens but there was a rudimentary video capture system as well. We couldn't actually edit videos but we could incorporate video into projects.
We did have a Mac in the electronic audio lab though. Mind you, it was a single Mac Quadra. We had just gotten a 60mb hard drive which really opened up audio editing capabilities. We would dump our projects onto a DAT and reload them in our next time booked in the lab. All that gear was really expensive. I ended up using the analog stuff more (Moog something or other, Otari 1/4" 4 track, razor blades, tape, etc.) because it was so much easier to get up and going and tape loops were a lot of fun.
It's a shame Commodore never properly iterated on the product to let it reach it's potential.
A couple years later I'd get my hands on a TRS-80 at school. I loved it.
When I went to high school in 1983 we had the Apple IIe in our lab. I spent hours on them doing basic and later Pascal. By then I had a vic-20 at home. I was a part of a nerdy group that hung out in the computer lab any time we could and we had a pretty good relationship with the head of the match department. When they go their first Mac, that was the first time I saw gui in person.
We were given a few minutes to play with it and I really liked it. I don't think it was so shocking though. While the command line was my normal initial interface, I was used to playing games that provided a GUI of their own to interact with the game. The gui on the mac was another program like a game running on top of the OS.
After high school I didn't have a pc of my own for years. When I did get one is was a Packard Bell sx386 running DOS. I had that for a decent amount of time before I got my first copy of windows. For me it all just fit together and I would say to this day I think of a computer and it's operating system as the CLI it provides. The GUI is an abstraction that runs on top. I like a nice desktop ( KDE is my favorite ) but I get very frustrated any time I can't get underneath it and do what I want. My first instinct any time I hit a problem that feels difficult, is to drop to the shell.
I presume you speak metaphorically, but just in case. Macintosh wasn't like that, the GUI was the OS, which is what made it so different from Microsoft's attempts, no DOS to drop down to.
My friend sat there for several minutes, mostly just playing with the menu bar, watching the menus come and go, flickering as he went from one to another with the mouse. He just sat there slack jawed going “Wow!”.
Folks have to appreciate the simple marvel of having that beautiful, crisp display with square pixels. This was not routine.
It was amazing.
We both bought one with the student discount. $1480 for a 128k Mac. Over $4300 inflation adjusted today. At the time a box of floppies were $40. Man did we blow vast amounts of money on computer hardware back then.
And, yes, a 128k, single floppy Mac was borderline useless. We got back alley RAM upgrades and external floppy drives. There went another $800.
But with that we could develop in C on the machine.
It was my first usable C compiler with the awesome MPW Shell. I was developing commercially and had a hard disk :-)
Luxury!
I’ve been a commercial Mac coder ever since, going via NeXT which turned out to be lucrative a few years later.
When the Amiga came out, there was no way I was choosing a Mac over the powerful Amiga with 4096 colors, stereo sound, multi-tasking and amazing games. While I had it I was able to digitize and edit images, record and edit audio, built and used a MIDI interface with it, and even went online with it even before the web was a thing. I also ran a BBS on it and was still able to use the Amiga for all these other things at the same time.
The Mac was the first commercially available bitmapped GUI but the Amiga was the first system with a bitmapped GUI that was like modern OSes with color, sound, and multi-tasking. The Mac didn't have those things until later.
(the iPhone transformed him into the biggest Apple fan on the planet, somehow)
The very first GUI I ever used was a thing called Fleet Street Editor on a BBC Micro. It was slow, but since the task at hand was about graphics, it made sense to use it. All other things you could do on a micro were a lot faster on the CLI.
My first exposure to a computer was in 1981, when I took a high school programming class. So I was certainly not intimidated by text entry or errors. While in college, someone came through and demonstrated an Apple Lisa in the big lecture hall, and of course the Mac came out soon thereafter.
What I can say is that complete familiarity with text based systems didn't make me any less impressed with the GUI. It might have made me in less of a rush to go out and get one of my own, but my bank account had its own opinion about that too.
On the same computer I was using many MS-DOS programs which not only were much more useful than anything that existed for Windows at that time (1990), but they also had much more clever user interfaces, some of which have never been completely matched by the later GUI programs, e.g. Lotus 1-2-3, AutoCAD, XTree, the Brief text editor and many others.
While it is said that in GUI programs it should be easier for the user to discover the available commands and their behavior, that was not really a problem for the good MS-DOS programs, because they all came with huge well-written user manuals, where the user could find everything that was needed to know about the user interface, so even when the UI consisted of obscure keyboard shortcuts, learning it was not difficult.
The only thing that I really liked in the Windows 3.0 GUI were the included fonts (Times, Helvetica and Courier), which were much nicer than the SuperVGA bitmap fonts that were used by the MS-DOS programs.
Later, Windows 3.1 and 3.11 for Workgroups were still not competitive with MS-DOS or DR-DOS. Only IBM OS/2 2.0 looked like a real improvement over the operating systems with a text-based UI, but it lacked useful applications.
Everything changed with the launch of Windows 95, together with the availability of cheap 486 CPU clones, which finally enabled a decent speed for GUIs, which previously had been annoyingly slow for any professional use.
Only with Windows 95, both myself and everybody I knew have switched to using GUIs.
Deleted Comment
My Mom english major (who made a “living” as a potter saw I a Mac 128K at the mall and went on a frenzy to buy and experienced the birth of desktop publishing at the age of 46.
Deleted Comment
The next time I experienced that same feeling was during Jeff Han's TED talk where he gave his multitouch demo.
https://www.youtube.com/watch?v=ac0E6deG4AU
Definitely recommended reading, especially if you weren't around in those days.
I like this passage which explains some computing terms we take for granted now:
> The model was, of course, the Lisa workstation with its graphic “windows” to display simultaneously many different programs. “Icons,” or little pictures, were used instead of cryptic computer terms to represent a selection of programs on the screen; by moving a “mouse,” a box the size of a pack of cigarettes, the user manipulated a cursor on the screen. The Macintosh team redesigned the software of the Lisa from scratch to make it operate more efficiently, since the Macintosh was to have far less memory than the 1 million bytes of the Lisa.
The article also reminded me how expensive computers were in those days. Accounting for inflation, the $2500 Macintosh would be about $7500 today, and the $10,000 Lisa with its 1 million bytes of memory would be $30,000.
Back in 1982, I decided the just-introduced IBM PC was not a real computer because it didn't have a front panel with lights and switches. So I bought an Ithaca Intersystems DPS-1, a high end S-100 system with a Z80 processor and a real front panel, for $8000. Later I added a 20 megabyte hard drive for another $6000. So I guess that $14,000 computer would be about $42,000 today!
Something about both that strikes me: that wonderful era when the engineers were (and pardon me) left the fuck alone.
To be sure there was management (and I am under no illusion that Jobs would have been in fact one of the more complicated managers to report to) but there was an attitude of "let me wire-wrap it up tonight and I'll show you we can handle the mouse interrupts without a dedicated driver chip." Or, "I think I figured out a fast way to handle rendering to overlapping windows within out memory constraints. I coded it up last night."
It strikes me that a device like the Macintosh (or the Data General Eclipse for that matter) will never be created again in the same way.
I'm also old enough to see parallels in how software development was done at the beginning of my career and how it is done today. I admit that I miss those days.
This, plus the 1984 Macintosh article showed me what was a great time back in the 70s/80s for computer engineers -- you could design the hardware with a very small team, and there were more diversity in computer manufacturers back then.
Nevertheless, thanks to people like Ben Eater, I can still live that era with much more information and hand-holding. His 6502-computer series is a gold mine. I'm going to watch all of the videos and build one afterwards.
In the back of my mind, sometimes I feel that building a customer computer, if possible to be a 32/64 bit one, is an essential skill to fight the dystopian future we are inevitably walking into.
Also see : https://www.folklore.org/ all kinds of short stories about the early days of the Macintosh
Cursors on sun workstations (not just because of their optical grips mousepad) or windows boxes always felt like “oh there’s an abstraction between the physical hardware and the pointer, that’s just the way it is” - and nobody cared to make it better.
That "tightness" you mention is quite tangible on my 500Mhz PowerBook G3 when it's booted into OS 9. It would be nice to get that back in its entirety somehow.
https://www.folklore.org/StoryView.py?project=Macintosh&stor...
My friend DJ and I both attended the summer program for highschoolers at CMU in 1981. She accepted the early admission that were were both offered (I couldn't see skipping my senior year), so she was starting her sophomore year when I started as a freshman. DJ was a woman - a very rare thing in engineering at that time. She was MechE, but my ECE class of over a hundred had just two women.
Anyway with that story background, DJ bought an IBM and I bought a Mac. They were probably the only two computers in our dorm for a while. We joked with each other about their respective pros and cons. Funny thing is that many of those same jokes and pros and cons are mentioned today.
I still have that Mac. Unmodified Mac early off that assembly line :)
https://www.businessinsider.com/steve-wozniak-thought-the-fi...
https://en.wikipedia.org/wiki/Integrated_Woz_Machine
Jobs certainly had no interest in making the Mac compatible with the Lisa, and intentionally positioned it as a competitor - effectively killing the Lisa line. Sadly, preemptive multitasking didn't become standard in mainline Mac OS until OS X in 2001 - some 18 years after it appeared on the Lisa.
Apple also didn't want the Mac and Newton lines to compete with each other, much as they are currently trying to differentiate MacBooks and iPads. Kind of a shame since the eMate seems like it was an interesting system, and pen-based and palmtop Macs could have been interesting as well. Today iPadOS still has many limitations compared to macOS, in spite of running on the same silicon.
(sorry, can’t find the reference now)