The source code is such a fun read (for the comments). I found some source code for GW-BASIC, and here are two of my favorites:
;WE COULD NOT FIT THE NUMBER INTO THE BUFFER DESPITE OUR VALIENT
;EFFORTS WE MUST POP ALL THE CHARACTERS BACK OFF THE STACK AND
;POP OFF THE BEGINNING BUFFER PRINT LOCATION AND INPUT A "%" SIGN THERE
;CONSTANTS FOR THE RANDOM NUMBER GENERATOR FOLLOW
;DO NOT CHANGE THESE WITHOUT CONSULTING KNUTH VOL 2
;CHAPTER 3 FIRST
Fun fact, GW-BASIC was a descendant of the original Altair BASIC. The "Translation created 10-Feb-83" headers on each source file refer to tooling Microsoft had that automatically translated the 8080 assembly to 8086 (it shouldn't be taken as a build date since they were manually modified after that point). Besides GW-BASIC, source code for the 6502 and 6809 rewrites of Microsoft BASIC were available up to this point (see https://www.pagetable.com/?p=774 and https://github.com/davidlinsley/DragonBasic) but I believe this is the first public release of the original 8080 BASIC code.
The design is fun and gave me a lot of nostalgia, but I admit they overdid it. They could have made that piece feel the same without so much distraction. And please people, support reader mode. It's not hard and it shouldn't be optional.
EDIT: Good god they animated EVERYTHING. It's not even readable... also... not one inline code sample? This is the designer trying to get an awwwards site of the day without any interest in the actual content. It's like a guitar player that solos over everyone else's solos.
On top of the poor readability, my 2-year-old laptop can't even navigate through the page without CPU and GPU going insane, and my fans blasting at max speed. It's the poorest, choppiest web performance I can recall, all for what should be a simple blog post.
Yeah, some of the Apple pages are pretty bad too, and I feel Jobs would've kept it more minimal. But to me, this blog is worse because it's supposed to showcase a whole article, and it's hard to read (and not even compatible with browser reader-mode).
I do tend to agree with your sentiment — his business practices have not been ethically stellar. Despite, if you wish to bring forward criticism and be taken seriously, you’d do well to first familiarize yourself with the basic difference between copyright and patents.
I've written an Intel 8080 emulator that was portable between Dec10/VAX/IBM VM CMS. That was easy - the 8080 can be done quite simply with a 256 value switch - I did mine in FORTRAN77.
Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.
>Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.
The floating point routines are Monte Davidoff's work. But yes, Gates and Allen writing Altair BASIC on the Harvard PDP-10 without ever actually seeing a real Altair, then having it work on the first try after laboriously entering it with toggle switches at MITS in Albuquerque, was a remarkable achievement.
Then, their BASIC was debugged by running it on the emulator.
The genius was not the difficulty of doing that, it wasn't hard. The genius was the idea of writing an 8080 emulator. Wozniak, in comparison, wrote Apple code all by hand in assembler and then hand-assembled it to binary, a very tedious and error-prone method.
In the same time period, I worked at Aph, and we were developing code that ran on the 6800 and other microprocessors. We used full-fledged macro assemblers running on the PDP-11 to assemble the code into binary, and then download binary into an EPROM which was then inserted into the computer and run. Having a professional macro assembler and text editors on the -11 was an enormous productivity boost, with far fewer errors. (Dan O'Dowd wrote those assemblers.)
(I'm doing something similar with my efforts to write an AArch64 code generator. First I wrote a disassembler for it, testing it by generating AArch64 code via gcc, disassembling that with objdump and then comparing the results with my disassmbler. This helps enormously in verifying that the correct binary is being generated. Since there are thousands of instructions in the AArch64, this is a much scaled up version of the 8080.)
Allen had to write the loader in machine code, which was toggled in on the Altair console. The BASIC interpreter itself was loaded from paper tape via the loader and a tape reader. The first BASIC program Allen ran on the Altair was apparently "2 + 2", which worked - i.e. it printed "4" I'd like to have such confidence in my own code, particularly the I/O, which must have been tricky to emulate on the Dec10.
Fun facts, according to Jobs for some unknown reasons Wozniak refused to add floating point support to Apple Basic thus they had to license BASIC with floating point numbers from Microsoft [1].
[1] Bill & Steve (Jobs!) reminisce about floating point BASIC:
Writing a floating point emulator (I've done it) is not too hard. First, write it in a high level language, and debug the algorithm. Then hand-assembling it is not hard.
What is hard is skipping the high level language step, and trying to do it in assembler in one step.
Floating point math was a key feature on these early machines, since it opened up the "glorified desk calculator" use case. This was one use for them (along with gaming and use as a remote terminal) that did not require convenient data storage, which would've been a real challenge before disk drives became a standard. And the float implementation included in BASIC was the most common back in the day. (There are even some subtle differences between it and the modern IEEE variety that we'd be familiar with today.)
What stands out to me about Gates and Allen is the serious technical chops. Writing an emulator for the PDP-10 and then an interpreter, line editor, I/O system all in 4KB of memory. The code is worth reading and in addition to that they had a very solid business sense and pretty serious work ethic for people who were 20 years old.
It stands to me in real contrast to the "fake it till you make it", "if it works you shipped too late" hustle culture that took hold of the industry, with entire products just being API wrappers. Really hope we see more companies that start out like Microsoft again.
To be fair they definitely faked it, they said they had source code for a program they hadn't even written yet! They were just also very serious about the "making it" part.
True but "fake it and then immediately proceed to make it" is definitely more appreciated than just burning through deals by lying for a long time, which "fake it till you make it" usually boils down to.
IMO although it was complex, the human brain could still manage the complexity back then. Reading Woz's autobiography, it feels he knew what every logic gate on the original Apple computer did.
The PDP-10 probably worked at "human speed" too...
> It stands to me in real contrast to the "fake it till you make it"
They are the all-time greatest in fake-it-til-you-make-it. They got the IBM PC OS contract without having an OS, which they bought from someone else (iirc).
> What stands out to me about Gates and Allen is the serious technical chops. Writing an emulator for the PDP-10 and then an interpreter, line editor, I/O system all in 4KB of memory.
Is that really so impressive? Everything then was in 4K, from all coders.
Some luck, and willingness to take risks paid off in ways that could never be anticipated. Not sure I'll see something like the pc era in my lifetime. Perhaps mobile phones, or the Internet.
Having lived through pcs, internet, mobile, social, crypto and ai, I’d say mobile or social has been the biggest so far and AI is likely to be vastly larger impact. Of course they build on each other. But the global impact of mobile and social vastly exceed that of the pc era.
That came out of millions of dollars and man hours of investment by Google and OpenAi.
VS
Some college students selling software they didn't have and getting it ready from 0 to sellable in 2 months which led to a behemoth that still innovates to this day.
Consider that nobody ever sat in countless meetings asking "How can we use the PC?" They either saw the vision and went for it, or eventually ran up against the limitations of working without a PC and bought in.
Flipping through the source code is like a time machine tour of tech's evolution over the past 50 years. It made me wonder: will our 2025 code look as ancient by 2075?
That's interesting to consider. Some of the GNU code is getting quite old and looking through it is a blast from the past. I'm frankly amazed that it continues to work so well. I suspect there is a relatively small group of GNU hackers out there rocking gray and white beards that are silently powering the foundations of our modern world, and I worry what's going to happen when they start retiring. Maybe we'll get rust rewrites of everything and a new generation will take over, but frankly I'm pretty worried about it.
I think to most (90+%?) software developers out their in the world, Assembler might as well be hieroglyphics. They/we can guess at the concepts involved of course, but actually being able to read the code end to end, and have a mental model of what is happening is not really going to happen. Not without some sort of Rosetta Stone. (Comments :) )
I think 2075 developers will feel the same way about modern Java, C#, TypeScript, etc.
They will think of themselves as software developers but they won't be writing code the same way, they'll be giving guided instructions to much higher level tools (perhaps AIs that themselves have a provenance back to modern LLMs)
Just as today there will still be those that need to write low level critical code. There are still lots of people today that have to write Assembler, though end up expressing it via C or Rust. And there will be people still working on AI technology. But even those will be built off other AI's.
Thanks for the Democracy Now interview! His description of "tech layoffs" is the most concise framing I've heard to describe what I've felt about it:
"Cosmetic offering to the financial markets to show that Silicon Valley still can control its labor costs... It's less the future flow of funds is improved ... than that they're signaling something to the markets ..."
They weren't integrated into programming-oriented editors, and it would have been unusual to run them against code.
The font-shimmering effect on scroll immediately reminded me of that, it is really distracting. And you can’t use reader mode to disable it.
(FWIW, I’m a fan of Bill Gates and all he’s done for the world)
EDIT: Good god they animated EVERYTHING. It's not even readable... also... not one inline code sample? This is the designer trying to get an awwwards site of the day without any interest in the actual content. It's like a guitar player that solos over everyone else's solos.
The voice of this blog post does sound a little corporate, tbh
Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.
The floating point routines are Monte Davidoff's work. But yes, Gates and Allen writing Altair BASIC on the Harvard PDP-10 without ever actually seeing a real Altair, then having it work on the first try after laboriously entering it with toggle switches at MITS in Albuquerque, was a remarkable achievement.
https://pastraiser.com/cpu/i8080/i8080_opcodes.html
Then, their BASIC was debugged by running it on the emulator.
The genius was not the difficulty of doing that, it wasn't hard. The genius was the idea of writing an 8080 emulator. Wozniak, in comparison, wrote Apple code all by hand in assembler and then hand-assembled it to binary, a very tedious and error-prone method.
In the same time period, I worked at Aph, and we were developing code that ran on the 6800 and other microprocessors. We used full-fledged macro assemblers running on the PDP-11 to assemble the code into binary, and then download binary into an EPROM which was then inserted into the computer and run. Having a professional macro assembler and text editors on the -11 was an enormous productivity boost, with far fewer errors. (Dan O'Dowd wrote those assemblers.)
(I'm doing something similar with my efforts to write an AArch64 code generator. First I wrote a disassembler for it, testing it by generating AArch64 code via gcc, disassembling that with objdump and then comparing the results with my disassmbler. This helps enormously in verifying that the correct binary is being generated. Since there are thousands of instructions in the AArch64, this is a much scaled up version of the 8080.)
[1] Bill & Steve (Jobs!) reminisce about floating point BASIC:
https://devblogs.microsoft.com/vbteam/bill-steve-jobs-remini...
What is hard is skipping the high level language step, and trying to do it in assembler in one step.
And Bill Gates complaining about pirating $150 Altair BASIC inspired the creation of Tiny BASIC, as well as the coining of "copyleft".
https://tinyurl.com/2jttvjzk
The computer came with some pretty good books with example BASIC programs to type in.
It stands to me in real contrast to the "fake it till you make it", "if it works you shipped too late" hustle culture that took hold of the industry, with entire products just being API wrappers. Really hope we see more companies that start out like Microsoft again.
The PDP-10 probably worked at "human speed" too...
They are the all-time greatest in fake-it-til-you-make-it. They got the IBM PC OS contract without having an OS, which they bought from someone else (iirc).
> What stands out to me about Gates and Allen is the serious technical chops. Writing an emulator for the PDP-10 and then an interpreter, line editor, I/O system all in 4KB of memory.
Is that really so impressive? Everything then was in 4K, from all coders.
VS
Some college students selling software they didn't have and getting it ready from 0 to sellable in 2 months which led to a behemoth that still innovates to this day.
https://images.gatesnotes.com/12514eb8-7b51-008e-41a9-512542...
Curiously this isn't the oldest extant version of the source code. The Harvard archives have a copy of version 1.1, printed on 30 April 75. http://altairbasic.org/other%20versions/ian.htm
And, btw, great infographics within the post.
I think 2075 developers will feel the same way about modern Java, C#, TypeScript, etc.
They will think of themselves as software developers but they won't be writing code the same way, they'll be giving guided instructions to much higher level tools (perhaps AIs that themselves have a provenance back to modern LLMs)
Just as today there will still be those that need to write low level critical code. There are still lots of people today that have to write Assembler, though end up expressing it via C or Rust. And there will be people still working on AI technology. But even those will be built off other AI's.
So my money is that the code I wrote today is the joke of tomorrow - for all involved.
Also, I for one don’t want to go back to punch cards ;)
I am guessing that generation that transitioned from Pax Romana to early middle ages in Europe.
I notice his interview on Democracy Now : https://www.youtube.com/watch?v=j7jPzzjbVuk
This guys mental map is impressive, as are the color of his book titles : https://www.goodreads.com/author/show/16872611.Malcolm_Harri...
"Cosmetic offering to the financial markets to show that Silicon Valley still can control its labor costs... It's less the future flow of funds is improved ... than that they're signaling something to the markets ..."
https://youtu.be/j7jPzzjbVuk?si=YSbUW8h2mNktzj_9&t=634