Readit News logoReadit News
stkai · 5 months ago
The source code is such a fun read (for the comments). I found some source code for GW-BASIC, and here are two of my favorites:

  ;WE COULD NOT FIT THE NUMBER INTO THE BUFFER DESPITE OUR VALIENT
  ;EFFORTS WE MUST POP ALL THE CHARACTERS BACK OFF THE STACK AND
  ;POP OFF THE BEGINNING BUFFER PRINT LOCATION AND INPUT A "%" SIGN THERE

  ;CONSTANTS FOR THE RANDOM NUMBER GENERATOR FOLLOW
  ;DO NOT CHANGE THESE WITHOUT CONSULTING KNUTH VOL 2
  ;CHAPTER 3 FIRST
Edit: GW-BASIC, not QBASIC (https://github.com/microsoft/GW-BASIC)

ndiddy · 5 months ago
Fun fact, GW-BASIC was a descendant of the original Altair BASIC. The "Translation created 10-Feb-83" headers on each source file refer to tooling Microsoft had that automatically translated the 8080 assembly to 8086 (it shouldn't be taken as a build date since they were manually modified after that point). Besides GW-BASIC, source code for the 6502 and 6809 rewrites of Microsoft BASIC were available up to this point (see https://www.pagetable.com/?p=774 and https://github.com/davidlinsley/DragonBasic) but I believe this is the first public release of the original 8080 BASIC code.
deathtrader666 · 5 months ago
Shouldn't it be "valiant" ?
roryirvine · 5 months ago
Sure, but in those days spellcheckers were separate apps - the most popular at the time being CorrectStar from MicroPro.

They weren't integrated into programming-oriented editors, and it would have been unusual to run them against code.

jimbob45 · 5 months ago
The best programmers I’ve known have all been deficient at spelling. I don’t know why it so uniformly appears among them.
nilsbunger · 5 months ago
Steve Jobs used to say the problem with Microsoft is they don’t have taste.

The font-shimmering effect on scroll immediately reminded me of that, it is really distracting. And you can’t use reader mode to disable it.

(FWIW, I’m a fan of Bill Gates and all he’s done for the world)

toddmorey · 5 months ago
The design is fun and gave me a lot of nostalgia, but I admit they overdid it. They could have made that piece feel the same without so much distraction. And please people, support reader mode. It's not hard and it shouldn't be optional.

EDIT: Good god they animated EVERYTHING. It's not even readable... also... not one inline code sample? This is the designer trying to get an awwwards site of the day without any interest in the actual content. It's like a guitar player that solos over everyone else's solos.

nerevarthelame · 5 months ago
On top of the poor readability, my 2-year-old laptop can't even navigate through the page without CPU and GPU going insane, and my fans blasting at max speed. It's the poorest, choppiest web performance I can recall, all for what should be a simple blog post.
SpaceNoodled · 5 months ago
That's the fault of modern websites being massive JavaScript ad-playing behemoths instead of sub-1kB served HTML as god intended.
zombot · 5 months ago
But it does distract from the insufferable self-aggrandizing slime he's shitting out.
zelon88 · 5 months ago
Yes, I was shocked that Bill Gates's personal blog seems to have that "500 WordPress plugins" kinda vibe. Kinda reminds me of my old MySpace profile.
kevincox · 5 months ago
FWIW the spinning scrolling effects of Apple release announcements are nearly as bad.
nilsbunger · 5 months ago
Yeah, some of the Apple pages are pretty bad too, and I feel Jobs would've kept it more minimal. But to me, this blog is worse because it's supposed to showcase a whole article, and it's hard to read (and not even compatible with browser reader-mode).
graton · 5 months ago
Personally I like it :) Tastes differ.
fsckboy · 5 months ago
get your hands on DONKEY.BAS you will love it!
mimischi · 5 months ago
Makes me wonder: did Bill write all of this text? Did he decide this effect is cool and must go in? Did he even know about that text effect?
chubot · 5 months ago
Yeah totally, the fact that it has all this extra design makes me imagine a mid sized paid team behind it, with ghost writers.

The voice of this blog post does sound a little corporate, tbh

phatskat · 5 months ago
I think it’s important to remember how he got to where he is and what kind of person it takes - Behind the Bastards is a fairly good reality check for most people we give kudos to https://podcasts.apple.com/us/podcast/part-one-the-ballad-of...
spookie · 5 months ago
I think it's pretty cool
microtherion · 5 months ago
Also, the text font on that page has an incredibly irritating lowercase "f".
zulu-inuoe · 5 months ago
I think it's really cute and endeared me to it immediately
piyuv · 5 months ago
“All he’s done for the world” by copyrighting Covid vaccine, eh?
Timwi · 5 months ago
I do tend to agree with your sentiment — his business practices have not been ethically stellar. Despite, if you wish to bring forward criticism and be taken seriously, you’d do well to first familiarize yourself with the basic difference between copyright and patents.
zabzonk · 5 months ago
I've written an Intel 8080 emulator that was portable between Dec10/VAX/IBM VM CMS. That was easy - the 8080 can be done quite simply with a 256 value switch - I did mine in FORTRAN77.

Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.

TMWNN · 5 months ago
>Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.

The floating point routines are Monte Davidoff's work. But yes, Gates and Allen writing Altair BASIC on the Harvard PDP-10 without ever actually seeing a real Altair, then having it work on the first try after laboriously entering it with toggle switches at MITS in Albuquerque, was a remarkable achievement.

WalterBright · 5 months ago
What Allen did was write an 8080 emulator that ran on the -10. The 8080 is a simple CPU, so writing an emulator for it isn't hard.

https://pastraiser.com/cpu/i8080/i8080_opcodes.html

Then, their BASIC was debugged by running it on the emulator.

The genius was not the difficulty of doing that, it wasn't hard. The genius was the idea of writing an 8080 emulator. Wozniak, in comparison, wrote Apple code all by hand in assembler and then hand-assembled it to binary, a very tedious and error-prone method.

In the same time period, I worked at Aph, and we were developing code that ran on the 6800 and other microprocessors. We used full-fledged macro assemblers running on the PDP-11 to assemble the code into binary, and then download binary into an EPROM which was then inserted into the computer and run. Having a professional macro assembler and text editors on the -11 was an enormous productivity boost, with far fewer errors. (Dan O'Dowd wrote those assemblers.)

(I'm doing something similar with my efforts to write an AArch64 code generator. First I wrote a disassembler for it, testing it by generating AArch64 code via gcc, disassembling that with objdump and then comparing the results with my disassmbler. This helps enormously in verifying that the correct binary is being generated. Since there are thousands of instructions in the AArch64, this is a much scaled up version of the 8080.)

zabzonk · 5 months ago
Allen had to write the loader in machine code, which was toggled in on the Altair console. The BASIC interpreter itself was loaded from paper tape via the loader and a tape reader. The first BASIC program Allen ran on the Altair was apparently "2 + 2", which worked - i.e. it printed "4" I'd like to have such confidence in my own code, particularly the I/O, which must have been tricky to emulate on the Dec10.
teleforce · 5 months ago
Fun facts, according to Jobs for some unknown reasons Wozniak refused to add floating point support to Apple Basic thus they had to license BASIC with floating point numbers from Microsoft [1].

[1] Bill & Steve (Jobs!) reminisce about floating point BASIC:

https://devblogs.microsoft.com/vbteam/bill-steve-jobs-remini...

WalterBright · 5 months ago
Writing a floating point emulator (I've done it) is not too hard. First, write it in a high level language, and debug the algorithm. Then hand-assembling it is not hard.

What is hard is skipping the high level language step, and trying to do it in assembler in one step.

zozbot234 · 5 months ago
Floating point math was a key feature on these early machines, since it opened up the "glorified desk calculator" use case. This was one use for them (along with gaming and use as a remote terminal) that did not require convenient data storage, which would've been a real challenge before disk drives became a standard. And the float implementation included in BASIC was the most common back in the day. (There are even some subtle differences between it and the modern IEEE variety that we'd be familiar with today.)
musicale · 5 months ago
I agree - it's a useful BASIC that can do math and fits in 4 or 8 kilobytes of memory.

And Bill Gates complaining about pirating $150 Altair BASIC inspired the creation of Tiny BASIC, as well as the coining of "copyleft".

phkahler · 5 months ago
I still have a cassette tape with Microsoft Basic for the Interact computer. It's got an 8080.
thijson · 5 months ago
I remember my old Tandy Color Computer booting up and referencing Microsoft BASIC:

https://tinyurl.com/2jttvjzk

The computer came with some pretty good books with example BASIC programs to type in.

thesuitonym · 5 months ago
You should upload the audio to the Internet Archive!
vile_wretch · 5 months ago
I have a MS Extended Basic cassette for the Sol-20, also 8080 based.
Barrin92 · 5 months ago
What stands out to me about Gates and Allen is the serious technical chops. Writing an emulator for the PDP-10 and then an interpreter, line editor, I/O system all in 4KB of memory. The code is worth reading and in addition to that they had a very solid business sense and pretty serious work ethic for people who were 20 years old.

It stands to me in real contrast to the "fake it till you make it", "if it works you shipped too late" hustle culture that took hold of the industry, with entire products just being API wrappers. Really hope we see more companies that start out like Microsoft again.

mindwok · 5 months ago
To be fair they definitely faked it, they said they had source code for a program they hadn't even written yet! They were just also very serious about the "making it" part.
cybrox · 5 months ago
True but "fake it and then immediately proceed to make it" is definitely more appreciated than just burning through deals by lying for a long time, which "fake it till you make it" usually boils down to.
netsharc · 5 months ago
IMO although it was complex, the human brain could still manage the complexity back then. Reading Woz's autobiography, it feels he knew what every logic gate on the original Apple computer did.

The PDP-10 probably worked at "human speed" too...

mmooss · 5 months ago
> It stands to me in real contrast to the "fake it till you make it"

They are the all-time greatest in fake-it-til-you-make-it. They got the IBM PC OS contract without having an OS, which they bought from someone else (iirc).

> What stands out to me about Gates and Allen is the serious technical chops. Writing an emulator for the PDP-10 and then an interpreter, line editor, I/O system all in 4KB of memory.

Is that really so impressive? Everything then was in 4K, from all coders.

jwnin · 5 months ago
Some luck, and willingness to take risks paid off in ways that could never be anticipated. Not sure I'll see something like the pc era in my lifetime. Perhaps mobile phones, or the Internet.
vessenes · 5 months ago
Having lived through pcs, internet, mobile, social, crypto and ai, I’d say mobile or social has been the biggest so far and AI is likely to be vastly larger impact. Of course they build on each other. But the global impact of mobile and social vastly exceed that of the pc era.
rvba · 5 months ago
Android? Internet? Reddit?
LeFantome · 5 months ago
The Internet?
wrobelda · 5 months ago
I mean… The AI?
Izikiel43 · 5 months ago
That came out of millions of dollars and man hours of investment by Google and OpenAi.

VS

Some college students selling software they didn't have and getting it ready from 0 to sellable in 2 months which led to a behemoth that still innovates to this day.

thesuitonym · 5 months ago
Consider that nobody ever sat in countless meetings asking "How can we use the PC?" They either saw the vision and went for it, or eventually ran up against the limitations of working without a PC and bought in.
jer0me · 5 months ago
The source code is linked at the end (warning: it's a 100 MB PDF).

https://images.gatesnotes.com/12514eb8-7b51-008e-41a9-512542...

pdw · 5 months ago
The printout is dated 10-SEP-75 and is labeled "VERSION 3.0 -- MORE FEATURES TO GO".

Curiously this isn't the oldest extant version of the source code. The Harvard archives have a copy of version 1.1, printed on 30 April 75. http://altairbasic.org/other%20versions/ian.htm

Aardwolf · 5 months ago
The printout also contains dates 6-SEP-64 below it, any idea what those are?
seabass-labrax · 5 months ago
Thank you for the warning. I once used up my Internet package's entire monthly quota by following a similar link on Hacker News.
mysterydip · 5 months ago
Ironic for something designed to take up only 4KB on its target machine :)
paulddraper · 5 months ago
(It's a high-res image of the printed code.)
masfuerte · 5 months ago
Nice one. Has anyone OCRed this back into text?
pronoiac · 5 months ago
I attempted OCR with OCRmyPDF / Tesseract. It's not great, but it's under 1% the size, at least. https://github.com/pronoiac/altair-basic-source-code
n0rdy · 5 months ago
Flipping through the source code is like a time machine tour of tech's evolution over the past 50 years. It made me wonder: will our 2025 code look as ancient by 2075?

And, btw, great infographics within the post.

freedomben · 5 months ago
That's interesting to consider. Some of the GNU code is getting quite old and looking through it is a blast from the past. I'm frankly amazed that it continues to work so well. I suspect there is a relatively small group of GNU hackers out there rocking gray and white beards that are silently powering the foundations of our modern world, and I worry what's going to happen when they start retiring. Maybe we'll get rust rewrites of everything and a new generation will take over, but frankly I'm pretty worried about it.
dasudasu · 5 months ago
They’ve already retired for the most part. Stallman for example is 72.
deanCommie · 5 months ago
I think to most (90+%?) software developers out their in the world, Assembler might as well be hieroglyphics. They/we can guess at the concepts involved of course, but actually being able to read the code end to end, and have a mental model of what is happening is not really going to happen. Not without some sort of Rosetta Stone. (Comments :) )

I think 2075 developers will feel the same way about modern Java, C#, TypeScript, etc.

They will think of themselves as software developers but they won't be writing code the same way, they'll be giving guided instructions to much higher level tools (perhaps AIs that themselves have a provenance back to modern LLMs)

Just as today there will still be those that need to write low level critical code. There are still lots of people today that have to write Assembler, though end up expressing it via C or Rust. And there will be people still working on AI technology. But even those will be built off other AI's.

Towaway69 · 5 months ago
Has there ever been a moment in human history where we’ve (as a society, not as individuals) looked back and were envious?

So my money is that the code I wrote today is the joke of tomorrow - for all involved.

Also, I for one don’t want to go back to punch cards ;)

bojan · 5 months ago
> Has there ever been a moment in human history where we’ve (as a society, not as individuals) looked back and were envious?

I am guessing that generation that transitioned from Pax Romana to early middle ages in Europe.

azemetre · 5 months ago
It's interesting reading this after finishing Palo Alto by Malcom Harris.
jgord · 5 months ago
added to my must-read list.

I notice his interview on Democracy Now : https://www.youtube.com/watch?v=j7jPzzjbVuk

This guys mental map is impressive, as are the color of his book titles : https://www.goodreads.com/author/show/16872611.Malcolm_Harri...

jh00ker · 5 months ago
Thanks for the Democracy Now interview! His description of "tech layoffs" is the most concise framing I've heard to describe what I've felt about it:

"Cosmetic offering to the financial markets to show that Silicon Valley still can control its labor costs... It's less the future flow of funds is improved ... than that they're signaling something to the markets ..."

https://youtu.be/j7jPzzjbVuk?si=YSbUW8h2mNktzj_9&t=634