Readit News logoReadit News
Posted by u/kovac 3 years ago
Ask HN: What was being a software developer like about 30 years ago?
I'm curious what it was like to be a developer 30 years ago compared now in terms of processes, design principles, work-life balance, compensation. Are things better now than they were back then?
fd111 · 3 years ago
It was great. Full stop.

A sense of mastery and adventure permeated everything I did. Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything. :-)

Starting in 1986 I worked on bespoke firmware (burned into EPROMs) that ran on bespoke embedded hardware.

Some systems were written entirely in assembly language (8085, 6805) and other systems were written mostly in C (68HC11, 68000). Self taught and written entirely by one person (me).

In retrospect, perhaps the best part about it was that even the biggest systems were sufficiently unsophisticated that a single person could wrap their head around all of the hardware and all of the software.

Bugs in production were exceedingly rare. The relative simplicity of the systems was a huge factor, to be sure, but knowing that a bug meant burning new EPROMs made you think twice or thrice before you declared something "done".

Schedules were no less stringent than today; there was constant pressure to finish a product that would make or break the company's revenue for the next quarter, or so the company president/CEO repeatedly told me. :-) Nonetheless, this dinosaur would gladly trade today's "modern" development practices for those good ol' days(tm).

bombcar · 3 years ago
> In retrospect, perhaps the best part about it was that even the biggest systems were sufficiently unsophisticated that a single person could wrap their head around all of the hardware and all of the software.

This was it, even into the 90s you could reasonable "fully understand" what the machine was doing, even with something like Windows 95 and the early internet. That started to fall apart around that time and now there are so many abstraction layers you have to choose what you specialize in.

And the fact that you couldn't just shit another software update into the update server to be slurped up by all your customers meant you had to actually test things - and you could easily explain to the bosses why testing had to be done, and done right, because the failure would cost millions in new disks being shipped around, etc. Now it's entirely expected to ship software that has significant known or unknown bugs because auto-update will fix it later.

origin_path · 3 years ago
It isn't right to consider that time as a golden age of software reliability. Software wasn't less buggy back then. My clear recollection is that it was all unbelievably buggy by today's standards. However things we take for granted now like crash reporting, emailed bug reports, etc just didn't exist, so a lot of devs just never found out they'd written buggy code and couldn't do anything even if they did. Maybe it felt like the results were reliable but really you were often just in the dark about whether people were experiencing bugs at all. This is the origin of war stories like how Windows 95 would detect and effectively hot-patch SimCity to work around memory corruption bugs in it that didn't show up in Windows 3.1.

Manual testing was no replacement for automated testing even if you had huge QA teams. They could do a good job of finding new bugs and usability issues compared to the devs-only unit testing mentality we tend to have today, but they were often quite poor at preventing regressions because repeating the same things over and over was very boring, and by the time they found the issue you may have been running out of time anyway.

I did some Windows 95 programming and Win3.1 too. Maybe you could fully understand what it was doing if you worked at Microsoft. For the rest of us, these were massive black boxes with essentially zero debugging support. If anything went wrong you got either a crash, or an HRESULT error code which might be in the headers if you're lucky, but luxuries like log files, exceptions, sanity checkers, static analysis tools, useful diagnostic messages etc were just totally absent. Windows programming was (and largely still is) essentially an exercise in constantly guessing why the code you just wrote wasn't working or was just drawing the wrong thing with no visibility into the source code. HTML can be frustratingly similar in some ways - if you do something wrong you just silently get the wrong results a lot of the time. But compared to something more modern like JavaFX/Jetpack Compose it was the dark ages.

PaulHoule · 3 years ago
This was what passed for an "AAA" game in 1980

https://en.wikipedia.org/wiki/Ultima_I:_The_First_Age_of_Dar...

it was coded up in about a year by two people who threw in just about every idea they got.

dleslie · 3 years ago
This is why I drifted towards game development for most of my career. Consoles, until the penultimate (antipenultimate?) generation, ran software bare or nearly bare on the host machine.

I also spent time in integrate display controller development and such; it was all very similar.

Nowadays it feels like everything rides on top some ugly and opaque stack.

treis · 3 years ago
>This was it, even into the 90s you could reasonable "fully understand" what the machine was doing, even with something like Windows 95 and the early internet. That started to fall apart around that time and now there are so many abstraction layers you have to choose what you specialize in.

This doesn't really track. 30 years ago computers were, more or less, the same as they are now. The only major addition has been graphics cards. Other than that we've swapped some peripherals. Don't really see how someone could "fully understand" the modem, video drivers, USB controllers, motherboard firmware, processor instruction sets, and the half dozen or so more things that went into a desktop.

samstave · 3 years ago
the crazy thing to me is just how many different workflows/UI/UX you need to learn along so many platforms today. AWS, GCP, Azure - like you need to learn so much deeply about each in order to be "marketable" and the only way you shall learn all of them is if you happen to work at a company that happens to rely on said platform.

Then there is low-level training of ILO bullshit that ive done weeks training on for HPE and I have building and dealing with HPE servers since before the bought COMPAQ....

And dont even get me started on SUN and SGI... how much brain power was put into understanding those two extinct critters... fuck even CRAY.

there is so much knowledge that has to evaporate in the name of progress....

thfuran · 3 years ago
Yeah, it's definitely great but also terrible that bugs can be patched so easily now.
jerf · 3 years ago
I've been trying to teach my young teenage kids about how things work, like, washing machines, cars, etc. One of the things I've learned is that it's a looooot easier to explain 20th century technology than 21st century technology.

Let me give you an example. My father was recently repairing his furnace in his camper, which is still a 20th century technology. He traced the problem to the switch that detects whether or not air is flowing, because furnaces have a safety feature such that if the air isn't flowing, it shuts the furnace off so it doesn't catch on fire. How does this switch work? Does it electronically count revolutions on a fan? Does it have two temperature sensors and then compute whether or not air is flowing by whether their delta is coming down or staying roughly the same temperature? Is it some other magical black box with integrated circuits and sensors and complexity greater than the computer I grew up with?

No. It's really simple. It's a big metal plate that sticks out into the airflow and if the air is moving, closes a switch. Have a look: https://www.walmart.com/ip/Dometic-31094-RV-Furnace-Heater-S... You can look at that thing, and as long as you have a basic understanding of electronics, and the basic understanding of physics one gets from simply living in the real world for a few years, you can see how that works.

I'm not saying this is better than what we have now. 21st century technology exists for a reason. Sometimes it is done well, sometimes it is done poorly, sometimes it is misused and abused, it's complicated. That fan switch has some fundamental issues in its design. It's nice that they are also easy to fix, since it's so simple, but I wouldn't guarantee it's the "best" solution. All I'm saying here is that this 20th century technology is easier to understand.

My car is festooned with complicated sensors and not just one black box, but a large number of black boxes with wires hooked in doing I have no idea what. For the most part, those sensors and black boxes have made cars that drive better, last longer, are net cheaper, and generally better, despite some specific complaints we may have about them, e.g., lacking physical controls. But they are certainly harder to understand than a 20th century car.

Computers are the same way. There is a profound sense in which computers today really aren't that different than a Commodore 64, they just run much faster. There are also profound senses in which that is not true; don't overinterpret that. But ultimately these things accept inputs, turn them into numbers, add and subtract them really quickly in complicated ways, then use those numbers to make pictures so we can interpret them. But I can almost explain to my teens how that worked in the 20th century down to the electronics level. My 21st century explanation involves a lot of handwaving, and I'm pretty sure I could spend literally a full work day giving a spontaneous, off-the-cuff presentation of that classic interview question "what happens when you load a page in the web browser" as it is!

teddyh · 3 years ago
> This was it, even into the 90s you could reasonable "fully understand" what the machine was doing

That was always an illusion, only possible if you made yourself blind to the hardware side of your system.

https://news.ycombinator.com/item?id=27988103

https://news.ycombinator.com/item?id=21003535

kabdib · 3 years ago
It was a mix of great and awful.

I wrote tons of assembly and C, burned EPROMs, wrote documentation (nroff, natch), visited technical bookstores every week or two to see what was new (I still miss the Computer Literacy bookstore). You got printouts from a 133 column lineprinter, just like college. Some divisions had email, corporation-wide email was not yet a thing.

No source code control (the one we had at Atari was called "Mike", or you handed your floppy disk of source code to "Rob" if "Mike" was on vacation). Networking was your serial connection to the Vax down in the machine room (it had an autodial modem, usually pegged for usenet traffic and mail).

No multi-monitor systems, frankly anything bigger than 80x25 and you were dreaming. You used Emacs if you were lucky, EDT if you weren't. The I/O system on your computer was a 5Mhz or 10Mhz bus, if you were one of those fortunate enough to have a personal hard drive. People still smoked inside buildings (ugh).

It got better. AppleTalk wasn't too bad (unless you broke the ring, in which case you were buying your group lunch that day). Laserprinters became common. Source control systems started to become usable. ANSI C and CFront happened, and we had compilers with more than 30 characters of significance in identifiers.

I've built a few nostalgia machines, old PDP-11s and such, and can't spend more than an hour or so in those old environments. I can't imagine writing code under those conditions again, we have it good today.

jjav · 3 years ago
> No source code control

30 years ago is 1992, we certainly had source control a long time before!

In fact in 1992 Sun Teamware was introduced, so we even had distributed source control, more than a decade before "git invented it".

CVS is from 1986, RCS from 1982 and SCCS is from 1972. I used all four of those are various points in history.

> No multi-monitor systems, frankly anything bigger than 80x25 and you were dreaming.

In 1993 (or might've been early 1994) I had two large monitors on my SPARCstation, probably at 1280×1024.

cylinder714 · 3 years ago
>(I still miss the Computer Literacy bookstore)

I used to drive over Highway 17 from Santa Cruz just to visit the Computer Literacy store on N. First Street, near the San Jose airport. (The one on the Apple campus in Cupertino was good, too.)

Now, all of them—CL, Stacy's Books, Digital Guru—gone. Thanks, everyone who browsed in stores, then bought on Amazon to save a few bucks.

strangattractor · 3 years ago
Agree with the poster. Much better IMHO and more enjoyable back then.

Because of the software distribution model then there was a real effort to produce a quality product. These days not so much. Users are more like beta testers now. Apps get deployed with a keyboard input. The constant UI changes for apps (Zoom comes to mind) are difficult for users to keep up with.

The complexity is way way higher today. It wasn't difficult to have a complete handle on the entire system back then.

Software developers where valued more highly. The machines lacked speed and resources - it took more skill/effort to get performance from them. Not so much of an issue today.

Still a good job but I would like seek something different if I was starting out today.

hirvi74 · 3 years ago
> Still a good job but I would like seek something different if I was starting out today

I'm only 6 years in, and I am starting to feel this.

I went into computer science because it's something I knew that, at some level, it was something I always wanted to do. I've always been fascinated with technology ever since I was a child -- how things work, why things work, etc..

While studying computer science at my average state school, I met a few others that were a lot like me. We'd always talk about this cool new technology, work on things together, etc.. The was a real passion for the craft in a sense. It's something I felt similar during my time studying music with my peers.

Perhaps, in some naive way, I thought the work world would be a lot like that too. And of course, this is only my experiences so far, but I have found my peers to be significantly different.

People I work with do not seem to care about technology, programing, etc.. They care about dollar signs, promotions, and getting things done as quickly as possible (faster != better quality). Sure, those three things are important to varying degrees, but it's not why I chose computer science, and I struggle to connect with those people. I've basically lost my passion for programing because of it (though that is not the entire reason -- burnout and whatnot has contributed significantly.)

I'm by no means a savant nor would I even consider myself that talented, but I used to have a passion for programming and that made all the "trips" and "falls" while learning worth it in the end.

I tell people I feel like I deeply studied many of the ins and outs photography only to take school pictures all day.

mattgreenrocks · 3 years ago
> A sense of mastery and adventure permeated everything I did.

How much of that is a function of age? It is hard to separate that from the current environment.

Personally, I don't feel as inspired by the raw elements of computing like I once did, but it is probably more about me wanting a new domain to explore than something systemic. Or at least, it is healthier to believe that.

> knowing that a bug meant burning new EPROMs made you think twice or thrice before you declared something "done".

The notion of Internet Time, where you're continuously shipping, has certainly changed how we view the development process. I'd argue it is mostly harmful, even.

> perhaps the best part about it was that even the biggest systems were sufficiently unsophisticated that a single person could wrap their head around all of the hardware and all of the software.

I think this is the crux of it: more responsibility, more ownership, fewer software commoditization forces (frameworks), less emphasis on putting as many devs on one project as possible because all the incentives tilt toward more headcount.

psychphysic · 3 years ago
Yes indeed could be Dunning Kruger effect.
gautamdivgi · 3 years ago
There wasn’t HN so no distraction to digress to every now and them.

I second this - systems were small and most people could wrap their brains around them. Constant pressure existed and there wasn’t “google” & “so” & other blogs to search for solutions. You had to discover by yourself. Language and API manuals weighed quite a bit. Just moving them around the office was somewhat decent exercise.

There wasn’t as much build vs buy discussion. If it was simple enough you just built it. I spent my days & evenings coding and my nights partying. WFH didn’t exist so, if you were on-call you were at work. When you were done you went home.

My experience from 25 years ago.

convolvatron · 3 years ago
I actually used to do 'on call' by having a vt100 at the head of my bed and I would roll over every couple hours and check on things over a 9600 baud encrypted modem that cost several thousand dollars.

the only time I ever had to get up in the middle of the night and walk to the lab was the Morris worm. I remember being so grateful that someone brought me coffee at 7

doug_durham · 3 years ago
I have one word for you "Usenet".
chinchilla2020 · 3 years ago
Alot of our modern software practices have introduced layers of complexity on to systems that are very simple at a fundamental level. When you peel back the buzzword technologies you will find text streams, databases, and REST at the bottom layer.

It's a self fulfilling cycle. Increased complexity reduces reliability and requires more headcount. Increasing headcount advances careers. More headcount and lower reliability justifies the investment in more layers of complicated technologies to 'solve' the 'legacy tech' problems.

PontifexMinimus · 3 years ago
> A sense of mastery and adventure permeated everything I did.

My experience too. I did embedded systeems that I wrote the whole software stack for: OS, networking, device drivers, application software, etc.

> Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything.

These days programming is more trying to understand the badly-written documentation of the libraries you're using.

Jeema101 · 3 years ago
I'm younger than you, but one of my hobbies is messing around with old video game systems and arcade hardware.

You're absolutely right - there's something almost magical in the elegant simplicity of those old computing systems.

UncleOxidant · 3 years ago
> A sense of mastery and adventure permeated everything I did. Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything. :-)

Are you me? ;) I feel like this all the time now. I also started in embedded dev around '86.

> Nonetheless, this dinosaur would gladly trade today's "modern" development practices for those good ol' days(tm).

I wouldn't want to give up git and various testing frameworks. Also modern IDEs like VSCode are pretty nice and I'd be hesitant to give those up (VSCode being able to ssh into a remote embedded system and edit & debug code there is really helpful, for example).

HankB99 · 3 years ago
And it had it's downside too. - Developing on DOS with non-networked machines. (OK,l one job was on a PDP-11/23) - Subversion (IIRC) for version control via floppy - barely manageable for a two person team. - No Internet. Want to research something? Buy a book. - Did we have free S/W? Not like today. Want to learn C/C++? Buy a compiler. I wanted to learn C++ and wound up buying OS/2 because it was bundled with IBM's C++ compiler. Cost a bit less than $300 at the time. The alternative was to spend over $500 for the C++ compiler that SCO sold for their UNIX variant. - Want to buy a computer? My first was $1300. That got me a Heathkit H-8 (8080 with 64 KB RAM) and an H19 (serial terminal that could do up to 19.2 Kbaud) and a floppy disk drive that could hold (IIRC) 92KB data. It was reduced/on sale and included a Fortran compiler and macro-assembler. Woo! The systems we produced were simpler, to be sure, but so were the tools. (Embedded systems here too.)
airbreather · 3 years ago
Yeah, I am almost identical, lots of 6805, floating point routines and bit banging RS232, all in much less than 2k code memory, making functional products.

Things like basketball scorebaords, or tractor spray controllers to make uniform application of herbicide regardless of speed. Made in a small suburben factory in batches of a hundred or so, by half a dozen to a dozen "unksilled" young ladies, who were actally quite skilled.

No internet, the odd book and magazines, rest of it, work it out yourself.

In those days it was still acceptable, if not mandatory to use whatever trick you could come up with to save some memory.

It didn't matter about the direct readability, though we always took great pains in the comments for the non obvious, including non specified addressing modes and the like.

This was around the time the very first blue LEDS came out.

When the web came along, and all the frameworks etc, it just never felt right to be relying on arbitrary code someone else wrote and you did not know the pedigree of.

Or had at least paid for so that you had someone to hassle if it was not doing what you expected and had some sort of warranty.

But also a lot of closed source and libraries you paid for if you wanted to rely on someone elses code and needed to save time or do something special, an awful lot compared to today.

Microsoft C was something like $3000 (maybe $5k, cant rememeber exactly) dollars from memory, at a time when that would buy a decent second hand car and a young engineer might be getting 20-25k a year tops(AUD).

Turbo C was a total breakthru, and 286 was the PC of choice, with 20MB hard drive, with the Compaq 386-20 just around the corner.

Still, I wouldn't go back when I look at my current 11th Gen Intel CPU with 32Gig RAM, 2 x 1TB SSDs and a 1080Ti graphics card with multiple 55inch 4k monitors, not even dreamable at the time.

convolvatron · 3 years ago
don't forget the community. it was very much the case that you could look at an IETF draft or random academic paper and mail the authors and they would almost certainly be tickled that someone cared, consider your input, and write you back.

just imagine an internet pre-immigration-lawyer where the only mail you ever got was from authentic individuals, and there were no advertisements anywhere.

the only thing that was strictly worse was that machines were really expensive. it wasn't at all common to be self-funded

deathanatos · 3 years ago
> knowing that a bug meant burning new EPROMs made you think twice or thrice before you declared something "done".

> Schedules were no less stringent than today;

So … how did that work, then? I know things aren't done, and almost certainly have bugs, but it's that stringent schedule and the ever-present PM attitude of "is it hobbling along? Good enough, push it, next task" never connecting the dots to "why is prod always on fire?" that causes there to be a never ending stream of bugs.

ipaddr · 3 years ago
With no pms you dealt directly with the boss and you managed your own tasks so you had a hard deadline and showed demos and once it was done support/training. It was waterfall so not finishing on time meant removing features or finishing early meant added additional features if you had time . Everything was prod. You needed to fix showstopper bugs/crashed but bugs could be harmless (spelling fr example) or situational and complex or show shoppers. You lived with them because bugs were part of the OS or programming language or memory driver experience at the time.
HeyLaughingBoy · 3 years ago
As my old boss once said (about 30 years ago actually!) when complaining about some product or the other "this happens because somewhere, an engineer said, 'fuck it, it's good enough to ship'."
jokethrowaway · 3 years ago
I wonder how much of this is due to getting old vs actual complexity.

When I started I was literally memorising the language of the day and I definitely mastered it. Code was flowing on the screen without interruption.

Nowadays I just get stuff done; I know the concepts are similar, I just need to find the specifics and I'm off to implement. It's more akin to a broken faucet and it definitely affects my perception of modern development.

Rediscover · 3 years ago
Thanks. I'd forgotten how much the 68705 twisted my mind.

And how much I love the 68HC11 - especially the 68HC811E2FN, gotta get those extra pins and storage! I never have seen the G or K (?) variant IRL (16K/24K EPROM respectively and 1MB address space on the latter). Between the 68HC11 and the 65C816, gads I love all the addressing modes.

Being able to bum the code using zero-page or indirectly indexed or indexed indirectly... Slightly more fun than nethack.

Deleted Comment

fgatti · 3 years ago
https://en.wikipedia.org/wiki/Rosy_retrospection

I am sure everything was great back then but. I've been coding for 20 years, and there are a lot of problems of different types (including recurring bugs) that have been solved with better tooling, frameworks and tech overall. I don't miss too much

chkaloon · 3 years ago
Exactly my experience coming out of school in 1986. Only for me it was microcontrollers (Intel 8096 family).

Thanks for bringing back some great memories!

mech422 · 3 years ago
I miss everything being a 'new challenge'... Outside of accounting systems - pretty much everything was new ground, greenfield, and usually - fairly interesting :-)

Dead Comment

abraxas · 3 years ago
I started my first dev job in early 1997 which is more like 25 than 30 years ago but I think the milieu was similar.

The internet was mostly irrelevant to the line of work I was involved in although it was starting to have impact. We had one ISDN 2x line for the entire office. It was set up to open on demand and time out a few minutes later as it was billed by the minute.

I worked on an OpenGL desktop application for geoscience data visualization running on Irix and Solaris workstations.

The work life balance was great as the hardware limitations prevented any work from home. Once out of the office I was able to go back to my family and my hobbies.

Processes were much lighter with far less security paranoia as cyber attacks weren't a thing. Biggest IT risk was someone installing a virus on a computer from the disk they brought to install Doom shareware.

The small company I worked for did not have the army of product managers, project managers or any similar buffoonery. The geologists told us developers what they needed, we built it and asked if they liked the UI. If they didn't we'd tweak it and run it by them again until they liked it.

In terms of software design, OO and Gang of Four Patterns ruled the day. Everyone had that book on their desks to accompany their copies of Effective C++ and More Effective C++. We took the GoF a little too seriously.

Compensation was worse for me though some of that is a function of my being much more advanced in my career. These days I make about 10x what I made then (not adjusted for inflation). That said, I led a happier life then. Not without anxiety to which I'm very prone but happier.

doug_durham · 3 years ago
Effective C++ was an amazing book. I bought copies for the entire team out of my own pocket. The Gang of Four on the other hand was an unfortunate turn for the industry. As you say we took it too seriously. In practice very few projects can benefit from the "Factory pattern", but I've seen it used in way too many projects to the detriment of readability. I worked in one source code base where you had to invoke 4 different factories spread across many different source files just to allocate one object.
munificent · 3 years ago
> As you say we took it too seriously.

The real problem is that many people didn't actually read the book or, if they did, they only took part of it seriously.

Each pattern chapter has a pretty long section that details when you should and should not use the pattern. The authors are very clear about understanding the context and not mis-applying patterns.

But once it became popular (which happened because these patterns are quite useful), it got cargo culted and people started over-applying them because it sent a social signal that, "Hey, I must be a good developer because I know all these patterns."

The software engineering world is a much better one today because of that book now that the pendulum has swung back some from the overshoot.

raspberry1337 · 3 years ago
I am a newly employed engineer and I am assigned to learn Design patterns (the book and all) right as of today. Needless to day, I am very intrigued. Could you expand what you mean by too far, beyond over-applying the patterns?
feoren · 3 years ago
> In practice very few projects can benefit from the "Factory pattern"

The factory pattern in C#:

    public IMeasuringDevice CreateMeasuringDevice(Func<IUnitConverter> unitConverterFactory)
In TypeScript:

    function createMeasuringDevice(unitConverterFactory: () => UnitConverter): MeasuringDevice
Very few projects can benefit from this!?

mikrl · 3 years ago
Would you say the GoF are more descriptive than prescriptive?

That is, not “do these to write good code” but “well written code looks like this”

lowercased · 3 years ago
> The internet was mostly irrelevant to the line of work I was involved in although it was starting to have impact. We had one ISDN 2x line for the entire office. It was set up to open on demand and time out a few minutes later as it was billed by the minute.

Early gig I had in 97 was working on building an internal corp intranet for a prototyping shop. There were around 50-60 folks there - probably 20 "upstairs" - doing the office/business work. I was upstairs. I was instructed to build this in Front Page. Didn't want to (was already doing some decent PHP on the side) but... hey... the IT guy knew best.

Asked for some books on FP. Nope - denied. So I spent time surfing through a lot of MS docs (they had a moderate amount online docs for FP, seemingly) and a lot of newsgroups. I was pulled aside after a while saying I was using too much bandwidth. The entire building had - as you had - a double ISDN line - a whopping 128k shared between 20+ people. I was using 'too much' and this was deemed 'wrong'. I pointed out that they decided on the tool, which wasn't a great fit for the task, and then refused to provide any support (books/etc). I left soon after. They were looking for a way to get me out - I think they realized an intranet wasn't really something they could pull off (certainly not in FP) but didn't want to 'fire' me specifically, as that wasn't a good look. Was there all of... 3 months IIRC. Felt like an eternity.

Working in software in the 90s - a bookstore with good tech books became invaluable, as well as newsgroups. No google, no stackoverflow, often very slow internet, or... none sometimes.

AstralStorm · 3 years ago
For some things, it was possible to substitute grit, experimentation, practice and ultimately mastery.

But for others, especially being closer to hardware, a good book was necessary. These days, it still might be.

pjc50 · 3 years ago
Similarly I started in 2000, just as internet applications were starting to become a thing. I could list a whole load of defunct tech stacks: WebSphere, iPlanet, NSAPI, Zeus Web Server (where I worked for about a year), Apache mod_perl, Delphi etc. And the undead tech stack: MFC.

Compensation: well, this is the UK so it's never been anywhere near US levels, but it was certainly competitive with other white-collar jobs, and there was huge spike briefly around 2001 until the "dotcom boom" burst and a whole load of us were laid off.

Tooling: well, in the late 90s I got a copy of Visual Studio. I still have Visual Studio open today. It's still a slow but effective monolith.

The big difference is version control: not only no git, but no svn. I did my undergraduate work in CVS, and was briefly exposed to SourceSafe (in the way that one is exposed to a toxin).

Most of the computers we used back in 2000 were less powerful than an RPi4. All available computers 30 years ago would be outclassed by a Pi, and the "supercomputers" of that day would be outclassed by a single modern GPU. This .. makes less difference than you'd expect to application interactive performance, unless you're rendering 3D worlds.

We ran a university-wide proto-social-network (vaguely similar to today's "cohost") off a Pentium with a 100MB hard disk that would be outclassed by a low-end Android phone.

Another non-obvious difference: LCD monitors weren't really a thing until about 2000 - I was the first person I knew to get one, and it made a difference to reducing the eyestrain. Even if at 800x600 14" it was a slight downgrade from the CRT I had on my desk.

ComputerGuru · 3 years ago
I kept buying used higher-end CRTs for almost a decade because their refresh rate and resolution so greatly outstripped anything LCD that was available for sale.
rejectfinite · 3 years ago
My parents got an early 2002 LCD display... I never knew what I lost... by not gaming on a CRT. Low rez too... sad. All for what, space and "enviroment"?

Like, look at this shit: https://imgur.com/a/FiOf7Vw

https://youtu.be/3PdMtwQQUmo

https://www.youtube.com/watch?v=Ya3c1Ni4B_U

evilbob93 · 3 years ago
I went to a PC expo of some sort in NYC in 1999 because I was in town for an interview. LCDs had just come out, but every exhibit in the hall had them because they were new but also because you could ship a whole bunch of flat screens in the same weight as a decent CRT.
doktorhladnjak · 3 years ago
I was working at an internet startup in 1996. We basically built custom sites for companies.

It’s hard now to appreciate how “out there” the internet was at the time. One of the founders with a sales background would meet with CEOs to convince them they needed a website. Most of those meetings ended with a, “We think this internet web thing is a fad, but thanks for your time”.

shuntress · 3 years ago
It's interesting to consider this viewpoint 30 years later and wonder what will bring about the next age. Is it something in it's infancy being dismissing as a fad? Have we even thought of it yet?
gwbas1c · 3 years ago
> I started my first dev job in early 1997 which is more like 25 than 30 years ago but I think the milieu was similar.

My first internship was in 2000, and I feel like, overall, not a lot has changed except the deck chairs. Things still change just as fast as back then.

markus_zhang · 3 years ago
Taking inflation and especially rocketing housing plus wlb into account maybe you actually earned more in 1997 than today for the same kind of jobs?
evo_9 · 3 years ago
The thing I remember most about those days was how often I went to Barnes & Nobles. You simply couldn't find the information online at that point. I'd go and buy a coffee and sit with a stack of books on a given topic I needed to research; then after ~45 minutes I'd decide and buy one or two books, and head back to the office to get to work.
randcraw · 3 years ago
Or to Computer Literacy bookstore. Each time I attended a conference in the Bay Area I made sure to drop into their store in San Jose to spend hours poring over the multitude of recent books on all the new stiuff happening in computing. I then had to lug 3-5 heavy books back on the plane with me. Then CL opened a store near me (Tyson's Corner in northern Virginia) which I visited at least weekly. I musta spent thousands on books back then, especially from O'Reilly. The world of computing was exploding and just keeping up with it was a challenge but also a major blast.

No source on the changes afoot then in computing was more compelling than WiReD Magazine. Its first 3-5 years were simply riveting: great insightful imaginative stories and fascinating interviews with folks whose font of creative ideas seemed unstoppable and sure to change the world. Each month's issue sucked all my time until it was read cover to cover and then discussed with others ASAP. That was a great time to be young and alive.

But Wired wasn't alone. Before them, Creative Computing and Byte were also must reads. Between 1975 and maybe 1990, the computing hobbyist community was red hot with hacks of all kinds, hard and soft. No way I was going to take a job that was NOT in computing. So I did. Been there ever since.

thewileyone · 3 years ago
Awesome to see CL listed here ... worked at computerliteracy.com which eventually became fatbrain.com. Good times!
getpost · 3 years ago
Or the library! The GIF spec was published in a magazine IIRC. I wrote a GIF viewer that supported CGA, EGA, VGA, ... displays.
bombcar · 3 years ago
The library had some things, but man things were moving so fast in the late 80s early 90s that you often had to buy the books you needed directly; because by the time they appeared in the library you'd be on to something else.

The right magazines were worth their weight in gold back then, for sure.

TigeriusKirk · 3 years ago
The MSDN library CDs were indispensable for Windows developers in the 90s. Amazing resource all at your fingertips! What a time we were living in!
secondcoming · 3 years ago
A pirated copy of Visual Studio 2005 started my career.

We didn't have internet at home, and I was still in school, so the Knowledge Base articles on the MSDN CDs pretty much taught me.

massinstall · 3 years ago
Oh, you just made me completely melancholic with that atmospheric description! Makes me miss these times a lot. The abundance of information is truly a blessing, but also a curse.
ja27 · 3 years ago
Yeah it was huge for me when books started to come with CD-ROM copies (c. 1997?) and I could fit more than one "book" in my laptop bag.
suzzer99 · 3 years ago
The O'Reilly Cookbooks were always the best.

I still have most of my dev books. I figure if I ever get a huge bookshelf they'll help fill it out, and give the kids something to talk about.

JKCalhoun · 3 years ago
Or the “Computer Literacy” bookstore in the Silly Valley.
Yhippa · 3 years ago
I miss those days. The books weren't perfect but I feel like enough quality was put into a lot of them because it was hard to errata a print run. Of course there is a lot more information out there for free nowadays but it's harder to sift through. I think the nicer thing is that eventually you'll find content that speaks to you and the way you learn.
datavirtue · 3 years ago
God damn, I miss those days!
davidrupp · 3 years ago
My go-to was Softpro Books in Denver. I would scan the shelves at B&N and Borders too, just in case, but Softpro had a much better selection.
mikewarot · 3 years ago
I was a programmer back in the MS-DOS and early Windows days. My language of choice was Turbo Pascal. Source control consisted of daily backups to ZIP files on floppy disks. The program I wrote talked to hand-held computers running a proprietary OS that I programmed in PL/N their in house variant. The communications ran through a weird custom card that talked SDLC (I think).

I was the whole tech staff, work-life balance was reasonable, as everything was done turning normal day-shift hours. There was quite a bit of driving, as ComEd's power plants are scattered across the Northern half of Illinois. I averaged 35,000 miles/year. It was one of the most rewarding times of my life, work wise.

The program was essentially a set of CRUD applications, and I wrote a set of libraries that made it easy to build editors, much in the manner of the then popular DBASE II pc database. Just call with X,Y,Data, and you had a field editor. I did various reports and for the most part it was pretty easy.

The only odd bit was that I needed to do multi-tasking and some text pipelining, so I wrote a cooperative multi-tasker for Turbo Pascal to enable that.

There weren't any grand design principles. I was taught a ton about User Friendliness by Russ Reynolds, the Operations Manager of Will County Generating Station. He'd bring in a person off the floor, explain that he understood this wasn't their job, and that any problems they had were my fault, and give them a set of things to do with the computer.

I quickly learned that you should always have ** PRESS F1 FOR HELP ** on the screen, for example. Russ taught me a ton about having empathy for the users that I carried throughout my career.

carlgreene · 3 years ago
> It was one of the most rewarding times of my life, work wise.

Did you feel this way in the moment, or did you realize it when looking back?

mikewarot · 3 years ago
I was standing outside the gates of Crawford Generating Station, when I realized that no matter what was wrong, when I was done with my visit, they were going to be happy. It was that moment of self actualization that doesn't often come around.

Looking back in retrospect I see how dead nuts simple everything was back then, and how much more productive a programmer could be, even with the slow as snot hardware, and without GIT. Programming has gone far downhill since then, as we try to push everything through the internet to an interface we don't control. Back then, you knew your display routines would work, and exactly how things would be seen.

criddell · 3 years ago
Not the person you replied to, but I definitely felt that way in the moment. My success and enjoyment writing Clipper (a dBase III compiler) and Turbo Pascal applications for local businesses while I was in high school is the reason I went on to get a computer science degree at university.
LastTrain · 3 years ago
Also not the person you replied to, but yes, in the moment. The feeling was that I couldn't believe someone would pay me for that.
petilon · 3 years ago
It was awful. And it was great.

The awful part was C++. There were only two popular programming languages: C++ and Visual Basic. Debugging memory leaks, and memory corruption due to stray pointers and so on in C++ was a nightmare. Then Java came out and everything became easy.

The great part was everyone had offices or at least cubicles. No "open floor plan" BS. There was no scrum or daily standup. Weekly status report was all that was needed. There was no way to work when you're not at work (no cell phone, no internet), so there was better work-life balance. Things are definitely much worse now in these regards.

All testing was done by QA engineers, so all developers had to do was write code. Code bases were smaller, and it was easier to learn all there is to learn because there was less to learn back then. You released product every 2.5 years, not twice a week as it is now.

dingaling · 3 years ago
> There were only two popular programming languages: C++ and Visual Basic.

And COBOL. Vast, vast plurality of the business economy ran on COBOL. We also had mainframe assembler for when speed was required, but COBOL had the advantage of portability to both mainframe and minicomputer. Anything fast on the mini was written in C.

When I started we had a PC to use for general office tasks ( documents, e-mails and such ) and a 3270 or 5250 green-screen terminal for actual work. The desks groaned under the weight and the heat was ferocious. Overhead lockers were jam-packed with code printouts on greenbar and hundreds of useful documents. "Yeah I have that in here somewhere" and Bob would start to burrow into stacks of pages on his desk.

Cubicle walls were covered with faded photocopies of precious application flowcharts and data file definitions.

Updates to insurance regulations would arrive in the post and we were expected to take ownership and get them implemented prior to compliance dates. There was no agile, no user stories, no QA teams, no 360 reviews. Just code, test, release.

You knew who the gurus were because they kept a spare chair in their cubicles for the comfort of visitors.

Good times.

jrochkind1 · 3 years ago
And don't forget Perl. :)
hnfong · 3 years ago
Pretty sure Pascal/Delphi was also popular until the early 2000s...
maxshm · 3 years ago
I remember Turbo Pascal 3.0, the one that generated COM files for MS-DOS (like EXE files, but about 2KB smaller).

I loved that Turbo.com and COM files 30 years ago!

later I started to use Turbo Pascal 5.5 with OO support and a good IDE.

aliqot · 3 years ago
Still popular! Where's all my FreePascal nerds?
coldpie · 3 years ago
> The great part was everyone had offices or at least cubicles. No "open floor plan" BS. There was no scrum or daily standup. Weekly status report was all that was needed. There was no way to work when you're not at work (no cell phone, no internet), so there was better work-life balance. Things are definitely much worse now in these regards.

FWIW I have had all of these at every place I've worked, including my current job. Places like that are out there. If you're unhappy with your current job, there's never been a better time to move.

toolslive · 3 years ago
C++ was different on different operating systems (every compiler rolled his own template instantiation model). Portability was hard work.
imron · 3 years ago
And you downloaded the sgi stl.

Deleted Comment

BurningFrog · 3 years ago
Moving from C++ to Java in 1998 instantly made me twice as productive, as I was no longer spending half my time managing memory.

Together with starting pair programming in 2004, that is the biggest improvement in my work life.

jjav · 3 years ago
> There were only two popular programming languages: C++ and Visual Basic.

Not really. Back in 1992 I was doing mostly C and second was perl, with shell scripting thrown in the edges.

blablabla123 · 3 years ago
I didn't start that long ago but at my first fulltime job I also had my own office. An unthinkable luxury compared to now. Also figuring out requirements on my own was nice. On the other hand I think work was much more isolated, the office was in the middle of nowhere. Also during that time it was still normal that every second project failed or became some sort of internal Vaporware. Functioning Management seemed almost non-existent.
mech422 · 3 years ago
PL/1 on Stratus and RPG on sys/[36|38] and AS/400 checking in! :-D
granshaw · 3 years ago
My first job was in 2010, not that long ago but still long enough to experience offices and no standups... definitely good times
anovikov · 3 years ago
Was Valgrind already a thing?
relaxing · 3 years ago
Valgrind was huge when it became available early in the 21st century, for finding leaks but also because and gave us engineers ammunition to use against management to keep our development systems running on Linux.

There were other profiling tools before then, but they were extremely pricey.

DanielBMarkham · 3 years ago
Good times. Most of us hadn't gotten "high on our own supply" yet, and the field was wide open.

You got the feeling of a thousand developers all running off in different directions, exploring the human and condition and all of the massively cool things this new hammer called "programming" can do.

Compare that to today. Anywhere you go in the industry, it seems like there's already a conference, a video series, consultants, a community, and so on. Many times there are multiple competing groups.

Intellectually, it's much like the difference folks experienced comparing going cross country by automobile in say, 1935 versus 2022. Back then there was a lot of variation and culture. There was also crappy roads and places you couldn't find help. Now it's all strip malls and box stores, with cell service everywhere. It's its own business world, much more than a brave new frontier. Paraphrasing Ralphie in "A Christmas Story", it's all just crummy marketing.

(Of course, the interesting items are those that don't map to my rough analogy. Things like AI, AR/VR, Big Data, and so on. These are usually extremely narrow and at the end of the day, just bit and pieces from the other areas stuck together)

I remember customers asking me if I could do X, figuring out that I could, and looking around and not finding it done anywhere else. I'm sure hundreds, maybe thousands of other devs had similar experiences.

Not so much now.

jedberg · 3 years ago
Lots of books! We had the internet but it wasn't very useful for looking up information about programming. We had usenet but it would take a while to get an answer, and often the answer was RTFM.

But what we did have were O'Reilly books! You could tell how senior an engineer was by how many O'Reilly books were on their shelf (and every cubicle had a built in bookshelf to keep said books).

I remember once when our company fired one of the senior engineers. The books were the property of the company, so they were left behind. Us junior engineers descended on his cubicle like vultures, divvying up and trading the books to move to our own shelves.

I still have those books somewhere -- when I got laid off they let me keep them as severance!

evgen · 3 years ago
In addition to the ubiquitous ORA books (really, did anyone ever understand Sendmail config files before the bat book?) there were also a lot of print-outs. Huge swaths of code printed on 132-col fanfold paper. You might have a backup copy of the source code on tape somewhere, but nothing made you feel secure like having a copy of the previous working version printed out and stashed somewhere on your desk or in a drawer.
bombcar · 3 years ago
Lots of coding was done on those printouts also - you'd print out your function or your program and sit back and mark it up - especially if you were discussing or working with someone. Screens were small back then!
hinata08 · 3 years ago
oh these books

there are still pretty much all around 'old' IT companies, displayed in shelves and bookcases, as artifacts that explains what were old languages and systems.

I love the retro futuristic vibe of the cover of some of these. And of their content. They invite the reader to leap into the future with bash, explained how Linux used to work, how past versions of .NET and Java were breakthroughs, how to code with XML,...

As a junior who has hardly read any of these, I find them pretty poetic, and I like the reflection they bring on IT jobs. The languages and technologies will change, but good looking code is timeless

nonrandomstring · 3 years ago
Fun!

Precarious. Very slow. Like a game of Jenga, things made you nervous. Waiting for tapes to rewind, or slowly feeding in a stack of floppies, knowing that one bad sector would ruin the whole enterprise. But that was also excitement. Running a C program that had taken all night to compile was a heart-in-your-mouth moment.

Hands on.

They say beware a computer scientist with a screwdriver. Yes, we had screwdrivers back then. Or rather, developing software also meant a lot of changing cables and moving heavy boxes.

Interpersonal.

Contrary to the stereotype of the "isolated geek" rampant at the time, developing software required extraordinary communication habits, seeking other experts, careful reading, formulating concise questions, and patiently awaiting mailing list replies.

Caring.

Maybe this is what I miss the most. 30 years ago we really, truly believed in what we were doing... making the world a better place.