What would you change about software development if your apps must last 50 years?
What must software developers do differently to build apps that will still work 50 years from now?
Can software be durable?
Stop chasing trends and changing for the sake of change. At my job, management is constantly forcing us to reinvent the wheel. The systems we used 15 years ago would still work fine, and be very robust and mature by now. Instead, with each re-write new gaps and bugs come to the surface. For some reason we choose to live in Groundhog Day, instead of making the choice to prioritize cheap, boring stability. Each new solution feels more fragile than the last. When we know someone will tell us to throw it all out in 2 years, there is little incentive to prioritize durability.
Modern software development started for me in 1985 when we disassembled the Choplifter arcade game they mistakenly put in the college dorm with several student engineers. Within 15 minutes of them dropping it off, we had the back door open and the eproms lifted from the motherboard. Within a couple of hours we had reverse engineered the code, modified it for a free game hack, burned new eproms and put back in service.
In honor of thr 40th anniversary of this hack, I recently played the stock version in an emulator on a web page. Code lives on, I suppose.
There's fifty year old simulation software in use today. I ported some to semi-modern x64 and learned things I've gratefully forgotten about floating point in the process. Also that "VAX" and "SUN" Unix behaved differently, for whatever that's worth.
I would say that was viable because it had zero dependencies and could be built with the equivalent of gfortran *.f77, provided one changed the source in plausible looking ways first.
If your software relies on fetching things from the Internet it is probably doomed within a year or so and surely within a decade.
Wouldn't bet on today's up and coming language still existing either. C89? Will probably still build fine with some compiler passing appropriate flags.
Hardcoding x64 or aarch64 assumptions likely bad for longevity too, as both are surely gone before 2075 ticks around, though equally I couldn't find a VAX and still got that code running. So that's more about minimising the cost of porting rather than anything fundamental.
>> I would say that was viable because it had zero dependencies and could be built with the equivalent of gfortran *.f77, provided one changed the source in plausible looking ways first.
Came to say this. Minimize your dependencies. Software can last forever, but everything around it changes and can break or otherwise cause incompatibilities.
An example worth considering is TeX, which is now 43 years old (considering only TeX82; the earlier TeX78 was a substantially different piece of software). There has been some maintenance over the years, it's true, including a few feature additions in 1990 (TeX 3.0), but I would suggest it has shown itself to be extremely durable.
At the heart of this are two wildly different technologies:
- Literate Programming which was developed so as to work around limitations of the Pascal development stack as it existed when the project was begun: http://literateprogramming.com/
- web2c which allows converting .web source into a format which may be compiled by pretty much _any_ C compiler
LP was described by Knuth as more important than TeX, but it suffers a bit from folks not understanding that it's not so much documentation (if it were, then _The TeXbook_ would be the typeset source of plain.tex) as code commentary only useful to developers working to extend/make use of the application --- there really does need to be some sort of system for manual documentation, but I suspect that it will continue to be a talented technical writer for the foreseeable future.
C and Java are the most durable in different ways.
- C gives you the least likely to be broken by evolution of OSes without added effort, because they need backwards compatibility.
- Java will be even more durable as it can be implemented/fixed after you compile your software. I ran my games compiled 20 years ago on a Risc-V computer flawlessly = not a single glitch! That said it requires JVM efforts to be maintained.
Oracle is heading the wrong direction in every aspect (removing fundamental features like sandbox without a clear path to replacement), but the bytecode standard will prevail, ahead of clones like C# and WASM.
That said if you make a VM why not have it JiT/AoT on all 3 bytecodes?
Either way the JVM is the final emulator, as long as it's maintained.
I think it depends on the project and the programmer. Code I wrote for a Nuclear Power plant is still there. Ditto for the code I wrote for the steel mill. And you can't find ATM machines w/ my code in them easily, but you see them from time to time. Code I wrote to go into the DoD's CAC card is still there, even though the hardware went through a new Rev a decade ago, so it surprises me to say this, but "yay, java?" Some of the code I put in Firefox to support TLS 1.1 & 1.2 is still there, but most of it got refactored out a while ago (and thank $DEITY Brian(?) replaced the old libpkix library that had been in Netscape navigator and then Firefox for about 20 years.) Much of the POTS network has been replaced since the 80s, but I'm told there are one or two switches I contributed to while at DSCCC are still around.
But on the other hand, my 1993 1 character patch to the Linux kernel was replaced in around 96 or 97. I hope to whatever benevolent Supreme being exists that the crap pyth*n code I added to Second Life has been replaced. No one still uses Palm Pilots or Handspring Treos anymore, so I doubt that code has much life in it. Virtually every web app I wrote is dead, but they were fun to write, so whatever. And the code I added to a couple of satellites is dead (though my ground station code is still alive.) I bet that some of the avionics code in the cockpit is hard to update as well.
So... it depends... my nuke plant code still has another decade probably and my old room-mate's anti-lock braking code will probably outlast us all. Embedded systems are probably more long-lived than the Facebook front page. Some are just hard to update cause you can't easily get to the machine, others are hard due to regulatory or compliance reasons.
What kind of company do you work for that is doing so much embedded/controls type work? I program hydro power plants and governors with a 30 year mindset and no guarantee I ever get access to the plant again once I sign off as complete
Programs run on an operating system, the operating system runs on real hardware.
The real hardware gets old, wears out, parts become difficult and perhaps even impossible to source.
The operating system accumulates known vulnerabilities until it's no longer safe to connect to anything.
You can work around the latter two problems with emulation, but it's never the same--display technology if nothing else is different and presents differently. Emulation is dependent on the fidelity of the emulation. It's much harder to make it exactly cycle-and-timing accurate, though in most cases (like Word 2007) it doesn't matter.
The instructions might exist, but they are not runnable without other supporting infrastructure.
This also ignores programs that are wholly reliant on third party compute and instructions you have no access to that can be shut down and no longer available, like your MMOs.
The sole purpuse of a “set of instructions” is to serve end users, to fulfill some business function. Without it it is useless bunch of symbols.
The set contain bugs itself that are getting revealed over time. But more importantly end users and businesses function evolve and change and if people have no choice but to adapt to such a “hammer” - it’s a piece of crap, not a software.
To be fair, the surface of software has gone through the roof. Unix utils like `grep` or `find` can be hammers while a retailer's website with varying promotions, inventory and overall content needs to be maintained, moreso like a car.
The last 15 years was 2010. I’ve been in the industry since 1996 and programming as a hobbyist since 1986. Computers change, operating systems change. It’s not like I was using the original AppleWorks 3.0 on my Apple //e in 2010, or ClarisWorks from 1996 on my LCII.
While you can still buy Microsoft Office once and use it “forever”, I much prefer the $129 a year, 5 users deal with 1GB of online storage per user and each user can use office between their computers, online and on mobile regardless of operating system.
A desktop only office suite would do me no good as I go back and forth between platforms.
Software is a design. In the same conditions it never wears out. As conditions change it tends to work less and less well unless modified, as expected.
Use the least amount of dependencies, only old mature stable APIs, obviously offline only. This way you can be pretty sure the environment can be easily emulated.
What would I change about software development now to program like they did 50 years ago? I would program like they programmed 50 years ago: assume it has to work. Assume updates will be risky and expensive. Build in failsafes and watchdogs and redundancy. Be able to replicate the build every year for 50 years. Train people to know what the logs really mean, every year for 50 years. And launch it before the bike shedding can begin!
It's called; sustaining the gravy train. Why fix something once when you can "fix it" over and over again...? Grifters and posers.
- it’s self-contained: it works without dependencies, and with the hardware it was designed for
- there’s an ongoing need: peope want to continue playing Space Invaders
- it’s transposable: the environment it runs in can be recreated through additional layers (emulators here)
- it’s recreatable: the core concepts behind Space Invaders can be reimplemented in the prevailing contexts of the time
In honor of thr 40th anniversary of this hack, I recently played the stock version in an emulator on a web page. Code lives on, I suppose.
Modern SaaS apps can't be run once the company shuts it down. You don't have the code, not even the executable.
I would say that was viable because it had zero dependencies and could be built with the equivalent of gfortran *.f77, provided one changed the source in plausible looking ways first.
If your software relies on fetching things from the Internet it is probably doomed within a year or so and surely within a decade.
Wouldn't bet on today's up and coming language still existing either. C89? Will probably still build fine with some compiler passing appropriate flags.
Hardcoding x64 or aarch64 assumptions likely bad for longevity too, as both are surely gone before 2075 ticks around, though equally I couldn't find a VAX and still got that code running. So that's more about minimising the cost of porting rather than anything fundamental.
Came to say this. Minimize your dependencies. Software can last forever, but everything around it changes and can break or otherwise cause incompatibilities.
- Literate Programming which was developed so as to work around limitations of the Pascal development stack as it existed when the project was begun: http://literateprogramming.com/
- web2c which allows converting .web source into a format which may be compiled by pretty much _any_ C compiler
LP was described by Knuth as more important than TeX, but it suffers a bit from folks not understanding that it's not so much documentation (if it were, then _The TeXbook_ would be the typeset source of plain.tex) as code commentary only useful to developers working to extend/make use of the application --- there really does need to be some sort of system for manual documentation, but I suspect that it will continue to be a talented technical writer for the foreseeable future.
- C gives you the least likely to be broken by evolution of OSes without added effort, because they need backwards compatibility.
- Java will be even more durable as it can be implemented/fixed after you compile your software. I ran my games compiled 20 years ago on a Risc-V computer flawlessly = not a single glitch! That said it requires JVM efforts to be maintained.
Oracle is heading the wrong direction in every aspect (removing fundamental features like sandbox without a clear path to replacement), but the bytecode standard will prevail, ahead of clones like C# and WASM.
That said if you make a VM why not have it JiT/AoT on all 3 bytecodes?
Either way the JVM is the final emulator, as long as it's maintained.
Software is more like a plumbing. It a) wears out b) requires maintenance c) people maintaining it is integrated part of the whole system.
But on the other hand, my 1993 1 character patch to the Linux kernel was replaced in around 96 or 97. I hope to whatever benevolent Supreme being exists that the crap pyth*n code I added to Second Life has been replaced. No one still uses Palm Pilots or Handspring Treos anymore, so I doubt that code has much life in it. Virtually every web app I wrote is dead, but they were fun to write, so whatever. And the code I added to a couple of satellites is dead (though my ground station code is still alive.) I bet that some of the avionics code in the cockpit is hard to update as well.
So... it depends... my nuke plant code still has another decade probably and my old room-mate's anti-lock braking code will probably outlast us all. Embedded systems are probably more long-lived than the Facebook front page. Some are just hard to update cause you can't easily get to the machine, others are hard due to regulatory or compliance reasons.
The real hardware gets old, wears out, parts become difficult and perhaps even impossible to source.
The operating system accumulates known vulnerabilities until it's no longer safe to connect to anything.
You can work around the latter two problems with emulation, but it's never the same--display technology if nothing else is different and presents differently. Emulation is dependent on the fidelity of the emulation. It's much harder to make it exactly cycle-and-timing accurate, though in most cases (like Word 2007) it doesn't matter.
The instructions might exist, but they are not runnable without other supporting infrastructure.
This also ignores programs that are wholly reliant on third party compute and instructions you have no access to that can be shut down and no longer available, like your MMOs.
The set contain bugs itself that are getting revealed over time. But more importantly end users and businesses function evolve and change and if people have no choice but to adapt to such a “hammer” - it’s a piece of crap, not a software.
While you can still buy Microsoft Office once and use it “forever”, I much prefer the $129 a year, 5 users deal with 1GB of online storage per user and each user can use office between their computers, online and on mobile regardless of operating system.
A desktop only office suite would do me no good as I go back and forth between platforms.
Almost 50 years old now, and still sending data.
What would I change about software development now to program like they did 50 years ago? I would program like they programmed 50 years ago: assume it has to work. Assume updates will be risky and expensive. Build in failsafes and watchdogs and redundancy. Be able to replicate the build every year for 50 years. Train people to know what the logs really mean, every year for 50 years. And launch it before the bike shedding can begin!