Replace 'computers' with 'open source software' and you are something closer to an uncomfortable truth. Open source has been critical to the advancement of our world but people who go out of their way to use and encourage open source past the basics (even fdroid or Firefox not even getting to hardware and full software stacks) are treated as oddballs even by the tech community.
Embracing open source and freedom ideologically is very difficult. You can install Firefox, and still feel claustrophobic when you see paid-for default bookmarks and sponsored content everywhere. You can install an open source operating system, but when you learn your system is ACTUALLY running MINIX at the hardware level, which is connected to a 3g chipset listening for updates to do whatever the OEM wants, it all starts to feel like a losing battle.
Let's say you're really determined to have freedom, so you buy a fancy new RISC-V single board computer. Fully open source architecture, bleeding edge stuff. You'll learn quickly the GPU relies on a closed source kernel that the OEM refuses to provide updates for after as little as a year or two, effectively acting as hardware DRM, when your device was supposed to be as free as it gets.
There are many roadblocks in the way of being a free software/open source fanboy.
Please enumerate the demonstrable advancements of society attributable to open source software. Edit: lots of downvotes, no credible responses. Guess y'all are stewing in your own cognitive dissonance?
The Human Genome Project heavily used open source software, as did the detection of the Higgs boson on the Large Hadron Collider at CERN. I'm not sure what your standard for advancing society is though, but the Internet probably helps investment banks and short sellers all over the world try to price in various current and future events and prevent massive and sudden price corrections that can have major impacts on real people's lives. For example your pension going bust.
I am confortably working on hardware 10 to 15 years old. So do all the people of my family and the non-technical friends I help. This is possible with Linux and software resulting from the free and open source communities.
With proprietary OS, all this hardware would be trashed. It would go in waste dumps or poor countries to be dismantled at high health and environment expense. And I would buy new hardware requiring more nature destruction with mining of rare materials. Finally this additional investment would decrease my financial capacity to do other things.
So to make a long story short, the many (millions?) people preserving their hardware investment with open source software are saving environment, health and money to a large extend.
And last but not least, they weaken software company monopolies, which can only be good for society as well.
The open-source community significantly intersects with what is most accurately described as a "cult" dedicated to the irrational defense of open-source by any means necessary.
I've watched over time as the desktop stopped being the primary computing device of choice for many people. I've grown far too accustomed to the power and flexibility of a nice keyboard and 32" monitor, and virtually endless storage and general purpose compute that I own to fall into that usage pattern, but I understand the appeal.
No Wednesday morning headaches from Microsoft, no applications breaking with the new Python revision, etc. Just apps from the app store that work (mostly).
So, the Server Farms that comprise the cloud aren't ancient, but they might as well be remote temples just like those we had in the 1960s.
I don’t think much “computing” is done, in our sense of the word, on tablets and phones. Personally, I’ve found that almost any slightly complex, has-to-be-done-once-and p-right, action is better performed on an actual computer. And that’s not even considering writing new programs.
This hasn't been my experience. I'm able to be surprisingly effective on my smartphone. I've constructed trip itineraries, balanced outing budgets, shared shopping lists, designed room layouts, computed food and drink recipe ratios, built DJ sets, laid out wood pieces, and read books using my smartphone. Coding is probably the only thing that I find unparalleled with a keyboard and a seated position than my smartphone. I used to game more often when I was younger and I still generally prefer to game in a seated position though I have gamed on my smartphone before.
Hah, I'm also "accustomed" like you and always prefer a proper monitor/keyboard first, then a laptop, then lastly a phone even for simple tasks. Obviously things are different if I'm on the street, but if I'm in front of a computer I'd rather use that to check something rather than pull out my phone.
It's partially an eyestrain issue (I find the text too small) as well as the fact that the phone shows much less information than the computer screen. For instance on a mobile web page I have to scroll a lot, whereas on the computer I can just take in the text and images in a single glance. The keyboard and mouse also seem much more intuitive to me. So I guess I find the phone more cumbersome while many people I know think the other way :D
This kind of speculative fiction doesn't work. Usually the template is described as: take some part of the world and tweak it and then press play and see what the world looks like now.
Instead this is just "invert some part of culture" which makes no sense. Culture is the emergent externalized collective intelligence of social creatures. You can't invent a culture which would preserve this property without asserting a global belief in the community isomorphic to the alteration. You haven't altered the culture. You'd just asserted a group of unthinking zombies.
So when I read:
> Imagine a world where computers are inherently old. Whatever you do with them is automatically seen as practice of an ancient and unchanging tradition. Even though new discoveries do happen, they cannot dispel the aura of oldness.
I can't conceive of such a world. Its incoherent. I also can't imagine a world where no one likes metals for whatever reason. You can't just assert a fact like that and press play. It's nonsensical.
It's not just speculative. It's a retrospective on the early mainframe era.
The nitty gritty punch card programming was something stereotypically done by middle aged/older women. Many early computers were massive and housed in what could be considered temples in big universities surrounded by gothic architecture. They were expensive to replace and kept around for long periods of time because of how difficult it was to upgrade (some large mainframe systems still exist today for that reason). Mechanical computers like the Babbage difference engine would last a lifetime. The idea of an average Joe spending all day using expensive compute time was insane. Those who focused on them did so for academic reasons, and spent most of their time meditating on computation, math, and the structure of language and meaning.
Imagine a world where you can just assert any fact as the starting point of speculative fiction. Then imagine that someone would write a piece where computers were inherently old.
This breaks immersion for a lot of people, me included. It's one of the biggest issues I have with lots of sci-fi (moreso than most other types of speculative fiction) novels. Just asserting a fact this fundamental and building your world around it only works if societies, culture, technologies, and history are build around it. Humanity's history with tools and automation are millenia old. To alter this would require a lot of changes. We've been automating, forgetting, copying, and maintaining technologies for as long as our history.
Alien technology encounter stories satisfy some of this for me, as the injection of a foreign, advanced technology makes sense in this framework, much like how ancient peoples would discover technologies built by neighbors or rivals they didn't communicate regularly with.
It's like asking "would you believe in homeopathy if doctors all told you that it was correct?"
To which the answer is "Doctors believe things for reasons. You can't just flip a switch and change what doctors think without changing the entire world so that homeopathy is actually true, and that would be such a weird world that science in it would be unrecognizeable."
(And since I'm not ChatGPT, I can answer your entire question: "Such a story would have an incoherent world. I could imagine someone writing a story that takes place in an incoherent world, but you wouldn't be able to get useful insights from it.")
hell the living computer museum in seattle, before it's at least physical closure by the sister of paul allen, is/was a bit like this. They have a bunch of vintage big iron that they have restored to working condition and allow members of the public to use. Some of these are still available on the internet.
Hell deep in the chip shortage I talked with more than one person that were buying up electronics at thrift shops and ebay that they knew contained chips they could not source at the time.
Step 1 is really hard if computers are useful (why do you stop making something that's useful?), but step 3 is simply not believable: a society that has machines that could, at the minimum, help solving complex calculations and engineering problems much better than humans decides to completely ignore them. That's not how societies work.
The issue is with Step 1 - there is some reason why the world chooses to stop using computers despite (as far as I can see in the 'thought experiment') them being available and the homo sapiens needs/desires because of which we use(d) computers seemingly being the same. That reason isn't provided, isn't even implied or hinted to, however it's something major with far-reaching consequences to human behavior.
So it's kind of frustrating - in effect, the author describes a thought experiment of how society would change if X happened, but without saying anything at all about that X other than asserting that one of the things X causes is a mysterious lack of interest in using computers to fill needs/desires which (at least according to the original article) aren't filled by some better alternative.
You're basically missing the whole point of speculative fiction in this style, which is "we start with imagining a world which has these conclusions. Trying to imagine it forces us to try to make sense of what could lead to those conclusions."
You assert it's incoherent. That's part of the point: it's incoherent with your current understanding of our current world. There might be other circumstances that lead to a such world without being incoherent. If the conclusions in a piece of speculative fiction are appealing, it may be worth thinking of what it takes to reach those conclusions.
The thought exercise of trying to make it coherent is part of the point.
No, the point is it can't be done. It's like cleaning your house with an 100 year old tortoise: you absolutely would do that, if tortoises were plentiful and do a good job cleaning. Why would anyone clean their houses themselves, when there were all these old tortoises willing and able to do the job for them?
Now if the tortoise moves all your crap around so you can't find it, leaves streaks on your mirrors and windows, fire his scaly butt!
Good speculative fiction at least tries to give some reason why something might come to be the case. This article just says "pretend computers are old; old things are like X today; therefore, computers are going to be just like X," while ignoring all the things that are different.
Computers are inherently old at this point, the technology harkens back at least the better part of a century.
The old guard in particular cannot and will not let go of CLIs and esoteric flags and arguments, and will berate anyone who would dare suggest anything concerning a GUI.
To them, the only human interfaces that exist are the keyboard and the monochrome monitor, computing is still grey-on-black (or green/orange-on-black) monospace text and running emacs or vim. The rest of the world disregards them as living relics of ancient history.
Imagine a world where everybody simply has an AI assistant in their pocket, that simply tells them anything they might need to know. Design programs for you. Draw you any image. Perhaps even construct virtual worlds. We are almost there already.
Only weird old people would bother to learn how to program dumb desktop computers.
There's a lot of people who feel disquieted by the pace of software systems and aspire to do work in software that feels similar to the more methodological, slower paced work of more mature fields like HVAC engineering or structural engineering. These folks find solace in this kind of speculative fiction.
I feel their disquiet is misplaced. Any young, fast-moving field is going to be full of the same issues that software is in now. You can look back at the history of mechanical engineering during the industrial revolution to see many similar problems we had with unsafe projects, hyped-up snake oil, or iteration for the sake of iteration. The history of automobiles and aviation was also marked by similar issues. Slower paced engineering fields are more mature and have gone through decades or centuries of iteration before coming up with tried and true solutions. But fundamentally fiction speaks to the soul more than it speaks to any measurable outcome. Truly the only fix for this disquiet is to search inside rather than look out. Fiction can be a great tool for that.
> Computers are seldom privately owned – they are considered essentially communal rather than personal
This struck me as true of most people's use of computers now. Almost everyone I know who isn't either an old fashioned developer or a gamer uses a mobile phone or tablet merely as a graphical terminal; the actual computer is some mysterious entity elsewhere on the Net.
The golden age of computing when 'everyone' had one of their own that was useable offline is now long past.
Likewise Gen Z and Alpha are not very technical. Many teens and 20 somethings have a hard time with seemingly basic tasks like finding files. As a 20 something myself, my public school did not offer any 'technical' courses beyond office programs and keyboarding.
Each generation consists of roughly 90% of people who are completely clueless about anything computer related. Those of us that work in the field usually can't even conceive of it but it's true.
I wouldn't say it's "long past", but rather in its last gasps outside of niche hobbyist communities. There are plenty of people whose primary computer use is still local, they're just considered old-fashioned
> Solid-state computer components, on the other hand, have no mechanical decay, so they are practically eternal.
I wish.
Electrolytic capacitors seem to have a functional lifespan of about 10-20 years, transistors do wear out, electromigration is a problem in chips in the "hundreds of years" span discussed, and there are all sorts of other interesting failure modes of "solid state" electronics that mean they're not going to last hundreds of years without some pretty massive heroics - and that's when you can repair it at all.
I would wager that it's not been running powered on for 33 years without any repairs. If it has, probably long past due for a re-cap job, and you probably don't want to put a scope on the voltage rails to see what the ripple is.
For a broad handwave, "sitting powered off" isn't too bad for solid state equipment (it does bad things for hard drives, see "stiction"), but operating (and operating at temperature) is where the wear occurs from a range of effects. I've reworked [0] a Core 2 Duo board that stopped booting after a decade or so, because the capacitors filtering power for the IDE controller got so bad it wouldn't boot reliably (it would load the kernel off the drive, and then insist the drive wasn't present later).
Older hardware tends to be more resilient due to wider traces, which means lower susceptibility to ESD and electromigration. But eventually the last atom will get eroded out of a critical trace and the thing will fail. Nothing lasts forever, especially when made to be as cheap as possible.
I recently had to replace an SSD that was only 3 years old; it's dead-dead, as in, slowed down one day and then wouldn't boot. I've never had an HDD die this fast in my life. I know this is only one example but I'm curious if anyone else has a similar experience.
Had a brand new Seagate 2.5" hard drive fail after ~60 days of laptop use, back in 2010. I was a reasonably heavy desktop user, but scarcely a stress-tester, and I didn't have any memorable drop incident or accident that'd explain the failure, either (I was pretty careful with my machines). Retailer or manufacturer, I can't recall, replaced the drive (with a refurb) and I was able to recover some of the most important new data off the old drive as it failed, but it left me pretty unhappy.
still have a few mis-named files deep in my personal directories, 2 PCs and 3 Mac migrations later.
Here's a pretty good document about HDD life expectancy. Depending on the model HDDs can have a fail rate of up to 12% in 3 years. I'm unsure how many drives you've used, or how much of a workload you put them through, but I wouldn't assume SSDs are significantly more reliable than HDDs based on one failed drive.
All the comments here are so negative for such a beautiful essay!
This paragraph at the end in particular really struck a chord with me:
> In the real world, people associate computers with many different things: corporate dehumanization, overwhelming consumer capitalism, alienation from the material world, shortened attention spans, ridiculously short obsolescence cycles, etc. etc. It is often difficult to tell these cultural biases apart from the "essence" or computing, and it is even more difficult to envision alternatives due to the lack of diversity. Thought experiments like this may be helpful for widening the perspective.
Computing is a fundamental part of the world and crazy exciting to play around with. It allows us to experience whole new spectrums of reality! At its core, isn't this the essence of the hacker ethos?
This post so wonderfully blends a hypothetical with where we really are, what computing has in fact become. Computiung is in fact vanishing, hidden inside massive data centers (temples) & behind firewalls. Applications offer packaged consumer experiences, but actual computing recedes, gets further off, & few people experience it. Real connection to computing is slow & takes concentration & will to develop; it's not fast.
This project to invert common conceptions actually reveals a lot of truth.
The final summation is great.
> So, what is the "essence" of computing then? I'd say universality. The universality of computing makes it possible to bend it to reflect and amplify just about any kind of ideology or cultural construct.
They are just describing how computing works in an academic system if you’ve run out of HPC cluster credits and haven’t gotten a grant for a new system recently.
Let's say you're really determined to have freedom, so you buy a fancy new RISC-V single board computer. Fully open source architecture, bleeding edge stuff. You'll learn quickly the GPU relies on a closed source kernel that the OEM refuses to provide updates for after as little as a year or two, effectively acting as hardware DRM, when your device was supposed to be as free as it gets.
There are many roadblocks in the way of being a free software/open source fanboy.
With proprietary OS, all this hardware would be trashed. It would go in waste dumps or poor countries to be dismantled at high health and environment expense. And I would buy new hardware requiring more nature destruction with mining of rare materials. Finally this additional investment would decrease my financial capacity to do other things.
So to make a long story short, the many (millions?) people preserving their hardware investment with open source software are saving environment, health and money to a large extend.
And last but not least, they weaken software company monopolies, which can only be good for society as well.
No Wednesday morning headaches from Microsoft, no applications breaking with the new Python revision, etc. Just apps from the app store that work (mostly).
So, the Server Farms that comprise the cloud aren't ancient, but they might as well be remote temples just like those we had in the 1960s.
It's partially an eyestrain issue (I find the text too small) as well as the fact that the phone shows much less information than the computer screen. For instance on a mobile web page I have to scroll a lot, whereas on the computer I can just take in the text and images in a single glance. The keyboard and mouse also seem much more intuitive to me. So I guess I find the phone more cumbersome while many people I know think the other way :D
Web2 and Web3 websites do everything possible to waste as much screen real estate as possible with as little content density as possible.
Instead this is just "invert some part of culture" which makes no sense. Culture is the emergent externalized collective intelligence of social creatures. You can't invent a culture which would preserve this property without asserting a global belief in the community isomorphic to the alteration. You haven't altered the culture. You'd just asserted a group of unthinking zombies.
So when I read:
> Imagine a world where computers are inherently old. Whatever you do with them is automatically seen as practice of an ancient and unchanging tradition. Even though new discoveries do happen, they cannot dispel the aura of oldness.
I can't conceive of such a world. Its incoherent. I also can't imagine a world where no one likes metals for whatever reason. You can't just assert a fact like that and press play. It's nonsensical.
The nitty gritty punch card programming was something stereotypically done by middle aged/older women. Many early computers were massive and housed in what could be considered temples in big universities surrounded by gothic architecture. They were expensive to replace and kept around for long periods of time because of how difficult it was to upgrade (some large mainframe systems still exist today for that reason). Mechanical computers like the Babbage difference engine would last a lifetime. The idea of an average Joe spending all day using expensive compute time was insane. Those who focused on them did so for academic reasons, and spent most of their time meditating on computation, math, and the structure of language and meaning.
Alien technology encounter stories satisfy some of this for me, as the injection of a foreign, advanced technology makes sense in this framework, much like how ancient peoples would discover technologies built by neighbors or rivals they didn't communicate regularly with.
To which the answer is "Doctors believe things for reasons. You can't just flip a switch and change what doctors think without changing the entire world so that homeopathy is actually true, and that would be such a weird world that science in it would be unrecognizeable."
(And since I'm not ChatGPT, I can answer your entire question: "Such a story would have an incoherent world. I could imagine someone writing a story that takes place in an incoherent world, but you wouldn't be able to get useful insights from it.")
Step 2: 100 years pass
Step 3: computers are now an old (and dying) tradition, kept going only by a small group of people who understand how to care for the machines.
Was that so hard?
https://www.livingcomputers.org/
https://www.livingcomputers.org/Computer-Collection/Online-S...
Hell deep in the chip shortage I talked with more than one person that were buying up electronics at thrift shops and ebay that they knew contained chips they could not source at the time.
So it's kind of frustrating - in effect, the author describes a thought experiment of how society would change if X happened, but without saying anything at all about that X other than asserting that one of the things X causes is a mysterious lack of interest in using computers to fill needs/desires which (at least according to the original article) aren't filled by some better alternative.
You assert it's incoherent. That's part of the point: it's incoherent with your current understanding of our current world. There might be other circumstances that lead to a such world without being incoherent. If the conclusions in a piece of speculative fiction are appealing, it may be worth thinking of what it takes to reach those conclusions.
The thought exercise of trying to make it coherent is part of the point.
One word: Terminals.
Computers are inherently old at this point, the technology harkens back at least the better part of a century.
The old guard in particular cannot and will not let go of CLIs and esoteric flags and arguments, and will berate anyone who would dare suggest anything concerning a GUI.
To them, the only human interfaces that exist are the keyboard and the monochrome monitor, computing is still grey-on-black (or green/orange-on-black) monospace text and running emacs or vim. The rest of the world disregards them as living relics of ancient history.
Only weird old people would bother to learn how to program dumb desktop computers.
I feel their disquiet is misplaced. Any young, fast-moving field is going to be full of the same issues that software is in now. You can look back at the history of mechanical engineering during the industrial revolution to see many similar problems we had with unsafe projects, hyped-up snake oil, or iteration for the sake of iteration. The history of automobiles and aviation was also marked by similar issues. Slower paced engineering fields are more mature and have gone through decades or centuries of iteration before coming up with tried and true solutions. But fundamentally fiction speaks to the soul more than it speaks to any measurable outcome. Truly the only fix for this disquiet is to search inside rather than look out. Fiction can be a great tool for that.
This struck me as true of most people's use of computers now. Almost everyone I know who isn't either an old fashioned developer or a gamer uses a mobile phone or tablet merely as a graphical terminal; the actual computer is some mysterious entity elsewhere on the Net.
The golden age of computing when 'everyone' had one of their own that was useable offline is now long past.
I wish.
Electrolytic capacitors seem to have a functional lifespan of about 10-20 years, transistors do wear out, electromigration is a problem in chips in the "hundreds of years" span discussed, and there are all sorts of other interesting failure modes of "solid state" electronics that mean they're not going to last hundreds of years without some pretty massive heroics - and that's when you can repair it at all.
I would wager that it's not been running powered on for 33 years without any repairs. If it has, probably long past due for a re-cap job, and you probably don't want to put a scope on the voltage rails to see what the ripple is.
For a broad handwave, "sitting powered off" isn't too bad for solid state equipment (it does bad things for hard drives, see "stiction"), but operating (and operating at temperature) is where the wear occurs from a range of effects. I've reworked [0] a Core 2 Duo board that stopped booting after a decade or so, because the capacitors filtering power for the IDE controller got so bad it wouldn't boot reliably (it would load the kernel off the drive, and then insist the drive wasn't present later).
[0]: https://www.sevarg.net/2018/04/15/on-art-of-repair-re-capaci...
https://en.wikipedia.org/wiki/Electromigration
still have a few mis-named files deep in my personal directories, 2 PCs and 3 Mac migrations later.
Did you double check if the files could be read by another system? A drive that goes read-only will also fail to boot, I think.
[0] https://www.backblaze.com/blog/hard-drive-life-expectancy/
The vast majority of people never write enough for this to become a problem, though.
Deleted Comment
Deleted Comment
Dead Comment
To be an experiment there must be some sort of question to explore, after specifying some sort of condition.
This is a set of assertions from beginning to end.
This paragraph at the end in particular really struck a chord with me:
> In the real world, people associate computers with many different things: corporate dehumanization, overwhelming consumer capitalism, alienation from the material world, shortened attention spans, ridiculously short obsolescence cycles, etc. etc. It is often difficult to tell these cultural biases apart from the "essence" or computing, and it is even more difficult to envision alternatives due to the lack of diversity. Thought experiments like this may be helpful for widening the perspective.
Computing is a fundamental part of the world and crazy exciting to play around with. It allows us to experience whole new spectrums of reality! At its core, isn't this the essence of the hacker ethos?
This post so wonderfully blends a hypothetical with where we really are, what computing has in fact become. Computiung is in fact vanishing, hidden inside massive data centers (temples) & behind firewalls. Applications offer packaged consumer experiences, but actual computing recedes, gets further off, & few people experience it. Real connection to computing is slow & takes concentration & will to develop; it's not fast.
This project to invert common conceptions actually reveals a lot of truth.
The final summation is great.
> So, what is the "essence" of computing then? I'd say universality. The universality of computing makes it possible to bend it to reflect and amplify just about any kind of ideology or cultural construct.