Can the mods un-dupe this? That was 2 months ago, and it's generated fruitful discussion on a hacker-relevant matter. Certainly more relevant than the Atlas Obscura "look at this quirky history" articles that get upvoted so much.
I appreciate re-posts when they receive new insightful comments, but 60ish days is perhaps a bit on the short side? However, this re-post now has 300+ comments and has justified its existence.
I remember turning on the computer and waiting for the motherboard company logo to finish flashing. After that was the Windows logo. I also remember waiting for the dial-up modem to connect to the internet and waiting hours for a video to download. Heck, last week I played a game streaming on GeForceNow and I was surprised by how seamless the whole experience was.
Maybe some web apps are inconvenient, but you can easily setup a command line interface or Linux system and move blazing fast.
> you can easily setup a command line interface or Linux system and move blazing fast
Well... not really. Terminal response times are way slower than they used to be in the 80's. Sure, we have greater throughput, but we also have greater latency. https://danluu.com/term-latency/
> Terminal response times are way slower than they used to be in the 80's.
Do you have data / a citation to back up that statement? Most things seem faster to me. As long as the command line responds as faster as I can type, I don't care about it being faster.
That being said, overall speed and what can be handled by the system certainly does matter in many cases, even if it is _just throughput_. I've recently been running commands to search/process data from the command line recently
1. Search through a directory structure with millions of files (find)
2. filter down to ones containing a certain thing (grep)
3. find certain values in the files based on pattern matching (sed)
4. filter to files where one value equals another (test)
I can't imagine that would _ever_ finish running on a computer in the 80s. It works just fine on my current computer (albeit taking a minute to run).
"""
It's caused by conhost.exe excessively hammering the registry to query the following two values:
HKCU\Software\Microsoft\Windows\CurrentVersion\ImmersiveShell\TabletMode
HKCU\Software\Microsoft\Windows\DWM\ColorPrevalence
Having previously reported this issue through Feedback Hub (to no avail), let me offer my observations from debugging this issue:
It's not specific to Dell's XPS series - I've been able to reproduce on any Windows 10 installation from version 1703 and up
It only occurs when the console application writes output that causes the console to scroll
It only occurs when the console application in question is in foreground/focus
Each reg value mentioned above is queried 6 times, per scroll!!!
"""
The solution I use is to minimize the terminal when it's scrolling. When I compile my projects if I don't minimize the terminal it takes forever to finish.
"Terminal response times are way slower than they used to be in the 80's." In 1983 the top selling computers were the TRS-80, Apple II and IBM PC (among others). I started on those computers. And what you said is crap... straight out BS. Any person from that era should remember what happened when you typed dir (or equivalent) into a folder with a couple hundred items. It was time to go make some coffee while each...item...was...written...to...the...screen. The OP has good points but the start was WEAK because it is factually inaccurate like your comment.
Consider a VT100. It could only go up to 19200 baud, and couldn't actually handle constant streams at that speed.
However I do agree with the general message of the article. Current UX is a mess, websites are overblown by tons of javascript. There's no reason there should be loading spinners in this day and age with the power we have at our fingertips.
...I'm a little confused. My computer still flashes the motherboard manufacturer until the fans are all assessed and then the windows logo.
And while my connection is fast, it's a lot less reliable than the one I had 20 years ago, with about three 15-minute disconnects per day on average (although that has more to do with the general shit-quality of the Comcast monopoly than anything else).
Desktop machines may POST at a reasonable speed, but enterprise servers are still damn slow to start. Whether its the HP servers we deployed a few years back, or the more recent UCS servers we are building our new platform on, it takes ages for them to get to booting the OS (ESXi).
Check bios settings a lot of them let you configure that time. Worth keeping for a few seconds so you can hit the appropriate key to access bios again but can certainly turn it down.
A single longer wait is much better than constant tiny interruptions. I'd gladly wait for 10 minutes for my computer to boot up if it means that I get a seamless experience afterwards.
Of course, your point about connection speeds is entirely right - but I don't think anybody would honestly advocate just entirely going back to 56k. Rather, we really should be trying to answer the question of why using a computer feels much more sluggish (probably a better word to use than slow) than it did back in the day.
Yes, some of it is rose-tinted glasses and nostalgia, but most of what people discuss is worth taking into consideration. Nobody would seriously go and use a 20 year old computer unless they were making a statement of some sorts - and that statement is not "20 year old technology is entirely better than what we have today" but rather "why can't modern technology do things this 20 year old device can?"
Most computers weren't that bad, but there was a time when Windows 95/98 and the mainstream hardware it ran on weren't a good match. That's what you remember.
Even then, once the thing did boot up, applications were rather responsive.
> ...you can easily setup a command line interface or Linux system and move blazing fast.
No, you can't. The majority of people uses software that is imposed upon them. The majority of work can not efficiently be expressed in CLIs.
It's the responsibility of software developers to write responsive software and in many cases, they have failed. Of course, with the inefficient web platform, their job is made difficult from the start.
I feel bad for the metaphorical user. They were robbed of their tools.
We get to keep our shell utilities and CLIs, but they were robbed of tiny single-purpose GUI tools.
We get to keep our shell pipelines and perl scripts, but they're getting locked out of their Excel spreadsheets and VBA scripts.
Hell, with how things are going even automation tools like AutoHotKey are going to perish.
"People don't know how to code" or "users aren't tech-savvy enough for data processing and programming" are bullshit statements. They just happen to not code the way Silicon Valley does. And now these users are getting corralled into caged web apps and experiences that give them no flexibility at all and just treat them like children, all the way from the feature set to the presentation.
> Most computers weren't that bad, but there was a time when Windows 95/98 and the mainstream hardware it ran on weren't a good match. That's what you remember.
That's been the issue with every Windows OS. The cheap consumer hardware available at the time of the release was not enough to provide a snappy OS experience.
Unfortunately, hardware frequencies have sort of stalled, so now we're back to square 1 again, where W10 doesn't run snappy on median consumer hardware, but the hardware isn't getting faster at the same rate it was during the first decade of 2000's.
I have a Windows 7 machine for some legacy games and software, and it feels much faster on a 6-year old CPU (Haswell) than W10 does on a 1 year old CPU.
> you can easily setup a command line interface or Linux system and move blazing fast.
Maybe the HN crowd can but not your average or median computer users
> I remember turning on the computer and waiting for the motherboard company logo to finish flashing
I still have computers that take some time to boot. Issue is, I just rarely power off/on. Instead the computers go to sleep which if much faster to come out of.
- predictability: it can be non instant but I know what's coming
I had an old pentest distro with linux 2.4 on a key and a few recent ones using systemd and tailored DE. Whenever I used the old 2.4 era distro, maybe I had less capabilities overall, maybe a few glitches here and there, but somehow I felt in contact. Everything reacted as-is, throughput felt large (console output, file IO). Even the old sequential boot had a 'charm' to it. Even though systemd is so vastly superior in all aspects .. there was something that appealed to me. Weird.
Unfortunately enabling this means everybody within a 15 meter radius knows when my laptop has rebooted because this option also enables a BEEEEP that cannot be disabled or turned down. :(
Can disable the low-battery beeping; cannot disable the POST beep unless you switch to vendor logo mode.
(It's your standard old-style beep - just rendered through stereo speakers that have uncannily decent audio reproduction at that particular frequency for some reason. In a serene, quiet room.)
404 sense not found; please check the garbled and try again
I can still remember some of the text displayed when booting up my family's Win95 beige box because stayed up so long. Now if I'm trying to read an error message when booting even a slow modern computer I have to take a video and step through it frame by frame.
As the OP who wrote this while drunk two years ago and thinks that about 30-50% of it is objectively wrong, it's extremely funny to me that every five months someone reposts it here and everyone gets in a fight about it again. Many of the details are entirely inaccurate - the thesis is still completely valid, and it's Computer Person Thinking that wants to attack the details while refusing to stand back and look at the overall picture, which is that using computers is now an incredibly messy experience where nothing quite does what you want, nothing can be predicted, nothing can be learned.
I hypothesize that this is because programmers and other various Computer People consider "being on the computer" to be a satisfying goal in itself that sometimes has positive side effects in the real world, while everyone else simply has stockholm syndrome and has long since given up on even imagining a better experience.
That tendency to rathole and nitpick, while discarding the larger point, is increasingly frustrating.
It seems like a lot of this boils down to people getting stuck in a local maxima. If someone isn't used to zooming out to the big picture, are they more likely to solve problems in isolation and end up with layers of dependencies, npm installing all the things? Resulting in an unstable stack and less predictable ecosystem at large.
Yes, the ratholeing vs. generalizing is an interesting phenomenon. I get the sense that it's largely personality-based. Some people (most computer people) are wired up to get every detail exactly correct and if anything is out of place they break out the tweezers and start tweaking. That's exactly what's required in most engineering work. It can lead to perfecting something that's about to get thrown away.
The other personality deals with abstractions, analogies and top-down thinking. When faced with an issue they'll start by defining goals and values. And it's easy to be blind to the details when you're thinking of the big picture and ask for things counter to the laws of physics.
Even if you're a two-minded kind of person and can deal with both generalities and specifics, it's incredibly hard to deal with both at once and quite a task to switch from the one mindset to the other.
If you were a hammer manufacturer, wouldn't you nitpick if someone said "I feel hammers have gotten worse since the 80's, and handles aren't what they used to be"? To many people, computers are a tool;to many IT people, they are a way of life. Its not about big picture, its about proximity- some rant about some minor detail on something you value captures more attention than something you don't care about.
I appreciate your perspective a lot. And I agree that 90% of our use of computers is complete trash, and for those of us in the industry the addiction of being able to understand and maneuver through that trash can overpower the desire to remove the trash entirely.
Studying the history of CS and following it to its roots has been simultaneously invigorating because of the elegance of the fundamental ideas, but also disappointing, because of the line I can trace from the hopes and dreams of those pioneers to the current state of computing.
I'm hopeful that a new wave of software is just over the horizon. Perhaps I am naive, but I'm holding out against pessimism about the future for as long as I can.
I've been working on Google Maps for close to five years now. I somehow missed the previous postings of this thread, but a little hyperbole never hurt anyone :) Your larger point about the inability in our profession to take a step back and consider the overall situation is spot-on. The mindset I most commonly encounter among other engineers when discussing any kind of systemic problems in our practices is a kind of "it's $CURRENT_YEAR, of course things are the best they've ever been and we've solved all known problems. Also what is $KNOWN_SINCE_DECADES_TRIVIAL_SOLUTION_TO_MAJOR_PROBLEM?"
> the thesis is still completely valid, and it's Computer Person Thinking that wants to attack the details while refusing to stand back and look at the overall picture, which is that using computers is now an incredibly messy experience where nothing quite does what you want, nothing can be predicted, nothing can be learned.
FWIW, I don't care about the details, there are plenty individual examples of bad design today and slower software. I disagree completely with your larger thesis that things have gotten worse, and with the title in particular as a summary of your point. Why do you feel it's completely valid, and what does that mean exactly if you agree that many of the supporting examples are wrong?
And I also disagree with characterizing people who disagree with you as "Computer Person Thinking" and "Stockholm Syndrome". That charge seems doubly ironic, quite hypocritical, given the content of your tweet-storm, but I understand if you're feeling a bit attacked or defensive about this discussion. It's okay that it wasn't all true, and if this causes discussion that seems annoying, at least it's a discussion. We can continue to improve software and hardware and UX, and it's good to discuss it. It's just not true that it's gotten worse, and you don't really need to crap on today's software or the people who disagree with you to prove the point that we still have room for improvement. There's always room for improvement.
Anyway, computing is objectively faster today than in 1983 (I was there), and not by a little, by orders of magnitude, especially if you are fair and compare functionality apples to apples, but even if you only tally wait times for activities that seem similar on the surface but are completely different today (such as in-memory queries vs internet queries).
I don't see much objective data or measured results in your UX argument on the whole. Arguing that function keys are intuitive and that the mouse is useless is pure opinion based on what you like and being used to something, not something that can be shown by any large scale user studies to date, and history has already somewhat demonstrated the opposite, that many people prefer mice navigation to keyboards, and that function key workflows are for experts, not Twitter users, by and large.
A thesis cannot be "completely valid" when 50% of the reasoning behind it is "objectively wrong". Some people would rather point out in which ways your idea is misguided and naive than attempt to force sense out of an idea that's based on wrong information.
This post again with its ridiculous ranting examples.
"This text searching program that searches text against text is way faster and simpler than a fully rendered, interactive map of the entire world with accurate roads and precision at the foot / meter level."
No. Shit, really? Get out of town.
Yes, some very popular apps have bad UX. But some apps have incredible UX. Cherry picking the bad ones while ignoring the good ones to prove a point is boring and not reflective of the actual industry.
These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
Remember autoexec.bat files? Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability? Remember when building a computer required work and research and took hours? I do, and it wasn't fun then, it was a detriment to doing what you wanted. It wasn't a rite of passage, it was a huge pain in the ass.
So yeah, things are slower now, and I wanna go fast. But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either. I don't need to teach her arcane commands to get basic functionality out of her machine. Plug and play and usability happened, and while things feel slower, computers are now open to a much wider audience and are much more usable now.
These posts always have a slight stench of elitism disguised as disappointment.
Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going; it gives almost no affordances for exploration and cross-referencing.
> Remember when building a computer required work and research and took hours.
As someone who builds their own PC every couple years: it still does. It's actually worse now, due to the amount of products on the market and price segregation involved. Two PCs ago, I didn't have to use two parts compatibility tools, several benchmarking sites and a couple of friends, and didn't have to write CSS hacks for electronics stores, just to be able to assemble a cost-effective PC.
> But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either.
You don't? Printer drivers are only slightly less garbage than they were, but now there's also less knobs to turn if things go wrong. When my Windows 10 doesn't want to talk to a printer or a Bluetooth headset, all I get to see is a stuck progress bar.
Bottom line: I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades (arguably for business reasons: the easier it is for people to make sense of information, the harder it is for your sales tactics to work).
> These posts always have a slight stench of elitism disguised as disappointment.
That I don't get. Is it "elitist" now to point out that the (tech) "elite" can actually handle all this bullshit, but it's the regular Joes and Janes that get the short end of the stick?
> It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going;
As it turns out, that is probably the most popular use case for maps in the world.
Note also that for most smartphone users of Google Maps the use-case is actually much broader than that. The UI flow also totally accounts for users who only know where they are going—thanks to GPS and Google Maps knowing where you are often isn't necessary.
I'm confused by the complaint that the "Maps" app only caters to the 90-percentile use case for maps, but doesn't cover the other uses-cases well.
> I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades
I just find this not the case at all. For expert-users the tools that existed decades ago are still there and still usable. Or you can craft your own!
For non-expert users the information in the world is orders of magnitude more accessible than it used to be.
> Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
I know of an ERP system that somehow manages to take about 15 seconds to search an inventory of ~100k items. If you export all those items to CSV, with all their attributes (most of which are not searched), the resulting file is about 15 MB.
It is boggling how they managed to implement search this slowly (in a C++ application using MS SQL as the backend). 3.5 GHz computers, performing plain text string search at about 1 MB/s.
It is even more surprising that users feel this is not an completely unreasonable speed.
(They managed to pull this stunt off by completely not using the SQL database in the intended way, i.e. all tables are essentially (id, blob) tuples, where the blob is a zlib compressed piece of custom TLV encoded data. All data access goes through a bunch of stored procedures, which return data in accordance to a sort of "data extraction string". Search works by re-implementing inverted indices in tables of (word, offset, blob), where blob contains a zlib compressed list of matching IDs; again processed by stored procedures. The client then is wisely implemented using MS SQL's flavour of LIMIT queries which effectively cause a classic quadratic slowdown because the database engine literally has no way to fetch result rows n...m except by constructing the entire result set up to m.
Unsurprisingly the developers of this abomination claim to be competent. They also invented a funny data exchange format involving fixed field lengths and ASCII separators - some time in the 2010s.)
What makes Google Maps bad? My computer in 1983 didn't have a map application at all, how can Google Maps possibly be bad compared to that?
And if I had had a map application (I'm sure they existed) it would have taken several minutes to load from cassette tape. That's not faster than Google Maps, either perceptually or objectively.
> Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
What are you talking about? How in the world is a DOS that can only run a single app at a time better than a system that can run dozens of apps at once? How is non-multitasking a better experience? I remember DOS pretty well. I remember trying to configure Expanded Memory vs Extended Memory. Having to wait for dial-up to literally dial up the target machine.
Edit: I didn't realize the poster was talking about Point-of-Sale devices. So the above rant is aimed incorrectly.
> It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going;
That's exactly what it's made for.
> As someone who builds their own PC every couple years: it still does. It's actually worse now,
No way. Things just work nowadays. Operating Systems have generic drivers that work well. Its so much easier now to build a machine than it was years ago. I remember taking days to get something up and running, but now its minutes. Maybe an hour?
I really hate these "good old days" posts, no matter what the subject. The days of the past weren't better, there were just fewer choices.
In no way shape or form does DOS-era UX beat current-era UX (there may be specific programs that do so, but writ large this is incorrect). Getting away from command lines is one of the core reasons that computing exploded. Getting farther away from abstraction with touch is another reason that computing is exploding further.
Command-line systems simply do not map well to a lot of users' mental models. In particular, they suffer from very poor discoverability.
It is true that if you are trained up on a command-line system you can often do specific actions and commands quickly but without training and documentation, it is often quite hard to know what to do and why. Command-line systems also provide feedback that is hard for many people to understand.
Yes, it is true that many current-era systems violate these guidelines as well. This is because UX has become too visual design focused in recent years, but that's a symptom of execs not understanding the true value of design.
> While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
These aren't orthogonal in any way. A significant chunk of modern performance hit relative to older Von Neumann architectures is the cross-validation, security models, encapsulation, abstraction, and standardization that make BSODs an extremely unexpected event indicative of your hardware physically malfunctioning that they have become (as opposed to "Oops, Adobe reverse-engineered a standard API call and botched forcing bytes directly into it when bypassing the call library's sanity-checking; time to hard-lock the whole machine because that's the only way we can deal with common errors in this shared-memory, shared-resource computing architecture!").
> a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.
As someone who helped with the transition from crappy text-only DOS interfaces on POSes to graphical interfaces on POSes, I have to disagree. Learning those interfaces was terrible. I don't know where the author gets the idea that even a beginner had no trouble learning them. I worked at NCR during the switchover from the old system to the new one that used graphical representations of things and the new system was way easier to use and for beginners to learn. And that's not just something we felt, we measured it. (Interestingly, it was still DOS-based in its first incarnation, but it was entirely graphical.)
You can never really make a map for "exploring" because what people want to explore is domain specific. Do you want to explore mountains? Do you want to explore museums, do you want to explore beaches?
There is too much stuff in the world to present on a map without absolutely overwhelming people, in the same way that the internet is too vast to "explore". You can't explore the internet from google search you've got to have some vague starting point as a miniumum.
> offers orders of magnitude better UX than current-era browser
And they also did orders of magnitude less. If you've ever done UX, you would know that it gets exponentially harder as you add more features. People love to complain, but not a single person would realistically give up on all the new features they've got for the slightly better UX in certain edge cases.
If it truly was worse at being a map, people would still use physical maps, at least in some scenarios. I have never met a person who still uses a physical map for anything more than a wall decoration.
> That I don't get. Is it "elitist" now to point out that the (tech) "elite" can actually handle all this bullshit
It's easy to critique and complain, while at the same time doing it puts someone in a position of superiority. Saying "all the effort and knowledge out there is bullshit because my 1983 terminal searched text faster", in a way says that the person writing this knows better, and thus is elite. OP says it's disguised as disappointment, which I agree, because the author describes (and disguises) the situation from a frustration point-of-view.
But I also think that elitism can be traded with snobbery in this case.
>But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either.
You still have to waste a lot of time doing this crap. Last time I set up a printer for my mom, it was a PITA: I had to download drivers, install some giant pile of crap software, go through a bunch of screens of garbage, etc. Needless to say, her computer runs Windows 10.
By contrast, when I wanted to print to that same printer with my own laptop (running Linux Mint), it was simple: there was absolutely no extra software to install, and all I had to do was go to my Settings->Printers, search for printers on the network (took a few seconds), notice that it found the (WiFi) printer, verify it was an HP and whatever model, and then it was done.
Things could be much faster and simpler than they are now. They aren't, because it's not profitable for various actors for it to be, and because most users accept this. Most users are happy to have gigabytes of malware and spyware on their systems, wasting CPU cycles reporting their behavior back to Microsoft or their printer vendor or whoever, and that's a big reason things are so much slower than in 1983.
So if I'm being "elitist" to point out that spyware is a bad thing, then I'll happily accept that moniker.
Personally I would recommend Brother every time for a printer. They're not fancy but they are well built and don't require you to install a software suite. Roommate and I are still using the same laser printer from college. It's not wireless but if we really wanted to we could attach a pi and setup a printer pool but for now a long cable serves us just fine for the few times we need to print.
Last printer I had to configure on my Linux machine required finding a downloading a hidden mystery bash script from the manufacturer's website and running it with root permissions. Not exactly "plug and play" or "safe"
>Last time I set up a printer for my mom, it was a PITA: I had to download drivers, install some giant pile of crap software, go through a bunch of screens of garbage, etc. Needless to say, her computer runs Windows 10.
Weird. Printer drivers usually auto-install for me on Windows 10.
> So if I'm being "elitist" to point out that spyware is a bad thing, then I'll happily accept that moniker.
The article isn't talking about spyware. It's a chuntering rant about "Kids these days" with no analysis of why things are the way they are. Spyware doesn't fit, because it's doing exactly what it's meant to do, and the question of whether it pisses you off is secondary at best as long as you're convinced there's no alternative to the kinds of software which bundles spyware as part of you getting what you pay for.
Today basically every piece of external hardware is compatible with pretty much every computer. Better than that, they are plug and play for the most part. Thanks for reminding me to properly appreciate this.
I'm not convinced by your argument because the examples you cite of things having improved since these times have hardly anything to do with having a text interface feeling sluggish. There's no contradiction here, you could have the best of both world (and if you eschew web technologies you generally get that nowadays).
Are these modern Electron apps chugging along using multi-GB of RAM and perceptible user input latency because they're auto-configuring my printer in the background? Can twitter.com justify making me wait several seconds while it's loading literally megabytes of assets to display a ~100byte text message because it's auto-configuring my graphic card IRQ for me?
No, these apps are bloated because of they're easier to program that way. Justifying that using user convenience is a dishonest cop-out. If you rewrote these apps with native, moderately optimized code there's nothing that would make them less user-friendly. On the other hand they'd use an order of magnitude less RAM and would probably be a few orders of magnitude faster.
It's fine to say that developer time is precious and it's fine to do it that way but justifying it citing printer drivers of all things is just absurd.
> Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability? Remember when building a computer required work and research and took hours?
There are examples, but it's no longer the de facto experience. I have put PCs together from parts since about the 386 era, and there's absolutely no question that it's far smoother now than it ever was before.
You forgot to mention that computers have become at least ten-thousand times faster in the meantime. There really isn't a single objective reason why we should wait for anything at all for longer than a few milliseconds on a modern computing device to happen. I actually think that this is just a sign that things are only optimized until they are "fast enough", and this "fast enough" threshold is a human attribute that doesn't change much over the decades.
PS: the usability problems you mention have been specifically problems of the PC platform in the mid- to late-90s (e.g. the hoops one had to jump through when starting a game on a PC was always extremely bizarre from the perspective of an Amiga or AtariST user, or even C64 user).
> Remember autoexec.bat files? Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability?
Maybe my memory is bad but I seem to have the same issues today, just in different shapes.
The operating system choose to update itself in the middle of me typing a document and then 15 minutes later while in a conference meeting the app decides that a new version is immediately required. Then I open up a prompt and the text is barely readable because the screen DPI is somehow out of sync with something else and now the anti-aliasing (?) renders stuff completely broken. According to Google there are 31 ways to solve this but none of them works on my PC. Then all the Visual Studio addins decide to use a bit much memory so in the middle of the debug process the debugger goes on a coffee break and the just disappear. In the afternoon I need to update 32 NPM dependencies due to vulnerabilities and 3 had simply disappeared. Weird.
> Cherry picking the bad ones while ignoring the good ones to prove a point is boring and not reflective of the actual industry.
Can you give us examples of the "good ones" for the "bad" examples he cited.
> These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
No, he is saying that certain paradigms made computers fast in the past and instead of adopting and working progressively on them, we have totally abandoned them. He is not advocating embracing the past, wart and all, but only of what was good. He is not asking us to ditch Windows or Macs or the web and go back to our DOS / Unix era.
The original Kinetix 3D Studio MAX was NOT as slow as the current AutoDesk products. It had a loading time, but I can live with that.
With the speed of the M.2 SSDs we have today and everything else I really wonder why it got like this.
Maybe it's the transition to protected mode that did all this? Now everything has to be sanitized and copied between address spaces. But then again, Win95 were also protected mode. I don't know... :)
>But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either. I don't need to teach her arcane commands to get basic functionality out of her machine. Plug and play and usability happened
The slowness is orthogonal to all those things that have improved. We can put the fast back just fine while keeping those things.
I don't think you can, not really. Practically all the slowdown you see comes from increasing levels of abstractions, and basically all good abstractions have a performance cost.
Abstractions in general buy you faster development and more reliable/stable software. UI abstractions buy you intuitive and easy to use UX. All of these benefits define the modern era of software that has spawned countless valuable software programs, and it's naive to think you can skip the bill.
I would actually say that Microsoft Office suite was really good (up until recently when they started trying to funnel you into their cloud storage). But even with that blemish, the UX is still pretty good. Hitting the alt key let's you access any menu with just a few keystrokes that show up when you hit the alt key so they don't have to be memorized. I can also work a hell of a lot of magic in Excel without ever touching my mouse, and not needing to memorize every keyboard shortcut I need. I wish every app and desktop in every OS followed a similar standard for keyboard shortcuts.
Macros/VBA is still awful, but overall, despite the horrifying computational abuse I put office suite through, it's actually very stable!
And, it's fast enough once it loads. It's still pretty slow to load though.
Pretend the modern web made computers worse and then try Figma in the browser. It's glorious. Granted it's not written in React but it really shows what the web can be. It's BETTER than any desktop equivalent I've tried. It requires zero installation. Plugins are added instantly. It has native collaboration. It's scary fast. I'm not at all nostalgic about the old days.
Yes, it is a rant. Yes, the examples are terrible and there are plenty of counter examples. Yet there is also some truth to what is being said.
There is clearly going to be added complexity to address what we expect of modern computers. It is going to have a negative performance impact. If it isn't carefully planned, it is going to have a negative impact on UX.
Yet there is also a reasonable expectations for things to improve. The extra overhead in hardware and software should, in most cases, be more than compensated by the increase in performance of hardware. The UX should improve as we learn from the past and adapt for the future.
In many cases, we do see tremendous improvements. When a limitation is treated as a problem that needs to be addressed, we have seen vast improvements in hardware and software performance. Arguably, the same applies to UX. (Arguably, because measuring UX is much more opinionated than quantitative.)
Yet so much of what we get is driven by other motivations. If inadquate consideration is given to performance or UX, then the experience will be degraded.
I don't know if autoexec.bat was the most annoying thing from the 90s. (Although it was certainly annoying...)
My example of choice would be ISA, specifically configuring IRQs.
That's why UARTs back in the day are faster than USB: because your CPU would INSTANTLY software interrupt as soon as that ISA voltage changed. While today, USB 2.0 is poll-only, so the CPU effectively only asks USB once-per millisecond if "any data has arrived". (I think USB 3.0 has some bidirectional travel, but I bet most mice remain to be USB 2.0)
--------
For the most part, today's systems have far worse latency than the systems of the 80s. But two points:
1. Turns out that absolute maximum latency wasn't very useful in all circumstances. The mouse updated far quicker in the 80s through the Serial port than today... but the difference between a microsecond delay from a 80s-style Serial port and a modern-style USB millisecond update routine (traversing an incredibly complicated, multilayered protocol) is still imperceptible.
2. The benefits of the USB stack cannot be understated. Back in ISA days, you'd have to move jumper pins to configure IRQs. Put two different hardware on IRQ5 and your computer WILL NOT BOOT. Today, USB auto-negotiates all those details, so the user only has to plug the damn thing in, and everything autoworks magically. No jumper pins, no IRQ charts, no nothing. It just works. Heck, you don't even have to turn off your computer to add hardware anymore, thanks to the magic of modern USB.
-------
Adding hardware, like soundcards, new Serial Ports (to support mice/gamepads), parallel ports (printers), etc. etc. before plug-and-play was a nightmare. PCI and USB made life exponentially easier and opened up the world of computers to far more people.
Every now and then when I visit my parents... I find my dad's old drawer of serial-port gender-changers, null modems, baud-rate charts, 7n1 / 8n1 configuration details and say... "thank goodness computers aren't like that anymore".
Pick a web app that can render without JS. Then remove all JS and they will render 5x faster.
Some aspects of computers got better, sure. But the web is at an super shitty state. I mean were here posting on a website that has bare mininum design and features. Why is that? Twitter and reddit are absolutely horrible in terms of usability and performance. If you know that they could do better makes tham even worse. It is an attack on humanity. You can try to find excuses in unrelated things like noted, but that wont change anything.
I do wish shortcuts on web apps would be more universally implemented and more universally discoverable. I wish browsers had a common hotkey that would show available shortcuts, for example.
The biggest problem with his argument is that, done correctly, a mouse augments the keyboard and does not replace it. If you've ever seen a good print production artist, that person can absolutely FLY around a very visual program like InDesign... without leaving the keyboard.
I second this. Once one learns the keyboard shortcuts and builds the habit to learn shortcuts in general they won't be encumbered by small visual tasks like aiming lots of boxes like in menus and sub menus. These interactions micro-dent the flow. When I use the keyboard to navigate I'm not even consciously thinking what shortcuts to press, it happens naturally. I also noticed that I don't even know the shortcuts and have to think a bit before telling someone what shor-cut i use, my fingers remember them.
All this said, the software has to have a good shortcut flow in mind. Not all software offers this benefit.
Webapp need to stop implementing some shortcuts, if not most. It annoys me to no end when a site hijacks ctrl+f and presents their own crappy sesrch rather than just letting me search the page with Firefoxs built in search, which actually works.
Really, because two co-workers have told me they spent weekends trying to configure a printer and downgrading a graphics driver to get their graphics working again. Yes, they are engineers (mechanical, not software), but I have had my share of issues like this, and I started programming in 1978 on a Commodore PET. I do prefer lightweight interfaces, and my wife can't get my Maps app working while I am driving because she has a Samsung and I have a Blackberrry (KeyOne). Things look prettier, and there are a lot of advertisements, but I relate to a lot of these "cherry picked" rants.
About printers, it depends on manufacturer. Some printers are easy to setup. I know at the places I used to work whenever I went to printer setup and network I would see the printer very easily.
They say the fastest code is the code not written. I say the best printing is not needing it. I don't have a printer since I only need to print things every once in a while (few times a year). Whenever I need to print, I grab my usb stick put what I need on it and drive off to FedEx or UPS and print out what I need. Point and click, very simple. Cost of printers haven't justified buying one for me since it is a couple cents per year.
Agreed. Comparing UI responsiveness to yesteryear can be an interesting case study - examining precisely why things feel slower, to see whether they can be made better - but more often than not the answer is, "because we traded a bit of efficiency for an enormous wealth of usability, and iterability, and compatibility, that people never could have dreamed of back then".
Rants like this post are willfully ignorant of the benefits that a technological shift can have for millions of actual humans, simply because it offends the author's personal sensibilities.
The endless configuration pain you mention pretty perfectly describes anytime I try to dip into a modern framework and find myself spending hours and hours updating ruby or python or some random node module until all the dependencies finally align and I can write a single productive line of code.
Or when I spend 10 minutes fighting various websites just to get a dumb TV streaming site to login with the app it’s connected to.
I have to agree with the general premise: despite the phenomenal power of computers today, they generally feel just as irritating as they always have.
For a better comparison, I ran Windows XP in VirtualBox on a Macbook Air last year to interface with some old hardware.
I was surprised that it ran in something like 256 MB of RAM. And everything was super fast and responsive out of the box.
I consider OS X less janky than Windows or Linux these days, but Windows XP blew it out of the water.
Try it -- it's eye opening. The UI does basically 100% of what you can do on OS X, yet it's considerably more responsive on the same hardware, with the handicap of being virtualized, and having fewer resources.
> to a much wider audience and are much more usable now.
Doctors hate their life because of computers: https://www.newyorker.com/magazine/2018/11/12/why-doctors-ha..., but maybe you'll think the article is too old to be true or maybe you will quibble about the headline. Why not believe something true instead?
I was about to say same thing. I remember I had to recover some files off the diskette, and oh boy it was frustrating. That feeling when you start copying a megabyte-size file and just sit there staring at the screen for five minutes - we forgot how bad it used to be.
> These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
More to the point, I grew up with early Windows XP computers and early 90s Internet, and I don't remember the speed. Maybe I'm just too young and things were magically faster in the 80s? Maybe I was born just slightly before everything took a giant nosedive and became crap?
There are lots of things that annoy me about modern computers, but none of them take upwards of 2-3 minutes to boot any more. I remember loading screens in front of pretty much every app on my computer. A lot of my modern apps I use don't even have loading screens at all. I remember clicking buttons on apps and just kind of waiting, while the entire computer froze, for an operation to complete. Sometimes I'd start to do something complicated and just walk away and grab a snack because I literally couldn't use my computer while it ran.
There were entire joke websites like Zombocom set up to riff on how bad the loading screens were on the web back then. I would wait literally 10-15 minutes for Java apps like Runescape to load on a dial-up connection, despite the fact that the actual game itself played fine over that connection, and the delay was just due to dropping a giant binary that took no intelligent advantage of caching or asset splitting.
I can't imagine waiting 10-15 minutes for anything today.
I got a low-key allowance out of going to other people's houses and defragging their computers. Do you remember when Windows would just get slower over time because there was an arcane instruction you had to run every year or so to tell it to maintain itself?
> On the library computer in 1998 I could retry searches over and over and over until I found what I was looking for because it was quick
Now I have to wait for a huge page to load, wait while the page elements shift all over, GOD FORBID i click on anything while its loading
What library were you going to in 1998? I also did library searches, and they were insanely slow, and prone to the exact same "don't click while it loads" behavior that the author is decrying here. And that's if I was lucky, sometimes the entire search engine would just be a random Java app that completely froze while it was loading results. And forget about giving me the ability to run multiple searches in parallel across multiple tabs. Whatever #!$@X cookie setup or native app they were wired into could never handle that.
The modern database search interfaces I have today are amazing in comparison. I have annoyances, but you couldn't pay me to go back in time. A lot of those interfaces were actively garbage.
Again, maybe I'm just too young and everything took a nosedive before I was born. But even if that's the case, it seems to me that interfaces are rapidly improving from that nosedive, not continuing to slide downwards. The computing world I grew up in was slow.
>I also did library searches, and they were insanely slow, and prone to the exact same "don't click while it loads"
Not the person you asked, but around 2002 at my university library the computers were running a DOS like application, on the top part of the screen there were printed the commands and you can put your search and hit enter, after a few years it was replaced with a Win98 like GUI, you had to open the app if someone else closed it, then use the mouse to find the right dropdowns and select the options then input the search term then click Search. Before you would type something like "a=John Smith" hit enter and it would show all the books of that author.
The problem with us developers is that most of the time we are not heavy users of the applications we create, we create since test projects and simple test to check the application but our users might use the application many hours a day and all the little issues would add up.
> More to the point, I grew up with early Windows XP computers and early 90s Internet
He is talking about running local software before mainstream internet was a thing.
That is, locally installed software without a single line of network code in them.
MS Word started faster on my dads Motorola based Mac with a 7MHz CPU and super-slow spinning rust-drives than it does on my current PC with a 4GHz CPU and more L3 cache than the old Mac had RAM all together.
If you had used a Mac or an Amiga, you wouldn't have those bad memories. Instead, you would fondly remember an era when software was pristine.
There also was a time when people just started to use C++ and software was much buggier.
I wouldn't even say software today feels much slower across the board, but it definitely far more wasteful, given the resources.
You can see a lot of software that essentially does the same things as it did 20 years ago, but (relatively) much slower. Try using some older versions of e.g. Adobe software, you'll see how snappy it feels.
> almost everything on computers is perceptually slower than it was in 1983 .... amber-screen library computer in 1998: type in two words and hit F3. search results appear instantly. .... now: type in two words, wait for an AJAX popup. get a throbber for five seconds. oops you pressed a key, your results are erased
So we start with something 15 years AFTER the title as evidence of the "good" times, then make a vague anecdotal reference to something modern. I've seen POOR performances in places, but the majority of experiences I see now are faster than in 1998 and definitely than 1983. Faster AND more convenient.
> And it's worth noting that HYPERTEXT, specifically, is best with a mouse in a lot of cases. Wikipedia would suck on keyboard.
Um....no. It's less convenient than a mouse, but way better than the function key-based commands than the author lists.
I think there is a lot of room available to complain about terrible interfaces today, and in particular how choked everything is by the need to track and add advertising, but there's no actual evidence in this article, and it comes across as a rant and selective nostalgia.
> Most of the 1980-1995 cases could fit the entire datasets in CPU cache and be insanely fast.
They couldn't then. They had to fit it in RAM.
> Most things I query these days are in the gigabytes to terabytes range.
That still is in "fits in RAM on a typical PC" to "fits in SSD on a PC, fits in RAM on a server" range.
There's little excuse for the slowness of the current searching interfaces, even if your data is in gigabytes-to-terabytes. That's where the whole "a bunch of Unix tools on a single server will be an order of magnitude more efficient than your Hadoop cluster" articles came from.
>Um....no. [Wikipedia is] less convenient than a mouse, but way better than the function key-based commands than the author lists.
Okay, but slap on an extension, and Wikipedia is much more usable without a mouse! For example, VimFX/Vimium allow you to click links (and do a bunch of other stuff) from the keyboard, and that makes it tremendously more usable.
Of course, that’s only true because Wikipedia is standards compliant and doesn’t break extensions’ ability to see links, which is not universally true of websites.
How so? The article is based on an assertion that directly counteracts my memories. I remember using computerized library card catalogs in the 80s. I remember using AltaVista in the 90s. I remember using the Up/Down/Left/Right navigation (and full-screen re-renders) in Mapquest pre-google maps. I remember using the cgi-bin version of imdb.com. I can see how promptly things work now. I can see the additional benefits I get now.
If the author makes an assertion that contradicts my experiences, they better provide SOME evidence, or be dismissed (which is presumably NOT their intent).
When an area of engineering is new we make products better than the market will eventually pay for. Over time, we learn how to make them just good enough to reach an equilibrium in a market. This is an idea that is explored in _The World Without Us_ by Alan Wiesman. The Roman Colosseum still stands, but buildings today don't need to be that durable. It would cost significantly more to make them that way, so we don't. The same could be said about supersonic travel. We could do it again, but enough of us are happy to not pay the premium, so that product has disappeared.
The problem is, the market equilibrium is usually the worst possible garbage that's still fit enough for purpose that it won't break (on average) before the warranty period ends. Some of the money you save on this isn't really saved, you just pay for it in having to replace the product sooner, or being constantly stressed and irritated with the quality and fragility of your tooling (it's essentially death by a thousand cuts for your psyche).
So there is a reason to add some extra pressures to the market to raise the floor.
Cars today are pretty reliable. Random other stuff seems pretty decent as long as you don't buy the absolute cheapest stuff around. I'm working on a solid wood desk with no nails or screws (it's called a "puzzle" desk because it fits together in a stable configuration without any connectors). I'm looking at a backpack, which I assume is nylon, and it seems pretty tough and has lasted through a few adventures. I'm looking at my metal-framed bookcase, which has fake-wood shelves but it still seems pretty sturdy.
There are certainly some things that seem worse. Furniture is probably worse overall, but that's because good wood is so much more scarce now. If you buy metal/glass furniture than it's fine. There are also some worrisome practices that are pretty widespread, like using plastic-threaded screw holes rather than metal; or just using really cheap metal that can be stripped or cross-threaded easily.
On balance, where are we? I have a feeling that we're basically better off. We remember the old stuff that lasts, and forget about the stuff that breaks (unless it's still celebrated/historical, like the Liberty Bell).
> The same could be said about supersonic travel. We could do it again, but enough of us are happy to not pay the premium, so that product has disappeared.
There are indeed a lot of curious examples of computers seeming to go slower today than 20,30,40 years ago. But on the whole, I completely disagree with the summary. My laptop today is faster than anything I’ve ever used before today. It boots faster, it loads files faster, it responds faster, it does more simultaneous things, it crashes far less often.
My first computer was an IBM PC jr, and using it was an exercise in patience at all times. Maybe the author never played Ultimate III on an IBM PC. It was sooooo slow. The exact same was true of the computers my friends had, C-64, Atari 800, IBM PC. It probably took a full minute to boot. Maybe the author never saw the memory test PCs used to do before even starting to boot, or how slow floppy drives were. My first modem was 110 baud... that’s a whopping 15 7-bit bytes per second. Downloading a 40x40 character image (picture made out of keyboard characters) took a minute and a half. Downloading games routinely took hours. The PCjr hard crashed and needed reboots all the time. Even my later 486 would do the same thing. Rebooting was something you just did constantly, multiple times per hour. Today, almost never.
One thing this article completely avoids acknowledging is the general difference in functionality we have today than with computers in 1983. The database lookups and maps that were faster were faster precisely because they are 7 orders of magnitude smaller data. The article is comparing DBs and maps that fit in the main memory of a computer at the time to databases and maps that are out on the internet and don’t fit on any single computer. It’s amazing that we get the results as fast as we do.
"one of the things that makes me steaming mad is how the entire field of web apps ignores 100% of learned lessons from desktop apps"
Not perhaps 100% but sure, in the modern age of web apps for everything I miss having proper desktop apps.
Even more so, programming web apps is nowhere near as convenient as developing desktop apps in late 90's with tools like Delphi.
I remember that. It was very easy.. I sort of have that same feeling with .NET, but I wonder if it's still going to be a layer on top of windows C system functions which might just be wrappers for system calls, which require an expensive context switch.
One big difference between win95 and now is the security aspect. Another big difference is the tons of services running on Windows 10 now that is always there (such as YourPhone). On Android for some vendors you are forced to use FaceBook and it can't easily be removed.
https://news.ycombinator.com/item?id=15643663
And so on.
Dead Comment
Maybe some web apps are inconvenient, but you can easily setup a command line interface or Linux system and move blazing fast.
Well... not really. Terminal response times are way slower than they used to be in the 80's. Sure, we have greater throughput, but we also have greater latency. https://danluu.com/term-latency/
Do you have data / a citation to back up that statement? Most things seem faster to me. As long as the command line responds as faster as I can type, I don't care about it being faster.
That being said, overall speed and what can be handled by the system certainly does matter in many cases, even if it is _just throughput_. I've recently been running commands to search/process data from the command line recently
1. Search through a directory structure with millions of files (find)
2. filter down to ones containing a certain thing (grep)
3. find certain values in the files based on pattern matching (sed)
4. filter to files where one value equals another (test)
I can't imagine that would _ever_ finish running on a computer in the 80s. It works just fine on my current computer (albeit taking a minute to run).
""" It's caused by conhost.exe excessively hammering the registry to query the following two values:
HKCU\Software\Microsoft\Windows\CurrentVersion\ImmersiveShell\TabletMode HKCU\Software\Microsoft\Windows\DWM\ColorPrevalence Having previously reported this issue through Feedback Hub (to no avail), let me offer my observations from debugging this issue:
It's not specific to Dell's XPS series - I've been able to reproduce on any Windows 10 installation from version 1703 and up It only occurs when the console application writes output that causes the console to scroll It only occurs when the console application in question is in foreground/focus Each reg value mentioned above is queried 6 times, per scroll!!! """
The solution I use is to minimize the terminal when it's scrolling. When I compile my projects if I don't minimize the terminal it takes forever to finish.
Deleted Comment
Consider a VT100. It could only go up to 19200 baud, and couldn't actually handle constant streams at that speed.
However I do agree with the general message of the article. Current UX is a mess, websites are overblown by tons of javascript. There's no reason there should be loading spinners in this day and age with the power we have at our fingertips.
And while my connection is fast, it's a lot less reliable than the one I had 20 years ago, with about three 15-minute disconnects per day on average (although that has more to do with the general shit-quality of the Comcast monopoly than anything else).
I only reboot my machine for security updates now.
Of course, your point about connection speeds is entirely right - but I don't think anybody would honestly advocate just entirely going back to 56k. Rather, we really should be trying to answer the question of why using a computer feels much more sluggish (probably a better word to use than slow) than it did back in the day.
Yes, some of it is rose-tinted glasses and nostalgia, but most of what people discuss is worth taking into consideration. Nobody would seriously go and use a 20 year old computer unless they were making a statement of some sorts - and that statement is not "20 year old technology is entirely better than what we have today" but rather "why can't modern technology do things this 20 year old device can?"
Even then, once the thing did boot up, applications were rather responsive.
> ...you can easily setup a command line interface or Linux system and move blazing fast.
No, you can't. The majority of people uses software that is imposed upon them. The majority of work can not efficiently be expressed in CLIs.
It's the responsibility of software developers to write responsive software and in many cases, they have failed. Of course, with the inefficient web platform, their job is made difficult from the start.
We get to keep our shell utilities and CLIs, but they were robbed of tiny single-purpose GUI tools.
We get to keep our shell pipelines and perl scripts, but they're getting locked out of their Excel spreadsheets and VBA scripts.
Hell, with how things are going even automation tools like AutoHotKey are going to perish.
"People don't know how to code" or "users aren't tech-savvy enough for data processing and programming" are bullshit statements. They just happen to not code the way Silicon Valley does. And now these users are getting corralled into caged web apps and experiences that give them no flexibility at all and just treat them like children, all the way from the feature set to the presentation.
That's been the issue with every Windows OS. The cheap consumer hardware available at the time of the release was not enough to provide a snappy OS experience.
Unfortunately, hardware frequencies have sort of stalled, so now we're back to square 1 again, where W10 doesn't run snappy on median consumer hardware, but the hardware isn't getting faster at the same rate it was during the first decade of 2000's.
I have a Windows 7 machine for some legacy games and software, and it feels much faster on a 6-year old CPU (Haswell) than W10 does on a 1 year old CPU.
Maybe the HN crowd can but not your average or median computer users
> I remember turning on the computer and waiting for the motherboard company logo to finish flashing
I still have computers that take some time to boot. Issue is, I just rarely power off/on. Instead the computers go to sleep which if much faster to come out of.
Of all the tasks this is an uncommon one.
The median computer user and use cases from 1983 to 2020 are very different.
- linearity: logical steps are simple (boot, os, login, byte fetching)
- predictability: it can be non instant but I know what's coming
I had an old pentest distro with linux 2.4 on a key and a few recent ones using systemd and tailored DE. Whenever I used the old 2.4 era distro, maybe I had less capabilities overall, maybe a few glitches here and there, but somehow I felt in contact. Everything reacted as-is, throughput felt large (console output, file IO). Even the old sequential boot had a 'charm' to it. Even though systemd is so vastly superior in all aspects .. there was something that appealed to me. Weird.
Can disable the low-battery beeping; cannot disable the POST beep unless you switch to vendor logo mode.
(It's your standard old-style beep - just rendered through stereo speakers that have uncannily decent audio reproduction at that particular frequency for some reason. In a serene, quiet room.)
404 sense not found; please check the garbled and try again
I hypothesize that this is because programmers and other various Computer People consider "being on the computer" to be a satisfying goal in itself that sometimes has positive side effects in the real world, while everyone else simply has stockholm syndrome and has long since given up on even imagining a better experience.
It seems like a lot of this boils down to people getting stuck in a local maxima. If someone isn't used to zooming out to the big picture, are they more likely to solve problems in isolation and end up with layers of dependencies, npm installing all the things? Resulting in an unstable stack and less predictable ecosystem at large.
The other personality deals with abstractions, analogies and top-down thinking. When faced with an issue they'll start by defining goals and values. And it's easy to be blind to the details when you're thinking of the big picture and ask for things counter to the laws of physics.
Even if you're a two-minded kind of person and can deal with both generalities and specifics, it's incredibly hard to deal with both at once and quite a task to switch from the one mindset to the other.
Studying the history of CS and following it to its roots has been simultaneously invigorating because of the elegance of the fundamental ideas, but also disappointing, because of the line I can trace from the hopes and dreams of those pioneers to the current state of computing.
I'm hopeful that a new wave of software is just over the horizon. Perhaps I am naive, but I'm holding out against pessimism about the future for as long as I can.
FWIW, I don't care about the details, there are plenty individual examples of bad design today and slower software. I disagree completely with your larger thesis that things have gotten worse, and with the title in particular as a summary of your point. Why do you feel it's completely valid, and what does that mean exactly if you agree that many of the supporting examples are wrong?
And I also disagree with characterizing people who disagree with you as "Computer Person Thinking" and "Stockholm Syndrome". That charge seems doubly ironic, quite hypocritical, given the content of your tweet-storm, but I understand if you're feeling a bit attacked or defensive about this discussion. It's okay that it wasn't all true, and if this causes discussion that seems annoying, at least it's a discussion. We can continue to improve software and hardware and UX, and it's good to discuss it. It's just not true that it's gotten worse, and you don't really need to crap on today's software or the people who disagree with you to prove the point that we still have room for improvement. There's always room for improvement.
Anyway, computing is objectively faster today than in 1983 (I was there), and not by a little, by orders of magnitude, especially if you are fair and compare functionality apples to apples, but even if you only tally wait times for activities that seem similar on the surface but are completely different today (such as in-memory queries vs internet queries).
I don't see much objective data or measured results in your UX argument on the whole. Arguing that function keys are intuitive and that the mouse is useless is pure opinion based on what you like and being used to something, not something that can be shown by any large scale user studies to date, and history has already somewhat demonstrated the opposite, that many people prefer mice navigation to keyboards, and that function key workflows are for experts, not Twitter users, by and large.
"This text searching program that searches text against text is way faster and simpler than a fully rendered, interactive map of the entire world with accurate roads and precision at the foot / meter level."
No. Shit, really? Get out of town.
Yes, some very popular apps have bad UX. But some apps have incredible UX. Cherry picking the bad ones while ignoring the good ones to prove a point is boring and not reflective of the actual industry.
These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
Remember autoexec.bat files? Remember endless configuration to get one program working? Remember the computer just throwing its hands up and giving up when you gave it input that wasn't exactly what it expected? Remember hardware compatibility issues and how badly it affected system stability? Remember when building a computer required work and research and took hours? I do, and it wasn't fun then, it was a detriment to doing what you wanted. It wasn't a rite of passage, it was a huge pain in the ass.
So yeah, things are slower now, and I wanna go fast. But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either. I don't need to teach her arcane commands to get basic functionality out of her machine. Plug and play and usability happened, and while things feel slower, computers are now open to a much wider audience and are much more usable now.
These posts always have a slight stench of elitism disguised as disappointment.
It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going; it gives almost no affordances for exploration and cross-referencing.
> Remember when building a computer required work and research and took hours.
As someone who builds their own PC every couple years: it still does. It's actually worse now, due to the amount of products on the market and price segregation involved. Two PCs ago, I didn't have to use two parts compatibility tools, several benchmarking sites and a couple of friends, and didn't have to write CSS hacks for electronics stores, just to be able to assemble a cost-effective PC.
> But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either.
You don't? Printer drivers are only slightly less garbage than they were, but now there's also less knobs to turn if things go wrong. When my Windows 10 doesn't want to talk to a printer or a Bluetooth headset, all I get to see is a stuck progress bar.
Bottom line: I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades (arguably for business reasons: the easier it is for people to make sense of information, the harder it is for your sales tactics to work).
> These posts always have a slight stench of elitism disguised as disappointment.
That I don't get. Is it "elitist" now to point out that the (tech) "elite" can actually handle all this bullshit, but it's the regular Joes and Janes that get the short end of the stick?
As it turns out, that is probably the most popular use case for maps in the world.
Note also that for most smartphone users of Google Maps the use-case is actually much broader than that. The UI flow also totally accounts for users who only know where they are going—thanks to GPS and Google Maps knowing where you are often isn't necessary.
I'm confused by the complaint that the "Maps" app only caters to the 90-percentile use case for maps, but doesn't cover the other uses-cases well.
> I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades
I just find this not the case at all. For expert-users the tools that existed decades ago are still there and still usable. Or you can craft your own!
For non-expert users the information in the world is orders of magnitude more accessible than it used to be.
I know of an ERP system that somehow manages to take about 15 seconds to search an inventory of ~100k items. If you export all those items to CSV, with all their attributes (most of which are not searched), the resulting file is about 15 MB.
It is boggling how they managed to implement search this slowly (in a C++ application using MS SQL as the backend). 3.5 GHz computers, performing plain text string search at about 1 MB/s.
It is even more surprising that users feel this is not an completely unreasonable speed.
(They managed to pull this stunt off by completely not using the SQL database in the intended way, i.e. all tables are essentially (id, blob) tuples, where the blob is a zlib compressed piece of custom TLV encoded data. All data access goes through a bunch of stored procedures, which return data in accordance to a sort of "data extraction string". Search works by re-implementing inverted indices in tables of (word, offset, blob), where blob contains a zlib compressed list of matching IDs; again processed by stored procedures. The client then is wisely implemented using MS SQL's flavour of LIMIT queries which effectively cause a classic quadratic slowdown because the database engine literally has no way to fetch result rows n...m except by constructing the entire result set up to m.
Unsurprisingly the developers of this abomination claim to be competent. They also invented a funny data exchange format involving fixed field lengths and ASCII separators - some time in the 2010s.)
And if I had had a map application (I'm sure they existed) it would have taken several minutes to load from cassette tape. That's not faster than Google Maps, either perceptually or objectively.
What are you talking about? How in the world is a DOS that can only run a single app at a time better than a system that can run dozens of apps at once? How is non-multitasking a better experience? I remember DOS pretty well. I remember trying to configure Expanded Memory vs Extended Memory. Having to wait for dial-up to literally dial up the target machine.
Edit: I didn't realize the poster was talking about Point-of-Sale devices. So the above rant is aimed incorrectly.
> It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going;
That's exactly what it's made for.
> As someone who builds their own PC every couple years: it still does. It's actually worse now,
No way. Things just work nowadays. Operating Systems have generic drivers that work well. Its so much easier now to build a machine than it was years ago. I remember taking days to get something up and running, but now its minutes. Maybe an hour?
I really hate these "good old days" posts, no matter what the subject. The days of the past weren't better, there were just fewer choices.
Command-line systems simply do not map well to a lot of users' mental models. In particular, they suffer from very poor discoverability.
It is true that if you are trained up on a command-line system you can often do specific actions and commands quickly but without training and documentation, it is often quite hard to know what to do and why. Command-line systems also provide feedback that is hard for many people to understand.
Here are some guidelines for thoughtful product designs. DOS-era systems often violate quite a few of these: https://uxdesign.cc/guidelines-for-thoughtful-product-design...
Yes, it is true that many current-era systems violate these guidelines as well. This is because UX has become too visual design focused in recent years, but that's a symptom of execs not understanding the true value of design.
These aren't orthogonal in any way. A significant chunk of modern performance hit relative to older Von Neumann architectures is the cross-validation, security models, encapsulation, abstraction, and standardization that make BSODs an extremely unexpected event indicative of your hardware physically malfunctioning that they have become (as opposed to "Oops, Adobe reverse-engineered a standard API call and botched forcing bytes directly into it when bypassing the call library's sanity-checking; time to hard-lock the whole machine because that's the only way we can deal with common errors in this shared-memory, shared-resource computing architecture!").
As someone who helped with the transition from crappy text-only DOS interfaces on POSes to graphical interfaces on POSes, I have to disagree. Learning those interfaces was terrible. I don't know where the author gets the idea that even a beginner had no trouble learning them. I worked at NCR during the switchover from the old system to the new one that used graphical representations of things and the new system was way easier to use and for beginners to learn. And that's not just something we felt, we measured it. (Interestingly, it was still DOS-based in its first incarnation, but it was entirely graphical.)
There is too much stuff in the world to present on a map without absolutely overwhelming people, in the same way that the internet is too vast to "explore". You can't explore the internet from google search you've got to have some vague starting point as a miniumum.
Deleted Comment
And they also did orders of magnitude less. If you've ever done UX, you would know that it gets exponentially harder as you add more features. People love to complain, but not a single person would realistically give up on all the new features they've got for the slightly better UX in certain edge cases.
could you explain that?
It's easy to critique and complain, while at the same time doing it puts someone in a position of superiority. Saying "all the effort and knowledge out there is bullshit because my 1983 terminal searched text faster", in a way says that the person writing this knows better, and thus is elite. OP says it's disguised as disappointment, which I agree, because the author describes (and disguises) the situation from a frustration point-of-view.
But I also think that elitism can be traded with snobbery in this case.
You still have to waste a lot of time doing this crap. Last time I set up a printer for my mom, it was a PITA: I had to download drivers, install some giant pile of crap software, go through a bunch of screens of garbage, etc. Needless to say, her computer runs Windows 10.
By contrast, when I wanted to print to that same printer with my own laptop (running Linux Mint), it was simple: there was absolutely no extra software to install, and all I had to do was go to my Settings->Printers, search for printers on the network (took a few seconds), notice that it found the (WiFi) printer, verify it was an HP and whatever model, and then it was done.
Things could be much faster and simpler than they are now. They aren't, because it's not profitable for various actors for it to be, and because most users accept this. Most users are happy to have gigabytes of malware and spyware on their systems, wasting CPU cycles reporting their behavior back to Microsoft or their printer vendor or whoever, and that's a big reason things are so much slower than in 1983.
So if I'm being "elitist" to point out that spyware is a bad thing, then I'll happily accept that moniker.
Weird. Printer drivers usually auto-install for me on Windows 10.
The article isn't talking about spyware. It's a chuntering rant about "Kids these days" with no analysis of why things are the way they are. Spyware doesn't fit, because it's doing exactly what it's meant to do, and the question of whether it pisses you off is secondary at best as long as you're convinced there's no alternative to the kinds of software which bundles spyware as part of you getting what you pay for.
Deleted Comment
Are these modern Electron apps chugging along using multi-GB of RAM and perceptible user input latency because they're auto-configuring my printer in the background? Can twitter.com justify making me wait several seconds while it's loading literally megabytes of assets to display a ~100byte text message because it's auto-configuring my graphic card IRQ for me?
No, these apps are bloated because of they're easier to program that way. Justifying that using user convenience is a dishonest cop-out. If you rewrote these apps with native, moderately optimized code there's nothing that would make them less user-friendly. On the other hand they'd use an order of magnitude less RAM and would probably be a few orders of magnitude faster.
It's fine to say that developer time is precious and it's fine to do it that way but justifying it citing printer drivers of all things is just absurd.
Literally all of these are still a thing...
PS: the usability problems you mention have been specifically problems of the PC platform in the mid- to late-90s (e.g. the hoops one had to jump through when starting a game on a PC was always extremely bizarre from the perspective of an Amiga or AtariST user, or even C64 user).
but in the old days, I hardly ever remember having more than 1 to 2 apps open at a time. Maybe 3 if was feeling crazy.
...Right now, I've spotify, Word, Excel, Exchange, emacs, slack, mattermost, pycharm, and matlab, acrobat, and firefox (20+ tabs)... all open at once.
Maybe my memory is bad but I seem to have the same issues today, just in different shapes.
The operating system choose to update itself in the middle of me typing a document and then 15 minutes later while in a conference meeting the app decides that a new version is immediately required. Then I open up a prompt and the text is barely readable because the screen DPI is somehow out of sync with something else and now the anti-aliasing (?) renders stuff completely broken. According to Google there are 31 ways to solve this but none of them works on my PC. Then all the Visual Studio addins decide to use a bit much memory so in the middle of the debug process the debugger goes on a coffee break and the just disappear. In the afternoon I need to update 32 NPM dependencies due to vulnerabilities and 3 had simply disappeared. Weird.
Can you give us examples of the "good ones" for the "bad" examples he cited.
> These posts fondly remember just the speed, but always seem to forget the frustrations, or re-imagine them to be something we treasured.
No, he is saying that certain paradigms made computers fast in the past and instead of adopting and working progressively on them, we have totally abandoned them. He is not advocating embracing the past, wart and all, but only of what was good. He is not asking us to ditch Windows or Macs or the web and go back to our DOS / Unix era.
With the speed of the M.2 SSDs we have today and everything else I really wonder why it got like this.
Maybe it's the transition to protected mode that did all this? Now everything has to be sanitized and copied between address spaces. But then again, Win95 were also protected mode. I don't know... :)
The slowness is orthogonal to all those things that have improved. We can put the fast back just fine while keeping those things.
Abstractions in general buy you faster development and more reliable/stable software. UI abstractions buy you intuitive and easy to use UX. All of these benefits define the modern era of software that has spawned countless valuable software programs, and it's naive to think you can skip the bill.
OP gave examples. Could you provide a couple of examples as well?
Macros/VBA is still awful, but overall, despite the horrifying computational abuse I put office suite through, it's actually very stable!
And, it's fast enough once it loads. It's still pretty slow to load though.
[0] https://en.wikipedia.org/wiki/Autoconfig
There is clearly going to be added complexity to address what we expect of modern computers. It is going to have a negative performance impact. If it isn't carefully planned, it is going to have a negative impact on UX.
Yet there is also a reasonable expectations for things to improve. The extra overhead in hardware and software should, in most cases, be more than compensated by the increase in performance of hardware. The UX should improve as we learn from the past and adapt for the future.
In many cases, we do see tremendous improvements. When a limitation is treated as a problem that needs to be addressed, we have seen vast improvements in hardware and software performance. Arguably, the same applies to UX. (Arguably, because measuring UX is much more opinionated than quantitative.)
Yet so much of what we get is driven by other motivations. If inadquate consideration is given to performance or UX, then the experience will be degraded.
I don't know if autoexec.bat was the most annoying thing from the 90s. (Although it was certainly annoying...)
My example of choice would be ISA, specifically configuring IRQs.
That's why UARTs back in the day are faster than USB: because your CPU would INSTANTLY software interrupt as soon as that ISA voltage changed. While today, USB 2.0 is poll-only, so the CPU effectively only asks USB once-per millisecond if "any data has arrived". (I think USB 3.0 has some bidirectional travel, but I bet most mice remain to be USB 2.0)
--------
For the most part, today's systems have far worse latency than the systems of the 80s. But two points:
1. Turns out that absolute maximum latency wasn't very useful in all circumstances. The mouse updated far quicker in the 80s through the Serial port than today... but the difference between a microsecond delay from a 80s-style Serial port and a modern-style USB millisecond update routine (traversing an incredibly complicated, multilayered protocol) is still imperceptible.
2. The benefits of the USB stack cannot be understated. Back in ISA days, you'd have to move jumper pins to configure IRQs. Put two different hardware on IRQ5 and your computer WILL NOT BOOT. Today, USB auto-negotiates all those details, so the user only has to plug the damn thing in, and everything autoworks magically. No jumper pins, no IRQ charts, no nothing. It just works. Heck, you don't even have to turn off your computer to add hardware anymore, thanks to the magic of modern USB.
-------
Adding hardware, like soundcards, new Serial Ports (to support mice/gamepads), parallel ports (printers), etc. etc. before plug-and-play was a nightmare. PCI and USB made life exponentially easier and opened up the world of computers to far more people.
Every now and then when I visit my parents... I find my dad's old drawer of serial-port gender-changers, null modems, baud-rate charts, 7n1 / 8n1 configuration details and say... "thank goodness computers aren't like that anymore".
Some aspects of computers got better, sure. But the web is at an super shitty state. I mean were here posting on a website that has bare mininum design and features. Why is that? Twitter and reddit are absolutely horrible in terms of usability and performance. If you know that they could do better makes tham even worse. It is an attack on humanity. You can try to find excuses in unrelated things like noted, but that wont change anything.
The biggest problem with his argument is that, done correctly, a mouse augments the keyboard and does not replace it. If you've ever seen a good print production artist, that person can absolutely FLY around a very visual program like InDesign... without leaving the keyboard.
They say the fastest code is the code not written. I say the best printing is not needing it. I don't have a printer since I only need to print things every once in a while (few times a year). Whenever I need to print, I grab my usb stick put what I need on it and drive off to FedEx or UPS and print out what I need. Point and click, very simple. Cost of printers haven't justified buying one for me since it is a couple cents per year.
Rants like this post are willfully ignorant of the benefits that a technological shift can have for millions of actual humans, simply because it offends the author's personal sensibilities.
Or when I spend 10 minutes fighting various websites just to get a dumb TV streaming site to login with the app it’s connected to.
I have to agree with the general premise: despite the phenomenal power of computers today, they generally feel just as irritating as they always have.
I was surprised that it ran in something like 256 MB of RAM. And everything was super fast and responsive out of the box.
I consider OS X less janky than Windows or Linux these days, but Windows XP blew it out of the water.
Try it -- it's eye opening. The UI does basically 100% of what you can do on OS X, yet it's considerably more responsive on the same hardware, with the handicap of being virtualized, and having fewer resources.
Deleted Comment
Doctors hate their life because of computers: https://www.newyorker.com/magazine/2018/11/12/why-doctors-ha..., but maybe you'll think the article is too old to be true or maybe you will quibble about the headline. Why not believe something true instead?
Name 10. Some variety (Android, Windows, Linux, GUI, CLI) preferred.
More to the point, I grew up with early Windows XP computers and early 90s Internet, and I don't remember the speed. Maybe I'm just too young and things were magically faster in the 80s? Maybe I was born just slightly before everything took a giant nosedive and became crap?
There are lots of things that annoy me about modern computers, but none of them take upwards of 2-3 minutes to boot any more. I remember loading screens in front of pretty much every app on my computer. A lot of my modern apps I use don't even have loading screens at all. I remember clicking buttons on apps and just kind of waiting, while the entire computer froze, for an operation to complete. Sometimes I'd start to do something complicated and just walk away and grab a snack because I literally couldn't use my computer while it ran.
There were entire joke websites like Zombocom set up to riff on how bad the loading screens were on the web back then. I would wait literally 10-15 minutes for Java apps like Runescape to load on a dial-up connection, despite the fact that the actual game itself played fine over that connection, and the delay was just due to dropping a giant binary that took no intelligent advantage of caching or asset splitting.
I can't imagine waiting 10-15 minutes for anything today.
I got a low-key allowance out of going to other people's houses and defragging their computers. Do you remember when Windows would just get slower over time because there was an arcane instruction you had to run every year or so to tell it to maintain itself?
> On the library computer in 1998 I could retry searches over and over and over until I found what I was looking for because it was quick Now I have to wait for a huge page to load, wait while the page elements shift all over, GOD FORBID i click on anything while its loading
What library were you going to in 1998? I also did library searches, and they were insanely slow, and prone to the exact same "don't click while it loads" behavior that the author is decrying here. And that's if I was lucky, sometimes the entire search engine would just be a random Java app that completely froze while it was loading results. And forget about giving me the ability to run multiple searches in parallel across multiple tabs. Whatever #!$@X cookie setup or native app they were wired into could never handle that.
The modern database search interfaces I have today are amazing in comparison. I have annoyances, but you couldn't pay me to go back in time. A lot of those interfaces were actively garbage.
Again, maybe I'm just too young and everything took a nosedive before I was born. But even if that's the case, it seems to me that interfaces are rapidly improving from that nosedive, not continuing to slide downwards. The computing world I grew up in was slow.
Not the person you asked, but around 2002 at my university library the computers were running a DOS like application, on the top part of the screen there were printed the commands and you can put your search and hit enter, after a few years it was replaced with a Win98 like GUI, you had to open the app if someone else closed it, then use the mouse to find the right dropdowns and select the options then input the search term then click Search. Before you would type something like "a=John Smith" hit enter and it would show all the books of that author.
The problem with us developers is that most of the time we are not heavy users of the applications we create, we create since test projects and simple test to check the application but our users might use the application many hours a day and all the little issues would add up.
He is talking about running local software before mainstream internet was a thing.
That is, locally installed software without a single line of network code in them.
MS Word started faster on my dads Motorola based Mac with a 7MHz CPU and super-slow spinning rust-drives than it does on my current PC with a 4GHz CPU and more L3 cache than the old Mac had RAM all together.
Modern software really is immensely slow.
There also was a time when people just started to use C++ and software was much buggier.
I wouldn't even say software today feels much slower across the board, but it definitely far more wasteful, given the resources.
You can see a lot of software that essentially does the same things as it did 20 years ago, but (relatively) much slower. Try using some older versions of e.g. Adobe software, you'll see how snappy it feels.
So we start with something 15 years AFTER the title as evidence of the "good" times, then make a vague anecdotal reference to something modern. I've seen POOR performances in places, but the majority of experiences I see now are faster than in 1998 and definitely than 1983. Faster AND more convenient.
> And it's worth noting that HYPERTEXT, specifically, is best with a mouse in a lot of cases. Wikipedia would suck on keyboard.
Um....no. It's less convenient than a mouse, but way better than the function key-based commands than the author lists.
I think there is a lot of room available to complain about terrible interfaces today, and in particular how choked everything is by the need to track and add advertising, but there's no actual evidence in this article, and it comes across as a rant and selective nostalgia.
Most of the 1980-1995 cases could fit the entire datasets in CPU cache and be insanely fast.
Most things I query these days are in the gigabytes to terabytes range.
Lastly, we have to make them secure,especially against malformed data attempting to attack the app, which eats a lot of CPU cycles.
They couldn't then. They had to fit it in RAM.
> Most things I query these days are in the gigabytes to terabytes range.
That still is in "fits in RAM on a typical PC" to "fits in SSD on a PC, fits in RAM on a server" range.
There's little excuse for the slowness of the current searching interfaces, even if your data is in gigabytes-to-terabytes. That's where the whole "a bunch of Unix tools on a single server will be an order of magnitude more efficient than your Hadoop cluster" articles came from.
How big do you think CPU caches were at the time? CPUs towards the start of the era didn't even have caches.
Okay, but slap on an extension, and Wikipedia is much more usable without a mouse! For example, VimFX/Vimium allow you to click links (and do a bunch of other stuff) from the keyboard, and that makes it tremendously more usable.
Of course, that’s only true because Wikipedia is standards compliant and doesn’t break extensions’ ability to see links, which is not universally true of websites.
You're saying that as if somehow that was a problem, but it's not.
If the author makes an assertion that contradicts my experiences, they better provide SOME evidence, or be dismissed (which is presumably NOT their intent).
So there is a reason to add some extra pressures to the market to raise the floor.
Cars today are pretty reliable. Random other stuff seems pretty decent as long as you don't buy the absolute cheapest stuff around. I'm working on a solid wood desk with no nails or screws (it's called a "puzzle" desk because it fits together in a stable configuration without any connectors). I'm looking at a backpack, which I assume is nylon, and it seems pretty tough and has lasted through a few adventures. I'm looking at my metal-framed bookcase, which has fake-wood shelves but it still seems pretty sturdy.
There are certainly some things that seem worse. Furniture is probably worse overall, but that's because good wood is so much more scarce now. If you buy metal/glass furniture than it's fine. There are also some worrisome practices that are pretty widespread, like using plastic-threaded screw holes rather than metal; or just using really cheap metal that can be stripped or cross-threaded easily.
On balance, where are we? I have a feeling that we're basically better off. We remember the old stuff that lasts, and forget about the stuff that breaks (unless it's still celebrated/historical, like the Liberty Bell).
That and sonic booms.
My first computer was an IBM PC jr, and using it was an exercise in patience at all times. Maybe the author never played Ultimate III on an IBM PC. It was sooooo slow. The exact same was true of the computers my friends had, C-64, Atari 800, IBM PC. It probably took a full minute to boot. Maybe the author never saw the memory test PCs used to do before even starting to boot, or how slow floppy drives were. My first modem was 110 baud... that’s a whopping 15 7-bit bytes per second. Downloading a 40x40 character image (picture made out of keyboard characters) took a minute and a half. Downloading games routinely took hours. The PCjr hard crashed and needed reboots all the time. Even my later 486 would do the same thing. Rebooting was something you just did constantly, multiple times per hour. Today, almost never.
One thing this article completely avoids acknowledging is the general difference in functionality we have today than with computers in 1983. The database lookups and maps that were faster were faster precisely because they are 7 orders of magnitude smaller data. The article is comparing DBs and maps that fit in the main memory of a computer at the time to databases and maps that are out on the internet and don’t fit on any single computer. It’s amazing that we get the results as fast as we do.
You nailed it.
Even more so, programming web apps is nowhere near as convenient as developing desktop apps in late 90's with tools like Delphi.
One big difference between win95 and now is the security aspect. Another big difference is the tons of services running on Windows 10 now that is always there (such as YourPhone). On Android for some vendors you are forced to use FaceBook and it can't easily be removed.