For most people, technology is a haunted house riddled with unpleasant surprises. With no agency, they are at the mercy of other people's bad ideas that keep changing. Everything needs to be updated because everything else needs to be updated, because everything needs to be updated. Duh!
Software updates! Guess what! Here's a new UI for ya. We moved all the stuff! It's like someone threw you a surprise birthday party, but not on your birthday, on their birthday, and their idea of the best gift evar is to hire an interior designer (for free! lucky you!) who completely rearranges your house inside and out and springs it on you after you return from the grocery store. And there's no going back.
At first it was exciting--when I was 15--then it's slightly bothersome, then downright annoying, then it's infuriating, then it's just tiring. Your brain learns that there is no point in learning anything anymore because they're just going to scramble it again anyway. Learned helplessness. People age out, ended up feeling old and useless and jaded because their skillset is completely inapplicable, after just a few years.
I logged into Mailchimp yesterday and found that they moved the header navigation to the left side.
Instead of the previous menu option words like Campaigns or Audience there were icons signifying each that I had to hover over to figure out what they might mean. Then when I went to my Reports the css breakpoints seemed to be wonky making that screen hard to read and use.
Half-jokingly It almost feels like constantly confusing people is a trick to boost engagement temporarily while people are forced to figure things out.
To add to that, now that I'm a retired lifelong techie I realize why "old folks" back in the day would hesitate to give up the old, outdated software that they knew how to use.
E.g. I'd prod older friends and family to give up wordperfect - which they knew and loved - in order to progress to the feature-rich-new MS WORD.
Now I'm a linux advocate with its archaic terminal commands and I can empathize with anyone who wants their laptop, phone, TV, microwave, etc. to stop evolving!!
Water under the bridge now, but I bet you did some of them a real disservice.
Wordperfect had "reveal codes", so when the WYSIWYG gave you something you didn't want, you could pop open the actual document representation and wrangle the tags until you What You Want Is What You See.
MS Word has no such function, so when it screws you, and it does, you're good and screwed.
Linux is also far from stable. There is the mess that is Linux desktops like Gnome 2, Gnome 3, Unity (okay, this was only an Ubuntu escapade). The init system changed and the result is that you have to think about things you usually don't want to. There's things like Snap and Flatpack, which pretend to make things easier, but ultimately lead to more complexity...
> I’ve always described it as “the design team justifying their own existence after the job is done.”
I actually think that's really what is going on. Wish I had first hand evidence though.
I do know of a tangential phenomenon at a friend's work place. Her org has a dedicated build tools team. So every 6 months every project's build infrastructure needs to change to something entirely new, because the build tools team keeps having to justify its existence.
I don't know why a company would let this sort of thing happen. It's a massive waste of time for every team.
(Late to the party but) Yes, this, absolutely this. It's almost a rule now that, above some very low threshold, the more expertise and hours you throw at UX, the worse the UX is.
Some of the most annoying UX I've had is on Quora, Facebook, and the reddit redesign, which all spend a veritable fortune on it, while the best ones I've seen are something a non-specialist slapped together with bootstrap.
The thing is, I do not really hate tech, if UNIX, and UNIX alone (no GUI), is considered "tech". Most of the programs in the freely available open-source UNIX OS I use do NOT need to be updated. They just keep working and are quite reliable (at least compared to their GUI alternatives).
I do sometimes wish that there could be alternative (not "replacement") ways to do things we use "tech" to do today, where the alternatives only required UNIX (no GUI). This way if we get frustrated with a graphical UI, and myriad "updates", we can just do these things the "old-fashioned way", with comparatively smaller, simpler, command line UNIX programs.
To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users. In fact, some of them are passionate about these aspects of using a computer.
Adam Savage was talking about a scribing tool for machining which was very expensive, but which he likes very much [0].
Before recommending it, however, he felt it important to mention that for people who don't machine very much, far cheaper scribes work well because unless it's your job, your tooling is less likely to be the bottleneck, and you have fewer resources. When you machine professionally, you're tooling is likely your bottleneck and you've more resources.
I think this holds for tech and software. Think of resources here as "time spent learning APIs, bash, and remembering tar mnemonics".
At first, dragging and dropping folders isn't going to be your bottleneck. Need to move 1000s of folders scattered on the hard-drive? If you're not using a terminal, you'll be in trouble.
Everyone cares about UX, it's their experience when using tech.
It's just that GUIs are better for some contexts than others.
> To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users.
... what? Are you suggesting computer users in 2020 - which includes everyone from your nana on her iPhone to a toddler watching YouTube on a tablet - want to use CLIs, and are being forced by baddie developers into using apps?
Remember that "alternative" is not the same as "replacement". This is similar to the idea of "more than one way to do it" in computer languages. Users have freedom to choose. Here, one of the ways is without GUI, using UNIX. Only applies where the task does not inherently require graphics.
> For most people, technology is a haunted house riddled with unpleasant surprises.
I'd change that to: "For most people, corporate neoliberal technology is a haunted house riddled with unpleasant surprises."
Writing that recognizes that we live with the most un-free market of all time:
"We are in the middle of a global transformation. What that means is, we're seeing the painful construction of a global market economy. And over the past 30 years neoliberalism has fashioned this system. Markets have been opened, and yet intellectual property rights have ensured that a tiny minority of people are receiving most of the income." [1]
And:
"How can politicians look into TV cameras and say we have a free market system when patents guarantee monopoly incomes for twenty years, preventing anyone from competing? How can they claim there are free markets when copyright rules give a guaranteed income for seventy years after a person’s death? How can they claim free markets exist when one person or company is given a subsidy and not others, or when they sell off the commons that belong to all of us, at a discount, to a favoured individual or company, or when Uber, TaskRabbit and their ilk act as unregulated labour brokers, profiting from the labour of others?" [2]
I have worked for software companies for over 25 years, mostly on teams building software, and I hate software. I find bugs in every software I use (my freaking microwave oven control panel!). In addition to questionable quality, software is often downright hostile (lose all the data you typed into a web form if you accidentally backspace while not in a text field, because it navigates off the page). Ironically software engineering tools (build systems, etc.) are some of the worst. I don’t know what has to happen for people to stop tolerating software as it is.
My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.
My house decays days after day. Floors need constant cleaning, wall have holes from small impacts, paint contains inedible fragments and disperse nocive gas.
Bees are building nests on my balcony and it’s definitely not what it was built for, nor where they should be.
I live in an old house, and routinely discover ugly hacks that were done by the previous owner, presumably due to laziness, cost or just lack of skill. For example, they buried tons of stuff (toys, furniture, water heater etc) in the backyard and built a terrace on top of the pile to cover it up, apparently because they were too lazy to take it to the dump. The terrace decayed, so I had to tear it down, but in doing so I had to clean up their mess so I could actually use the garden. I'm not annoyed at the planks for decaying, as that is to be expected, just like you are expected to e.g. keep up with third party dependencies that you have chosen to include. Discovering a mess like the one I found in my garden, however, evoked the same feelings in me as when I look at a badly written code base and just wonder how anyone could ship something of such low quality to paying customers.
I guess my point is that there is a difference between things sucking because of the laws of nature, and things sucking because of incompetence, laziness or indifference.
When is the last time leaving your keys in the car caused your house to suddenly slide 10 foot southwest?
When is the last time you flipped a light switch, and suddenly your pool disappeared?
Have you ever had French doors appear in your dining room because of a "Windows Update" on Wednesday morning?
Have you ever had to wait for half an hour for your house to boot later on that same Wednesday?
When is the last time you closed a door, and were killed by a hailstorm of bowling balls?
At least with a light switch, you know it's very unlikely to cause structural issues, or plumbing issues, or drain your bank account. Computers are singularly horrible in the ways things can fail.
I think the difference is that the entire software/hardware stack is a world created entirely by humans, untouched by "reality" for the purposes of all these annoyances, so it feels like we should be able to wrangle it better after so many decades. It's entirely our own creation, and we decide every iota of it, and yet it bites us (justifiably or not - turns out thousands of people each creating different layers of a gigantic state machine is hard to perfectly coordinate, but we may have been able to do better by now if we had been more thoughtful and patient throughout).
No. A hardware product like a car has predictable wear and tear governed mainly by the laws of physics. The fact that I can no longer use my smart speaker because the manufacturer decided to stop supporting it, went out of business, or got bought is not at all the same. My car will still work through all of those things in the exact same way. It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it. Not the same at all.
Sure there are things that don’t work, but it’s not nearly comparable. In my life I’ve never had a problem with a refrigerator, had maybe two problems with washer/dryer, my closet Just Works, books Just Work (with an ink smudge maybe every 50-100 reading hours) etc. I can expect each of those things to last decades. Looking at the goods I’ve bought on Amazon recently, digital electronics/software as a category doesn’t hold a candle to everything else I buy in terms of reliability.
To me, software is as if when I open a book to read it, then, the book suddenly snaps itself shut, hurting my fingers.
Thereafter, the book gets wings, tries to fly away, but bumps into my coffee mug on my desk, so coffee spills on the floor. Then the book does fly out through the closed window — smashing the glass into pieces — and gets larger and larger wings, flying higher and higher until it disappears into the blue sky.
It's as if software was alive, and does random things I don't always agree with.
But actually — the bees building nests on the balcony: That feels pretty close to misbehaving software. Or the cat, bringing in a snake from outdoors. Or a squirrel shewing through a power cable, shutting down a city.
There is a difference between design trade offs and flawed design of deviations from the design. Your car does what it’s supposed to do, within the predictable limits imposed by the fact that it’s a gas-powered car. Since MacOS 10.15.4 or .5, my 16” MacBook Pro crashes waking up from sleep due to some glitch in waking up the GPU.
Of course, people perceive that software sucks because it’s more complicated than people perceive. I forget what book said it, but an operating system has more separate components than an aircraft carrier and they’re more tightly coupled. (I’m not sure that’s true, but it conveys the idea.)
Houses, cars, etc are far more reliable and well designed than software. Think about all the extreme conditions cars continue to function in. How many people don't even follow basic maintenance schedules?
Another key difference is that in maintenance of your home, you have complete control. It's extremely easy to understand and act to improve or maintain it. When large software systems (like the IRS login) have problems, you are totally helpless.
Cars vary widely in their product quality. Houses vary widely in their product quality. Some things in life are inevitable facts of nature, but product quality is not. Quality is to a large extent determined by the time and care taken by the manufacturer.
>My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.
That's not a good example, nor is it parallel to the dynamic the article describes.
Your car stinks a lot less than cars did 10/30/50 years ago (emits less in the way of pollutants or CO2 per mile driven), is less likely to kill you in a crash involving the same size cars/velocities (despite weighing less!), needs less maintenance, lasts longer, and can notify you of potential collisions and sometimes avoid them.
It's probably only worse in terms of education needed to perform maintenance or nominal sticker price.
Not the same, we expect the problems you mention. There are just some laws of nature that we get used to dealing with. Tech has the tendency to produce random problems. The one we have all dealt with is, everything was working fine and then suddenly stopped. You call tech support and after an hour of troubleshooting it with them we get, the, "We've never seen this before. It must be caused by one of your other SUCKEE tech toys." Ahhhhhhhhhh...
I know a product that can do more damage with an accidental backspace: the iMessage app for MacOS (Messages).
If you're any sort of power user, you likely know that you can backspace by the word instead of the by the character, using Ctrl + BS on Linux or Cmd + BS on Mac.
In the Messages app, the shorcut to delete your _entire chat history_ is also Cmd + BS, and it works even if your caret is in the text box. So if you type five words and then Cmd + BS six times, you will be prompted to delete your entire chat history.
I do this almost every day. So far I've never compulsively hit return but I am dreading the day it happens.
Gnome Notes has similar behavior: Whenever you use Ctrl + BS, the note you are currently writing it gets put into the trash, even though you just wanted to delete a word quickly. You can recover the note so it's not the worst possible behavior, but it still sucks.
> lose all the data you typed into a web form if you accidentally backspace while not in a text field, because it navigates off the page
Nowadays, I would consider this a problem with the browser. How often does one navigate backwards with the backspace key?
Recently, I had some doubts over whether or not I should clobber the native browser behaviour for "ctrl-s", but then I realized that nobody anywhere EVER saves a web page to disk... and if they really needed to, the browser toolbar is right there.
I for one fully expect the backspace button to work if I do not currently have a text field focused. once you learn keybindings for an application, there's usually no way to perform that function as quickly/efficiently with a mouse. please do not break the conventional ones.
ctrl-s is probably fine to break though. even when it does "save" the page, it rarely does so in a useful way.
Edge, or at least the Chromium version of it, has changed the navigate back key to be ALT+Left Arrow instead of the backspace key. It was annoying at first because I have over two decades of muscle memory for hitting backspace to go back. After a couple of days I got used to it and now am happy I can backspace without accidentally navigating away because focus wasn't where I thought it was.
> but then I realized that nobody anywhere EVER saves a web page to disk...
Some people do it all the time. I was emailed a saved page the other day.
I was responsible for a single page web app, and the error detection code was stored in a <script> tag within the page, so I got plenty of “errors” logged for people trying to access saved pages.
> I find bugs in every software I use (my freaking microwave oven control panel!).
My dishwasher, which has only buttons to select what to do during the next wash cycle, has a firmware bug.
Sometimes when the door is closed, it will start one of the pumps. If I cycle "heated drying" on then off again, the pump will stop. I figured this out because, well, I've worked on firmware and I understand the how of how software can be stupid.
After I learned to recognize watchdog resets, I started seeing them more and more often, and became even more terrified of how bad software is.
After I learned to recognize watchdog resets, I started seeing them more and more often
Yup, sounds like my TV. It's not even one of the smart ones, I was careful to avoid those. But once every few days, it stops responding to the remote control when performing some action (opening the EPG, switching channels). I then have to wait about ten seconds for the display to go dark and the TV to "reboot" itself, so I can continue channel surfing.
I prefer open-source tools because I know where the agency for pain lies: myself. With modern Rust and Go tooling you can download an application's source code, modify it, and compile it painlessly in half an hour.
So why do I tolerate bugs in software like that? Because I know I can fix them. And I also know I won't always. Small gods have handed me tools to remake the world as I would see fit and I do not use them. Are they at fault for not having made the world as I would prefer? Or am I at fault for not using the tools?
In any case, I've noticed a sort of dichotomy among users in their reaction to tools that fail. There are those who go "this tool sucks how can I do my work" and there are those who go "my work is what I want to do which tool can I use instead". The latter set get a lot more done. Once observing this I have attempted to modify my behaviour to be like the latter and have effectively become better.
> Small gods have handed me tools to remake the world as I would see fit and I do not use them.
But they didn't give you the only tool you really needed: Time.
Having meaningful access to the source is very important, but its value is limited because even small improvements often take a large amount of time especially to code you're unfamiliar with. Once you've made that improvement, maintaining it (or up-streaming it so someone else might maintain it) can take a tremendous amount of time.
>I don’t know what has to happen for people to stop tolerating software as it is.
What we've got here is a question of cost and choice. If my choices are all equally bad, IE: vendor one is not any worse than vendor two, then inertia or cost become determining factors. In terms of consumer software - consumers have been conditioned to have low expectations, and these costs are further reduced because prices are so often free or very low-cost. In regards to commercial-focused packages - again, so often we put up with it because the systems we're using are so complicated and specialized that the pool of options are limited and/or the domain is so complicated that problems are inevitable.
So long as this is the landscape, few software producers have incentives to do the things necessary to improve, and/or believe they can spread the cost of improvement over a long period, IE: don't make the investment until the pain is too great.
> I don’t know what has to happen for people to stop tolerating software as it is.
Deaths, a lot of deaths.
Software Engineering needs a PE type licensure and a union. We need a way to stand together to advocate for better working conditions, practices, and tools.
What is the evidence that professional licensing would help? Even the most skilled programmers produce bugs. Professional licensing would only raise the barrier to entry for a well-paying job.
I completely agree. But I fear that if this happens then the companies will look for ways to produce software with less friction compared to the now-stricter software development practice in general. And it would be "let's hire Indians for peanuts" all over again.
Really not sure what's the way out of this corner that we've all collectively painted ourselves into.
Discipline and professional responsibility on the behalf of programmers, coupled with patience.
Everyone is in such an irrational hurry, it's been built into the "culture" such that rushing and making messes is acceptable. And by extension, customers expect things to be shit so you don't get in much trouble for doing it.
It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft > careless speed and money like they used to do back in the 50s/60s.
> It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft
Most programmers, and many companies, want to produce something of quality, well crafted.
The drive for low quality kibble comes directly from consumers, and the inability/cost of judging value.
A consumer can’t be expected to be a UI expert, and a slightly better UI might not drive sales because other factors are more important. I try to buy hardware with good UI, but I often make compromises for other factors.
> accidentally backspace while not in a text field
Thankfully that can be disabled, but I find it to be one of the most infuriating 'features' of Firefox. If it weren't something that could be turned off, it would be a deal killer all by itself.
Also, in most normal web pages—at least those not using hideously over-complicated scripting—just pressing your "forward" button (Alt+Right, for example) will navigate forward and restore the edits you had in the form.
My FreeNAS, Arch Linux and my Android phone as well.
I think, we get paid because we are building new stuff and have to maintain shitty stuff. If my job would literaly just desigining it high level, clicking it together and then it works, no one would need me.
Yes its frustrating sometimes.
I would like to cure cancer instead of debugging why this update broke our system.
My microwave will run for a split second if you hit the start button. I suspect it’s running the software to operate where the time variable is 0 seconds. That there’s no guard against 0 so it just runs until the counter part of the code runs.
I’ve microwaved a burrito by mashing the start button hundreds of times.
Software is indeed downright hostile sometimes... it's gotten better with autosave... but your IDE, MS Word, or Photoshop crashing would sometimes cost you half a day of work if not more. It was infuriating and even discouraging sometimes until you learned the subconscious behavior of pressing ctrl-s all the time.
As an experiment, I logged every bug I encountered for a few days. I averaged 5 bugs a day, not counting dark patterns or bad design.
Everything is broken, and nobody is upset. [1]
Some of the software I use is so unreliable that I expect it to fail. I expect the Vodafone login page not to work properly. I expect one of my airpods not to connect on the first try. I expect my banking app to show random error messages, even though it works just fine. Most online stock brokers have issues at the worst possible times. My bookkeeping app is frequently wrong, per my tax advisor. Since everything is broken, the best I can do is to mentally assign all those apps a trustworthiness score, and avoid betting too much on them.
The worst part is that support for all that software has been largely automated. If you have a problem that can't be fixed by a chatbot or a crowdsourced support community, you are largely helpless. Google can wipe everything you love, and there's no one to punch in the face (to borrow from Grapes of Wrath).
So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.
Thank you for mentioning AirPods! I was so excited to buy them after a ton of reviews saying they just worked. They said the days of fighting with Bluetooth audio were gone. I believed them, then reality showed up.
My Mac sometimes unpairs them and worst it doesn’t find them. Some times while my baby is sleeping I put my AirPods and play a loud video just to realize they were not connected. My wife’s right side AirPod just stopped working after one year of use...
If apple is considered top tier in reliability, then technology in general really just sucks!
To provide an anecdotal counter-example, AirPods have worked seamlessly for me so far. Much quicker to connect and more reliable than Bluetooth. So their marketing isn't completely off-base :)
My manager had to buy new headphones as when she is WFH she used airpods and iphone to connect to calls and the irpods constantly cycled leaving garbage audio whenever she tried to speak.
I'm using a BT Jabra headset, with noise cancelling I got for about the same cost, 16+ hours of battery, easy pairing and super useful phone app, great ANC, and solid audio quality, at least to a non-audiophile. My biggest complaint is the closed back design leaves my ears a bit irritated after 4+ hours of use. Not an airpods competitor but for the cost I am way happier.
I use my AirPods with multiple devices, and that process is also fraught with problems. Switching to another device takes an absurdly long time. About once per month, switching will make bluetoothaudiod peg a core and ultimately hard crash the entire computer. Yes, the kernel panics if it doesn't hear from this userspace process frequently enough.
Surprisingly, iCloud syncing works fine. If I pair my AirPods with one device, it always pairs with all of them.
They work well enough, but they don't work magically well.
The main issue is with the right pod not always turning on when I take it out of the case. The solution is to put it back in the case for 5 seconds and to try again.
The second most important issue is the airpods falling out of sync with each other. It seems like the signal from my Samsung S9 in my pocket is choppy. Looking left or right for too long will make the signal drop. Putting my hands in my pockets also will. If I put the phone in my backpack, it's okay.
This is still more pleasant than wired headphones, but it's far from a magical experience.
Personally, I hate ear buds and, as such, never bought ear buds. Rather, I spent ~$20 on SoundBot bluetooth headphones starting some five years ago (long before air pods, methinks) and haven't had problems with them at all.
I also have a seven year-old phone (HTC OneMax) running custom (unofficial/ported by a random hacker) Android[0], and it pretty much works.
Sure the battery life has degraded since 2014, but that's to be expected, no? I wish I could replace the battery (as I did with my 15+ year-old Panasonic cordless phones), but there really aren't too many mainstream mobile devices that allow that any more.
As for poor quality software/hardware, if you don't like it, vote with your feet and/or wallet.
If stuff doesn't work, why use it? Even more, if stuff doesn't work and you can't/won't fix it yourself, then don't use it.
Software devs and hardware manufacturers don't care about whiny blog posts or complaints on HN, they care about the bottom line. Impact the bottom line and you may have a chance at improvement.
Stuff that actually addresses the issue is useful. A great example is the lack of Android support after 4.4/KitKat on the HTC OneMax mentioned above and the abandonment of it on Cyanogenmod/LineageOS in 2017, where those (myself included although I'd never hacked on Android ever -- and failed miserably -- thankfully someone else did not) impacted by this took action to provide the latest Android on an old, unsupported, discontinued device.
If you're not taking positive action toward making things better (whether that's fixing the problems or voting with your feet/wallet), then you're not going to have any impact.
While whinging about it on your blog may be a way to relieve the stress you feel about whatever issue(s) you may have, it's not constructive or useful.
That is unless your goal is to get lots of comments on HN where the Apple Fanbois sagely agree, and lament there's nothing to be done about it because Apple is the pinnacle of tech and since no one could possibly do anything better than Apple (or the apps that run on their gear) therefore all technology sucks.
And that's objectively false. There's lots of tech out there that's quite good. I suggest using that and shunning rather than using, then whinging about the stuff that sucks.
> I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store.
Good, you dodged a bullet there.
I mean, I love my 2-in-1 Dell (a slightly cheaper but still high-end Surface-like device). The pen, as much as it's useful (I'm not even considering buying a touchscreen-enabled device without solid support of a pen anymore; it's so much better UX than fingers), still has lots of subtle and annoying bugs. Maybe in 20 years people will work out the kinks. More likely, the concept will be abandoned in favor of some new modality that will also never be perfected.
Most software is still net positive in productivity. We tend to place more emphasis on failures as users.
Remember you're running millions of lines of code that talks to other computers running millions of lines of code that communicates over a network running millions of lines of code to deliver some information on the order of seconds to minutes -- and then something responds to that information and everything happens all over again.
All day, every day, trillions of packets of information get delivered just fine. Try doing that as a human, delivering letters. You probably won't even approach a million packets delivered in your life time. And people have the audacity to say, "oh my, some things didn't work, this is completely broken"
In only a single generation, we went from voice communicators to super computers in our pockets. The utility vastly, vastly, vastly overshadows the glitches that come with frenetic advancement. How long did it take humans to invent basic numbers?
I rsearched the ReMarkable 1 and 2. I ended up just getting a Rocketbook. It's very simple concept. "Paper" in the form of hard plastic. The pen it uses is the Pilot Frixion, which is an erasable pen. Hence you have a notebook to record notes for a while. If there's anything important I'll manually transfer it to my OneNote (I don't use Rocketbook's picture taking app).
Most notes I take only need to exist for a few weeks and then I erase...so transferring it to "long term storage" is rare.
I do have an iPad and note taking apps like Notability if I know something will need to go to "long term storage" but I find I use the Rocketbook more.
I was looking to replace the A6 sketchbook I carry everywhere, and the A5 notebook that always sits in front of my keyboard.
I thought it would be nice to access my notes when I don't have my notebook on me, and to have layers, zooming, undos etc. However, the more I look into it, the more absurd it seems.
I'm replacing a 15€ notebook and a 2€ mechanical pencil by a 400€ gadget that doesn't quite work. Why? So that I can spend my time organising notes in a digital space. Why? I don't really know.
It would be cool to have layers, zooming and an undo button. It would also be cool to have access to my notes even when I don't have my notebook. However, it would just be cool. It doesn't actually solve a serious problem.
> So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.
All of those were on my list of cons, particularly the lack of distractions. I avoided the Surface completely because Windows is anything but quiet and maintenance-free.
The iPad seemed pretty solid, but I'd have to turn it on and unlock it to see my notes, unlike a notebook.
The Remarkable seemed nice, bht there are lots of complaints that paper doesn't have.
The Supernote A6X was the most promising, but it was hard do get in Germany.
The problem is with the idea of "continuous delivery". Many people fail to understand that technological advances only increase productivity if the innovations, the great leaps forward, are relatively rare, with long periods of stability and refinement in between.
There's always an adjustment period, where people have to spend time learning a new technology, and any issues with the new technology need to be resolved. The gains in productivity happen mainly after the adjustment period. But we've eliminated the periods of stability and are constantly pushing for more "innovation", which means we're in constant periods of adjustment and resolving problems, where the promise of increased productivity is never fully met.
The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.
Produce something new and great... but then let us all enjoy the new thing for a while. Novelty for its own sake is not productive.
> The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.
this is sort of uncharitable. the development/maintenance cycle for software is incompatible with the traditional way of monetizing a product (ie, design it up front, manufacture at scale, and then the buyer gets what they get, barring severe safety defects). buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale, even if the bugs are introduced through unforeseeable interactions with other software.
imo, subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development. while it introduces some unfortunate constraints in the dev cycle, bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole. trickling out new features "when they're done" earns the respect of engineers, but results in the average user simply not noticing that progress is being made.
> buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale,
Contrast that to traditional physical goods, where buyers expect the product to work as advertised, right out of the box, or their money back. Software in the Internet era has it easy, because it gets to release shitty half-finished versions, and then keep charging money while never quite finishing the software.
> subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development
Because software does not decay on its own (despite the misleading term "bitrot" being popular in tech circles). That's literally why digitizing data has taken the world by storm: digital data does not decay; as long as the physical medium is readable, you can make a perfect copy of the data it contains. As a buyer, I don't expect my software to need maintenance. I expect it to work out of the box (just like I expect every physical product to work out of the box), and once I find software that fulfills my needs, I expect it to work all the way until computing technology moves forward so much that it's no longer possible to run the software. Which, in the era of virtual machines, may take decades.
So yeah, there's a need to clearly justify to the customers why you're charging subscription, because software in its natural state does not need maintenance.
> they also expect bugs to continue to be fixed after the sale
I wasn't disagreeing with that. I mentioned "long periods of stability and refinement" — refinement including bug fixes — and "any issues with the new technology need to be resolved". But again, bug fixes don't magically happen on a schedule either. Maybe fixes are easy, maybe they're hard, you never know in advance.
> bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole
This is exactly why it's not true that "subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance". Instead of maintenance, subscriptions incentivize continuous delivery of new features, and consequently continuous delivery of new bugs.
Did consumers demand subscription services? Or did vendors (led by Adobe) decide to change to subscriptions to get uniform cash flow?
At agencies I have worked at all creatives I worked with would prefer to spend $200-400 and have a permanent software license. Perhaps this isn't a representative group.
No, subscriptions introduce the perverse incentive to release unfinished products and slowly drip-feed fixes. Having to test extensively before a one-and-only release may not drive nearly as much revenue, nor provide a running deliverable stream for a given engineer/product manager's CV, but it is clearly the better user experience.
Developers get paid every couple of weeks, so an agile business tries to track the value gained from that expense on a similar cadence. The process has become more about corporate governance than delivering customer value.
> Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.
For consumer goods, it is the hype cycle.
New updates and releases get press. They also restore consumer confidence.
If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out. Consumers would lose confidence in buying Samsung phones, journalists would write articles questioning if Samsung was pulling out of the market, a lot of bad things would happen.
Give it 18 months without a release and people would start to think of Samsung as "that company that used to make phones."
They would have to fight like heck to restore their image.
Software is the same way. In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.
Sure every Android version up until 7 was kinda-sorta-terrible, but it kept Android in the news. Likewise, Apple got huge free press every time they announced a new revision of OS X (now MacOS), and every time they came out an announced a new version of iOS.
The result? "The desktop is dying, phones are where the real innovation is at!" articles being published even faster than those software updates came out.
You can of course release too fast, rarely do Chrome or Firefox's releases get any press (unless there is a controversial UI change), but in general frequent updates are free advertising.
Tesla is also great at this, I'm nowhere near being in the market for a Tesla, but at least a couple times a year I still end up hearing about software features they are rolling out!
> If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out.
The smartphone industry has only themselves to blame for setting up this expectation. But it is possible to get off the train. I remember when Apple announced they were dropping out of the annual MacWorld San Francisco conference, because they didn't want to constrain their product release cycle. Apple survived that just fine.
> In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.
Competitors? Windows and Mac had near 100% market share on desktop. There was only 1 competitor. Vista was released in January 2007, but Mac OS X 10.5 Leopard was infamously delayed until October 2007 because of iPhone, so this telling of history doesn't seem entirely accurate. Moreover, Mac OS X releases were already slowing. Here's a list of months since the previous major .0 release:
10.1.0 6
10.2.0 11
10.3.0 14
10.4.0 18
10.5.0 18
10.6.0 22
10.7.0 23
10.8.0 12
10.9.0 15
10.10.0 12
10.11.0 11
10.12.0 12
10.13.0 12
10.14.0 12
10.15.0 12
Thus, major Mac OS releases were slowing down year by year — which is totally sensible — but then Steve Jobs died after 10.7 was released in 2011, and only then did they switch to a yearly schedule.
This criticism is based on the assumption that all updates are created equal. But they aren't.
One week, the update could be a relatively minor bug fix. The next week, a major feature upgrade that's been in the pipeline for months.
You also remove the ambiguity of "Is this worth pushing out? When should I push this out? Should I do some more fixes or push this one out first?". You got fixes, push them out in the next update.
Your criticism also assumes a small team. If you have a large enough team where you can split them into new feature development and current bug fixing, they're going to work at different rates and be ready at different times. If instead your entire team just works on "the product", then there is no effective difference between fixing issues and creating functionality.
I feel like the slow decline of software quality has been in lockstep with the gradual transition from (expensive and non-measurable) manual software/hardware testing and QA to automated frameworks and rollout-based quality assurance.
I constantly encounter broken functionality, buggy or unpleasant UIs, just as the author has. It feels like many of these problems could be avoided if you just had one person whose job it was to sit there and look for broken stuff. (I'm sure I'm biased as someone whose first job out of college was to sit there and look for broken stuff.)
I would tell a slightly different version of this story, focusing in on "rollout-based quality assurance".
I would say that effortless, automatic updates are to blame.
When you can always just push an update, the impact of a given bug goes way down. It's no longer mission-critical to exterminate flaws before shipping; a totally broken feature becomes a mere annoyance. So project prioritization shifts from polishing an artifact to outweighing the (presumed inevitable) constant stream of little annoyances with fixes and features. I think the shift towards automated testing is just a symptom; an attempt to bridge the gap in this brave new world.
For a clear-cut example of this phenomenon, look to the video game industry. Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.
Right around that time, "glitches" went from very rare unicorns that people would spend lots of time actually seeking out, to nearly everyday occurrences. As long as it doesn't corrupt someone's save file, they mostly laugh it off and upload a clip to YouTube to show their friends. This is just how things are now.
(Edit: I should have scoped this to "console games")
> Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.
Sure, but they still (sometimes) released (a few) extra revisions of a game. They were just targeted at people who bought physical copies after the revision date, rather than at existing customers.
Or said updates came on the 1.0 version of the game as shipped in markets that got the game later than others. (Just imagine — per-market release versioning. Every market effectively got its own fork of the codebase!)
Or said updates came in the form of a re-release port. There are patches made to the emulated downloadable app-store re-releases of some games, that never made it into any physical edition of the game.
Also, before home consoles, arcade game machines did receive bug-fix updates regularly. Arcade machines were essentially provided “as a service” from their manufacturers, with support contracts et al. Sort of like vending machines are today. If you reported a bug, they’d fix it and send you a new EEPROM chip to swap out in your cabinet. If there was a critical bug that affected all units, they’d send techs out to everybody’s machines to swap out the ROM for the newest revision. (For this reason, it’s actually kind of hard to do art-conservation / archiving of arcade games. The cabinets almost never have fully “original-release” components inside.)
I think you've both got a piece of it. I've programmed PC software, embedded software, and mobile software, and my gut feeling (without data) is that the shipped software quality is inversely proportional to the update frequency and ease of updates. Had nothing to do with how smart or skilled the developers and testers were. Had nothing to do with management's priorities. Update frequency and ease changed immensely once we could feasibly deliver patches over the Internet. Before easy updates, you'd actually quality check every corner of the application, you'd actually fix those P2s and P3s. You'd do exploratory testing off the test plan rails to find things. There was even a concept of "done" in software, as in, you eventually stop constantly jamming features in and tweaking the UI in maddening ways.
Now, it's just "LOL just ship it, users will just deal with it until the next release!" Now, it's "Do experiments on N% in prod and use end users for A/B testing. If something's broken we'll update!"
In several industries, it's actually totally expected that v1.0 of the application simply won't work at all. It's more important for these companies to ship non-working software than to miss the deadline and ship something that works! Because who cares? Users will bear the cost and update.
I agree that games used to have far fewer bugs when shipped, but it's not true that games never received updates back in the day. I distinctly remember queuing on file sharing sites as a kid in the early 2000s to download half-life updates and updates for other games.
absolutely this.
Back in the early 2000s quality assurance was insane. A release is was working on (aaa title from a major studio) was blocked because players' eyes were rendered incorrectly and you had to really zoom in to even see the glitch. And of course you had to find and fix all bugs before october, or else you won't be able to hit Christmas sales.
Once internet updates became the norm, it all became pretty much like the rest of software industry. (At least game companies still have QA departments, a lot of mainstream web companies have dispensed with those as well.)
I have witnessed this first hand comparing two systems.
One system has no patching, and updates incur some non-trivial amount of effort on the part of the installer. Releases are a few times a year, at most.
The other system has patching, updates are lighter weight, and as a result, the system has THOUSANDS of patches released over the last decade, north of 2 per work day.
Software has always sucked and had these issues. It has nothing to to with automated QA. The reason you see more issues is 'way back in the day' your software had a very limited number of things that it did, and in general it did not involve accessing a network or chugging down massive volumes of data from untrusted sources.
I work for a company that has a lot of individuals that test for QA issues, they have list miles long of things to check and write reports on.
The problem is more of "It's much easier to write mountains of code than it is to ensure that it works in all cases"
I worked for almost 15 years in embedded and have 25YOE, and no, software has not always sucked as much as it does now.
I agree with your last point a lot, though I would modify it slightly: it's much easier to write mountains of code now than it as, and it's now common and much easier to import external dependencies (especially at system level) than ever, and those dependencies have tens of millions of lines of mediocre code all by themselves.
The one thing that has changed is updates can be delivered easily. This takes some of the pressure off in terms of QA because rolling out a fix to a centralised service delivered through the browser is quick and painless. The cost of pressing millions of CD's kept developers in check in the past.
And, as you’ve alluded to, the scope of what software does for us on a daily basis has expanded by several orders of magnitude. Not only have a number of devices that used to be purely electro-mechanical been reworked to use microcontrollers (cars, everything in the kitchen), but the scope of activities that have migrated onto the web or our phones is truly massive.
30 years ago, software bugs might interfere with you professionally, but they wouldn’t stop your ability to get money from the bank, cook food, or do any other day to day tasks.
You also need management willing to prioritize engineering time to fixing that broken stuff, and engineers who actually know how to make non-broken stuff. My experience is that having all three of these prerequisites is pretty rare.
I'm going to restate something that I said a few days ago:
There are a number of hats developers are expected to wear today:
1. Developer of new features
2. Sustainer of prior code and features
3. Tester of all of this
4. Constant student (outside work because who'd pay their employees to learn?)
The priority for the business is (1), so 2-4 get neglected. This compounds over time to mean that old code isn't properly refactored or rewritten when it should be, and none of the code is tested as thoroughly as it should be, and none but the smartest or most dedicated are really going to be perpetual students (or they'll choose to study things that interest them but don't help at work, like me).
When the old code and poor tests create sufficient problems, you get a business failure or a total rewrite. Which strips out half (or more) of the features and the whole process gets restarted.
Great comment. People also used to use only a few pieces of software in any given day (or week, even!) e.g. email, web browser, and word processor.
Now, our computer ("phone") is with us everywhere we go and we'll use dozens of complex applications per day, connected by dozens of APIs, networks, protocols, and hardware features. It's a miracle any of it works sometimes! Thank you to everyone for making this stuff seem like magic; my twelve-year-old self would be amazed at how well it works.
I do feel like there are more UI bugs as we optimize for certain metrics over others. Timing updates has become far more complicated, so we get weird UI refreshes as new data comes in, stale caches, missed notifications, etc. Turning it off and on again often works, surprisingly (probably because devs start with clean environments often, so that's the functional baseline).
Lastly, it is probably far more lucrative for a technology based business to use their most valuable minds for the Next Thing, rather than iterating on the current thing. Incremental revenue improvements just don't cut it in a capital-driven world; everyone is trying to escape the local maxima to find billion/trillion dollar businesses.
Yeah, but then you'd have to raise the price to pay for the testing and the jokers a block down the street who YOLO'd their competing product into the marketplace without testing would grab all your sales. You'd be out of business and your competitors would be laughing their way to the bank while the customers still suffered constantly from a broken product.
If it were only for novel products this would be a reasonable argument. See Netscape's rush to release their browser as an example of what you're talk about.
But the worst part is that this is an issue with established products that have secured their market, and will even receive money every year from their customers. They have both the money and the time to pace themselves and test things properly, but they don't.
I agree with this sentiment in general. Of course, we have a ton more features than we used to have, but I think given the newness of software in years past, we were OK with bugs and issues because of the novelty of it all.
Now we are in 2020 and iPhone updates STILL cause battery issues. The iPhone has been out for 14 years...
Our expectations have changed. Tools like Excel should just work - and yet, when I try to save a file, sometimes it freezes and crashes. How is that acceptable now?
This, but also remember that we're living through a transition from SAAP to SAAS, where the "service" is actually data extraction for advertisers. For that, software must be minimally useful to a user, but the true aim of UX is not user-satisfaction but data extraction or subscription lock-in.
I work somewhere that primarily relies on manual testing rather than automated testing. It definitely does not make the software more reliable here, at the least. :-)
1. Manual testing that should be manual (exploratory).
2. Manual tests that are new and haven't been automated yet (but will be).
3. Manual tests that should be automated.
(3) is the one many people see and suffer through (I know I have). They need to be automated to free up time for (1), which is where many issues are actually discovered. But if (3) dominates your time, you can never get to (1) and you'll constantly ship broken things (or more broken than they should be).
I am a life long Linux user and programmer well accustomed to searching for support and fixing issues with software.
Recently I got an RSI from programming and could not use a keyboard for 10 months. I could only use my Android phone. There were so many problems and I had so little insight. I _did_ search for answers and found contradictory or incorrect instructions. Often I had to resort to uninstalling apps and taking my chances on their competitors. This strategy had a very low rate of success. Many things I just gave up on. I essentially lost 10 months of productivity and became despondent.
One specific problem I had was that I was unable to control which PDF viewer would open automatically when I downloaded a PDF. For not particularly interesting reasons regarding my workflow, it was important to me that Adobe open instead of the viewer built in to the file navigation app. I followed the instructions online to rectify this. However when I navigated the menu to the appropriate place, the option simply didn't exist. I don't fully understand, I'm not and will never be an Android dev, but I believe it had something to do with the built in viewer not being a first class app but some sort of "applet".
Deep in this frustration it clicked with me; this is how most people see computers. Black boxes that work against you. Unfathomable and unserviceable.
PSA: Stretch your hands. Take care with the ergonomics of your keyboard, mouse, and desk. If typing hurts - stop.
Advancement in tech are nowadays driven only by profit, which in turn is driven by sales, which prioritise what people are likely to pay for. People expect technology to be affordable and they really don't care what the effect of this is. Put competition in the mix and you get a lot of stuff that sucks because the field of battle is not the quality, is pricing and marketing.
And then stuff dies after a short time. And/or is just left forgotten because it's more broken/an annoyance than it is useful. Or, and this is key, you are stuck with it because for a mix of reasons there is no viable alternative.
One would expect that on the long run the ultimate victims of this (the users of technology) would realise that rather then the ten thousand loads of tech crap they buy they would be much better off with a bunch of stuff that actually works without annoyances for a better quality of life but people just don't. Because again, pricing and marketing.
It used to be _create value and make a profit from it_, now it's the other way around: _aim for profit and consider creating value as a cost of doing so_ (and therefore should be minimised, who gives a frack if the user is annoyed or the thing is broken as long as there is profit).
People has been even trained for decades by major products in the industry to accept that tech doesn't work/is broken/sucks. The good stuff we have exist because at some point for some small time quality became competitive advantage, then on top of that other cycles of craploads piled on.
Expecting that billions of people will simultaneously act against their incentives is madness. You can't assign collective fault to "we" who did this by making bad choices individually. It's a system that has optimized itself to this end goal.
I feel pretty good about my choices in comparison. I use an older, sock Android device, with simple wired headphones, so that particular issue with AirPods has never occurred to me. My laptop is a bulletproof old Thinkpad running Ubuntu LTS, which is likewise pretty reliable. Software that I have to use for work but that I expect to be crappy goes in a VM. I use this crappy software to the best of my ability to create other software and hardware that I expect to work reliably, and to do so for decades.
But none of those signals and "good choices" ever feed back into better choices by the titans of our industry. When an iPhone or AirPods device uses up all the recharges available in it's battery, that shows up as profit in the numbers Apple uses to make decisions.
Don't blame the victims for the situation they're prey to.
I don't see your point, it's obvious that among billions _some_ made better choices, but collectively it's still our choices that led to this.
And I think it's pointless to blame "titans of the industry" while almost literally everyone else too buy into the same pattern, or you mean to tell that there is a _signigicant_ sum of companies in the tech industry that ships with a "quality first" mindset? I do not mean to be sarcastic, if that's your point please make a list, I would really appreciate that.
And yes, I blame the victims for their/our short-sightedness, it's not like anyone forced people to buy and re-buy broken products for decades.
> When an iPhone or AirPods device uses up all the recharges available in it's battery, that shows up as profit in the numbers Apple uses to make decisions.
You make it sound like only the batteries in Apple devices have finite charging cycles, which is obviously not true.
Although I agree quite wholeheartedly with your comment and the article, I still feel it's possible to find good or at least better software out there. I recently switched to Android because it is far less buggy than iOS. I honestly think iOS is the buggiest software I've ever used. I also similarly switched from Spotify to Amazon Music Unlimited for the same reasons. I have yet to find even a single bug in AMU. It's honestly quite impressive in today's software world.
I really hope more companies start to see quality and reliability as a competitive advantage. I know I'm an outlier, but I gladly pay more or switch when I find a product that is more reliable.
It’s interesting, there are a lot of little bugs in software today, but I think it’s easy to forget that common bugs used to be a lot worse.
In the 90s it was pretty common to have a kernel panic and hard crash, blue screen of death, etc. Data loss was common.
Folks who lived through that have such a strong compulsion to hit CMD-S every few minutes, that a lot of cloud software today offers a “save button” that actually doesn’t do anything because the document auto saves as you type anyway.
To some degree I think this may be related. Bugs that crash the computer are a little bit easier to detect (If only because the users come screaming at you) compared to bugs where the app scrolls incorrectly once every million scroll events.
Overall I would say that software has become much more complex and much more reliable at the expense of being more inscrutable. People are able to operate in a degraded condition now because software is capable of working that way, whereas in the past it would just fail completely.
Is that better? Maybe not by much, but I think most normal users would say software is easier to use and more reliable today compared to decades past.
> Folks who lived through that have such a strong compulsion to hit CMD-S every few minutes, that a lot of cloud software today offers a “save button” that actually doesn’t do anything because the document auto saves as you type anyway.
This describes me.
And I often wonder if the presence of the "save button" is what causes the compulsion: in web apps that auto-save, I trust/assume they're auto-saving, and feel more free.
In web apps that have the save button, it's always occupying some part of my mental capacity to keep hitting it after significant edits.
Software updates! Guess what! Here's a new UI for ya. We moved all the stuff! It's like someone threw you a surprise birthday party, but not on your birthday, on their birthday, and their idea of the best gift evar is to hire an interior designer (for free! lucky you!) who completely rearranges your house inside and out and springs it on you after you return from the grocery store. And there's no going back.
At first it was exciting--when I was 15--then it's slightly bothersome, then downright annoying, then it's infuriating, then it's just tiring. Your brain learns that there is no point in learning anything anymore because they're just going to scramble it again anyway. Learned helplessness. People age out, ended up feeling old and useless and jaded because their skillset is completely inapplicable, after just a few years.
Yeah, I can understand why people hate tech.
Instead of the previous menu option words like Campaigns or Audience there were icons signifying each that I had to hover over to figure out what they might mean. Then when I went to my Reports the css breakpoints seemed to be wonky making that screen hard to read and use.
Half-jokingly It almost feels like constantly confusing people is a trick to boost engagement temporarily while people are forced to figure things out.
To add to that, now that I'm a retired lifelong techie I realize why "old folks" back in the day would hesitate to give up the old, outdated software that they knew how to use.
E.g. I'd prod older friends and family to give up wordperfect - which they knew and loved - in order to progress to the feature-rich-new MS WORD.
Now I'm a linux advocate with its archaic terminal commands and I can empathize with anyone who wants their laptop, phone, TV, microwave, etc. to stop evolving!!
Wordperfect had "reveal codes", so when the WYSIWYG gave you something you didn't want, you could pop open the actual document representation and wrangle the tags until you What You Want Is What You See.
MS Word has no such function, so when it screws you, and it does, you're good and screwed.
I’ve always described it as “the design team justifying their own existence after the job is done.”
Let software get stable and boring.
I actually think that's really what is going on. Wish I had first hand evidence though.
I do know of a tangential phenomenon at a friend's work place. Her org has a dedicated build tools team. So every 6 months every project's build infrastructure needs to change to something entirely new, because the build tools team keeps having to justify its existence.
I don't know why a company would let this sort of thing happen. It's a massive waste of time for every team.
Some of the most annoying UX I've had is on Quora, Facebook, and the reddit redesign, which all spend a veritable fortune on it, while the best ones I've seen are something a non-specialist slapped together with bootstrap.
I do sometimes wish that there could be alternative (not "replacement") ways to do things we use "tech" to do today, where the alternatives only required UNIX (no GUI). This way if we get frustrated with a graphical UI, and myriad "updates", we can just do these things the "old-fashioned way", with comparatively smaller, simpler, command line UNIX programs.
To me, the people who would be very opposed to this idea are not users, they are developers. Having been raised on computers in the 1980's I can attest that computer users never cared about "UI" or "UX", they just did what they needed to do to use the computer. It is developers, especially contemporarary, who are actually care about "UI" and "UX", not computer users. In fact, some of them are passionate about these aspects of using a computer.
Before recommending it, however, he felt it important to mention that for people who don't machine very much, far cheaper scribes work well because unless it's your job, your tooling is less likely to be the bottleneck, and you have fewer resources. When you machine professionally, you're tooling is likely your bottleneck and you've more resources.
I think this holds for tech and software. Think of resources here as "time spent learning APIs, bash, and remembering tar mnemonics".
At first, dragging and dropping folders isn't going to be your bottleneck. Need to move 1000s of folders scattered on the hard-drive? If you're not using a terminal, you'll be in trouble.
Everyone cares about UX, it's their experience when using tech. It's just that GUIs are better for some contexts than others.
[0] https://youtu.be/n5laGi3GO7M?t=356
... what? Are you suggesting computer users in 2020 - which includes everyone from your nana on her iPhone to a toddler watching YouTube on a tablet - want to use CLIs, and are being forced by baddie developers into using apps?
I'd change that to: "For most people, corporate neoliberal technology is a haunted house riddled with unpleasant surprises."
Writing that recognizes that we live with the most un-free market of all time:
"We are in the middle of a global transformation. What that means is, we're seeing the painful construction of a global market economy. And over the past 30 years neoliberalism has fashioned this system. Markets have been opened, and yet intellectual property rights have ensured that a tiny minority of people are receiving most of the income." [1]
And:
"How can politicians look into TV cameras and say we have a free market system when patents guarantee monopoly incomes for twenty years, preventing anyone from competing? How can they claim there are free markets when copyright rules give a guaranteed income for seventy years after a person’s death? How can they claim free markets exist when one person or company is given a subsidy and not others, or when they sell off the commons that belong to all of us, at a discount, to a favoured individual or company, or when Uber, TaskRabbit and their ilk act as unregulated labour brokers, profiting from the labour of others?" [2]
[1] https://www.youtube.com/watch?v=nnYhZCUYOxs
[2] https://www.resilience.org/stories/2017-08-03/book-day-corru...
My gas car stinks, destroys the planet, needs yearly maintenance, crashes in everything the second I stop paying attention.
My house decays days after day. Floors need constant cleaning, wall have holes from small impacts, paint contains inedible fragments and disperse nocive gas.
Bees are building nests on my balcony and it’s definitely not what it was built for, nor where they should be.
How can we tolerate such a life ?
I guess my point is that there is a difference between things sucking because of the laws of nature, and things sucking because of incompetence, laziness or indifference.
When is the last time you flipped a light switch, and suddenly your pool disappeared?
Have you ever had French doors appear in your dining room because of a "Windows Update" on Wednesday morning?
Have you ever had to wait for half an hour for your house to boot later on that same Wednesday?
When is the last time you closed a door, and were killed by a hailstorm of bowling balls?
At least with a light switch, you know it's very unlikely to cause structural issues, or plumbing issues, or drain your bank account. Computers are singularly horrible in the ways things can fail.
No. A hardware product like a car has predictable wear and tear governed mainly by the laws of physics. The fact that I can no longer use my smart speaker because the manufacturer decided to stop supporting it, went out of business, or got bought is not at all the same. My car will still work through all of those things in the exact same way. It also doesn't throw up random dialogs (or whatever a physical equivalent would be) that stop the product from working until I interact with it. Not the same at all.
To me, software is as if when I open a book to read it, then, the book suddenly snaps itself shut, hurting my fingers.
Thereafter, the book gets wings, tries to fly away, but bumps into my coffee mug on my desk, so coffee spills on the floor. Then the book does fly out through the closed window — smashing the glass into pieces — and gets larger and larger wings, flying higher and higher until it disappears into the blue sky.
It's as if software was alive, and does random things I don't always agree with.
But actually — the bees building nests on the balcony: That feels pretty close to misbehaving software. Or the cat, bringing in a snake from outdoors. Or a squirrel shewing through a power cable, shutting down a city.
Of course, people perceive that software sucks because it’s more complicated than people perceive. I forget what book said it, but an operating system has more separate components than an aircraft carrier and they’re more tightly coupled. (I’m not sure that’s true, but it conveys the idea.)
Another key difference is that in maintenance of your home, you have complete control. It's extremely easy to understand and act to improve or maintain it. When large software systems (like the IRS login) have problems, you are totally helpless.
Cars vary widely in their product quality. Houses vary widely in their product quality. Some things in life are inevitable facts of nature, but product quality is not. Quality is to a large extent determined by the time and care taken by the manufacturer.
That's not a good example, nor is it parallel to the dynamic the article describes.
Your car stinks a lot less than cars did 10/30/50 years ago (emits less in the way of pollutants or CO2 per mile driven), is less likely to kill you in a crash involving the same size cars/velocities (despite weighing less!), needs less maintenance, lasts longer, and can notify you of potential collisions and sometimes avoid them.
It's probably only worse in terms of education needed to perform maintenance or nominal sticker price.
If you're any sort of power user, you likely know that you can backspace by the word instead of the by the character, using Ctrl + BS on Linux or Cmd + BS on Mac.
In the Messages app, the shorcut to delete your _entire chat history_ is also Cmd + BS, and it works even if your caret is in the text box. So if you type five words and then Cmd + BS six times, you will be prompted to delete your entire chat history.
I do this almost every day. So far I've never compulsively hit return but I am dreading the day it happens.
Nowadays, I would consider this a problem with the browser. How often does one navigate backwards with the backspace key?
Recently, I had some doubts over whether or not I should clobber the native browser behaviour for "ctrl-s", but then I realized that nobody anywhere EVER saves a web page to disk... and if they really needed to, the browser toolbar is right there.
ctrl-s is probably fine to break though. even when it does "save" the page, it rarely does so in a useful way.
Some people do it all the time. I was emailed a saved page the other day.
I was responsible for a single page web app, and the error detection code was stored in a <script> tag within the page, so I got plenty of “errors” logged for people trying to access saved pages.
My dishwasher, which has only buttons to select what to do during the next wash cycle, has a firmware bug.
Sometimes when the door is closed, it will start one of the pumps. If I cycle "heated drying" on then off again, the pump will stop. I figured this out because, well, I've worked on firmware and I understand the how of how software can be stupid.
After I learned to recognize watchdog resets, I started seeing them more and more often, and became even more terrified of how bad software is.
Yup, sounds like my TV. It's not even one of the smart ones, I was careful to avoid those. But once every few days, it stops responding to the remote control when performing some action (opening the EPG, switching channels). I then have to wait about ten seconds for the display to go dark and the TV to "reboot" itself, so I can continue channel surfing.
So why do I tolerate bugs in software like that? Because I know I can fix them. And I also know I won't always. Small gods have handed me tools to remake the world as I would see fit and I do not use them. Are they at fault for not having made the world as I would prefer? Or am I at fault for not using the tools?
In any case, I've noticed a sort of dichotomy among users in their reaction to tools that fail. There are those who go "this tool sucks how can I do my work" and there are those who go "my work is what I want to do which tool can I use instead". The latter set get a lot more done. Once observing this I have attempted to modify my behaviour to be like the latter and have effectively become better.
But they didn't give you the only tool you really needed: Time.
Having meaningful access to the source is very important, but its value is limited because even small improvements often take a large amount of time especially to code you're unfamiliar with. Once you've made that improvement, maintaining it (or up-streaming it so someone else might maintain it) can take a tremendous amount of time.
I can't wrap my head around this, could you explain further?
What we've got here is a question of cost and choice. If my choices are all equally bad, IE: vendor one is not any worse than vendor two, then inertia or cost become determining factors. In terms of consumer software - consumers have been conditioned to have low expectations, and these costs are further reduced because prices are so often free or very low-cost. In regards to commercial-focused packages - again, so often we put up with it because the systems we're using are so complicated and specialized that the pool of options are limited and/or the domain is so complicated that problems are inevitable.
So long as this is the landscape, few software producers have incentives to do the things necessary to improve, and/or believe they can spread the cost of improvement over a long period, IE: don't make the investment until the pain is too great.
Deaths, a lot of deaths.
Software Engineering needs a PE type licensure and a union. We need a way to stand together to advocate for better working conditions, practices, and tools.
Really not sure what's the way out of this corner that we've all collectively painted ourselves into.
I coined it watching the robotic soda vending machine crash and reboot frequently.
Everyone is in such an irrational hurry, it's been built into the "culture" such that rushing and making messes is acceptable. And by extension, customers expect things to be shit so you don't get in much trouble for doing it.
It's a feedback loop that only stops if companies (and individual programmers) start taking pride in craft > careless speed and money like they used to do back in the 50s/60s.
Most programmers, and many companies, want to produce something of quality, well crafted.
The drive for low quality kibble comes directly from consumers, and the inability/cost of judging value.
A consumer can’t be expected to be a UI expert, and a slightly better UI might not drive sales because other factors are more important. I try to buy hardware with good UI, but I often make compromises for other factors.
Thankfully that can be disabled, but I find it to be one of the most infuriating 'features' of Firefox. If it weren't something that could be turned off, it would be a deal killer all by itself.
http://kb.mozillazine.org/Browser.backspace_action
Maybe it was copied from IE to keep "closer to the platform"?
My FreeNAS, Arch Linux and my Android phone as well.
I think, we get paid because we are building new stuff and have to maintain shitty stuff. If my job would literaly just desigining it high level, clicking it together and then it works, no one would need me.
Yes its frustrating sometimes.
I would like to cure cancer instead of debugging why this update broke our system.
I’ve microwaved a burrito by mashing the start button hundreds of times.
If I design a bracket for a TV mount, do you blame the bracket when someone hangs a 3000 kg bookcase on it?
You’re expect software to be perfect, yet ignore the massive limits everything in the world has.
Not saying there aren’t quality issues with software. I’m saying software development is really difficult.
You mean where 99 is greater than 100?
Everything is broken, and nobody is upset. [1]
Some of the software I use is so unreliable that I expect it to fail. I expect the Vodafone login page not to work properly. I expect one of my airpods not to connect on the first try. I expect my banking app to show random error messages, even though it works just fine. Most online stock brokers have issues at the worst possible times. My bookkeeping app is frequently wrong, per my tax advisor. Since everything is broken, the best I can do is to mentally assign all those apps a trustworthiness score, and avoid betting too much on them.
The worst part is that support for all that software has been largely automated. If you have a problem that can't be fixed by a chatbot or a crowdsourced support community, you are largely helpless. Google can wipe everything you love, and there's no one to punch in the face (to borrow from Grapes of Wrath).
So far, my only solution to this is to be a late adopter, and to favour simplicity over sophistication. I was recently considering going from paper notebooks to a tablet. That initiative stopped at the electronics store. The Surface Go wanted me to go through a setup wizard (after dismissing a few notifications). Two of the 4 iPads had working pencils. The ReMarkable reviews mention a host of issues. I never encountered any bugs with my Moleskine. It pairs flawlessly with any pencil I want, including older models.
[1] https://www.hanselman.com/blog/EverythingsBrokenAndNobodysUp...
My Mac sometimes unpairs them and worst it doesn’t find them. Some times while my baby is sleeping I put my AirPods and play a loud video just to realize they were not connected. My wife’s right side AirPod just stopped working after one year of use...
If apple is considered top tier in reliability, then technology in general really just sucks!
I'm using a BT Jabra headset, with noise cancelling I got for about the same cost, 16+ hours of battery, easy pairing and super useful phone app, great ANC, and solid audio quality, at least to a non-audiophile. My biggest complaint is the closed back design leaves my ears a bit irritated after 4+ hours of use. Not an airpods competitor but for the cost I am way happier.
Surprisingly, iCloud syncing works fine. If I pair my AirPods with one device, it always pairs with all of them.
The main issue is with the right pod not always turning on when I take it out of the case. The solution is to put it back in the case for 5 seconds and to try again.
The second most important issue is the airpods falling out of sync with each other. It seems like the signal from my Samsung S9 in my pocket is choppy. Looking left or right for too long will make the signal drop. Putting my hands in my pockets also will. If I put the phone in my backpack, it's okay.
This is still more pleasant than wired headphones, but it's far from a magical experience.
Personally, I hate ear buds and, as such, never bought ear buds. Rather, I spent ~$20 on SoundBot bluetooth headphones starting some five years ago (long before air pods, methinks) and haven't had problems with them at all.
I also have a seven year-old phone (HTC OneMax) running custom (unofficial/ported by a random hacker) Android[0], and it pretty much works.
Sure the battery life has degraded since 2014, but that's to be expected, no? I wish I could replace the battery (as I did with my 15+ year-old Panasonic cordless phones), but there really aren't too many mainstream mobile devices that allow that any more.
As for poor quality software/hardware, if you don't like it, vote with your feet and/or wallet.
If stuff doesn't work, why use it? Even more, if stuff doesn't work and you can't/won't fix it yourself, then don't use it.
Software devs and hardware manufacturers don't care about whiny blog posts or complaints on HN, they care about the bottom line. Impact the bottom line and you may have a chance at improvement.
Stuff that actually addresses the issue is useful. A great example is the lack of Android support after 4.4/KitKat on the HTC OneMax mentioned above and the abandonment of it on Cyanogenmod/LineageOS in 2017, where those (myself included although I'd never hacked on Android ever -- and failed miserably -- thankfully someone else did not) impacted by this took action to provide the latest Android on an old, unsupported, discontinued device.
If you're not taking positive action toward making things better (whether that's fixing the problems or voting with your feet/wallet), then you're not going to have any impact.
While whinging about it on your blog may be a way to relieve the stress you feel about whatever issue(s) you may have, it's not constructive or useful.
That is unless your goal is to get lots of comments on HN where the Apple Fanbois sagely agree, and lament there's nothing to be done about it because Apple is the pinnacle of tech and since no one could possibly do anything better than Apple (or the apps that run on their gear) therefore all technology sucks.
And that's objectively false. There's lots of tech out there that's quite good. I suggest using that and shunning rather than using, then whinging about the stuff that sucks.
[0] https://forum.xda-developers.com/htc-one-max/rom-lineageos-1...
Edit: Fixed typos/formatting issues.
Good, you dodged a bullet there.
I mean, I love my 2-in-1 Dell (a slightly cheaper but still high-end Surface-like device). The pen, as much as it's useful (I'm not even considering buying a touchscreen-enabled device without solid support of a pen anymore; it's so much better UX than fingers), still has lots of subtle and annoying bugs. Maybe in 20 years people will work out the kinks. More likely, the concept will be abandoned in favor of some new modality that will also never be perfected.
Most software is still net positive in productivity. We tend to place more emphasis on failures as users.
Remember you're running millions of lines of code that talks to other computers running millions of lines of code that communicates over a network running millions of lines of code to deliver some information on the order of seconds to minutes -- and then something responds to that information and everything happens all over again.
All day, every day, trillions of packets of information get delivered just fine. Try doing that as a human, delivering letters. You probably won't even approach a million packets delivered in your life time. And people have the audacity to say, "oh my, some things didn't work, this is completely broken"
In only a single generation, we went from voice communicators to super computers in our pockets. The utility vastly, vastly, vastly overshadows the glitches that come with frenetic advancement. How long did it take humans to invent basic numbers?
Most notes I take only need to exist for a few weeks and then I erase...so transferring it to "long term storage" is rare.
I do have an iPad and note taking apps like Notability if I know something will need to go to "long term storage" but I find I use the Rocketbook more.
I thought it would be nice to access my notes when I don't have my notebook on me, and to have layers, zooming, undos etc. However, the more I look into it, the more absurd it seems.
I'm replacing a 15€ notebook and a 2€ mechanical pencil by a 400€ gadget that doesn't quite work. Why? So that I can spend my time organising notes in a digital space. Why? I don't really know.
It would be cool to have layers, zooming and an undo button. It would also be cool to have access to my notes even when I don't have my notebook. However, it would just be cool. It doesn't actually solve a serious problem.
Not to mention that it:
- doesn't need charging
- never freezes or crashes
- is much cheaper than a laptop or tablet
- distraction-free (no Internet, no apps, etc.)
The iPad seemed pretty solid, but I'd have to turn it on and unlock it to see my notes, unlike a notebook.
The Remarkable seemed nice, bht there are lots of complaints that paper doesn't have.
The Supernote A6X was the most promising, but it was hard do get in Germany.
There's always an adjustment period, where people have to spend time learning a new technology, and any issues with the new technology need to be resolved. The gains in productivity happen mainly after the adjustment period. But we've eliminated the periods of stability and are constantly pushing for more "innovation", which means we're in constant periods of adjustment and resolving problems, where the promise of increased productivity is never fully met.
The worst idea ever in technology is regularly scheduled updates. Innovation has never and will never happen on a schedule. This is simply greed-driven, promotion-driven, pointy-haired-boss-driven development.
Produce something new and great... but then let us all enjoy the new thing for a while. Novelty for its own sake is not productive.
this is sort of uncharitable. the development/maintenance cycle for software is incompatible with the traditional way of monetizing a product (ie, design it up front, manufacture at scale, and then the buyer gets what they get, barring severe safety defects). buyers of software expect the product to at least mostly work in the first place, but they also expect bugs to continue to be fixed after the sale, even if the bugs are introduced through unforeseeable interactions with other software.
imo, subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development. while it introduces some unfortunate constraints in the dev cycle, bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole. trickling out new features "when they're done" earns the respect of engineers, but results in the average user simply not noticing that progress is being made.
Contrast that to traditional physical goods, where buyers expect the product to work as advertised, right out of the box, or their money back. Software in the Internet era has it easy, because it gets to release shitty half-finished versions, and then keep charging money while never quite finishing the software.
> subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance. but buyers tend to consider this a ripoff if they don't actually see a stream of new features in development
Because software does not decay on its own (despite the misleading term "bitrot" being popular in tech circles). That's literally why digitizing data has taken the world by storm: digital data does not decay; as long as the physical medium is readable, you can make a perfect copy of the data it contains. As a buyer, I don't expect my software to need maintenance. I expect it to work out of the box (just like I expect every physical product to work out of the box), and once I find software that fulfills my needs, I expect it to work all the way until computing technology moves forward so much that it's no longer possible to run the software. Which, in the era of virtual machines, may take decades.
So yeah, there's a need to clearly justify to the customers why you're charging subscription, because software in its natural state does not need maintenance.
I wasn't disagreeing with that. I mentioned "long periods of stability and refinement" — refinement including bug fixes — and "any issues with the new technology need to be resolved". But again, bug fixes don't magically happen on a schedule either. Maybe fixes are easy, maybe they're hard, you never know in advance.
> bundling up features in a scheduled update is a good way to make it visible to users that their subscription dollars aren't just falling into a black hole
This is exactly why it's not true that "subscriptions are actually the ideal way of aligning incentives for products that involve ongoing maintenance". Instead of maintenance, subscriptions incentivize continuous delivery of new features, and consequently continuous delivery of new bugs.
Did consumers demand subscription services? Or did vendors (led by Adobe) decide to change to subscriptions to get uniform cash flow?
At agencies I have worked at all creatives I worked with would prefer to spend $200-400 and have a permanent software license. Perhaps this isn't a representative group.
For consumer goods, it is the hype cycle.
New updates and releases get press. They also restore consumer confidence.
If Samsung announced that next year they weren't releasing a new Galaxy phone, the entire industry would freak out. Consumers would lose confidence in buying Samsung phones, journalists would write articles questioning if Samsung was pulling out of the market, a lot of bad things would happen.
Give it 18 months without a release and people would start to think of Samsung as "that company that used to make phones."
They would have to fight like heck to restore their image.
Software is the same way. In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.
Sure every Android version up until 7 was kinda-sorta-terrible, but it kept Android in the news. Likewise, Apple got huge free press every time they announced a new revision of OS X (now MacOS), and every time they came out an announced a new version of iOS.
The result? "The desktop is dying, phones are where the real innovation is at!" articles being published even faster than those software updates came out.
You can of course release too fast, rarely do Chrome or Firefox's releases get any press (unless there is a controversial UI change), but in general frequent updates are free advertising.
Tesla is also great at this, I'm nowhere near being in the market for a Tesla, but at least a couple times a year I still end up hearing about software features they are rolling out!
The smartphone industry has only themselves to blame for setting up this expectation. But it is possible to get off the train. I remember when Apple announced they were dropping out of the annual MacWorld San Francisco conference, because they didn't want to constrain their product release cycle. Apple survived that just fine.
> In the Vista/7/8 era, Microsoft looked like they were falling behind because their competitors started releasing yearly, or even bi-yearly, feature updates.
Competitors? Windows and Mac had near 100% market share on desktop. There was only 1 competitor. Vista was released in January 2007, but Mac OS X 10.5 Leopard was infamously delayed until October 2007 because of iPhone, so this telling of history doesn't seem entirely accurate. Moreover, Mac OS X releases were already slowing. Here's a list of months since the previous major .0 release:
10.1.0 6
10.2.0 11
10.3.0 14
10.4.0 18
10.5.0 18
10.6.0 22
10.7.0 23
10.8.0 12
10.9.0 15
10.10.0 12
10.11.0 11
10.12.0 12
10.13.0 12
10.14.0 12
10.15.0 12
Thus, major Mac OS releases were slowing down year by year — which is totally sensible — but then Steve Jobs died after 10.7 was released in 2011, and only then did they switch to a yearly schedule.
One week, the update could be a relatively minor bug fix. The next week, a major feature upgrade that's been in the pipeline for months.
You also remove the ambiguity of "Is this worth pushing out? When should I push this out? Should I do some more fixes or push this one out first?". You got fixes, push them out in the next update.
Your criticism also assumes a small team. If you have a large enough team where you can split them into new feature development and current bug fixing, they're going to work at different rates and be ready at different times. If instead your entire team just works on "the product", then there is no effective difference between fixing issues and creating functionality.
I constantly encounter broken functionality, buggy or unpleasant UIs, just as the author has. It feels like many of these problems could be avoided if you just had one person whose job it was to sit there and look for broken stuff. (I'm sure I'm biased as someone whose first job out of college was to sit there and look for broken stuff.)
I would say that effortless, automatic updates are to blame.
When you can always just push an update, the impact of a given bug goes way down. It's no longer mission-critical to exterminate flaws before shipping; a totally broken feature becomes a mere annoyance. So project prioritization shifts from polishing an artifact to outweighing the (presumed inevitable) constant stream of little annoyances with fixes and features. I think the shift towards automated testing is just a symptom; an attempt to bridge the gap in this brave new world.
For a clear-cut example of this phenomenon, look to the video game industry. Until around 2007, games received no updates. Ever. Once a game shipped, it was shipped. There wasn't even a mechanism for installing an update from physical media.
Right around that time, "glitches" went from very rare unicorns that people would spend lots of time actually seeking out, to nearly everyday occurrences. As long as it doesn't corrupt someone's save file, they mostly laugh it off and upload a clip to YouTube to show their friends. This is just how things are now.
(Edit: I should have scoped this to "console games")
Sure, but they still (sometimes) released (a few) extra revisions of a game. They were just targeted at people who bought physical copies after the revision date, rather than at existing customers.
Or said updates came on the 1.0 version of the game as shipped in markets that got the game later than others. (Just imagine — per-market release versioning. Every market effectively got its own fork of the codebase!)
Or said updates came in the form of a re-release port. There are patches made to the emulated downloadable app-store re-releases of some games, that never made it into any physical edition of the game.
Also, before home consoles, arcade game machines did receive bug-fix updates regularly. Arcade machines were essentially provided “as a service” from their manufacturers, with support contracts et al. Sort of like vending machines are today. If you reported a bug, they’d fix it and send you a new EEPROM chip to swap out in your cabinet. If there was a critical bug that affected all units, they’d send techs out to everybody’s machines to swap out the ROM for the newest revision. (For this reason, it’s actually kind of hard to do art-conservation / archiving of arcade games. The cabinets almost never have fully “original-release” components inside.)
Now, it's just "LOL just ship it, users will just deal with it until the next release!" Now, it's "Do experiments on N% in prod and use end users for A/B testing. If something's broken we'll update!"
In several industries, it's actually totally expected that v1.0 of the application simply won't work at all. It's more important for these companies to ship non-working software than to miss the deadline and ship something that works! Because who cares? Users will bear the cost and update.
Once internet updates became the norm, it all became pretty much like the rest of software industry. (At least game companies still have QA departments, a lot of mainstream web companies have dispensed with those as well.)
One system has no patching, and updates incur some non-trivial amount of effort on the part of the installer. Releases are a few times a year, at most.
The other system has patching, updates are lighter weight, and as a result, the system has THOUSANDS of patches released over the last decade, north of 2 per work day.
Guess which system is higher quality? The former.
Much, much higher quality.
Warcraft II, from 1995 received multiple patches. So did many other games from that era.
Do you perhaps mean console games when you say "games"?
Software has always sucked and had these issues. It has nothing to to with automated QA. The reason you see more issues is 'way back in the day' your software had a very limited number of things that it did, and in general it did not involve accessing a network or chugging down massive volumes of data from untrusted sources.
I work for a company that has a lot of individuals that test for QA issues, they have list miles long of things to check and write reports on.
The problem is more of "It's much easier to write mountains of code than it is to ensure that it works in all cases"
I agree with your last point a lot, though I would modify it slightly: it's much easier to write mountains of code now than it as, and it's now common and much easier to import external dependencies (especially at system level) than ever, and those dependencies have tens of millions of lines of mediocre code all by themselves.
30 years ago, software bugs might interfere with you professionally, but they wouldn’t stop your ability to get money from the bank, cook food, or do any other day to day tasks.
I disagree, subjectively it had its ups and downs and we are in a down phase right now. YMMV.
And despite the issues lamented in this think piece, it is _Apple_ that the author should blame for setting technology expectations impossibly high.
No organization had been remotely as successful in understanding and releasing tech products that were truly great.
Everything since is just a comparison to expectations Apple set. Even when Apple fails, it is in comparison with an Apple that does not.
There are a number of hats developers are expected to wear today:
1. Developer of new features
2. Sustainer of prior code and features
3. Tester of all of this
4. Constant student (outside work because who'd pay their employees to learn?)
The priority for the business is (1), so 2-4 get neglected. This compounds over time to mean that old code isn't properly refactored or rewritten when it should be, and none of the code is tested as thoroughly as it should be, and none but the smartest or most dedicated are really going to be perpetual students (or they'll choose to study things that interest them but don't help at work, like me).
When the old code and poor tests create sufficient problems, you get a business failure or a total rewrite. Which strips out half (or more) of the features and the whole process gets restarted.
The level of complexity in modern day software is orders of magnitude greater than that of even a decade ago.
What has changed is our reliance on that software. We are now so deeply embedded into our software existence we see these flaws up close.
Now, our computer ("phone") is with us everywhere we go and we'll use dozens of complex applications per day, connected by dozens of APIs, networks, protocols, and hardware features. It's a miracle any of it works sometimes! Thank you to everyone for making this stuff seem like magic; my twelve-year-old self would be amazed at how well it works.
I do feel like there are more UI bugs as we optimize for certain metrics over others. Timing updates has become far more complicated, so we get weird UI refreshes as new data comes in, stale caches, missed notifications, etc. Turning it off and on again often works, surprisingly (probably because devs start with clean environments often, so that's the functional baseline).
Lastly, it is probably far more lucrative for a technology based business to use their most valuable minds for the Next Thing, rather than iterating on the current thing. Incremental revenue improvements just don't cut it in a capital-driven world; everyone is trying to escape the local maxima to find billion/trillion dollar businesses.
But the worst part is that this is an issue with established products that have secured their market, and will even receive money every year from their customers. They have both the money and the time to pace themselves and test things properly, but they don't.
Now we are in 2020 and iPhone updates STILL cause battery issues. The iPhone has been out for 14 years...
Our expectations have changed. Tools like Excel should just work - and yet, when I try to save a file, sometimes it freezes and crashes. How is that acceptable now?
1. Manual testing that should be manual (exploratory).
2. Manual tests that are new and haven't been automated yet (but will be).
3. Manual tests that should be automated.
(3) is the one many people see and suffer through (I know I have). They need to be automated to free up time for (1), which is where many issues are actually discovered. But if (3) dominates your time, you can never get to (1) and you'll constantly ship broken things (or more broken than they should be).
Deleted Comment
Recently I got an RSI from programming and could not use a keyboard for 10 months. I could only use my Android phone. There were so many problems and I had so little insight. I _did_ search for answers and found contradictory or incorrect instructions. Often I had to resort to uninstalling apps and taking my chances on their competitors. This strategy had a very low rate of success. Many things I just gave up on. I essentially lost 10 months of productivity and became despondent.
One specific problem I had was that I was unable to control which PDF viewer would open automatically when I downloaded a PDF. For not particularly interesting reasons regarding my workflow, it was important to me that Adobe open instead of the viewer built in to the file navigation app. I followed the instructions online to rectify this. However when I navigated the menu to the appropriate place, the option simply didn't exist. I don't fully understand, I'm not and will never be an Android dev, but I believe it had something to do with the built in viewer not being a first class app but some sort of "applet".
Deep in this frustration it clicked with me; this is how most people see computers. Black boxes that work against you. Unfathomable and unserviceable.
PSA: Stretch your hands. Take care with the ergonomics of your keyboard, mouse, and desk. If typing hurts - stop.
Probably should've mentioned that!
And then stuff dies after a short time. And/or is just left forgotten because it's more broken/an annoyance than it is useful. Or, and this is key, you are stuck with it because for a mix of reasons there is no viable alternative.
One would expect that on the long run the ultimate victims of this (the users of technology) would realise that rather then the ten thousand loads of tech crap they buy they would be much better off with a bunch of stuff that actually works without annoyances for a better quality of life but people just don't. Because again, pricing and marketing.
It used to be _create value and make a profit from it_, now it's the other way around: _aim for profit and consider creating value as a cost of doing so_ (and therefore should be minimised, who gives a frack if the user is annoyed or the thing is broken as long as there is profit).
People has been even trained for decades by major products in the industry to accept that tech doesn't work/is broken/sucks. The good stuff we have exist because at some point for some small time quality became competitive advantage, then on top of that other cycles of craploads piled on.
We did this.
I feel pretty good about my choices in comparison. I use an older, sock Android device, with simple wired headphones, so that particular issue with AirPods has never occurred to me. My laptop is a bulletproof old Thinkpad running Ubuntu LTS, which is likewise pretty reliable. Software that I have to use for work but that I expect to be crappy goes in a VM. I use this crappy software to the best of my ability to create other software and hardware that I expect to work reliably, and to do so for decades.
But none of those signals and "good choices" ever feed back into better choices by the titans of our industry. When an iPhone or AirPods device uses up all the recharges available in it's battery, that shows up as profit in the numbers Apple uses to make decisions.
Don't blame the victims for the situation they're prey to.
And yes, I blame the victims for their/our short-sightedness, it's not like anyone forced people to buy and re-buy broken products for decades.
You make it sound like only the batteries in Apple devices have finite charging cycles, which is obviously not true.
I really hope more companies start to see quality and reliability as a competitive advantage. I know I'm an outlier, but I gladly pay more or switch when I find a product that is more reliable.
In the 90s it was pretty common to have a kernel panic and hard crash, blue screen of death, etc. Data loss was common.
Folks who lived through that have such a strong compulsion to hit CMD-S every few minutes, that a lot of cloud software today offers a “save button” that actually doesn’t do anything because the document auto saves as you type anyway.
To some degree I think this may be related. Bugs that crash the computer are a little bit easier to detect (If only because the users come screaming at you) compared to bugs where the app scrolls incorrectly once every million scroll events.
Overall I would say that software has become much more complex and much more reliable at the expense of being more inscrutable. People are able to operate in a degraded condition now because software is capable of working that way, whereas in the past it would just fail completely.
Is that better? Maybe not by much, but I think most normal users would say software is easier to use and more reliable today compared to decades past.
This describes me.
And I often wonder if the presence of the "save button" is what causes the compulsion: in web apps that auto-save, I trust/assume they're auto-saving, and feel more free.
In web apps that have the save button, it's always occupying some part of my mental capacity to keep hitting it after significant edits.