If there's one thing that I think was revolutionary about Jobs, it was his obsession with quality and user experience. You simply don't find that quality in a lot of tech CEOs. Jobs was willing to burn a load of developer time doing performance tuning. Most other CEOs then and today had an attitude that was more along the line of "We'll just buy more/faster hardware. It's a waste of time to make things faster".
A lot of the reason people are hating on windows now-a-days is because "fast enough" has become the name of the game for UX. Unacceptable lags in working with a computer have just become accepted.
He was like that not just for performance, but user experience across the board. “Good enough”, aka mediocrity, didn’t cut it and he didn’t care if he had to spend extra resources or even burn bridges to raise the bar to where he thought it needed to be.
It’s a stark contrast to current industry norms, where anything that won’t keep the engagement and MRR bar charts on a steep incline gets vetoed. It’s more likely that memory consumption will be tripled and UI will be modified to harass users into compliance with whatever hare-brained thing product managers are pushing than it is for the software to become more efficient, pleasant, and useful.
Jobs was one of the original product managers. He brought the customer perspective right into engineering.
Unlike a lot of CEOs, he was willing to do what most product managers aren’t: make hard trade off decisions.
He cut losing product lines, made big bets (killing floppy disks) and was deeply technical… I wish my CEO had the guts to make these calls. (More importantly, when he does, I want him to be right!)
There was a story in one of his biographies where he spent a lot of time making sure the machines that made one of his products actually looked good too!
One of his biographers gave an example of how some cabinet makers only use good wood on the front and side, but Jobs would want good quality wood on the back as well.
There is also the story about Steve throwing a MacBook Air on a conference room table and asking why does the iPad wake from sleep so much faster? And then he told them to fix it and make Mac laptops sleep/wake just as well as iOS.
Sleep/Wake is one area where MacOS absolutely destroys Windows.
How can MacOS possibly sleep/wake any faster than Windows? My Lenovo X1C wakes up so quickly that the limiting factor is how fast I can enter my PIN on the keyboard. Well below 1 second, maybe 0.5 seconds. Going to sleep is the same, I'm not going to measure it but it feels like it's about 0.3 seconds.
Maybe Windows, I haven't used it in a long time. But I have noticed my son's MacBook pro (used to be my work laptop) only pretends to be available after "waking". It'll repeatedly fail to actually take input in the user login password field. It does so silently, leading to missing characters in the password and needs several attempts to actually fill out fully. I don't know what it's doing in this time, but not having the "busy beachball" is a lie.
> There is also the story about Steve throwing a MacBook Air on a conference room table and asking why does the iPad wake from sleep so much faster?
As someone who has owned two Apple laptops before the iPad was introduced (my first was a PowerBook G4 in 2005), I've always just closed the lid of my laptop instead of shutting them down. They've always resumed quickly.
If this story was true, it probably wasn't an iPad.
I think .NET is one of the few projects Microsoft maintains that I admire that feel like they care a lot about quality, you can tell the people working on it are focused on performance and making sure its really well rounded. I would argue that .NET is Microsoft's greatest achievement / work of all time.
MS has had very little to do with the execution, but it has their logo on it, so I think you can count most Microsoft hardware (pre-Copilot button) as quality.
Hm. The one-button mouse? That was part of the design impact - for user experience, it wasn't much of a win.
Likewise the faulty power cords and noisy power supplies (no choke on the power cable, because it looks ugly!)
How about the soldered-down components and device cases with special screws to keep users from ever opening them? That was not 'for the user', that was more 'walled garden'.
In fact, I'm not sure where this myth of 'quality and user experience' came from. It was all about selling, baby.
These critiques are so tiresome. Like he forced people to buy macs or something. You're not the audience. For the average consumer the fact they don't even have to think about unscrewing something is a major part of the appeal. The walled garden is a plus for them not a negative.
And then ending with the sanctimonious line about selling. Like you eat off of selling nothing. Go screw in whatever you like just understand your critique comes across as little more than entitled griping against a majority. You're the people he fought against the entire time, people obsessed with their own personal agenda/minutia with no understanding of the overarching mission or who the customer is. This video comes to mind https://youtu.be/oeqPrUmVz-o
Design without an audience in mind is not design. Don't dismiss the work simply because you're not the audience.
The main problem with the "a little lag is fine" mentality is that, in most organizations, it's becomes cancerous. It spread and infiltrates every team, every corner.
When you have hundreds of teams and they're all doing suboptimal things for shits and giggles, that extra 500 milliseconds is now a minute.
And, the real kicker is that usually the slow stuff isn't even simpler or better. It's just naive and poorly thought out. Usually it's super simple stuff like use a hash map instead of iterating through an array a bajillion times over. Or, do this network request asynchronously instead of just blocking for no reason. Or, in the case of some suspicious Microsoft GitHub code, just use sleep() instead of spin locking.
These things aren't harder, they're just different. So it's not even laziness really, it's something else. Apathy, maybe?
> So it's not even laziness really, it's something else. Apathy, maybe?
My hypothesis is that it comes from a bygone of tech.
Consider the lyrics from Weird Al's "It's all about the pentiums" (1999)
My new computer's got the clocks, it rocks
But it was obsolete before I opened the box
You say you've had your desktop for over a week?
Throw that junk away, man, it's an antique!
Your laptop is a month old? Well, that's great
If you could use a nice, heavy paperweight
From around the 80s all the way up until ~2010, one of the most reliable ways to make software run faster was to wait a year. You could get 50 or even 100% faster CPUs in a matter of a year or two.
Tech CEOs weren't blind to this fact. I have a lot of old software dev coworkers that lamented that era because they never had to think about performance problems. It was always "this won't be an issue if we wait a year".
I think that era has mostly built in an industry wide sloppiness and attitude.
Also for a platform vendor, it also infects everyone who develops for their platform.
Writing iOS apps, I've been in meetings where the discussion has basically made Apple's equivalent a benchmark. You can't make your feature slower/worse/buggier than theirs, but making it faster/better is optional.
Absolutely, and Cook-era Macs remind me of that frequently.
For example, my last Mac was a Cook-era machine with two third-party displays. Its normal boot process is a visual atrocity: the screens repeatedly blank off and on, the progress bar jumps arbitrarily to new positions and dimensions on the screen, the log-in window animation has drawing quirks...
...when I watch this orgy of complacent design, I often dream of what would happen had the Apple DRI presented it to Steve Jobs.
And you can tell the difference with how long he has been gone. MacOS is terrible now. So many weird bugs and performance issues. Not that Jobs was perfect, of course. But he cared in a way others didn't, like you said. Cook clearly doesn't.
> one thing that I think was revolutionary about Jobs...
No. Absolutely, unequivocally, no. You're talking about the difference between then and now in the way software was/is built, not the difference between Jobs and everyone else! The deification of Jobs is bad enough without the constant historical revisionism.
Back then, generally tuning/maximising performance and quality was a top priority for the majority of people in the industry, software engineers and senior staff alike. "Faster hardware" just didn't affordably exist for them back then. Many who were there in those days now bemoan the way their modern equivalents no longer prioritise efficiency, which leads to the awful slow UX you're referring to that really shouldn't be seen as acceptable, but somehow is.
Even if we see Jobs as being at the extreme - more focused on these things than most top-level execs of his day - then to treat the entire rest of the industry together as though they were at the opposite extreme (i.e. at today's level of not caring) and call him "revolutionary" in his day as a response to this, would very much be fallacious.
> "Faster hardware" just didn't affordably exist for them back then.
Not what I said. And I think you are the one that's doing historical revisionism now.
Even in this email from 1983, it starts off with
> since its 68000 microprocessor was effectively 10 times faster than an Apple II
From the 80s through the 00s (which I was alive through and very aware of), computer hardware was frequently doubling in performance. The common wisdom then was to make things fast enough. Anything more was a waste of time because in a year or two hardware would be twice as fast.
The wastefulness of today came directly from that past wisdom. I can guarantee you that ever since I've been conscious around discussions about software there's been people that have bemoaned how sloppy and wasteful software has become. People complained about how bloated Windows XP was vs 98.
Ruby, python, perl, java. All these bloated and slow programming languages got their starts in the 80s and 90s. Exactly because of the wisdom that "it's slow today but hardware tomorrow will make it fast". Heck, even C and lisp are manifestations of this. Consider that people weren't writing all software in assembly during the time period in question. There were clear performance benefits of doing so as compiler at the time were particularly bad.
I've worked with a lot of older devs and they all hold the attitude that performance optimization is a complete waste of time. They've been the hardest ones to break of that notion. Younger devs tend to more intuitively know that performance optimization are important. That's because over the last decade, hardware performance improvements have stagnated.
So yes, absolutely yes. In the past if you could make writing software more ergonomic by sacrificing some memory or performance, that's a tradeoff most of the industry would gladly take. They wrote for today's hardware and sometimes tomorrow's.
But does it matter? Eventually a bean counter will be in charge of the legacy you built up with this painstakingly acquired good UX and high quality, and take less than a decade to make most of what you spent your life fighting, the new reality.
"Reasonable people adapt themselves to the world. Unreasonable people attempt to adapt the world to themselves. All progress, therefore, depends on unreasonable people." — George Bernard Shaw
It also didn't always work. At no point did the MacBook boot nearly as instantly as an iPad. That said, Jobs' obsession with UX was a powerful driving force and your point stands.
That's actually the standard model for evaluating transport projects: aggregating small time savings across millions of people.
You basically take those millions of saved hours and multiply them by a government-standard 'value of time' (roughly £15/hr in the UK). That usually makes up the bulk of the benefits, though they also price in things like safety (a prevented death is worth ~£2m), carbon, noise, etc.
IIRC, if you hit a Benefit-Cost Ratio of 2.0 or higher, the project is considered 'high value' and has a good shot at getting executed.
This reminds me of a story I heard about a bus driver who would always pull away from the stop right on schedule even if a regular rider was running up. His calculation was the 30 seconds spent waiting for one rider was an aggregate of many minutes lost by the riders who were on time for their stops. What looked cruel to one was a kindness to many.
A bus can easily carry 50 passengers. 30 seconds times that many is 25 minutes. That's a lot of aggregate time wasted indeed.
Also assuming this 30 seconds delay is not compensated later, it can influence significantly more people than the bus capacity. And if someone misses a connection because of it that's even more time wasted.
Economic BCAs are typically handled by large eng firms like Arup, Jacobs, and WSP. However, the tricky task of modeling time savings (given that transport systems are complex) is often subcontracted to more specialized firms such as Steer.
Deloitte, KPMG, etc are usually more involved in writing the financial case (how to fund the project).
I like this thinking about other people's time as opportunity cost. I do that a lot and always encourage others to keep it in mind, too.
An example: a few years ago, there was a recurring unnecessary traffic congestion on my commute because of a malfunctioning traffic light. On the third day, I did some numbers while waiting and came to the conclusion that over hundreds of people, this was quickly adding up to months of lifetime wasted in total.
I then called the responsible municipality right on the spot to notify them there's a problem. They thanked me and had it fixed the next day.
Hertzfeld dismisses the idea, but I think it’s something more devs should take to heart.
Could someone build a tea timer app in React and save some time? How much impact to humanity does the GBs of RAM and untold CPU cycles the app now require that could be put to use elsewhere, or causes systems to be landfilled due to inefficiency?
I had a phone with GBs if RAM and a multicore processor that could barely run a single current app. I can buy a new phone, but what about the billions of people that don’t have that option?
Absolutely. There’s very little mass-market software that has any reasonable justification for being unusable on any machine that’s roughly Core 2 Duo era or newer. It’s perfectly possible to work within those constraints and not even really that difficult, it just requires a modicum of care and understanding of what’s happening under the hood.
Slightly off topic, but this reminds me of how a crash would bring the whole machine down. I was studying graphics, and woah could we gobble up that memory. You learned to save and version-name very quickly.
Pretty sure Steve Jobs was known for yelling at, belittling and bullying people, throwing tantrums and making threats/ultimatums.
Dude had anger/I'm the hero issues...his biography notably leaves this stuff out and Woz' only covers a few incidents (because he still considers friend) though I'm sure there were more. Like when Woz invented universal remote and sent a prototype to Jobs and Jobs smashed it against the wall in a fit of anger.
I will never claim Jobs was a good neighbor or a Mr. Rogers type. Or even a fun person to work for.
But I don’t look up to him for that. Same way I don’t look up to Tiger Woods for who he is as a husband, or Picasso for… well, also poor behavior with women.
I want to play for Michael Jordan to be with the best and to be challenged to be my best.
Sometimes the thing that makes people excellent in one facet of their life makes them impossible pricks in others.
Extreme excellence in one facet of life is what I admire people like that for.
There are a lot of stories about Jobs acting in completely unhinged and highly toxic ways. I agree that the particular situation you’re describing is a good though.
A lot of the reason people are hating on windows now-a-days is because "fast enough" has become the name of the game for UX. Unacceptable lags in working with a computer have just become accepted.
It’s a stark contrast to current industry norms, where anything that won’t keep the engagement and MRR bar charts on a steep incline gets vetoed. It’s more likely that memory consumption will be tripled and UI will be modified to harass users into compliance with whatever hare-brained thing product managers are pushing than it is for the software to become more efficient, pleasant, and useful.
Unlike a lot of CEOs, he was willing to do what most product managers aren’t: make hard trade off decisions.
He cut losing product lines, made big bets (killing floppy disks) and was deeply technical… I wish my CEO had the guts to make these calls. (More importantly, when he does, I want him to be right!)
One of his biographers gave an example of how some cabinet makers only use good wood on the front and side, but Jobs would want good quality wood on the back as well.
Sleep/Wake is one area where MacOS absolutely destroys Windows.
As someone who has owned two Apple laptops before the iPad was introduced (my first was a PowerBook G4 in 2005), I've always just closed the lid of my laptop instead of shutting them down. They've always resumed quickly.
If this story was true, it probably wasn't an iPad.
SQL Server is of equally high quality.
We just have postgres in the open source world (which is truly exceptional) so our expectations are higher.
I am the first to hate on Microsoft, their OS is a dumpster fire that I feel is forced on me. But sometimes they knock it out of the park.
Likewise the faulty power cords and noisy power supplies (no choke on the power cable, because it looks ugly!)
How about the soldered-down components and device cases with special screws to keep users from ever opening them? That was not 'for the user', that was more 'walled garden'.
In fact, I'm not sure where this myth of 'quality and user experience' came from. It was all about selling, baby.
Deleted Comment
And then ending with the sanctimonious line about selling. Like you eat off of selling nothing. Go screw in whatever you like just understand your critique comes across as little more than entitled griping against a majority. You're the people he fought against the entire time, people obsessed with their own personal agenda/minutia with no understanding of the overarching mission or who the customer is. This video comes to mind https://youtu.be/oeqPrUmVz-o
Design without an audience in mind is not design. Don't dismiss the work simply because you're not the audience.
When you have hundreds of teams and they're all doing suboptimal things for shits and giggles, that extra 500 milliseconds is now a minute.
And, the real kicker is that usually the slow stuff isn't even simpler or better. It's just naive and poorly thought out. Usually it's super simple stuff like use a hash map instead of iterating through an array a bajillion times over. Or, do this network request asynchronously instead of just blocking for no reason. Or, in the case of some suspicious Microsoft GitHub code, just use sleep() instead of spin locking.
These things aren't harder, they're just different. So it's not even laziness really, it's something else. Apathy, maybe?
My hypothesis is that it comes from a bygone of tech.
Consider the lyrics from Weird Al's "It's all about the pentiums" (1999)
From around the 80s all the way up until ~2010, one of the most reliable ways to make software run faster was to wait a year. You could get 50 or even 100% faster CPUs in a matter of a year or two.Tech CEOs weren't blind to this fact. I have a lot of old software dev coworkers that lamented that era because they never had to think about performance problems. It was always "this won't be an issue if we wait a year".
I think that era has mostly built in an industry wide sloppiness and attitude.
Writing iOS apps, I've been in meetings where the discussion has basically made Apple's equivalent a benchmark. You can't make your feature slower/worse/buggier than theirs, but making it faster/better is optional.
For example, my last Mac was a Cook-era machine with two third-party displays. Its normal boot process is a visual atrocity: the screens repeatedly blank off and on, the progress bar jumps arbitrarily to new positions and dimensions on the screen, the log-in window animation has drawing quirks...
...when I watch this orgy of complacent design, I often dream of what would happen had the Apple DRI presented it to Steve Jobs.
No. Absolutely, unequivocally, no. You're talking about the difference between then and now in the way software was/is built, not the difference between Jobs and everyone else! The deification of Jobs is bad enough without the constant historical revisionism.
Back then, generally tuning/maximising performance and quality was a top priority for the majority of people in the industry, software engineers and senior staff alike. "Faster hardware" just didn't affordably exist for them back then. Many who were there in those days now bemoan the way their modern equivalents no longer prioritise efficiency, which leads to the awful slow UX you're referring to that really shouldn't be seen as acceptable, but somehow is.
Even if we see Jobs as being at the extreme - more focused on these things than most top-level execs of his day - then to treat the entire rest of the industry together as though they were at the opposite extreme (i.e. at today's level of not caring) and call him "revolutionary" in his day as a response to this, would very much be fallacious.
Not what I said. And I think you are the one that's doing historical revisionism now.
Even in this email from 1983, it starts off with
> since its 68000 microprocessor was effectively 10 times faster than an Apple II
From the 80s through the 00s (which I was alive through and very aware of), computer hardware was frequently doubling in performance. The common wisdom then was to make things fast enough. Anything more was a waste of time because in a year or two hardware would be twice as fast.
The wastefulness of today came directly from that past wisdom. I can guarantee you that ever since I've been conscious around discussions about software there's been people that have bemoaned how sloppy and wasteful software has become. People complained about how bloated Windows XP was vs 98.
Ruby, python, perl, java. All these bloated and slow programming languages got their starts in the 80s and 90s. Exactly because of the wisdom that "it's slow today but hardware tomorrow will make it fast". Heck, even C and lisp are manifestations of this. Consider that people weren't writing all software in assembly during the time period in question. There were clear performance benefits of doing so as compiler at the time were particularly bad.
I've worked with a lot of older devs and they all hold the attitude that performance optimization is a complete waste of time. They've been the hardest ones to break of that notion. Younger devs tend to more intuitively know that performance optimization are important. That's because over the last decade, hardware performance improvements have stagnated.
So yes, absolutely yes. In the past if you could make writing software more ergonomic by sacrificing some memory or performance, that's a tradeoff most of the industry would gladly take. They wrote for today's hardware and sometimes tomorrow's.
Deleted Comment
--
After the original iPad was released, Steve Jobs held a meeting with the MacBook engineering team and demonstrated the difference in wake speed.
He woke up a current MacBook (with an Intel chip), which took a few seconds.
He then instantly woke up the iPad (with an Apple A-series chip) by pressing the home/power button on and off rapidly.
Jobs told the team, "I want you to make this" (pointing to the MacBook) "like this" (pointing to the iPad), and then walked out of the room.
---
This no longer exists at Apple.
You basically take those millions of saved hours and multiply them by a government-standard 'value of time' (roughly £15/hr in the UK). That usually makes up the bulk of the benefits, though they also price in things like safety (a prevented death is worth ~£2m), carbon, noise, etc.
IIRC, if you hit a Benefit-Cost Ratio of 2.0 or higher, the project is considered 'high value' and has a good shot at getting executed.
A bus can easily carry 50 passengers. 30 seconds times that many is 25 minutes. That's a lot of aggregate time wasted indeed.
Also assuming this 30 seconds delay is not compensated later, it can influence significantly more people than the bus capacity. And if someone misses a connection because of it that's even more time wasted.
IIRC it's over 10m USD in the us currently, but only about 6m USD in most of the EU.
Deloitte, KPMG, etc are usually more involved in writing the financial case (how to fund the project).
An example: a few years ago, there was a recurring unnecessary traffic congestion on my commute because of a malfunctioning traffic light. On the third day, I did some numbers while waiting and came to the conclusion that over hundreds of people, this was quickly adding up to months of lifetime wasted in total.
I then called the responsible municipality right on the spot to notify them there's a problem. They thanked me and had it fixed the next day.
Could someone build a tea timer app in React and save some time? How much impact to humanity does the GBs of RAM and untold CPU cycles the app now require that could be put to use elsewhere, or causes systems to be landfilled due to inefficiency?
I had a phone with GBs if RAM and a multicore processor that could barely run a single current app. I can buy a new phone, but what about the billions of people that don’t have that option?
These days Slack is occupying 4-8GB of RAM and is less snappy than a native app.
Yeah, in-lining giphy images is kinda fun. But 1000X memory consumption seems like a horrible trade off.
A modern desktop PC would have been a damn supercomputer not long ago. Today it’s kinda adequate.
- Big boss doesn't just yell at the product manager who then yells at the team leads who then calls "all hands" and unloads her stress on the team
- Instead big boss explains his line of thinking and adding some nape of the napkin projections why this improvement actually matters.
You might get a chuckle out of the "life saved" point, but it's easy to understand that this is meaningful productivity over a big number of users.
Dude had anger/I'm the hero issues...his biography notably leaves this stuff out and Woz' only covers a few incidents (because he still considers friend) though I'm sure there were more. Like when Woz invented universal remote and sent a prototype to Jobs and Jobs smashed it against the wall in a fit of anger.
But I don’t look up to him for that. Same way I don’t look up to Tiger Woods for who he is as a husband, or Picasso for… well, also poor behavior with women.
I want to play for Michael Jordan to be with the best and to be challenged to be my best.
Sometimes the thing that makes people excellent in one facet of their life makes them impossible pricks in others.
Extreme excellence in one facet of life is what I admire people like that for.