Readit News logoReadit News
bschne · 5 years ago
I've found "efficiency as the opposite of stability" a very powerful concept to think about - even though it's fairly simple, it seems to be almost a fundamental law.

Whether it's about the economy at large, your own household, a supply chain, what have you - as soon as you optimize for efficiency by removing friction, you take all the slack/damping out of the system and become instantly more liable to catastrophic failure if some of your basic conditions change. Efficiency gives you a speed bonus, at the cost of increased risk / less resilience to unforeseen events.

Stewart Brand's concept of "Pace Layering" comes to mind for how to deal with this at a systemic level - https://jods.mitpress.mit.edu/pub/issue3-brand/release/2

civilized · 5 years ago
> efficiency as the opposite of stability

In statistics, there is a slight variant of this thesis that is true in a precise formal sense: the tradeoff between efficiency and "robustness" (stability in a non-ideal situation).

For example, if you have a population sample, the most efficient way to estimate the population mean from your sample is the sample mean. But if some of the data are corrupted, you're better off with a robust estimator - in this case, a trimmed mean, where the extreme N% of high and low values are discarded.

The trimmed mean is less efficient in the sense that, if none of the data are corrupted, it discards information and is less accurate than the full mean. But it's more robust in the sense that it remains accurate even when a small-to-moderate % of the data are corrupted.

clairity · 5 years ago
i stumbled on "stability' too, because it's a static quality.

rather than robustness, i prefer to use the term resilience, a dynamic quality, since efficiency is also a dynamic quality. you can trade efficiency for resilience and vice versa (as the parent poster switched to later).

edit:

i should add that i don't entirely agree with the thesis of the article, which exhorts us to slow down, thereby trading efficiency away for resilience. there are a number of ways to add resilience (and trade away efficiency), and in some cases, slowing down might be the best, but it's certainly not the only, or best, option in most cases.

for housing, an example used in the article, we could add more housing to create resilience, which requires reducing friction, like increasing the throughput of permitting/inspections while generally reducing zoning/regulations.

mrtnmcc · 5 years ago
Another example would be forward error correction (adding parity bits to improve robustness at the expense of efficiency).

But inefficiency isn't necessarally more robust unless the extra bits serve some purpose.

goodsector · 5 years ago
I may be wrong, but it seems to me that 20th century (theoretical) statistics research overemphasized efficiency at the expense of robustness. My guess is that this has to do with the (over-)mathematization of statistics in the past century, as opposed to a more empirical/engineering viewpoint. Efficiency typically only holds under extremely narrow (and often impossible to check) assumptions, which is great for mathematicians proving theorems and creating theories of efficiency. On the other hand, robustness is ideally about unknown unknowns and weak assumptions, which is hard to deal with mathematically.

It seems already the 21st century is seeing a more balanced emphasis on theory vs. real world applications though.

zzz95 · 5 years ago
This also exists in control theory as the tradeoff between performance and robustness!
javert · 5 years ago
Wouldn't this be better described as a tradeoff between accuracy and robustness?

Interesting concept.

cmonnow · 5 years ago
it's called bias vs. variance tradeoff, or over-fitting, in stats/machine learning lingo.
yt-sdb · 5 years ago
That's a nice way to think about, and it reminds me of Nassim Taleb's "antifragile" thesis [1]. Basically, the world is more random than you think, and to operate rationally under uncertainty, you need to be open-minded about opportunities and risks with huge asymmetries. Fragile systems are often very successful for a long time because they ignore hidden risks and then collapse due to the unexpected.

[1] https://en.wikipedia.org/wiki/Antifragility

roughly · 5 years ago
> Fragile systems are often very successful for a long time because they ignore hidden risks and then collapse due to the unexpected.

There's another interesting aspect to this in that things that are failures from some perspectives may not be from others.

If stripping resiliency out of a company nets enough savings in the short term, it may still be profitable to the owners it even if it's long-term fatal.

As a hypothetical example, let's say you take a company making $1M a year and trim $19M a year of costs out of it. The company lasts another 10 years and then collapses. You've netted an extra $190M out of that company, or nearly 200 years at their previous rate.

In that case, it's in your local interest to strip the company bare, even if it's not necessarily optimal for your partners, workers, society, or any other stakeholder in this wonderful interconnected world of ours. The benefits are concentrated, the costs are distributed, and there's no mechanism for connecting the two.

richeyrw · 5 years ago
tjpnz · 5 years ago
One of my main takeaways from Antifragile was that the people often involved with making processes efficient have no business being in that position. He was right to label management science as quackery and practitioners as charlatans.
ceilingcorner · 5 years ago
Reminds me of Machiavelli’s comments on the French state (many small warlords) vs. the Ottoman state (single supreme leader.) The French state was less efficient, but more resilient and difficult to conquer, while the Ottoman state had more efficiency but was highly fragile.

In some cases the old king of the conquered kingdom depended on his lords. 16th century France, or in other words France as it was at the time of writing of The Prince, is given by Machiavelli as an example of such a kingdom. These are easy to enter but difficult to hold.

When the kingdom revolves around the king, with everyone else his servant, then it is difficult to enter but easy to hold. The solution is to eliminate the old bloodline of the prince. Machiavelli used the Persian empire of Darius III, conquered by Alexander the Great, to illustrate this point and then noted that the Medici, if they think about it, will find this historical example similar to the "kingdom of the Turk" (Ottoman Empire) in their time – making this a potentially easier conquest to hold than France would be.

marcusverus · 5 years ago
> The French state was less efficient, but more resilient and difficult to conquer, while the Ottoman state had more efficiency but was highly fragile.

Machiavelli is saying the opposite of what you think he's saying.

He's saying that France's governmental structure makes it (relatively) easy to conquer. Because there are many quasi-independent, competing fiefs, an invader is not necessarily facing a unified front, and may in fact be able to recruit dissatisfied lords to their cause. But that doesn't make it the kind of place you'd want to rule, because once you've conquered it, it's (relatively) easy for another invader to conquer you for the same reasons.

In contrast, there were no fiefdoms in Persia. Unlike the lords in France, the regional rulers in Persia were chosen by the state, and picked for their loyalty. When invading Persia, you are far more likely to face a united front, making it (relatively) difficult to conquer. That said, once you've conquered it, it would be (relatively) easy to hold for the same reasons.

jjoonathan · 5 years ago
It is a fundamental law.

Systems theory has the concepts of "gain margin" and "phase margin" -- how much you can amplify feedback or delay feedback, respectively, before your self-adjusting feedback mechanism fails to find equilibrium and turns into an oscillator.

Even though most non-engineering systems don't fit the mathematical theory, the idea that only a finite amount of gain + delay is available, and that the two are somewhat inter-convertible, generalizes astoundingly well.

bschne · 5 years ago
Do you happen to know a good introduction to that? I‘d love to read up more on it!
themeiguoren · 5 years ago
Yeah, as a controls engineer I’ve worked on systems where the main requirement was not gain or phase margin, but a root sum square of the two. Nichols plots drive home the idea that it’s not really the margin at two discrete points that matter, but the close approach to the critical point.
throwaway815190 · 5 years ago
> It is a fundamental law.

> Systems theory

I'm confused. Is it a law or a theory? And no, it can't be both. Laws are proved. Theories are unproven.

alexpetralia · 5 years ago
I think this is probably better characterized as "efficiency erodes resilience". You can have stability if there are no perturbations. However, if there are, and you have optimized for a regime where there are not, you are very exposed to risk. This is pretty much Table's notion of antifragility as well as the study of resilience engineering.
nitrogen · 5 years ago
There's an engineering version of "stable" that might be useful here to draw lines between the three or so different concepts being discussed. A "stable" system is one that will return to the same resting state when perturbed. One can have "equilibrium" in an unstable system, e.g. balancing a broomstick on one's hand, but the system will not return to that equilibrium if perturbed.
205guy · 5 years ago
> Table's notion of antifragility

sigh, someday we'll have auto-correct that just works. Doesn't even need to be AI, just use words in the current page. Heck just use contextual info such as the capitalization. Somebody, please?

shoo · 5 years ago
"efficiency erodes resilience" is a good line, at a loss of symmetry. resilience also erodes efficiency.
agumonkey · 5 years ago
Yes it's basically a myopic under-appreciation of the current system. I was guilty of that, many are I believe, we all strive for better but sometimes our perspective is off. (not to bring him up on every topic but Alan Kay said that perspective was worth a lot of IQ points)

In the list of anti-perfection patterns there's mechanical jitter.. a catastrophe avoiding relaxation.

kibwen · 5 years ago
Relieved to learn that other people dwell on this as well. My anxiety stems from the idea that modern corporations are incentivized to ruthlessly optimize for efficiency--short-term gains--thereby outcompeting corporations that are structured for longer-term outlooks by engineering redundancy (which you call stability) into their processes. I don't know how to begin to incentivize the idea that efficiency is not the end-all.

Dead Comment

georgeecollins · 5 years ago
See also "Slack", a book about this very issue.

https://www.penguinrandomhouse.com/books/39276/slack-by-tom-...

lexicality · 5 years ago
It's funny, I've been sucked into factorio after the 1.0 release announcement and this really rings true.

If you make a production line that has perfect throughput with no buffers then you get fantastic efficiency and productivity right up until a single train hits a biter and is delayed by 30s.

Then you spend 3 hours trying to deal with every machine being out of sync with every other machine with constant start/stops :(

euske · 5 years ago
There's a similar relationship between efficiency (convenience) and security (safety). Examples are everywhere. A centralized system (efficient) is a SPOF (not safe). Aggressive caching can be fast but unreliable. Adding a lot of features in a short term makes the code unstable, etc, etc.

People often focus on one thing and overlook the sacrifice they're really making. Everything has a tradeoff.

Nasrudith · 5 years ago
That concept of efficency as opposite of stability seems a bit fallacious in the strong case - inefficency itself can and has caused systems to collapse which is the exact opposite of stability.

Cache/reserve and efficency is itself complicated and situational if it helps or hurts efficiency. Overfitting it could make it fragile but it also depends upon relative costs for what a blind pursuit of efficiency goes for. If something is cheap enough like data storage there will be plentiful slack because slimming it down is irrelevant to efficiency - why bother going with custom 4KB chips when economy of scale means a 250 MB one is cheaper and better? It just isn't worth trying to cut corners there.

A laggard damped system would take longer to get into a "bad state" assuming the environment doesn't demand rapid changes as the baseline to survive. Bad state is relative as always - one can doom themselves both by leaping onto the "next big thing" which isn't and by sticking to the buggy whip and musket when others have cars and automatic rifles.

kibwen · 5 years ago
I think you're both talking about different ideas here. Efficiency is good. But redundancy is also good (necessary, even, for a resilient system), and the problem is that you can always increase efficiency by removing redundancy, so it does get removed by short-sighted efficiency-optimizers.

Topically, in the past week we've seen two giant companies, Adobe and Canon, lose unimaginable amounts of user data. If they had had backups, which are a form of redundancy, this would not have been a problem. But the backups were too expensive--too inefficient--and so now customer trust in their service is absolutely destroyed.

rightbyte · 5 years ago
If you are strong everywhere you are weak everywhere. The article is you plain out wrong. The problem is the lack of a strategic reserve, not efficiency in itself.
harikb · 5 years ago
Similar to your argument, but another way to think is that efficiency adds dependency on a higher form of skill (ability to multi-task) or a complicated system.

I dread the day when Google Maps, Traffic stats, Uber has actually delivered us the "perfect people transporter system" maximizing the heck out of existing infrastructure (cities, roads) and then the inevitable happens.

Systems become too big to fail.

genghisjahn · 5 years ago
That’s one of the points of “The Goal”. As in you need slack to be more productive over time. https://www.ebay.com/p/117280427

I leave the eBay link because last I checked used copies on amazon were very pricey.

cinntaile · 5 years ago
It's still in print so you can just get a new copy instead.
l_t · 5 years ago
I think of this as an optimization kind of problem. The word "efficiency" itself is only meaningful in context of what's being made more efficient.

A system could be "more efficient at becoming stable," for example.

But if by "efficiency" we limit ourselves to mean "the time-cost of a set of actions," (as in the most efficient path is the one that takes the least time), we quickly encounter problems with maximizing usage of time and how that conflicts with unexpected work, which leads to the anti-stability you mentioned.

The way I think about it is that a 100% time-efficient process has zero time-flexibility. If you want to gain time-flexibility (e.g. the ability to pivot, or to work on different things too, or to introduce error bars to your calculations), you lose time-efficiency.

vonwoodson · 5 years ago
Dr. Richard Hamming said in his lecture series “Life Long Learning” given at the Naval Postgraduate School: a perfectly engineered bridge would collapse if an extra pound was driven across it. I used to think this was a joke, or at least said in jest. Nope.
neogodless · 5 years ago
When I think about personal finance, I often think about efficiency. The obvious examples include buying in bulk, avoiding finance charges to optimize what you get for your money, paying insurance up front when the payment options charge extra, and just plain having fewer subscriptions to keep monthly expenses lower and easier to track.

All of this efficiency increases financial stability. I suppose if we argue that I'm only referring to optimization and not efficiency, then perhaps it's not a great argument.

ghaff · 5 years ago
The two are certainly related but it feels like optimization is a bit different. Maybe the efficiency equivalent for personal finance would be an example like keeping your bank balance at a minimum and immediately transferring any spare cash to paying down a mortgage or otherwise into a fairly illiquid investment because thet's where the best returns are. But now if you have an unexpected expense you have to scramble to come up with funds.
mcguire · 5 years ago
Rather than stability, think of it as robustness or flexibility. Past a certain point, efficiency is an enemy of robustness.

Buying in bulk is cool. If you buy a big package of paper towels, it's cheaper and you don't have to worry about running out. The fact that you have a cabinet full of paper towels isn't a big deal. But suppose you find a really great deal on a semi-trailer load of towels and stock up. Now you have your guest bedroom full of paper towels. The next week, your cousin Edgar's house burns down; you'd like to offer him and his wife a room to stay in temporarily, but you have all this paper in the way. You have lost some flexibility.

A bigger problem is, say, corporate supply chains. With just-in-time supply, you don't have to store inputs and can focus on producing outputs; it's very efficient. But then there's a pandemic or a fire in a factory somewhere, and the supply chain falls apart. Now your outputs are perhaps in greater demand, but you can't take advantage because you have no inputs. You're out of business for the duration. You can't flexibly respond.

bschne · 5 years ago
It increases your financial stability under the assumption that your life situation evolves in a predictable way. Say you need to cover unforeseen medical expenses, or you develop an allergy to one of the foods you bought in bulk (sorry, that one was a bit contrived), etc. - well, then you'd suddenly wish you'd held off on buying that pallet of canned soup.

On the other hand, if you only buy your food day to day, that is certainly more like JIT logistics, prevents waste & storage space needs, etc., but it screws you if you can't leave your house and the stores get closed due to some... ahem... what might possibly happen that forces you to stay inside.

So it's always a matter of your frame of reference, I guess.

tonyarkles · 5 years ago
> paying insurance up front when the payment options charge extra

Depending on the cost of the insurance, that sounds to me like a drop in stability: you have infrequent periodic large payments to make instead of frequent smaller payments to make. If you had an unexpected expense arise near the time of the large insurance payment, your financial situation could get temporarily bad; if instead your insurance was small payments on a monthly basis, the unexpected expense would be easier to ride out.

[Note that I pay all of my large expenses in lump sums instead of in small trickles, but that's mostly psychology on my part not efficiency or optimization]

goatinaboat · 5 years ago
Whether it's about the economy at large, your own household, a supply chain, what have you - as soon as you optimize for efficiency by removing friction, you take all the slack/damping out of the system and become instantly more liable to catastrophic failure if some of your basic conditions change

I would hope that the fragility of JIT supply chains was laid bare for everyone in the Covid crisis but I expect that lesson will soon be forgotten.

api · 5 years ago
Reminds me of something six sigma manufacturing types call "building monuments." This refers to over-investing in optimization and automation to the point that there is too much sunk cost. As soon as something changes you're in trouble and all that sunk cost is gone, or even worse you can be stuck and unable to produce until you've retooled significantly.
MadSudaca · 5 years ago
Maybe the problem is that efficiency and robustness are orthogonal?
aaron-santos · 5 years ago
If you think about the two-dimensional efficiency/robustness space you can pretty easily see how it's isomorphic to the return/volatility space by way of simple transformations. If yourself to be convinced that such a bridge exists you can also bring tools in the return/volatility space back into the efficiency/robustness space by applying the appropriate inverse transformations. Maybe the conclusions are common sense, but I'd still read a blog framing that process by way of analogy.
Double_Cast · 5 years ago
They're orthogonal for non-trivial decisions. Once you hit the efficient frontier of anything, you need to start making trade-offs.
marcosdumay · 5 years ago
That's a much more reasonable statement.

Add to that that if you optimize one from a set of orthogonal values, the other ones tend to decrease. And so you get to all the people claiming there's an intrinsic relation between them, on the face of a world of evidence.

maxerickson · 5 years ago
They at least aren't in direct tension.
ErikVandeWater · 5 years ago
I think this requires more thorough definition. Obviously you can be inefficient without being stable. E.g. I could drive my car at 5 miles an hour to work which would neither be efficient nor stable.
Xylakant · 5 years ago
It certainly would be stable. A very moderate increase in speed would allow you to make good on a substantial delay for any kind of reason.
shoo · 5 years ago
agreed. the two objectives of efficiency and robustness are not necessarily in conflict in all situations, but if you start trying to optimise a given system focusing on only one objective, the other one may degrade arbitrarily. better to define what tradeoff would be a good deal: how much efficiency would you be willing to lose to gain 1 unit of robustness, etc. then optimise both objectives taking into account your preferred exchange rate.

there's plenty examples of this kind of thing in engineering design situations. it's cheaper (i.e. more efficient usage of capital, at least in the short run) to not allocate resources for backups or allocate extra capacity in systems that isn't used 95% of the time. it's much more expensive to dig two spatially separated trenches to lay independent paths of fibre optic cable to a given building, but if you cough up the money for that inefficient redundant connection, your internet will have decreased risk of interruption by rogue backhoes. it's cheaper to not hire enough staff and get individuals in a team to over-specialise in their own areas of knowledge rather than having enough spare capacity and knowledge-sharing to be able to cover if people get sick, go on holiday or quit.

ref: https://en.wikipedia.org/wiki/Multi-objective_optimization

meh206 · 5 years ago
Right, there has to be some boundaries defined for what's in "reason".
seppel · 5 years ago
> I've found "efficiency as the opposite of stability" a very powerful concept to think about

I think this concept misses capacity. I my opinion, it is crucial that you always leave some over-capacity to have stability (lets say, you are running at most at 80% capacity). If you then increase your efficiency without sacrifying your buffer capacity, everything is fine. But as soon as your try to run at more than 80% capacity to be more efficient, this slightest problem could have devastating effects.

Dumblydorr · 5 years ago
Efficiency is a broad concept. When it comes to energy, I don't consider stability incongruous with efficiency at all: the less you waste over time, the more stable your electricity demand growth, which saves money and improves QoL for citizens.

In general, learning how to do things better can produce efficiency and more stability too, if the way is better all around.

tobmlt · 5 years ago
This is also the primary topic of “blue ocean strategy” — with a difference there that running without slack causes the system to grind to a halt and or run in “emergency rush order mode” much of the time. In such a case efficiency and resilience/stability have some linear dependence. Sometimes these “opposites” are actually working together.
datenwolf · 5 years ago
Apparently also the reason why plants forgo green light (i.e. the most abundant part of the solar spectrum): https://www.quantamagazine.org/why-are-plants-green-to-reduc...
esarbe · 5 years ago
This efficiency vs. resilience trade off seems to be a general pattern, also for organic systems.

https://www.quantamagazine.org/why-are-plants-green-to-reduc...

TimJRobinson · 5 years ago
That makes sense. If efficiency is achieving maximum ROI, then the best ROI activies are generally the highest risk ones, which have the most chance of failing and are thus least stable.
markrages · 5 years ago
The rule holds true for switchmode power supplies.

Sometimes you have to add a resistance (inefficiency!) in series with the output capacitor to achieve stability.

basicplus2 · 5 years ago
But even more effective and efficient to use a transfer resistor
chrisweekly · 5 years ago
"stability" strikes me as the wrong word there; maybe "resilience" or "flexibility"?
xapata · 5 years ago
I'd say resilience instead of stability, but I agree that it's worth thinking about in many contexts.
baxtr · 5 years ago
Very interesting thought! Although I’d probably call it anti-fragility or resilience instead of stability.
sriku · 5 years ago
Another way I like to state this is "bottlenecks are beautiful design devices".
aabbcc1241 · 5 years ago
Following this mindset, blockchain can be described as low efficiency and high stability?
nnain · 5 years ago
There's always someone making weird correlations between Blockchain/Cryptocurrency and some random idea.
xwdv · 5 years ago
This isn’t really too correct. A lot of things are neither stable nor efficient.
Florin_Andrei · 5 years ago
> A lot of things are neither stable nor efficient.

Yes, and those are the shitty things.

But when you want to improve them, make them non-shitty, you're facing a choice: stability, or high performance - pick one.

m463 · 5 years ago
I can't help but think of government (or management).
paulsutter · 5 years ago
The best efficiency arises from simplicity. Yes, haste makes waste, and a system that's constantly changing will be unstable.

But efficiency itself is nether haste nor churn, in fact the opposite.

N1H1L · 5 years ago
But isn't this a subtle diss against capitalism or a purely market based system? A free market results in _efficient_ allocation of capital, resulting in a _fragile_ allocation.

Dead Comment

mc32 · 5 years ago
Also in government. It’s really good that they are slow and inefficient (although it would be nicer if they were less wasteful).

There is little worse than a very efficient government.

mistermann · 5 years ago
A very efficient singular planetary government (or, excessive international homogenization of approaches) seems like one plausible example.

To me, this very conversation is the approach we should be taking to the major problems du jour on the planet (treating it as an incredibly complex system with an infinite number of interacting variables, many of which we do not even know exist). But it seems as if once a system reaches a certain level of complexity, we lose the ability to even realize that it is in fact a complex system, and insist upon discussing it only in simplistic terms. Or maybe it's the fact that we are embedded within the system that makes it impossible to see.

Taek · 5 years ago
This comment thread is making me realize we don't have a good word to distinguish between efficiency as in "we only have 7 hospital beds because that's all we need on 99% of each day" and efficiency as in "we replaced steps X,Y,Z with just step X', because we found that X' could accomplish everything that XYZ could accomplish but it's faster, more accurate, and cheaper".

One makes a tradeoff by reducing overheads and buffers, and the other doesn't have any tradeoffs, it's just a better way of doing things based on novel techniques.

war1025 · 5 years ago
This seems like it sort of ties in with NN Taleb's idea of "Anti-fragile" [1].

Perhaps also Chesterton's fence [2].

Maybe also the whole premature-optimization thing [3].

And of course the too-clever-by-half coyotes [4].

Really maybe it just comes down to "be wary of making changes that reduce resiliency."

I was hoping to come up with something cohesive with this comment, but really I guess I just agree with what you say.

And I think there are a bunch of people sort of circling around the same idea, which I don't think we've really quite landed on a precise definition of, just as you say.

[1] https://en.wikipedia.org/wiki/Antifragility

[2] https://en.wikipedia.org/wiki/G._K._Chesterton#Chesterton's_...

[3] https://en.wikipedia.org/wiki/Program_optimization#When_to_o...

[4] https://www.epsilontheory.com/too-clever-by-half/

adwn · 5 years ago
> [4] https://www.epsilontheory.com/too-clever-by-half/

Thank you for this link. I'm halfway through that article and will probably read every single one on that website.

O_H_E · 5 years ago
I love this comment on so many levels ;D
plutonorm · 5 years ago
epsilontheory.com I love every article I have read so far. It's really expanding me. Thanks for sharing.
bobwernstein · 5 years ago
the way you wrote this comment, with the sources fully visable at the bottom, amazing. I will start doing this myself.
mjohn · 5 years ago
In economics we would call the latter a Pareto improvement - ie an efficiency improvement without any trade off
milesskorpen · 5 years ago
I don’t know if those two things are as different as you suggest. Multiple steps add redundancy, if you fail one step you can get much of the value from other steps. More steps are frequently added in response to edge cases.

Could be a Chesterton’s Fence scenario

Instead of bemoaning efficiency, it’d be interesting to reward/value redundancy and antifragility, at least at the system level.

I think this could mean trust busting, regulation, and general cultural shifts.

Jtsummers · 5 years ago
Multiple steps don't always add redundancy, sometimes they're just noise or artifacts. I had a colleague who, when opening a file, would first click on the Windows desktop, then "My Computer", then navigate to the directory, open the file, and close the Explorer window.

There was zero redundancy versus leaving the directory open so they could open the next file (or using the application's "Open File" dialog).

That is a perfect example of wasteful motion (in their case due to a poor mental model of how computers worked, as I learned through later discussions) that could be simplified significantly without loss of quality or redundancy in the system.

Contrast this with: The surgical office called me this morning and stated, "The surgery is for a ganglion cyst on your left wrist." Which I confirmed. When I go in on Tuesday for the surgery this will be repeated, and a mark will be made on the area to be cut open (though in this case it'd be really hard to screw up and open the right wrist, as there is no, quite visible, cyst there). That is useful redundancy of the sort you describe. Removing any step (the initial visit a week ago, the call today, the check when I arrive, the mark on the wrist) and you increase the risk of error.

pchristensen · 5 years ago
My favorite story about unintended consequences of process improvement is the Vienna Sausage Factory: https://medium.com/dangerous-kitchen/vienna-sausages-a-guy-n...

Punch line - Sausages coming from a new modern factory didn't taste the same. The new, more efficient building removed a long transportation step where the partially finished sausages picked up flavors and scents coming from different parts of the factory. They had to create a new process to manually add those flavors that they were accidentally getting for free from the old factory layout.

r00fus · 5 years ago
As long as those steps Y & Z are actually adding redundancy/antifragility, then yes. If the purpose is some deprecated feature or dependency that simply serves no purpose or is now an anti-feature, it would be more efficient to remove.

It would be good to have proof of a Chesteron's Fence analysis to say "yes, we know that Y & Z purposes and have analized the cost/benefit to removing them and the populations/systems impacted" - would this be an impact analysis?

dllthomas · 5 years ago
In some scenarios, the process fails if all of the steps fail. In that case, redundancy is more stable, and you have a tension between stability and efficiency.

In other scenarios, the process fails if any of the steps fail. In that case, redundancy is less stable, and you can improve both stability and efficiency by eliminating unnecessary steps.

In either case, there may be other considerations involved as well (flexibility, visibility, recoverability...) but sometimes we just didn't see a better way to do something.

Taek · 5 years ago
There are obvious cases where you are improving efficiency without eliminating slack. I'm not making my application more fragile by rewriting an n^2 operation over a large dataset to an n*log(n) operation that provides the same output. It's a win without tradeoffs.

This type of example exists in all industries. For example, finding a new alloy that has strictly superior properties across all dimensions for a specific use case. Or upgrading mail delivery routes using better pathfinding algorithms. Etc.

brundolf · 5 years ago
> the other doesn't have any tradeoffs, it's just a better way of doing things based on novel techniques

At least in the case of code, this isn't true. The variability comes in terms of change to the system, rather than the running of the system. i.e., if I simplify a process to be less modular and more monolithic, making it more efficient, that also makes it more purpose-built and less flexible. The "risk" increases of running up against a change that needs to be made but is intractably onerous. There's always a tradeoff.

Jtsummers · 5 years ago
I posted this in a similar thread yesterday:

> Fisher's Fundamental Theorem: The better adapted a system is to a particular environment, the less adaptable it is to new environments. -- Gerald Weinberg,"The Psychology of Computer Programming"

It's something everyone should consider in making critical design decisions. Your adaptable, modular system has some risks (particularly in terms of meeting performance targets, increased cost due to increased complexity), but the monolithic system has its own risks (less adaptable to changing requirements, potentially more fragile against attack or damage). Which you choose depends on many variables including your risk profile and anticipated need for change in the future.

dllthomas · 5 years ago
IMO, it seems like there's always a tradeoff once we've restricted ourselves to interesting options, because those options are interesting because they're not obviously dominated by other options.

If we consider any possible solution, we can obviously imagine adding a completely spurious detail.

Smaug123 · 5 years ago
The former is the removal of slack to make a process more efficient: essentially trading slack for efficiency (making a sacrifice to Moloch!). The latter is more what I would term the removal of inefficiencies.
kazinator · 5 years ago
There are always tradeoffs. The discovery that X, Y, Z could be replaced by X required investigation, which is a cost, and then changes to be deployed which disruptions that also have a cost. If the deployment of the optimization doesn't recover the costs, then it ends up not worth it. Usually the cost is recouped because the optimization is discovered once, and then deployed on a large scale; but that is not always the case.
dllthomas · 5 years ago
I think considering the cost of change a cost of the optimized version is muddled thinking. It's a cost of deploying the optimized version, but that is only sometimes relevant.

A pure trade-off between efficiency and stability would imply that, were I already running the efficient version, we could buy stability by switching to the less efficient code.

Double_Cast · 5 years ago
* It's instructive to track the underlying resource. If technology_A saves you time, it's more time-efficient (aka quick). If technology_B saves you money, it's more cost-efficient (aka cheap).

* In game theory, "7 hospital beds" weakly dominates "8 hospital beds". But (x') strictly dominates (x, y, z). This is exactly what Pareto Optimality is about. Though perhaps a more colloquial term would be useful here.

didibus · 5 years ago
We do have a word, it's efficiency and productivity.

Efficiency produces the same output with less input.

Productivity produces more output with the same input.

So efficiency is a measure of input to target output, and productivity is a measure of output to target input.

To make a process more efficient, you figure out how to get to some X output while using as little input as possible.

To make a process more productive, you figure out how given some Y input, you can maximize your output.

baddox · 5 years ago
The first example about hospital capacity really just involves tradeoffs around your specific goals, whereas it only makes sense to talk rigorously about efficiency in the context of some specific output goal.

So if you fix some goal, say, "we want to be over capacity 1% of the time," then the most efficient way of doing that is probably to have the minimum number of beds that you need according to your predictions about utilization. But you can't really talk about efficiency when you're deciding what your goal is, e.g. whether you're okay being over capacity 10% of the time versus 1% of the time.

hinkley · 5 years ago
There's also efficiency like "I picked up some extra screws at Home Depot because I wasn't sure if I had any at home".

Having ways to avoid an unanticipated repetition of a process, which would result in bunging up the works for dependent parts of the system, can make the entire flow more efficient. See also 'drum buffer rope' from constraint theory.

throwaway_pdp09 · 5 years ago
> ...and efficiency as in "we replaced steps X,Y,Z with just step X', because we found that X' could accomplish everything that XYZ could accomplish but it's faster, more accurate, and cheaper".

optimisation.

jcims · 5 years ago
I agree it can get ambiguous but I think most of the time it's just framed differently and (to your point) are focused on different dimensions of efficiency.

For example, if your first statement was 'We only have seven beds because we tightened up our discharge workflow and that's all we need 99% of the time' and your second statement was 'We only have seven admins because we replaced steps X,Y,Z with just step X and that's all we need 99% of the time' they start to line up.

rticesterp · 5 years ago
It's also not as simple as adding 7 more beds so we have all we need on 99.999% of days. Sure you have 14 beds, but you also need more doctors, nurses, surgeons, OTs, etc. to support those beds. Those health care providers won't get the patient contact they need to maintain their skills as competent providers. I know as a part-time EMT if I don't go on a certain number of calls a month that I have a noticeable decline in my skills
Jtsummers · 5 years ago
At some point you must choose your bottlenecks. Excess physical capacity doesn't have to be used all the time (that is, there's no "currency" for a bed, and other equipment can be rotated through use and maintenance cycles). If you choose to constrain yourself with the number of beds needed for your 99% situation, then you can't expand beyond that without great difficulty. If you have physical capacity for your 99% situation x 2 or even just one or two extra beds, then you don't have to maintain full-time staff for it. You could extend staff hours, or bring part-time staffers to full-time hours, or bring in (with supervision) students from a nearby medical or nursing school to handle what they can and offload the burden for that 1% situation.
wisty · 5 years ago
A hospital can call a part-time EMT and ask them to work a few extra shifts due to some disaster. They can't ask them to bring a bed.
ozim · 5 years ago
That is why kaizen is so nice because those are small improvements that you notice in daily operations.

The other one like bad efficiency I would just call "cost cutting measures" not efficiency improvements.

With kaizen you try to accommodate to what you have. So if Bob is slow cost cutting measure would be to fire him. Efficiency improvement way would be observing Bob to see what can be changed so you can get more value without messing him up.

Deleted Comment

Deleted Comment

crdrost · 5 years ago
It gets worse because even if one had precisely such a word, say the difference between ‘efficiency’ and ‘superlativity’ respectively, it would not necessarily be as simple as to say “efficiency bad, superlativity good.” They morph into each other in complex systems.

Consider the paradox of finding that a factory crew has no inputs—they are playing cards waiting for an order to come in—and yelling at them to go do other things around the shop like clean and assist other operations, rather than loafing. Or, for another solution to the problem, you might pre-order all the stuff and make sure that the team is always 100% loaded and never has the free capacity to play cards.

At first blush these improve superlativity, no? We are accomplishing everything that card-playing does but we are “faster, more accurate, and cheaper” if we are measuring, say, labor cost per part and the technician time averaged over the parts they worked on. Have we not just found a “novel technique” which is “just a better way of doing things?”

But staring at it for longer you may find yourself less sure. That’s what I mean by complex systems they morph into each other. There are more subtle tradeoffs here. For example when people feel free to loaf when they have no work, you can walk into the shop and ask who’s loafing and why and how you can improve their situation so that they again have proper work to do. There is an increase in latency when that shipment finally comes in and all the workers need to be summoned from across the floor to handle it again. There may be mental fatigue from having to context-switch too much or from having to constantly work on just one thing with no breaks. Or maybe the teams that need whatever they are producing cannot finish their work fast enough, so all of the inventory produced by this team slowly grows until it fills 50% of your factory floor, until you only have a certain amount of space because that’s all you need on 99% of each day.

The point is that the greedy algorithm may fail. In a linear circuit, you short out some resistor with some wire, you know that current is going to move faster afterwards. But in a nonlinear circuit, you no longer know this. In the absolute simplest case, the increase in current rapidly breaks a fuse and everything grinds to a halt. In more complicated cases you have a feedback loop and the increased voltage from the short-circuit feeds back to the earlier stages to throttle the current coming through.

Same with weight loss. People think that they will eat fewer calories and they will therefore lose such-and-so amount of weight. Well, probably. But this is a complex system we are talking about. One of the first things that happens when you start burning the fat is that your body burns your muscle too. This is the same reason that you can't burn fat on your stomach by doing crunches, your system is sending the call out to your entire body that it needs to digest surplus material. The loss in muscle mass appears to be the primary culprit which kicks down your basal metabolic rate and you hit what weight-loss folks call a “wall” where you are literally cold all the time and wearing sweaters and feeling too cranky to exercise and all that, feedback mechanisms which will mean that if you keep eating that restricted amount of calories you won’t be losing any more weight unless you can “break through” it by keeping warm through exercising and thereby increasing your muscle mass back up to where it needs to be and so forth. It’s just that it’s a complex system and the greedy algorithm does not always work for such systems.

wayoutthere · 5 years ago
"Efficiency" can only be defined in terms of something else -- you can be optimizing throughput or optimizing costs and you'll end up at very different solutions. One is efficient with respect to waiting times, the other is efficient with respect to costs.

In your latter example, it could very well be the case that steps Y and Z had purposes you didn't take into account that makes the new process less efficient in some cases with respect to the target metric.

Either way, overoptimization and focus on specific metrics to the exclusion of others is a real problem. Circumstances change over time and high levels of optimization make processes more brittle and likely to fail when circumstances change.

austinl · 5 years ago
One of my favorite quotes on this topic comes from Aurelius' description of his adopted father in Meditations.

[Y]ou would never say of him that he "broke out a sweat": but everything was allotted its own time and thought, as by a man of leisure - his way was unhurried, organized, vigorous, consistent in all.

I feel like I spend a lot of time rushing from one thing to the next, constantly questioning whether I'm spending time wisely. And then I end up accomplishing less because I lack focus in one area. I've instead been trying to relax, slow down, and take tasks one at a time until completion. I'd also recommend Cal Newport's book, Deep Work, on this.

vishnugupta · 5 years ago
> but everything was allotted its own time and thought, as by a man of leisure

As an extreme example; watching Schumacher at his peak perform during a qualifying lap or during race, in treacherous rainy conditions, while everyone was absolutely struggling, and him out front, half a lap ahead of everyone was like watching poetry in motion. You could tell he was very relaxed just by the way his hands operated the steering wheel, hitting the apex every time in a single motion, no twitching or tossing around the car.

It seemed he just had more time, as in the time had just slowed down for him compared to everyone else.

Edit: Typo; damn you Mac OS auto-correct!

thrav · 5 years ago
Time dilation is a real thing. You’re likely describing his experience accurately.

We see it in obviously exaggerated forms in film, like the Matrix, but that’s based on real shit. The best baseball hitters describe seeing the pitch the same way.

austinl · 5 years ago
as by a man of leisure is my favorite phrase in the passage. I think it's the key to the whole thing. It's implies what kind of mindset you should try to adopt: relax, enjoy what you're doing and appreciate the moment (if you're able to).

Aurelius' adopted father was a consul three times, which is certainly not a stress-free job! But he apparently was able to keep cool by the way he approached his work.

https://en.wikipedia.org/wiki/Roman_consul

agumonkey · 5 years ago
I'm still stumped over the fact that humans are behind Antique Roma existentialism on average.
Wohlf · 5 years ago
Slow is smooth, smooth is fast.
kazinator · 5 years ago
Same-day delivery isn't efficiency; it's made possible by inefficiency, like using more delivery people and vehicles who cover inefficient routes, and using more warehouses closer to delivery areas.

Actually we can't discuss efficiency without making it clear what parameter we are optimizing, at the cost of what (if any) other parameters.

However, usually, if we reduce the time something takes not by cleverly eliminating, rearranging or otherwise streamlining the steps, but rather by some brute force method that requires more resources (more people, more equpiment, more energy), it is hard to frame that as efficiency.

iak8god · 5 years ago
Right. Same-day delivery is fast, not usually efficient.

Neither is multitasking using three different devices at the same time efficient. Again in this example the author seems to be confusing rush with efficiency. I didn't make it past those first few incoherent lines, so I don't know whether this confusion persists into the rest of the article.

Deleted Comment

kovac · 5 years ago
As the customer, it would be more efficient for me to receive what I need today than after one month. I think same day delivery is more about that than optimising a company's operations. They are selling efficiency to their customers.
semaj111 · 5 years ago
I agree. For me, the article absolutely did not deliver on what was said in the title. They declare efficiency as itself dangerous, where it is the narrow application of efficiency that really results in problens.
alexchamberlain · 5 years ago
I feel like a lot of conversations and articles recently speak to an inadequate understanding of risk, and planning for risk.

Risk is made up of at least 2 or 3 components: what is the probability something will happen? And, if it does happen, what is the impact and how will you mitigate that impact?

For example, you may believe that a change to a website you are deploying has a low probability of taking the website offline. If it is taken offline, it may cost £X per hour in lost revenue, but you’ll leave the old version running on a standby server, so it only takes a few minutes to switch back. That’s a much more thorough understanding to 1 aspect of risk than “this rollout is low risk”. Once you have that understanding, it’s reasonable to discuss how to reduce the probability of an outage (better testing?), as well as how to reduce the impact (staged rollout?) or to speed up the fix if it were to happen (practise?).

In COVID terms, we should be discussing the impact of decisions in the light of future pandemics. Could we invest now in reusable PPE, so that next time we don’t have a global rush on the disposable stuff? Do we need to educate the public more readily about reducing disease transmission to reduce the likelihood of a pandemic in the first place? I’m not a doctor, so I have no idea on the specifics, but the likelihood of any given pandemic will always be low, so what is the impact of the decision if there is one and does that impact need to be mitigated? (even if it is less efficient to do so...)

hahajk · 5 years ago
You say risk has 2 or 3 components. The two you list, liklihood and severity, are the two I usually hear. What is the third?
eointierney · 5 years ago
Oh dear, I want to rant.

Efficiency is a dimensionless ratio of energy in to energy out. Economics, as formulated, has little to say about efficiency and lots to say about preference relations on utility functions (the texts do tend to hand-wavingly waffle about how markets attain "efficiency" through hypothetically rational actors maximizing utility functions etc., if you want a giggle check out the "fundamental theorems of welfare").

I actually read the fine article, and didn't give up after the first couple of paragraphs.

And in paragraph the third is introduced friction. I dunno, maybe it's because I actually study science, but friction is a well defined thing and the coefficient of same is another _dimensionless_ variable. It seems every time economists want to incorporate a notion from science proper they go for the dimensionless stuff because that way they don't have to go through the whole tiresome rigmarole of ... dimensional analysis. It makes me feel like Tantacrul criticising UI's (check him out on youtube, he's both funny and informative).

Anyway, efficiency is not dangerous, efficiency will actually allow the survivors of the Anthropocene Disaster make it through the coming disaster. Slowing down is not a bad notion because it means most humans can spend more time thinking (quite efficient actually) and less time haring around the planet distracting themselves from the vapidity of their vanity. However life is not better because it is slower, it's better because humans appreciate what an extraordinary (literally and figuratively) opportunity it is to be alive.

As insurance policy against disaster stop listening to economists because they observably don't have a clue (they can't make accurate and meaningful predictions), instead study science, especially physics, because these meanings are measured against reality and the resulting predictions are highly reliable (TANSTAAFL < 2TD).

/rant

eafkuor · 5 years ago
> We worship efficiency. Use less to get more. Same-day delivery. Multitask; text on one device while emailing on a second, and perhaps conversing on a third. Efficiency is seen as good. Inefficiency as wasteful.

Is this a US thing? I've lived in three different European countries and nobody thinks this way. Efficiency and productivity are things I mostly just read about on HN.

Tade0 · 5 years ago
It spills over to countries(like mine) enamoured of the American way of life, but in a twisted manner.

I've worked for companies that saw overburdening people with responsibilities as "efficient". On paper it was.

rusty-rust · 5 years ago
I agree with your observation. From my observation Dutch startups live by the motto “steady wins the race” and to a larger extent I think it speaks to the risk aversion ingrained in their culture - both in their startup and VC mentalities.

They would rather bootstrap and grow with stability then pull a softbank and use/give massive capital injections in the hopes of getting market efficient domination.

not2b · 5 years ago
The article repeats a commonly believed myth about money being invented as a replacement for barter. Adam Smith thought so, as do many economists, but they forgot to ask anthropologists. See https://www.theatlantic.com/business/archive/2016/02/barter-...
roughly · 5 years ago
Strongly recommend "Debt: The First 5000 Years" by David Graeber - https://www.indiebound.org/book/9781612194196 - solid treatment of economic history from an anthropological viewpoint.
Nasrudith · 5 years ago
I thought there were some hints of barter of sorts in palace economies from shipwrecks - no currency but incoming exotic commodities they couldn't produce and outgoing ships of commodities well beyond any personal use implies some trade relationships.

The fact Palace Economies all died out doesn't mean a barter economy never existed, just that it wasn't stable enough to leave any isolated "time capsule cultures". Granted palace economy commerce isn't on a personal level unless you count the ruler who allocates everything.