> What about event planners, nurses, military officers?
As a Dutch ex-Navy officer, we just called this "friction" as everyone had read Von Clausewitz during officer training and was familiar with the nuances of the term. Militaries overwhelmingly address this problem by increasing redundancy, so that there are as few single points of failures as possible. It is very rare to encounter a role that can only be filled by a single person, a well designed military organization will always have a plan for replacing any single individual should they accidentally die.
Also wrt. solutions in the military setting: a strong NCO corp, competent and enabled to take on-the-spot decisions.
But it is not "oh, we have solved friction". It trades the "combat" friction of having to wait for orders (possibly compounded by the weather, comms jamming, your courrier stepping on a mine, etc.) and turns it into "strategy" friction of having subordinates taking initiatives they shouldn't have taken. But I'd argue (like modern armies do) the tradeoff is worth it, and the strategic level has more resources to overcome their friction than combat troops engaged in a fight. But it wasn't always the case (cue the famous tale of a Roman consul, Manlius [0], who executed his own son for disobeying, even though he was victorious ).
> But I'd argue (like modern armies do) the tradeoff is worth it, and the strategic level has more resources to overcome their friction than combat troops engaged in a fight.
I think the tradeoff is practically mandatory for modern armies. The high mobility they require just to avoid artillery strikes and engagements with armor makes top down command impossible to implement in a symmetric conflict.
> It's noticeable how few computer wargames simulate any of this, instead allowing for frictionless high speed micromanagement
In military Command and Staff Training (e.g. for training large HQs), the solution to this is that the trainees don't use the simulations themselves. Instead they issue commands using emulated C2 systems to role players ('lower controllers') pretending to be subordinate forces, who then execute the orders using the sim, and then report back what has happened, as far as they can tell. This generates quite a lot of useful friction. Another group of role players ('higher controllers') represent the HQ superior to the trainees HQ and in turn issue them with orders. The role players and opposing force are also following direction from exercise control (EXCON) and can easily be used to dial up the pressure on the trainees. There is a small industry (e.g. [0]) supporting the exercise management systems that keep track of the various 'injects' that are fed to the trainees via the role players, or by simulated ISR assets, etc.
> It's noticeable how few computer wargames simulate any of this, instead allowing for frictionless high speed micromanagement.
Friction is simulated in many computer games, the problem is that taking it too far would make them unenjoyable or too niche. Remember they are games first and simulations second (with exceptions; precisely the ones that are too niche).
Friction in computer games is simulated in multiple ways:
- The most obvious one: randomized results. Your units do not do a set damage nor do they always succeed, but instead the PRNG plays a role (e.g. most combat rolls in most wargames, but also whether a missile launched within parameters tracks or fails to in DCS).
- Fog of war: many wargames do not show areas you haven't explored or where you do not have scout units.
- Morale: many wargames simulate morale, units may break if sufficiently scared (e.g. the Total War games do this) and some may even rush to charge without your command, jeopardizing your plans (e.g. Total War, Warhammer: Dark Omens). In the Close Combat series, your soldiers may become demoralized or even refuse to follow orders if you order them to walk through enemy fire or take too many casualties.
- Some have external unpredictable hazards jeopardizing your unit (e.g. sandworms in Dune II).
And many others. So wargames do attempt to model friction; the problem is that if you make this too extreme, the game stops being fun for the majority of players. The illusion of control is an important aspect of gameplay.
That first quote is normally attributed to Charles de Gaulle. [0] I wonder if it would have been in character for Napoleon to reflect on the indispensability of anyone but himself.
There are tabletop wargames for the consumer/hobby market that do try to include various kinds of friction in the gameplay. Both the classic Memoir 44 [1] and the Undaunted series [2] have you issue orders from a hand of cards drawn from a deck.
Memoir 44 divides the board into three segments (a center and two flanks) and your cards to issue orders always apply to a specific segment (e.g. right flank). Lacking the cards in your hand to issue the orders you might want simulates those orders not making to the front lines.
Undaunted explicitly has Fog of War cards which you can't do anything with. They gum up your deck and simulate that same friction of imperfect comms.
Atlantic Chase [3], a more complex game, uses a system of trajectories to obscure the exact position of ships and force you to reason about where they might be on any given turn. The Hunt [4] is a more accessible take on the same scenario (the Hunt for the Graf Spee) that uses hidden movement for its friction.
I don't know how many of these ideas leap across to computer games, but designing friction into the experience has been a part of tabletop wargames for a long time.
> It's noticeable how few computer wargames simulate any of this, instead allowing for frictionless high speed micromanagement.
Games are entertainment, and as with a novel or film, the authors pick and choose to get the level of verisimilitude they think the player/reader/viewer might want. Who wants to take an in-game break to go to the bathroom? When you pick something up (extra ammo) it's instantaneous -- and how come there are so many prefilled magazines lying around anyway? And when you get shot your shoulder doesn't hurt and you don't spend any time in the hospital.
Wargames tend to try to be fun, as opposed to being a realistic simulation of war. Imagine you are playing Napoleon at Ligny: How much fun is it that your reserves receive conflicting orders all day from a field marshal fighting in a different nearby battlefield, and that there are similar town names in the map, leading to your troops coming in late and in a useless location?
You shouldn't even be able to watch the action in detail, Total War style, as you might have a hill, some messengers and low power binoculars. Games have attempted to copy this, but it's a curiosity, not something that brings sales
I think quite a few wargames, both computer-based and pre-computer, simulate friction at some level.
The original Prussian Kriegspiel involved opposing players being in different rooms having information conveyed to them by an umpire (must have been a lot of work for the umpire).
The wargames used in the Western approaches to train WWII convoy commanders made players look through slots to restrict what they could see.
Computer wargames like 'Pike and Shot' often won't show you units unless they are visible to your troops. Also your control over units is limited (cavalry will often charge after defeated enemies of their own accord).
In the novel Ender's Game, the Command School training takes an interesting approach.
Ender is able to see the full battlefield (modulo fog of war) because of ubiquitous faster-than-light sensor technology. But he doesn't control any ships directly. Instead, he issues orders to his subordinates who "directly" control squads of ships.
I've always wondered if anyone's ever made something like this. A co-op war simulation game with instant visibility but divided, frictioned actions. Nothing about it would be technically difficult. It would probably be socially difficult to find enough players.
There was a sci-fi story decades ago (probably in Analog) on this theme. A very realistic war game was set up, which two real-world opposing nations decided to use in lieu of losing real men. The friction caught them off guard. The one incident I recall was when one side deployed a biowarfare agent, but the wind changed and they ended up infecting their own troops. There were other incidents of friction.
Your best bet is probably actually shooters. There's several games that integrate elements of RTS games on top of FPSes, like Planetside 2, Natural Selection 2, Hell Let Loose, Squad, etc. In all of these the individual soldiers are individual players so you can't hardly micromanage them even if you wanted to
I am surprised by discussions, so far. Which, at the moment appear to be people poking holes in the shortcomings of friction as a model, and then a few talking about the unreasonable effectiveness of some processes.
My surprise is that neither discussion really leans in on the metaphor. Friction, as a metaphor, is really strong as the way you deal with things vastly changes the more mature a technology is. Consider how much extra lubricant is necessary in early combustion engines compared to more modern electric motors.
More, as you cannot always hope to eliminate the external cause of friction, you can design around it in different ways. Either by controlling what parts of a system are more frequently replaced , or by eliminating the need for them entirely, if and when you can. You can trace the different parts of a modern car to see how this has progressed.
Indeed, the modern car is a wonderful exploration, as we had some technologies that came and went entirely to deal with operator involvement. Manual transmissions largely were replaced by automatic for a variety of reasons, not the least of which was wear and tear on the clutch. And it seems to be going away entirely due to the nature of electric motors being so different from combustion ones?
Just FYI, in Europe we mostly use manual transmission and clutch is mostly so robust that something else breaks way before that.
Also, a lot of auto transmission approaches use the clutch behind the scenes, at least in the older models. But, I am nitpicking here at the analogy being transferred to the clutch system.
I fully agree otherwise that the friction is the best term to describe what is happening across the system and within social interactions.
Right, this is part of my assertion to the metaphor. There is not, necessarily, a single solution that is obviously superior to others. You can get lucky and find one, of course. Often times, though, it is largely driven by what tradeoffs can be made.
An electric motor is very different from an internal combustion engine. I'm not sure where you were going with the analogy but an electric motor does eliminate the need to lubricate that an internal combustion engine has (no valves, crankshaft, pistons, transmission etc.). Analogous somewhat to maybe a bad software architecture needing constant care vs. a better one that just works and eliminates a lot of that extra care.
I've learned about this term from the economics side rather than the military side. It's all the hidden factors that make things more expensive. Transaction costs. I do think this is a good analogy for "drag" in software development, something along the lines of "technical debt".
Right, that was my point. You can find ways of eliminating some sources of friction in a system, as was done in electric engines. Before you get to the electric motor, though, you will almost certainly have to deal with it in another way. Path dependence being hard to ignore for what we have done to deal with friction in the history of the car.
My assertion doesn't lean on "bad architecture," as I feel that there are just different choices and tradeoffs. I do think you should often times look for improvements in the tech you are working with instead of replacements. Replacing tech can, of course, be the correct choice. Often, it is a death knell for the people that are getting replaced. We solve that at societal levels in ways that are different from how you solve them at the local technical choice level.
The article seems insightful on the surface but falls apart very quickly when you take time to analyze what the author is actually saying in each sentence. Pretty much every statement is a logically false or bad argument or, at least, requires a lot of supporting material to be convincing.
Take the following sentences for example.
> If people have enough autonomy to make locally smart decisions, then they can recover from friction more easily.
Having autonomy has no relationship to recovering from friction more easily. Any why would autonomy cause one to make locally smart decision? The person having the autonomy might be the one causing the friction in the first place and might also be the one making bad decisions.
> Friction compounds with itself: two setbacks are more than twice as bad as one setback. This is because most systems are at least somewhat resilient and can adjust itself around some problem, but that makes the next issue harder to deal with.
Why would being resilient to one type of problem cause not being resilient to another type of problem? And why would this cause the friction to compound itself?
Incidentally, ChatGPT does produce an equally (if not more) plausible article when I ask it to produce an article on software friction.
Going up the chain for order is itself a source of friction. Communicating the situation on the ground, dealing with transmission issues like staticky radios, waiting for command to have time to deal with your issue (they may be dealing with other units having similar issues, especially if you have a command structure that doesn't delegate), etc. It's uncommon that higher levels of leadership have a better understanding of what lower level units are dealing with than the lower level unit itself
In my experience, the tooling that causes the most friction when it doesn't work is also the most likely to be abandoned, community supported, or only supported by an India team (requiring an overnight for each round-trip communication). Directors and VPs talk a big game about prioritizing developer productivity, but when it comes time to staff a support channel, prioritize a bug fix, or choose who to lay off, it always turns out that they were lying.
Thriving as a SWE in a medium-to-big company is not about algorithms and data structures, it is about coping with and recovering from environment breakages, and having the skills to diagnose and fix the environments that you were forced to adopt last quarter and by this quarter are deprecated.
Historically, 90% of my Indian coworkers have had a much higher pain threshold than 90% of my domestic teammates. I can think of two individuals with a low tolerance for bullshit and I always wonder how they fit in socially over there.
I have to dig a lot or try to bring a problem into N.A. office hours before I see how much rote work is required to do a task and it’s almost always shockingly high. We write software for a living. Nobody should be running a static run book. We should be converting it to code.
Keep in mind how immense income disparities are there. For someone living in India, getting a job that exists anywhere at all on the US payscale pretty much guarantees living comfortably and being able to save tons of money on top of that.
The problem applies to any pair of sites with a 12-hour timezone offset, regardless of culture. PDT<->IST happens to be the one that practically occurs for Bay Area tech companies.
This is nothing but the second law of thermodynamics.
Viewing friction as the principle of increasing entropy helps.
You can think of a graph with nodes being the states of various systems including humans, software services, database, etc., and edges being dependencies between them. Reducing the states directly reduces the entropy. Reducing the dependencies reduce the rate of increase of the entropy in the system.
This directly leads to various software principles like separation of concern, parse not validate, denormalisation, state reduction, narrow interface deep implementation, KISS, readability etc. All of these reduce friction.
As such I find the "Addressing friction" section in the article lacking, but it does highlight some useful points.
Once you're familiar with friction, you start seeing it everywhere. And then you hate how much of it there is. I'm sure there's a philosophical lesson in there somewhere.
As far as battling it goes, my experience is that you can get a lot of mileage by just spending an extra minute or three making something a little cleaner, more readable, less prone to failure, etc.
The follow-up comment about the Marines' "hot washes" retrospective meetings is interesting. I would love to browse through the Marine Corps Center for Lessons Learned library that's referenced.
The example about the software updates resonates with me. My policy is usually for the team, if you can upgrade your dependencies, just upgrade now. I have seen so many companies just taking the short term thinking again and again just to realize that oh, now it is too much of a step to update anything let's... Wait? Friction is way easier to take amortized over a longer time so you have to basically bake that in the everyday processes, oh an update? We are not in a bind, just upgrade! It is related to tech debt basically, just avoid accumulating it because it compounds very badly.
We had a hell of a time getting a fix for a CERT advisory deployed because we were several versions behind and there were breaking changes in the way. The idea of rushing a change to make a system more robust is absurd because all of the rushing is your surface area for regressions.
Our solution was that at least once a month we had a story to upgrade deps. But as each new person got the assignment they would immediately ask the question, “but upgrade what?” I didn’t have enough information at that point to care, so I told them to just pick something. Our dep pool wasn’t that big and any forward progress was reducing the total backlog so I figured they would pick something they cared about, and with enough eyeballs we would collectively care about quite a bit of the project.
Now part of the reason this ranked a story is that we were concerned about supply chain attacks on this project, so it wasn’t just a matter of downloading a new binary and testing the code. You also had to verify the signature of the library and update a document and that was a process that only a couple of us had previously used.
As a Dutch ex-Navy officer, we just called this "friction" as everyone had read Von Clausewitz during officer training and was familiar with the nuances of the term. Militaries overwhelmingly address this problem by increasing redundancy, so that there are as few single points of failures as possible. It is very rare to encounter a role that can only be filled by a single person, a well designed military organization will always have a plan for replacing any single individual should they accidentally die.
But it is not "oh, we have solved friction". It trades the "combat" friction of having to wait for orders (possibly compounded by the weather, comms jamming, your courrier stepping on a mine, etc.) and turns it into "strategy" friction of having subordinates taking initiatives they shouldn't have taken. But I'd argue (like modern armies do) the tradeoff is worth it, and the strategic level has more resources to overcome their friction than combat troops engaged in a fight. But it wasn't always the case (cue the famous tale of a Roman consul, Manlius [0], who executed his own son for disobeying, even though he was victorious ).
[0] https://www.rijksmuseum.nl/en/collection/SK-A-613https://www.heritage-history.com/index.php?c=read&author=haa...
I think the tradeoff is practically mandatory for modern armies. The high mobility they require just to avoid artillery strikes and engagements with armor makes top down command impossible to implement in a symmetric conflict.
"I can make a brigadier general in five minutes, but it is not easy to replace a hundred and ten horses" -- attr. Lincoln (exact words vary by source)
It's noticeable how few computer wargames simulate any of this, instead allowing for frictionless high speed micromanagement.
In military Command and Staff Training (e.g. for training large HQs), the solution to this is that the trainees don't use the simulations themselves. Instead they issue commands using emulated C2 systems to role players ('lower controllers') pretending to be subordinate forces, who then execute the orders using the sim, and then report back what has happened, as far as they can tell. This generates quite a lot of useful friction. Another group of role players ('higher controllers') represent the HQ superior to the trainees HQ and in turn issue them with orders. The role players and opposing force are also following direction from exercise control (EXCON) and can easily be used to dial up the pressure on the trainees. There is a small industry (e.g. [0]) supporting the exercise management systems that keep track of the various 'injects' that are fed to the trainees via the role players, or by simulated ISR assets, etc.
[0] https://www.4cstrategies.com/exonaut/
Friction is simulated in many computer games, the problem is that taking it too far would make them unenjoyable or too niche. Remember they are games first and simulations second (with exceptions; precisely the ones that are too niche).
Friction in computer games is simulated in multiple ways:
- The most obvious one: randomized results. Your units do not do a set damage nor do they always succeed, but instead the PRNG plays a role (e.g. most combat rolls in most wargames, but also whether a missile launched within parameters tracks or fails to in DCS).
- Fog of war: many wargames do not show areas you haven't explored or where you do not have scout units.
- Morale: many wargames simulate morale, units may break if sufficiently scared (e.g. the Total War games do this) and some may even rush to charge without your command, jeopardizing your plans (e.g. Total War, Warhammer: Dark Omens). In the Close Combat series, your soldiers may become demoralized or even refuse to follow orders if you order them to walk through enemy fire or take too many casualties.
- Some have external unpredictable hazards jeopardizing your unit (e.g. sandworms in Dune II).
And many others. So wargames do attempt to model friction; the problem is that if you make this too extreme, the game stops being fun for the majority of players. The illusion of control is an important aspect of gameplay.
[0] https://quoteinvestigator.com/2011/11/21/graveyards-full/?am...
Memoir 44 divides the board into three segments (a center and two flanks) and your cards to issue orders always apply to a specific segment (e.g. right flank). Lacking the cards in your hand to issue the orders you might want simulates those orders not making to the front lines.
Undaunted explicitly has Fog of War cards which you can't do anything with. They gum up your deck and simulate that same friction of imperfect comms.
Atlantic Chase [3], a more complex game, uses a system of trajectories to obscure the exact position of ships and force you to reason about where they might be on any given turn. The Hunt [4] is a more accessible take on the same scenario (the Hunt for the Graf Spee) that uses hidden movement for its friction.
I don't know how many of these ideas leap across to computer games, but designing friction into the experience has been a part of tabletop wargames for a long time.
[1]: https://boardgamegeek.com/boardgame/10630/memoir-44
[2]: https://boardgamegeek.com/boardgame/268864/undaunted-normand...
[3]: https://boardgamegeek.com/boardgame/251747/atlantic-chase-th...
[4]: https://boardgamegeek.com/boardgame/376223/the-hunt
Games are entertainment, and as with a novel or film, the authors pick and choose to get the level of verisimilitude they think the player/reader/viewer might want. Who wants to take an in-game break to go to the bathroom? When you pick something up (extra ammo) it's instantaneous -- and how come there are so many prefilled magazines lying around anyway? And when you get shot your shoulder doesn't hurt and you don't spend any time in the hospital.
You shouldn't even be able to watch the action in detail, Total War style, as you might have a hill, some messengers and low power binoculars. Games have attempted to copy this, but it's a curiosity, not something that brings sales
The original Prussian Kriegspiel involved opposing players being in different rooms having information conveyed to them by an umpire (must have been a lot of work for the umpire).
The wargames used in the Western approaches to train WWII convoy commanders made players look through slots to restrict what they could see.
Computer wargames like 'Pike and Shot' often won't show you units unless they are visible to your troops. Also your control over units is limited (cavalry will often charge after defeated enemies of their own accord).
Ender is able to see the full battlefield (modulo fog of war) because of ubiquitous faster-than-light sensor technology. But he doesn't control any ships directly. Instead, he issues orders to his subordinates who "directly" control squads of ships.
I've always wondered if anyone's ever made something like this. A co-op war simulation game with instant visibility but divided, frictioned actions. Nothing about it would be technically difficult. It would probably be socially difficult to find enough players.
My surprise is that neither discussion really leans in on the metaphor. Friction, as a metaphor, is really strong as the way you deal with things vastly changes the more mature a technology is. Consider how much extra lubricant is necessary in early combustion engines compared to more modern electric motors.
More, as you cannot always hope to eliminate the external cause of friction, you can design around it in different ways. Either by controlling what parts of a system are more frequently replaced , or by eliminating the need for them entirely, if and when you can. You can trace the different parts of a modern car to see how this has progressed.
Indeed, the modern car is a wonderful exploration, as we had some technologies that came and went entirely to deal with operator involvement. Manual transmissions largely were replaced by automatic for a variety of reasons, not the least of which was wear and tear on the clutch. And it seems to be going away entirely due to the nature of electric motors being so different from combustion ones?
Also, a lot of auto transmission approaches use the clutch behind the scenes, at least in the older models. But, I am nitpicking here at the analogy being transferred to the clutch system.
I fully agree otherwise that the friction is the best term to describe what is happening across the system and within social interactions.
I've learned about this term from the economics side rather than the military side. It's all the hidden factors that make things more expensive. Transaction costs. I do think this is a good analogy for "drag" in software development, something along the lines of "technical debt".
My assertion doesn't lean on "bad architecture," as I feel that there are just different choices and tradeoffs. I do think you should often times look for improvements in the tech you are working with instead of replacements. Replacing tech can, of course, be the correct choice. Often, it is a death knell for the people that are getting replaced. We solve that at societal levels in ways that are different from how you solve them at the local technical choice level.
Take the following sentences for example.
> If people have enough autonomy to make locally smart decisions, then they can recover from friction more easily.
Having autonomy has no relationship to recovering from friction more easily. Any why would autonomy cause one to make locally smart decision? The person having the autonomy might be the one causing the friction in the first place and might also be the one making bad decisions.
> Friction compounds with itself: two setbacks are more than twice as bad as one setback. This is because most systems are at least somewhat resilient and can adjust itself around some problem, but that makes the next issue harder to deal with.
Why would being resilient to one type of problem cause not being resilient to another type of problem? And why would this cause the friction to compound itself?
Incidentally, ChatGPT does produce an equally (if not more) plausible article when I ask it to produce an article on software friction.
- Testable hypotheses
- Reproducible experiments
- Data to support its arguments?
Thriving as a SWE in a medium-to-big company is not about algorithms and data structures, it is about coping with and recovering from environment breakages, and having the skills to diagnose and fix the environments that you were forced to adopt last quarter and by this quarter are deprecated.
I have to dig a lot or try to bring a problem into N.A. office hours before I see how much rote work is required to do a task and it’s almost always shockingly high. We write software for a living. Nobody should be running a static run book. We should be converting it to code.
I’ve often suspected it’s a job security thing.
Viewing friction as the principle of increasing entropy helps.
You can think of a graph with nodes being the states of various systems including humans, software services, database, etc., and edges being dependencies between them. Reducing the states directly reduces the entropy. Reducing the dependencies reduce the rate of increase of the entropy in the system.
This directly leads to various software principles like separation of concern, parse not validate, denormalisation, state reduction, narrow interface deep implementation, KISS, readability etc. All of these reduce friction.
As such I find the "Addressing friction" section in the article lacking, but it does highlight some useful points.
As far as battling it goes, my experience is that you can get a lot of mileage by just spending an extra minute or three making something a little cleaner, more readable, less prone to failure, etc.
Our solution was that at least once a month we had a story to upgrade deps. But as each new person got the assignment they would immediately ask the question, “but upgrade what?” I didn’t have enough information at that point to care, so I told them to just pick something. Our dep pool wasn’t that big and any forward progress was reducing the total backlog so I figured they would pick something they cared about, and with enough eyeballs we would collectively care about quite a bit of the project.
Now part of the reason this ranked a story is that we were concerned about supply chain attacks on this project, so it wasn’t just a matter of downloading a new binary and testing the code. You also had to verify the signature of the library and update a document and that was a process that only a couple of us had previously used.