Readit News logoReadit News
thomasahle · 8 years ago
I once studied the expected remaining length of a game of chess, as a function of moves played: https://chess.stackexchange.com/questions/2506/what-is-the-a... What was interesting is that for the first 20 moves (40 half moves) the expected remaining length will decrease at a near linear rate, but then it levels off at about 25 remaining moves, and after 45 moves every move played will _increase_ the expected remaining length.

At the time it surprised me, but of course it is natural to expect long games to be long.

sevenfive · 8 years ago
The threshold at 45 probably corresponds to endgames where the kings have to walk around the board to take care of pawns.
stouset · 8 years ago
Not only that, but endgames frequently involve long periods of positional maneuvering that can take dozens of moves before one side realizes an edge, or before it becomes clear that it's heading toward a draw.
mannykannot · 8 years ago
> At the time it surprised me, but of course it is natural to expect long games to be long.

Indeed, which is why I am skeptical of purely anecdotal claims of the Lindy effect, as they may be skewed by survivorship bias. In this case, however, you have the numbers to back up the observation.

debacle · 8 years ago
The Lindy effect and survivorship bias are one in the same. That's kind of the idea.

Take programming languages for example. If I asked you to bet on 30 different programming languages which would still be in use in 10 years, and all you knew about them was how long they had already been in use, you'd probably correlate your bets to some degree with their age.

xerophyte12932 · 8 years ago
How did you control for player level? I imagine novice games show a different pattern than grand master games
grkvlt · 8 years ago
Looks like he chose only games where the players had ELO rating > 2000

Deleted Comment

Dead Comment

connoredel · 8 years ago
The key insight is that you are unlikely to be experiencing the thing at a special time in its life. This is the Copernican principle (which J. Richard Gott uses in his version of this that Wikipedia mentions), which was basically "we (on Earth) are unlikely to occupy a special place in the solar system -- it's much more likely that some other object is the center."

Gott says you can be 95% confident that you're experiencing the thing in the middle 95% of its life. Let's say x is its life so far. If x is 2.5% of its eventual life (one extreme of the middle 95%), then the thing still has 39x to go. If you're at 97.5% (the other extreme), then the thing only has x/39 left. So the 95% confidence interval is between x/39 and 39x

Of course, 5% of the time you actually are experiencing something at the very beginning or very end of its life (outside the middle 95%), which is a unique thing. But that's why it's a confidence interval < 100% :)

I prefer this form of the principle a lot more than "the expected life is equal to 2X, always."

Side note: I took J. Richard Gott's class in college called The Universe. Maybe not the best use of a credit in hindsight, but we studied some really interesting things like this.

tgb · 8 years ago
And the real fun is when you apply this to humanity itself: https://en.m.wikipedia.org/wiki/Doomsday_argument
simonh · 8 years ago
Lots of interesting stuff in there. The problem I have with naive versions of this is that they assume as random people we don't live in a special time in human history, but if you look at human history so far the current era is both extremely short yet also spectacularly atypical in almost every conceivably way. It is also a period of still very rapid change. It's hard to get my head around what that means for estimating future trends or outcomes.
dmurray · 8 years ago
There's a similar effect when waiting for a bus or other public transport. At first, the expected time you'll have to wait exhibits decreases as time goes by: if there's a bus every 10 minutes, after waiting 8 minutes you expect one to arrive in 1 minute, compared to 5 when you started waiting. Stand there longer without a bus arriving, however, and the Lindy effect starts to apply. After 15 minutes without a bus, most likely the bus broke down, but you can expect another within 5 minutes. After 30 minutes, well, maybe the drivers are on strike today or this bus route got cancelled or you misremembered the frequency of the bus - either way, expect to keep waiting.

Anyone know of a term for this kind of behaviour? I've never seen it named, though I do recall an article that made the HN front page that demonstrated this effect with the New York subway.

todd8 · 8 years ago
If buses arrive uniformly every 10 minutes and you arrive at a random time, your wait on average will be 5 minutes.

However, if the buses arrive independently and by a purely random process on average every 10 minutes (i.e. a poisson process), your wait will average 10 minutes.

I always remember this result from probability when waiting for something. It helps me feel less unlucky: of course, I tell myself, it's more likely to land in the time interval between widely spaced busses because those intervals take up more space on the timeline.

gwern · 8 years ago
dmurray · 8 years ago
Yes, this is it exactly! Even the framing of it as waiting for a bus is given as one of the examples, "Problem 2, The Standard Wait Problem" but the other examples also have similar behaviour.
gerdesj · 8 years ago
You seem to be assuming that given you have waited for eight minutes already and that the schedule is 10 minutes that therefore half of the period remaining is the expected time for the bus. That seems like a fair heuristic but I'd personally read the timetable. To get a more exact answer you also need to know the distribution of actual bus arrival times. If they turn up at your stop every 10 mins (mean) but with a standard deviation of say two minutes then your heuristic is not the best.

Far more interesting is that two of the buggers will always turn up when you don't actually need one. I can prove that by assertion now and fairly confidently be able to appeal to around 60M Britons for testimony as required.

This is not related to "virality" which is my newly made up term for the Lindy Effect.

Feuilles_Mortes · 8 years ago
You might be interested in queuing theory: https://en.wikipedia.org/wiki/Queueing_theory
sdrothrock · 8 years ago
One type of cause for this behavior is the sunk cost fallacy, or more broadly, escalation of commitment: https://en.wikipedia.org/wiki/Escalation_of_commitment
throwawaybbq1 · 8 years ago
Well .. if you take hope and mistakes out of the equation, I think this is the memoryless property?
QAPereo · 8 years ago
Heuristics? It seems like a rough heuristic way to find a likely path.
franciscop · 8 years ago
I was thinking about what this had to do with HN and then it hit me: javascript is going to live forever; and C will outlast it.

We can also see why it's really difficult to compete with early [surviving] frameworks, since they will last for a really long time.

PKop · 8 years ago
Bitcoin.

The longer it continues increasing adoption and users (like any network effect) the more useful it will become to more people. Also, the higher the market cap, the more people will be invested in its success. The longer it continues to work as designed, the more people will trust it.

simonh · 8 years ago
Apple, Microsoft and Intel have dominated the personal computer industry for almost all of its existence, and their relative positions within that market have been remarkably stable. Therefore as every year passes with that still true, the expected future lifetime of that triopoly increases.
zhte415 · 8 years ago
Blockchain, perhaps. Bitcoin itself seems to not have gathered sufficient momentum or acceptance to gain any use outside of the use it currently has. Bitcoin is not dying, but is not thriving.
greenyoda · 8 years ago
Interesting that you mention C, but not Unix. Unix, an operating system invented in the 1970s, is still one of the most used server operating systems today (in its Linux, BSD, Solaris, etc. variations).

In the corporate IT world, there are IBM operating systems, such as Z/OS, that are direct descendants of 1960s operating systems.

Also, instruction sets for IBM mainframes and Intel x86 machines have been around for a very long time, and are not likely to go away any time soon.

Not to mention standards like 120v/60Hz AC and the shapes of electrical connectors, which have been around even longer and will thus probably survive for a very, very long time.

And going back to software, editors like Emacs and vi have been around so long (both come from the 1970s, and were invented for CRT terminals) that they're likely to keep on being used for a long time to come.

pletnes · 8 years ago
Fortran is forever.
ido · 8 years ago
the canonical examples are cobol and java ("java is the new cobol"). Microsoft has managed to cram c# into the race by sheer effort and expense.

js got there by happenstance of history.

Dead Comment

KingMob · 8 years ago
Lisp. Unless you don't count Clojure, Lisp could be expected to persist another 50 years.

Dead Comment

srean · 8 years ago
Given Wikipedia's standards I am a little surprised that the article is light on the math. One tool to measure this with is the hazard rate.

Say at age x my probability density of dying at that instant is f(x). Now we condition on the obvious fact that I would lived at least x before I die (counting from 0). The distribution function F(x) is the probability that I die before x. So that gives a conditional probability density of dying in this instant is

    h(x) = f(x) / (1- F(x))
If this quantity is constant (in other words independent of x) then I am Peter Pan. I don't age. I will die by some random accident that has no preference over time.

If h(x) is an increasing function of x then I am more human, I age. I age.

If its a decreasing function of x I am probably the Joker, "... makes me stronger". Whenever h(x) is a decreasing function of x one encounters Lindy effect.

Pareto distribution has been called out in the article but anything that has a fatter tail than the exponential distribution will suffice. The life of a database query, search engine request, etc., etc., likely all fall under this category. In such cases it is on us engineers to try to make those latencies have an increasing hazard function.

Deleted Comment

silverdrake11 · 8 years ago
Don't forget you can edit Wikipedia!
firebones · 8 years ago
I teach this as part of an internal developer class; one important thing to note is that it applies to certain classes of non-perishable items. Not the books themselves, but the ideas the books contain. We talk about how things like the presentation framework du jour (e.g., many Javascript frameworks) change rapidly, while tech deeper in the stack turns over less frequently (middleware, operating systems, etc.) And we ask about why some tech survives.

The lesson here is that things that last have developed certain adaptations to make them last. It's always worth studying why some oft-repudiated or outdated tech won't die; it is almost always because it possesses some key attribute present that is essential. If you're proposing a new framework, or promoting a new idea, it is essential you understand why these crufty old incumbents are still around, and see whether your new framework or idea embodies those old adaptations.

I've learned a lot about how flashy surface features (which compete well against new tech at the surface level) can be inferior to tech that embodies what the incumbents did well.

sna1l · 8 years ago
40acres · 8 years ago
What's that term where you learn of a new concept or word and then immediately see it referenced soon after learning it? In rereading zero to one by Peter Theil I came across the Lindy effect and here and now I've come across the wiki page on HN.
alister · 8 years ago
> you learn of a new concept or word and then immediately see it referenced soon after

It happens to me all the time. For anyone who hasn't experienced this, I offer this experiment. Pick one or more words from the list below that you don't already know. Repeat it to yourself a few times, read the definition, and invent a sentence using the word. Remember, choose a word or words you don't know.

frisson -- shudder of emotion with goose bumps when deeply affected by music

comity -- courtesy between nations for the laws of another

blithe -- happy and not worried; not realizing how bad a situation is

voxel -- 3D analogue to 2D pixel; a portmanteau of "volumetric" and "pixel"

pratfall -- comical fall landing on the buttocks

deus ex machina -- a god introduced into Greek/Roman play to resolve plot; for example, in Raiders of the Lost Ark, the hero's problem is solved for him rather than having him solve the problem himself; the film would end the same way even if Indiana Jones didn't exist: Nazis open the ark and kill themselves; pronounced day-us-eks-mah-kah-nah

marmite -- large cooking pot having legs and a cover; also a British sandwich spread made from yeast extract

Maybe you can come back to this comment 48 hours from now and tell us if you've heard your chosen word in real life soon after learning it.

fiddlerwoaroof · 8 years ago
Synchronicity

Dead Comment