Pentation? How quaint. For other Large Number fans, David Metzler has a wonderful playlist that goes way down the rabbit hole of the fast growing hierarchy:
Almost all of these mind-bogglingly large numbers are built around recursion, like the Ackermann function, which effectively has an argument for the number of Knuth up arrows to use. Then you can start thinking about feeding the Ackermann function into that slot and enjoy the sense of vertigo at how insane that becomes.
I find it fascinating how quickly the machinery of specifying large numbers probes the limits of what's definable within the mathematical systems we use.
It's funny how, for a certain subset of math, a researcher's life can be condensed to "arguing about which number is the biggest (preschool)" -> "learning about math" -> "arguing about which number is the biggest (academia)"
You might already know this, but the busy beaver function grows faster than any computable function. So although the best known lower bound of BB(6) can be expressed with just pentation, generally speaking the BB function is certainly beyond any standard operation in terms of fast growth
The growth of BB is certainly mind-boggling; however, I personally find its growth rate so untouchable as obviate any attempt at understanding. There's nothing to gain purchase on.
The fast growing hierarchy, on the other hand, provides oodles of structure for us to explore, and we can construct numbers that are vastly larger than anything BB(6) is likely to hit. In fact, this is why we use the fast growing hierarchy to approximate really big numbers all the time!
When we take something like f_Γ_0 and try to unpack even just the barest surface of its size, I get a feeling of vastness similar to those videos of diving endlessly into fractals.
There is something awesome about incredibly large finite numbers. People gush about infinity, but I find it to be a pretty boring concept compared to finite numbers too large to even be written in this universe.
Infinity is aspirational. Infinity is a concept, simple and self-evident, yet paradoxical and oxymoronic.
I get kind of intrigued by these large number things at first but ultimately it feels like kind of a less elegant version of the same thing? It's all this mucking about with multiples and powers of multiples and powers when it's like...we've already taken this to the limit, we can just stand back and marvel at that, what are we looking for? We already know you can always add another digit, why invent more and more complicated ways to do that?
This isn't meant to be contradictory to what you're saying or anything, just interesting to explore these different perspectives on what mathematical concepts capture the imagination.
Like, there is a perfectly finite number, but is so large that there simply isn’t enough information in the universe
to encode it in any format. How cool is that to just think about for a while?
If you have a child who likes math I highly recommend "Really Big Numbers" by Richard Schwarz. Tons of nice illustrations on how to "take bigger and bigger steps".
I really like the Busy Beaver stuff. I wish I had been exposed to it (at lest enough to play with it some) in high school. It reminds me some of Jorge Luis Borges' story "The Library of Babel".
Does anybody know of other interesting problems in the Busy Beaver space?
Fictionally, maybe the Mandelbrot Maze mentioned in Arthur C. Clarke’s 3001:
> Their approach
was more subtle; they persuaded their host machine to initiate a program which
could not be completed before the end of the universe, or which - the
Mandelbrot Maze was the deadliest example - involved a literally infinite series
of steps.
I loved that part of the book. I thought it was so cool how they thoroughly scrubbed the Internet of these dangerous programs, and then stored the very last copy on a disk in an ancient lava tube on the moon, just in case.
No. The busy beaver function grows faster than any computable function.
What's more, for any N > 549, you cannot prove (in the usual universe of mathematics) that any computable number is equal to or larger than BB(N). (At least that's my understanding of the independence results (https://wiki.bbchallenge.org/wiki/Independence_from_ZFC))
The algorithm for BB(N) is (with some hand waving) “for each N-state Turing machine, check if it halts on an initially empty tape. If it does not, skip it. If it does, run it until it halts and count the number of steps, k. BB(N) is the maximum value of k you find.”
The problem is that the naive algorithm above requires an oracle for the Halting Problem, which every CS student learns is impossible in general for a computational model that isn’t more powerful than a Turing Machine.
So if a function can be expressed in terms of a Turing machine, busy beaver has to grow faster because eventually for some value of N, you can just encode the candidate function as a Turing machine and have Busy Beaver simulate it.
Busy Beaver hunters on the low end (trying to find the next number) have to exhaustively prove whether each Turing machine eventually halts or not at that number of states so that they can evaluate the longest running one with finite execution.
Busy Beaver is non computable and grows much faster than the various subcubic graph numbers (you could easily encode a subcubic graph number computation in a relatively small turing machine).
TREE(3) is so crazy to me. You would expect it to either be a reasonable number, or to be infinite (i.e. there's an infinite sequence of such trees).
But no. It's just an _impossibly unfathomably large number_. Eventually, there's no more trees with less than $i$ vertices, but only at such a stupid incredibly large $i$ that it makes other "googology" numbers vanish by comparison.
> it makes other "googology" numbers vanish by comparison.
I consider TREE to be medium sized in the googology universe. It obliterates Graham's Number of course, as well as Goodstein sequence and simple hydras. But it's easily bested by an ingenious matrix generalization of hydras known as the Bashicu Matrix System, even when fixed at 2 rows. 3-row BMS goes far beyond that, let alone n-row. BMS is so simple that it can be encoded in under 50 bytes [1], where TREE would take significantly more.
Beyond that there vastly greater numbers still such as Loader's Number. I shouldn't mention Rayo's because that's no longer computable in the usual sense.
There is a downvoted comment that reads "ah yes the totally new math of exponentiation". The snark is uncalled for, but that's actually the essence of this article: it talks about repeated exponentiation as if it were some profound mathematical discovery.
It isn't. The article neglects to explain what makes busy beaver numbers interesting in the first place. And I think it's symptomatic of Quanta Magazine articles that feature on HN several times a week. A profoundly-sounding title and pleasant writing, but not much essence beyond that.
Indeed, I would have at least liked to get a rough understanding of the tricks they use to classify and discard Turing machines, and how they construct these enormous runtime calculations. They are clearly computing something but they are obviously not actually running the machines normally for those numbers of steps.
you gotta pay for that. or dig through the academic publisher archives; which you also gotta pay for unless you believe in digital piracy and evil copyright infringement which may or may not fund terrorism
like they used to say: information wants to be expensive so pay to be free
I usually get drawn into their posts also and didn’t realize they were low value; my IQ must not be high enough to differentiate. What would you recommend as alternative sources?
I don't think there's anything that would put some new, esoteric math concept in your mailbox every week, although there's plenty of books that cover recreational mathematics in an accessible way (Martin Gardner, Ian Stewart, etc). And for QM articles, I recommend searching the web - you can often find better explanations on some 1999-style blog somewhere.
The problem with this particular article is simple: busy beavers numbers aren't interesting because they're big. They don't break mathematics because of that; you can always say "+1" to get a larger number. There's also nothing particularly notable about Knuth's up-arrow notation, which is essentially a novelty that you're never gonna use. Instead, the numbers are interesting because they have fairly mind-blowing interactions with the theory of computability.
Don't let specialists detract from your enjoyment of Quanta-level articles. This one is well-written, makes no egregious errors, and only omits one important fact. And that fact is ...
All this talk about exponentiation, tetration and pentation have their roots in Peano arithmetic, which for reasons of logical clarity, defines precisely one function -- the "successor function":
s(n) = n+1
Using just this function, Peano defines addition, multiplication and exponentiation with great clarity. Since Peano's time, people have been making adjustments, like including zero among the counting numbers, and by extending exponentiation into tetration, pentation and a few more exotic operations fully discussed in the linked article and elsewhere.
I personally would have liked to see a more complete exposition of the logical roots of the Busy Beaver challenge, and I think the missing parts would have made the article a better read, even for non-specialists. Maybe especially for those readers.
But Quanta articles are perfectly suited to their audience, people who may be inspired to look for more depth elsewhere.
There isn't really anything better. The Notices of the AMS has one feature article each issue, but those are sometimes too technical.
Anyway, this article seems fine to me. The "exponentiation" comment seems like a bizarre misreading. The article is just trying to explain how big BB(6) is. Before that, it explains what BB(n) is. To think it's solely about exponentiation you have to skip that entire section.
It's crazy to me that we're now writing articles about the fact that a large number is large.
Hey guess what, I can imagine a number even larger. It's BB(6) + 1, isn't that amazing and interesting? Wow imagine BB(6) multiplied by a googolplex, amazing. Wow, such number.
What's the point? Numbers are infinite, what else is new?
What's the point of your comment? To suck the joy out of everything?
Turing machines are a fundamental object of study within the theory of computation. The complexity and wild behavior that arises from even the simplest machines is a cool discovery. BB(6) was thought to be a lot smaller, but it turns out to be really huge. The Busy Beaver game is also interesting to those who work on combinatorics and theorem provers. And of course many many people in the space of recreational math & volunteer computing love this challenge.
Hmm, what a shame that the LaTeX content wasn't properly rendered in the linked article. When I noticed this I assumed someone had forgotten to include the MathJax JavaScript library, which, when present, converts LaTeX notation into properly rendered LaTeX client-side.
As it turns out, the MathJax library is called from the article's HTML (in line 1765), but for some reason it's not working.
A long-running debate argues that LaTeX rendering should be a default browser feature, so far without success. In the meantime MathJax works pretty well unless someone disables JavaScript client-side.
Reasonable people may differ, but not being able to render LaTeX is egregious and wrong.
https://www.youtube.com/playlist?list=PL3A50BB9C34AB36B3
Highly recommended.
Almost all of these mind-bogglingly large numbers are built around recursion, like the Ackermann function, which effectively has an argument for the number of Knuth up arrows to use. Then you can start thinking about feeding the Ackermann function into that slot and enjoy the sense of vertigo at how insane that becomes.
I find it fascinating how quickly the machinery of specifying large numbers probes the limits of what's definable within the mathematical systems we use.
https://youtube.com/playlist?list=PLt5AfwLFPxWJ_FwchNAjB3xa8...
The tools required to probe and compare these monsters are so interesting.
https://www.mrob.com/pub/math/largenum.html
BB(n) surpasses Tree(n) by - at most - when n=2645.
And likely shortly after BB(100).
Now consider the following definition for an exponentially faster growing number:
HyperTree(n) is where the Tree function is nested x times, where x is the result of tree(n).
HyperTree(3) = Tree(Tree(Tree..{Tree(3) long}...Tree(3))))...)
BB(X) would (should) still outpace HyperTree(x) at some value of x.
I don't know where to begin even to give an upper or lower bound of when BB(x) is larger than HyperTree(x).
The fast growing hierarchy, on the other hand, provides oodles of structure for us to explore, and we can construct numbers that are vastly larger than anything BB(6) is likely to hit. In fact, this is why we use the fast growing hierarchy to approximate really big numbers all the time!
When we take something like f_Γ_0 and try to unpack even just the barest surface of its size, I get a feeling of vastness similar to those videos of diving endlessly into fractals.
I get kind of intrigued by these large number things at first but ultimately it feels like kind of a less elegant version of the same thing? It's all this mucking about with multiples and powers of multiples and powers when it's like...we've already taken this to the limit, we can just stand back and marvel at that, what are we looking for? We already know you can always add another digit, why invent more and more complicated ways to do that?
This isn't meant to be contradictory to what you're saying or anything, just interesting to explore these different perspectives on what mathematical concepts capture the imagination.
You need to look at this:
https://neugierde.github.io/cantors-attic/
Like, there is a perfectly finite number, but is so large that there simply isn’t enough information in the universe to encode it in any format. How cool is that to just think about for a while?
"Infinity is farther away than you thought."
Does anybody know of other interesting problems in the Busy Beaver space?
This paper contains many conjectures around BB that could be interesting to some.
> Their approach was more subtle; they persuaded their host machine to initiate a program which could not be completed before the end of the universe, or which - the Mandelbrot Maze was the deadliest example - involved a literally infinite series of steps.
https://archive.org/stream/SpaceOdyssey_819/3001_The_Final_O...
Do we know if they grow faster than busy beavers?
https://youtu.be/4-eXjTH6Mq4
What's more, for any N > 549, you cannot prove (in the usual universe of mathematics) that any computable number is equal to or larger than BB(N). (At least that's my understanding of the independence results (https://wiki.bbchallenge.org/wiki/Independence_from_ZFC))
The problem is that the naive algorithm above requires an oracle for the Halting Problem, which every CS student learns is impossible in general for a computational model that isn’t more powerful than a Turing Machine.
So if a function can be expressed in terms of a Turing machine, busy beaver has to grow faster because eventually for some value of N, you can just encode the candidate function as a Turing machine and have Busy Beaver simulate it.
Busy Beaver hunters on the low end (trying to find the next number) have to exhaustively prove whether each Turing machine eventually halts or not at that number of states so that they can evaluate the longest running one with finite execution.
But no. It's just an _impossibly unfathomably large number_. Eventually, there's no more trees with less than $i$ vertices, but only at such a stupid incredibly large $i$ that it makes other "googology" numbers vanish by comparison.
I consider TREE to be medium sized in the googology universe. It obliterates Graham's Number of course, as well as Goodstein sequence and simple hydras. But it's easily bested by an ingenious matrix generalization of hydras known as the Bashicu Matrix System, even when fixed at 2 rows. 3-row BMS goes far beyond that, let alone n-row. BMS is so simple that it can be encoded in under 50 bytes [1], where TREE would take significantly more. Beyond that there vastly greater numbers still such as Loader's Number. I shouldn't mention Rayo's because that's no longer computable in the usual sense.
[1] https://codegolf.stackexchange.com/questions/18028/largest-n...
It isn't. The article neglects to explain what makes busy beaver numbers interesting in the first place. And I think it's symptomatic of Quanta Magazine articles that feature on HN several times a week. A profoundly-sounding title and pleasant writing, but not much essence beyond that.
you gotta pay for that. or dig through the academic publisher archives; which you also gotta pay for unless you believe in digital piracy and evil copyright infringement which may or may not fund terrorism
like they used to say: information wants to be expensive so pay to be free
The problem with this particular article is simple: busy beavers numbers aren't interesting because they're big. They don't break mathematics because of that; you can always say "+1" to get a larger number. There's also nothing particularly notable about Knuth's up-arrow notation, which is essentially a novelty that you're never gonna use. Instead, the numbers are interesting because they have fairly mind-blowing interactions with the theory of computability.
All this talk about exponentiation, tetration and pentation have their roots in Peano arithmetic, which for reasons of logical clarity, defines precisely one function -- the "successor function":
s(n) = n+1
Using just this function, Peano defines addition, multiplication and exponentiation with great clarity. Since Peano's time, people have been making adjustments, like including zero among the counting numbers, and by extending exponentiation into tetration, pentation and a few more exotic operations fully discussed in the linked article and elsewhere.
I personally would have liked to see a more complete exposition of the logical roots of the Busy Beaver challenge, and I think the missing parts would have made the article a better read, even for non-specialists. Maybe especially for those readers.
But Quanta articles are perfectly suited to their audience, people who may be inspired to look for more depth elsewhere.
Anyway, this article seems fine to me. The "exponentiation" comment seems like a bizarre misreading. The article is just trying to explain how big BB(6) is. Before that, it explains what BB(n) is. To think it's solely about exponentiation you have to skip that entire section.
Deleted Comment
Hey guess what, I can imagine a number even larger. It's BB(6) + 1, isn't that amazing and interesting? Wow imagine BB(6) multiplied by a googolplex, amazing. Wow, such number.
What's the point? Numbers are infinite, what else is new?
What's the point of your comment? To suck the joy out of everything?
Turing machines are a fundamental object of study within the theory of computation. The complexity and wild behavior that arises from even the simplest machines is a cool discovery. BB(6) was thought to be a lot smaller, but it turns out to be really huge. The Busy Beaver game is also interesting to those who work on combinatorics and theorem provers. And of course many many people in the space of recreational math & volunteer computing love this challenge.
You don't like it? Ok, then. You don't have to.
https://bbchallenge.org/1RB1LC_1RC1RB_1RD0LE_1LA1LD_1RZ0LA?w...
https://wiki.bbchallenge.org/wiki/5-state_busy_beaver_winner
Dead Comment
As it turns out, the MathJax library is called from the article's HTML (in line 1765), but for some reason it's not working.
A long-running debate argues that LaTeX rendering should be a default browser feature, so far without success. In the meantime MathJax works pretty well unless someone disables JavaScript client-side.
Reasonable people may differ, but not being able to render LaTeX is egregious and wrong.