Readit News logoReadit News
Rochus · 5 years ago
> von Neumann proposed a description for a computer architecture now known as the von Neumann architecture

Well, he actually described an already existing design by Eckert and Mauchly under his name and this paper was illegally disclosed. For the ones interested in this and other fascinating stories about ENIAC, here is a good book: https://www.amazon.com/Eniac-Triumphs-Tragedies-Worlds-Compu... worth reading is also the review by Jean Bartik (here https://www.amazon.com/gp/customer-reviews/R3K2DSB6UE1X7H/re...) who was there and witnessed everything firsthand.

jshier · 5 years ago
This isn't really true. There was no existing architecture and even if there was, Eckert and Mauchly weren't the only ones involved. ENIAC itself encapsulated almost none of the Von Neumann architecture principles. While it's true Von Neumann's paper started as a summary of ongoing discussions on computer architecture for the EDVAC with a group that included Eckert, Mauchly, Arthur Burks, and others, Von Neumann's proofs of the architecture's viability within the paper were entirely original. Hence the name.

Jean Bartik was a hugely biased source who used personal recollection to quarrel with anyone who offered any view of computer history that didn't put ENIAC front and center. She seems to have spent the latter part of her life writing strident negative Amazon reviews of any book which didn't place Eckert and Mauchly and the ENIAC as the primary source of all computer innovation. Her reviews [1] of Who Invented the Computer? The Legal Battle That Changed Computing History in particular were extremely negative personal attacks on the author and her husband (Alice and Arthur Burks) for having written books which dared question ENIAC's supremacy.

[1] https://www.amazon.com/gp/customer-reviews/R2MERA4EUZ8M17

Rochus · 5 years ago
There is not only Jean Bartik; there is a lot of recorded history, even court cases that left many files and affidavits. Bartik's commentary lent itself because it is with the book I referenced. Burks' case is also amply documented. He wanted to be registered as an inventor on the patent after the fact, which was denied to him, even by a court. In his bitterness, he then made many claims. The comment by Bartik you reference should be seen in this context.
suifbwish · 5 years ago
Fascinating when individuals of such advanced intelligence are not obsessed with immortality. Either They believe it is not possible or they are secretly a god trying to escape the boredom of already having long since achieved it.
justicezyx · 5 years ago
Now the water is really muddy...

Is there a defitive book or wiki page for the whole thing regarding: * Von Neumann's actual contribution to the so-called von Neumann architecture

Deleted Comment

m_mueller · 5 years ago
How does Turing’s work factor into this? Does Bartik also discredit his influence on computer architecture?
masswerk · 5 years ago
According to J. Presper Eckert in his oral history interview [1] the Moore School was actually collecting patent applications, when von Neumann went public with the report. Which is why principal computer architecture is in the public domain. And, according to Eckert, this was done with a purpose and in von Neumann's self-interest. He also accused von Neumann of having done similarly at the IAS. (According to von Neumann, it was merely an oversight that his name was the only one on the report and it had been just an internal proposal for a draft of the final paper.)

Eckert summarizes this condemnatory as follows on p.35: "You know, we finally regarded von Neuman as a huckster of other people's ideas with Goldstine as his principle mission salesman. Now, if you don't believe this, talk to Julian Biglow at the Institute for Advanced Study(…) Von Neumann was stealing ideas and trying to pretend work done at the Moore School was work he had done."

[1] Oral history interview with J. Presper Eckert conducted by Nancy Stern, 1977: https://conservancy.umn.edu/handle/11299/107275

PDF: https://conservancy.umn.edu/bitstream/handle/11299/107275/oh...

devindotcom · 5 years ago
Interesting if true but it seems odd to think that Von Neumann of all people in the 20th century perhaps would need to steal someone's ideas. From what I've read of him he was more likely to improve on them before the ideator had finished speaking.

But certainly those were days (if indeed we have left them) when 'great men' like him could and would take credit for something like that for reasons of pride and posterity.

TedDoesntTalk · 5 years ago
What a crime that such 22 year old memories are locked away in an amazon comment, to be lost whenever amazon deems it necessary to cull comments older than a certain age.
dolmen · 5 years ago
It appears someone (some machine?) asked the Web Archive to save it before I did.

https://web.archive.org/web/20210407045457/https://www.amazo...

echelon · 5 years ago
The whole internet is like this. Everything is atrophying.

Remember personal websites and blogs? Barely anybody runs them anymore.

Archive Team is doing amazing work, and we need to support them.

globular-toast · 5 years ago
It's not locked away. Copy the text and save it on your disk if you think it's important to you. Publish it on your own blog if you think it's important to everyone.

People seem to forget about the "save" functionality included in every web browser for some reason. You should save anything you deem important enough if you fear it might be gone one day. This is no different from acquiring a copy of an important book for your own shelf.

Where would you prefer this comment to reside? Where would it have lived before the web?

croes · 5 years ago
ENIAC the worlds first computer? What about Z3 by Konrad Zuse? https://en.m.wikipedia.org/wiki/Z3_(computer)
Hydraulix989 · 5 years ago
The winners write history.

It's interesting that the Z3 was denied funding from the German gov't for not being war-important. It turns out they were ahead of the computing game and squandered it, as US and GB relied on computers for code breaking and performing calculations for developing the atom bomb.

It's not a blunder to the tune of invading Russia, but it's still interesting.

aap_ · 5 years ago
There was no first computer. The further you go back the less it becomes a computer, the further you go forward the less it becomes a first.

I can really recommend the book "ENIAC in Action" on this matter and of course on the ENIAC generally.

kyberias · 5 years ago
It can be argued that Z3 was not as general purpose (Turing complete) as ENIAC.

"The Z3 was demonstrated in 1998 to be, in principle, Turing-complete. However, because it lacked conditional branching, the Z3 only meets this definition by speculatively computing all possible outcomes of a calculation."

Rochus · 5 years ago
The claim was the first programmable, general-purpose "electronic computer". Zuse's machine was "electro mechanic" (i.e. based on relays instead of tubes).
djmips · 5 years ago
True but on the other hand, he kind of made computers 'open source'
Rochus · 5 years ago
> he kind of made computers 'open source'

The only "positive" (depeding on perspective) effect of his illegal and morally questionable action was that the patent was declared null and void due to the novelty-damaging effect of his report, among other issues supporting this outcome. The computers however did not become more "open source" because of this, just cheaper.

tpetricek · 5 years ago
I was looking through the comments to see if someone mentioned Jean Bartik already! Just to add to this comment, she also wrote an autobiographical book, which is worth reading: https://www.amazon.co.uk/Pioneer-Programmer-Jennings-Compute...

It is clearly a very personal and biased account, but I think there are good reasons to believe many of the things she is saying, even when they go against the commonly accepted history "written by the winners".

smaddox · 5 years ago
Wow... How is this not more well known? It's not like von Neumann doesn't have plenty of other important contributions to his name.
gpanders · 5 years ago
It is far from settled whether the GP’s interpretation is in fact what actually happened. I highly recommend The Innovators by Walter Isaacson. Among other things, it covers this “controversy” in great detail.
masswerk · 5 years ago
Have a look the oral history interview with J. Presper Eckert, it's a long lasting controversy: https://conservancy.umn.edu/handle/11299/107275

PDF: https://conservancy.umn.edu/bitstream/handle/11299/107275/oh...

kalium-xyz · 5 years ago
Most people who have a lot of things named after then wanted to have a lot of things named after them. Consider: why is more named after von Neumann then Erdos?
ducktective · 5 years ago
How come it's known as "von Neumann" architecture?
adrian_b · 5 years ago
The name "von Neumann" is rightly applied to this, because von Neumann was the first who has written a complete and consistent description of this architecture, so that anybody could understand it and build computers based on this architecture.

The Eckert group had some rather vague ideas about the architecture of their future computer, the successor of ENIAC, and those ideas certainly included some kind of stored program (not necessarily stored in the same memory as the data; stored programs were not new, as they were common in mechanical or electromechanical computers).

However the Eckert group was secretive, because they hoped to start after the war their own company to make computers and become rich, so it is not known for sure how much von Neumann learned from them and how much of what von Neumann wrote were his own ideas.

In any case, I am more grateful to von Neumann than to the Eckert group, because all the other more important computer projects have started from the von Neumann report, while the work of Eckert and al. had quite a little influence.

The ENIAC computer was very important only as a demonstration that it is possible to make very fast electronic computers, otherwise it had much less influence on the structure of computers than the many other slower computers made around that time.

madhadron · 5 years ago
So a little bit longer version of the story: Eckert and Mauchly had designed the architecture. Herman Goldstine, the military liaison of the ENIAC project, and something of a social climber, sought out von Neumann to see if he was interested. von Neumann was, and came to a lot of design meetings and made contributions about what the instruction set should be. Then when he was heading out to Los Alamos on a trip, he had some downtime during travel and wrote up the design for the group. Goldstine typed it up and circulated it with just von Neumann's name on it.

It is a point of discredit to von Neumann that he did not disavow the naming of the architecture.

Funnily, Goldstine's wife was one of the early programmers and well respected in that area. Goldstine himself made no significant contributions to computing.

jonnybgood · 5 years ago
Stigler's law of eponymy?
tdfirth · 5 years ago
Here is a fairly old documentary about 'Johnny' von Neumann:

https://www.youtube.com/watch?v=Y2jiQXI6nrE

He was a myth even in his own lifetime but parts of that film hint at the man. It features interviews with many of his old colleagues, discussion of what it was like to work with him, etc.

If you just watch one bit, personally I love the anecdote that begins around here:

https://youtu.be/Y2jiQXI6nrE?t=2604

It is a story of his incredible facility for mental arithmetic that I hadn't heard before. It is arguably one of his least impressive skills but for me it illustrates so clearly the gap between him and his peers.

Deleted Comment

1cvmask · 5 years ago
Regarding the coining of entropy is just an example of the creative genius of Von Neumann:

When Shannon first derived his famous formula for information, he asked von Neumann what he should call it and von Neumann replied “You should call it entropy for two reasons: first because that is what the formula is in statistical mechanises but second and more important, as nobody knows what entropy is, whenever you use the term you will always be at an advantage!

http://www.spatialcomplexity.info/what-von-neumann-said-to-s...

https://en.wikipedia.org/wiki/Entropy_(information_theory)

toomanybeersies · 5 years ago
Funny you mention that story, I came across it for the first time about 3 days ago in a book I was reading -- Grammatical Man. It's one of the best reads I've found on the history of information theory. Funnily enough the friend that lent it to me isn't even a tech person, he's an electrician who's more interested in occultism and esotericism than mathematics and technology.

Every time I read about the 20th century history of computing, it sounds like it was a wild time. Every advance or discovery seems to have some humorous anecdote attached to it.

1cvmask · 5 years ago
I hope you like this biography on Paul Erdos. Great read. If you like it recommend it to others. I have gifted it to many friends and even to those who are not mathematicians, STEMmy or mathematically inclined. Hope you get a chance to read it.

https://en.wikipedia.org/wiki/The_Man_Who_Loved_Only_Numbers

hintymad · 5 years ago
I often use von Neumann's quote, "Young man, in mathematics you don't understand things. You just get used to them.", to console myself when I have to read a math book multiple times to really understand something.

The quote also has truth in it. I had no problem accepting that 0! = 1 only because I learnt that fact early in school. However, I struggled quite a bit to accept that span({}) = {0}, even though it is not that different from 0!=1 and I knew multiple explanations. It seems the later one learns a new concept, the longer it takes to accept it.

pointedset · 5 years ago
Yes, those two facts about zero/empty cases (and so many more) are definitely related, and this class of facts is one of my favourites! Usually, if you're dealing with something algebraic in flavour (which is a very vague concept, sorry), there will be a sensible way to define the zero/empty case. This is often a good test of whether you have a uniform concept that works for all n without corner cases.

It almost irritates me when I read a book or a paper and they say that the zero/empty case is "by convention". I almost want to yell, "no! it's because that's how you make the definition uniform!"

Addition is usually defined as a binary operation, a+b, but really it should be defined as an n-ary operation; associativity tells us that doing "two layers" of addition should boil down to doing a single layer of addition on the concatenated list of operands. That forces 0-ary addition to be zero, which can always be added to the list of operands without affecting the result.

Something similar happens with empty products (which explains the factorial), empty spans, etc. In all cases, the trick is to figure out, what is the equivalent of associativity? What "syntactic" operations on the inputs (for example, concatenating a list of lists of operands) correspond to operations on the outputs (you can get the total sum by first computing partial sums)?

A fun puzzle, if you enjoy this kind of thing: what's the determinant of the 0x0 matrix (over your favourite field or ring)? For all (square) sizes, the determinant of the zero matrix is zero, but the determinant of the identity matrix is one, and the 0x0 matrix is kind of both. So which pattern should win? Which one is stronger? I know my own answer ;)

hintymad · 5 years ago
I was also puzzled by det(0x0) being 1, because I had built an intuition that determinant of a matrix was the volume of the parallelepiped represented by the matrix. I made my peace by accepting that my intuition on volume implies that volume is defined in a space that has positive dimensions, and by treating zero space as an algebraic construct.
jonahx · 5 years ago
> what's the determinant of the 0x0 matrix (over your favourite field or ring)?

1, because 0x0 seems like a more elegant base case for the recursive det formula than 1x1.

Groxx · 5 years ago
Yeah, it took a while to sink into my head that many of these "wait, why is span({}) = {0}?" kinds of cases have answers that sum up as "because anything else means other rules are inconsistent, and the whole thing is either less useful or useless". It's "arbitrary", but it's either the only useful option, or sometimes a simple(st) one of many.

Even just one number theory course helped a lot, since it brought that kind of consistency into its own concept, where [this set of rules] forms a ring, and [this set] forms a field, etc.

OscarCunningham · 5 years ago
Define the determinant to live in the ring quotiented by the annihilator of the module. Then the determinant of the 0 by 0 matrix is both 0 and 1.
adverbly · 5 years ago
Couple more fun examples:

all([]) == True

any([]) == False

jhgb · 5 years ago
> there will be a sensible way to define the zero/empty case. This is often a good test of whether you have a uniform concept that works for all n without corner cases.

And so, begun the array indexing war has.

sellyme · 5 years ago
> Young man, in mathematics you don't understand things. You just get used to them.

Not quite in the same weight class as von Neumann, but Matt Parker's "There's a trick for dealing with that in mathematics, called 'not really worrying about it'" when discussing results that don't mesh with our intuitive understanding is another nice one.

Koshkin · 5 years ago
I’d say, rather, try and build your intuition in accordance to what mathematics, in fact, tells you (which is achieved by doing exercises).
smnplk · 5 years ago
0! = 1 is not hard to accept, since it follows a rule. You just need to look at it backwards. To get the previous factorial (Ni-1)! you need to divide the N! by N

4! = 24 ;; 24 / 4 = 6

3! = 6 ;; 6 / 3 = 2

2! = 2 ;; 2 / 2 = 1

1!= 1 ;; 1 / 1 = 1

0! = 1

gcanyon · 5 years ago
I was going to joke, "So you're saying -1! = 1/0" but then I thought I'd check Wikipedia first and that's exactly the reason they give for factorials of negatives being undefined, which spoils the joke.
Koshkin · 5 years ago
This looks even more natural from the programmer’s standpoint: reduction of a set of numbers by summation starts with 0, and reduction by multiplication, with 1; so, if the set is empty (has zero elements) the result is simply the starting number (0 or 1, respectively).
zests · 5 years ago
Even funnier is how this roughly works for -1!
yosito · 5 years ago
For the last two years I've been learning Neumann's native language, Hungarian. It's a very difficult language for an English speaker to acquire (and vice-versa), but this quote could be adapted perfectly to how I feel learning Hungarian, "Young man, in Hungarian you don't understand things. You just get used to them.". I wonder if Neumann had a similar experience early on while learning English, and it shaped his view of mathematics as well.
kaba0 · 5 years ago
It’s quite rare for someone to learn Hungarian, what a surprise! (I’m a native speaker)

Feel free to ask me anything you have trouble with. In the reverse direction, double negative is something that I sometimes have to pay attention to, and I don’t know about another language that employs it.

gnramires · 5 years ago
There's one of his maxims I like better: You don't know something until you can prove it 3 different ways.

In mathematics, it begs the question of what "understanding" really means. To understand an object doesn't necessarily mean divining an unquestionable structure by chance. What usually happens is that you're investigating some kind of problem (usually with real-world applications, if distant), and then you find you need a certain tool or a certain theory to simplify your problem, make it more tractable or more abstract. For example, you could be studying permutations, and from there the Binomial comes naturally, as well as the factorial function. From this theory, comes several definitions. The definitions are such to further you goal: they are the ones that make your tool easier to use, simpler, more "streamlined", more suitable to approach your application with minimal special cases. This is how something like '0!' is defined, and how most theories are discovered.

The thing about understanding is that it's a bit too much to require to "understand" something (even to yourself).

How can one know when he's reached "understanding"? Being used to it should be good enough for most purposes. If you know the rules, and you know how to apply them, that's mathematics.

Perhaps another direction to understanding is seeing a thing from a variety of lenses (connecting to different fields), expanding your ability to apply a tool, seeing more broadly. That's when you generalize, and you're able to see what your had as a special case (another definition of understanding): from addition to algebra, to rings, to abstract algebra. From numbers to equations to functions, each step perhaps you "understand" the fundamentals better by having a broader perspective on generalization. But of course that's only useful if your generalization is useful at all.

ZephyrBlu · 5 years ago
I've never seen that quote before, but it feels very human.

I feel like there's an intuition which comes along with learning things. Or maybe learning is simply developing intuition.

At some point things somehow just make sense in your mind because you've built up an intuition of how it works.

hintymad · 5 years ago
I learned it the hard way, as I hit a wall when taking a course on abstract algebra, that the only path to truly understanding college-level math is building solid intuition. Otherwise, the sheer number of definitions and theorems will just be overwhelming. Of course, intuition is as critical for pre-college math. It's just that we somehow get the intuition naturally probably because we get to experience all kinds of examples on a daily basis to hone our intuition unconsciously.
djmips · 5 years ago
I agree, negative numbers learned early are used with aplomb but imaginary numbers learned later are initially met with suspicion.
adamnemecek · 5 years ago
I wonder if e.g. using constructive math would make things easier since every step of a constructive proof is understandable.
pointedset · 5 years ago
Unfortunately, constructive vs classical (vs linear, etc.) applies to proofs, but this is really about definitions. Proofs can be correct or incorrect pretty straightforwardly, but definitions being correct or not is really a matter of taste. (And as someone who's been formalising some mathematics in Lean recently, definitions are so much trickier to get right than proofs!)
Koshkin · 5 years ago
My understanding is that since any mathematical proof can (in principle) be reduced to manipulation with formal objects (symbols), it’s always “constructive.”
MrDresden · 5 years ago
He was beyond a brilliant person, of that is no doubt. And not in anyway trying to take away from that fact, reading the following excerpt from the article..;

"[..] von Neumann both worked alongside and collaborated with some of the foremost figures of twentieth century science. He went to high school with Eugene Wigner, collaborated with Hermann Weyl at ETH, attended lectures by Albert Einstein in Berlin, worked under David Hilbert at Göttingen, with Alan Turing and Oskar Morgenstern in Princeton, with Niels Bohr in Copenhagen and was close with both Richard Feynman and J. Robert Oppenheimer at Los Alamos."

..makes me do wonder how many of the accomplishments he made were due to his own brilliance, or due to both it and the stimulating environments he kept finding him self in.

For my own parts, I see a big difference in my own output, and the quality of that output, when getting the opportunity to interact with high caliber people (knowledgable, emotionally intelligent, inquisitive and open) on a regular basis.

Tuna-Fish · 5 years ago
A lot of those particularly accomplished people he interacted with specifically pointed him out as exceptional.

For example, from this article about his eidetic memory:

"One of his remarkable abilities was his power of absolute recall. As far as I could tell, von Neumann was able on once reading a book or article to quote it back verbatim; moreover, he could do it years later without hesitation. He could also translate it at no diminution in speed from its original language into English. On one occasion I tested his ability by asking him to tell me how A Tale of Two Cities started. Whereupon, without any pause, he immediately began to recite the first chapter and continued until asked to stop after about ten or fifteen minutes."

read_if_gay_ · 5 years ago
Another:

> “There was a seminar for advanced students in Zürich that I was teaching and von Neumann was in the class. I came to a certain theorem, and I said it is not proved and it may be difficult. von Neumann didn’t say anything but after five minutes he raised his hand. When I called on him he went to the blackboard and proceeded to write down the proof. After that I was afraid of von Neumann” — George Pólya

The nature vs. nurture argument has its merits in general, but sometimes nature just produces a complete freak. Being surrounded by geniuses probably isn't that valuable when you're head and shoulders above even them.

metalliqaz · 5 years ago
My favorite, from Edward Teller:

“von Neumann would carry on a conversation with my 3-year-old son, and the two of them would talk as equals, and I sometimes wondered if he used the same principle when he talked to the rest of us.”

bobcostas55 · 5 years ago
Fermi to one of his PhD students:

"You know how much faster I am in thinking than you are? That's how much faster von Neumann is compared to me."

jonplackett · 5 years ago
These things are a virtuous circle. The more brilliant you are, the more brilliant people want to hang around, and the more brilliant you become. And loop.
pjmorris · 5 years ago
"Work with the smartest people who will work with you" has become one of the pieces of career advice I give myself.
clairity · 5 years ago
it's all mythmaking, a custom as old as time and the ancestor of our modern obsession with promotion. exceptional people aren't born so much as constructed to tickle our imagination and ego.

every culture makes myths but obsession is where it becomes problematic. we must have idols, so we will make them opportunistically rather than leave them to organic discovery. there's enough randomness and manipulation in the people we remember historically to not put too much faith in the objectiveness of any of these constructions.

that's not to say that no credit is due them, but that it's overwhelmingly likely overstated, per this obsession.

also note that any story title with a name prominently featured is simply going to be subjective (practically by definition).

Deleted Comment

adverbly · 5 years ago
Is it just me or does it seem like there something like there were a lot of celebrity-level STEM heroes to come out of the early 20th century. These folks all seemed to work together closely.

Maybe these things take time, but it just doesn't seem like there is anything remotely like this going on today, despite far greater access to education and increases in global population.

Is it perhaps that the limelight is less on mathematicians than it was during the early 20th century?

ykl · 5 years ago
They all worked closely together because they all worked on the Manhattan Project. When there’s a single huge government project requiring the nation’s smartest people and on which the survival of the nation depends, it shouldn’t be too surprising that all of the top scientists and mathematicians and engineers all end up working together. :)

The fact that so many of them specifically grew up in Hungary, went to school in Germany, and came to the United States is interesting to note though!

kaba0 · 5 years ago
There is interesting blog post about it here:

https://www.lesswrong.com/posts/xPJKZyPCvap4Fven8/the-atomic...

Deleted Comment

Sharlin · 5 years ago
They immigrated because they could see that another war was brewing in Europe. So basically the US can thank the Nazis for that. They all had Jewish heritage (which is a very interesting fact in itself!) so it was sort of personal to them as well.
acidburnNSA · 5 years ago
It's not just you. This whole group of (Hungarian) scientists is known as "The Martians" [1]

[1] https://en.wikipedia.org/wiki/The_Martians_(scientists)

TheTrotters · 5 years ago
Scientists become famous partly due to their accomplishments and partly due to their life story.

Would John Nash be famous if it weren’t for his tragic struggles with mental health? Would I know about Feynman if it weren’t for “Surely You’re Joking Mr. Feynman”, his safecracking, and the topless bars? Would Stephen Hawking be a household name if he weren’t disabled?

WW2 gave a fascinating context to many scientific projects and made lives of these scientists more colorful than they would have been otherwise. Ditto for the Cold War.

It’s up for debate if there’s a slowdown in science nowadays. But it’s certain that lives of those who do science are much more boring to the outsiders. Papers, conferences, labs, petty academic squabbles, fights over funding, perhaps some consulting for the government or the private sector. It’s much harder to create a mythology out of that.

When I think of famous 21st century mathematicians I think of Perelman (a recluse who proved a famous conjecture and refused the Fields Medal) or Zhang (worked as an accountant and at a Subway, then as a lecturer at UNH before making a breakthrough on the twin primes conjecture). Ordinary geniuses who went from top graduate programs to top professorships are comparatively anonymous to the general public.

throw99909 · 5 years ago
I would offer Terry Tao as a counterexample. The only thing strange about him is how, well, not-strange he is!
chrisbrandow · 5 years ago
It was a relatively small community, even before Manhattan project they all knew each other. Academic science is still very much like this, but the difference now is that there are so many more scientists, very few of their results are known to a wide audience.
blueblisters · 5 years ago
I think research has become a lot bigger, more accessible, collaborative and derivative since the early 20th century. There are probably tons of graduate students with the intellect of a Einstein or a von Neumann who will likely never get the same recognition because they will be competing with (or working with) others for the same breakthroughs.

We had a lot of early "heroes" in Artificial Intelligence, but their accomplishments were eclipsed by newer breakthroughs in a short amount of time by equally smart people. Overall, this is great for humanity since we are pushing boundaries much quicker than was possible earlier.

anonytrary · 5 years ago
> Is it perhaps that the limelight is less on mathematicians than it was during the early 20th century?

Everything make sense looking backwards. It is quite possible there are people who exist today in various fields (think AI, crypto, space, vehicles) who will go down in history the same way. Kids will look back in 50 years citing their names, just as we cite Dirac, Einstein, Feynman, Heisenberg, etc. To us, these people are just "scientists", and "engineers" who are also alive at the same time we are. The reward is reaped years later.

Quantum Mechanics' reward was computers and health tech for the most part, but most of that truly started popping off long after the Manhattan project celebs died.

beaconstudios · 5 years ago
How is computing downwind from QM? I would trace computing's lineage back to the invention of the transistor or the earlier vacuum tube (advancements in electrical engineering, and the beginning of electronics), and logically back to Boole.

I don't think QM has had its payoff yet, but certainly general and special relativity have paid off in things like GPS, and probably more examples that I don't know about.

allemagne · 5 years ago
Matching the impact von Neumann made by himself in the 20th century might take dozens of von Neumanns today.
yosito · 5 years ago
Or one Katalin Karikó.
bick_nyers · 5 years ago
I think mathematicians only become famous when they are being driven by/driving an industrial revolution. Just look at the digital revolution. I have never heard of George Boole until a quick Google search, but he laid down the framework for Boolean Algebra in 1850. Claude Shannon is a much more recognized name, even though the impact of information theory is smaller than that of boolean algebra (considering that BA is a prerequisite). Is it possible that George Boole was as popular in the 1930's as Shannon is now? Maybe. Perhaps the fame of mathematicians is staggered, as the applications often drive their fame, and the applications often come ~50 years later.
toomanybeersies · 5 years ago
It was a much smaller field back then. Reading about computing in the 70's (especially ARPANET), the same handful of names keep popping up.

Also, they may be celebrities to us. But ask the average person who John von Neumann, Claude Shannon, Edsger Dijkstra, or Paul Erdős was, and they'll probably have no idea.

yosito · 5 years ago
> it just doesn't seem like there is anything remotely like this going on today

It seems to me like the STEM heroes these days are working together on vaccine technologies. If you follow developments in mRNA technology you'll see there's a lot of international cooperation. It seems to me that current collaboration is on an even greater scale than the STEM heroes of the early 20th century, but it probably won't become clear to many until we look on it in retrospect 50 years from now.

peter303 · 5 years ago
Two words: SILICON VALLEY. Plus a lot of incest with Stanford grad students and young Stanford professors.
astrange · 5 years ago
All of these Hungarian STEM heroes (https://en.wikipedia.org/wiki/The_Martians_(scientists)) were pre-Silicon Valley people. It was still Valley of Heart's Delight back then.

Not sure who their successors were. I guess there's Knuth.

DonHopkins · 5 years ago
Which episode are you referring to?
deviation · 5 years ago
It staggers me that in an age of incredible innovation and technology- we still have no idea what biologically separates a man like this from the rest of us.

I'd like to take the idyllic position of (somewhat irresponsibly) hoping that some level of genius can be fostered and created in anyone from a young age.. But when I read articles like this I feel the compulsive need to sit back in my chair and ponder just how much more primitive my mind is in comparison to a man like John's.

geenew · 5 years ago
To repeat a quote that probably appears in every Von Neumann post,

"Teller also said "von Neumann would carry on a conversation with my 3-year-old son, and the two of them would talk as equals, and I sometimes wondered if he used the same principle when he talked to the rest of us."

vecinu · 5 years ago
Obligatory Edward Teller talking about Von Neumann => https://youtu.be/ra4K6WPUkpk
iandanforth · 5 years ago
László Polgár believed this and turned his daughters into chess prodigies.

https://en.wikipedia.org/wiki/Judit_Polg%C3%A1r

Sharlin · 5 years ago
It’s pretty clear that the Polgár daughters had a very favorable genetic disposition on their side as well.
ZephyrBlu · 5 years ago
Do we actually have no idea what the biological/genetic differences between geniuses and average people are? That's interesting.

> I'd like to take the idyllic position of (somewhat irresponsibly) hoping that some level of genius can be fostered and created in anyone from a young age

So you believe that genius is environmental and not genetic? Or maybe we are thinking about different definitions of genius.

tenaciousDaniel · 5 years ago
My personal experience is that it's a bit of both.

I am not a genius, but I can tell that I'm likely above average. I was raised in an environment that did not value intellect or education, and I always felt out of place. It definitely felt innate, and there are lots of stories from people in similar circumstances.

On the other hand, I never did well in school, especially math. One of my teachers literally called me stupid. So I had this self-perception of just not being very good at STEM. Had I not fallen into programming by accident, I would never have changed that perception of myself. I imagine there are very many people who had similar experiences, but were not so lucky as to learn their actual potential.

So, it's almost certainly a combination of factors.

smnplk · 5 years ago
I believe the foundations have to be genetic. There are multitude of factors involved and one of them is having a strong memory. But there also needs to be a drive and thirst for knowledge.
stephc_int13 · 5 years ago
From a genetic standpoint, all humans are 99.9% identical beasts.

There is of course some genetic components in potential abilities.

But I strongly believe that most of the differences we are seeing are built on compounded interest of environmental factors, starting from a very young age.

yosito · 5 years ago
> we still have no idea what biologically separates a man like this from the rest of us

I don't think it was a biological separation. Neumann, and his brilliant peers, all grew up in Hungary in the midst of political and ethnic oppression with significant economic and language barriers to overcome. Books have been written about it, and I can't accurately recount the facts. But it seems to me that what separates him and his peers is their perseverance in the face of adversity. Having overcome the very difficult circumstances in which they were born prepared them for great success later in life. It's inspiring, but in some ways it's also survivorship bias, since there were uncountable people who were born in the same circumstances and did not overcome. I think similar things are happening with refugees today. Those who overcome their circumstances often go on to be outstanding examples of genius and accomplishment in their fields.

bidirectional · 5 years ago
I think you are confusing von Neumann with someone else, because that is not the case, at all. He grew up in a wealthy family which was ennobled by the emperor when he was 10 (hence the von), and spoke Hungarian natively. In childhood he was taught by governesses in numerous major European languages before going to a Christian school for the Hungarian elite, while receiving private tutoring from a renowned mathematician. Then he went to prestigious universities in Germany and Switzerland, before becoming the youngest Privatdozent in University of Berlin history.

The situation obviously rapidly changed in Europe, but Von Neumann left for Princeton before the 30s began. He grew up in a life of great privilege.

Dead Comment

wombatmobile · 5 years ago
"Anybody who looks at living organisms knows perfectly well that they can produce other organisms like themselves. This is their normal function, they wouldn’t exist if they didn’t do this, and it’s not plausible that this is the reason why they abound in the world. In other words, living organisms are very complicated aggregations of elementary parts, and by any reasonable theory of probability or thermodynamics highly improbable. That they should occur in the world at all is a miracle of the first magnitude; the only thing which removes, or mitigates, this miracle is that they reproduce themselves. Therefore, if by any peculiar accident there should ever be one of them, from there on the rules of probability do not apply, and there will be many of them, at least if the milieu is reasonable. But a reasonable milieu is already a thermodynamically much less improbable thing. So, the operations of probability somehow leave a loophole at this point, and it is by the process of self-reproduction that they are pierced."

— John von Neumann

iandanforth · 5 years ago
In this I disagree with von Neumann. The relevant phrase to understand my point of departure is "That which persists exists." There is a continuum of strategies for persistence and reproduction is only one of them. While thermodynamics over a large enough timescale dictates even disorder, over shorter timescales and in specific local it allows for the temporary increase in order. This is no more or less probable than any other arrangement. What's fundamentally fascinating is that there are forces which resist change. It is that resistance to change on the small scale that allows for the persistence of patterns on the large scale. Atoms resist change, persist and thus continue to exist. Molecules resist change. Crystal structures. Arrangements of molecules into cells. The patterns of behavior in individuals. There are layers upon layers of this same phenomenon of the ability to resist change via different strategies and inherent properties. Locally this means that there is an anti-entropic tendency built into the fabric of, and all subsequent layers or organization in the universe. It's bloody wonderful, but hardly improbable.
kaba0 · 5 years ago
I am by no means knowledgeable enough on this topic, but what we think of as increased order may not always equate to physically having larger order.

Eg. your atom/molecule example, it may only be more orderly to a human brain, when in actuality the added entropy of the tight packing is overcome by the favorable energy state at the temperatures we are used to. So there may be a difference in a fixed order by energy expenditure like a cell (Neumann’s point) and seemingly ordered state that in physical actuality requires no additional energy like a salt crystal (your point). With this (maybe faulty) distinction, Neumann’s point still stands in my reading.

If we were to have an immortal supercell that didn’t reproduce, it would likely seize to exist at an “infinite” timeline due to some rare external event, even if it is superior in every way to a simple cell that only reproduces. While at the other hand a “single” crystal will similarly not survive a long timeline, but by simply its structure being a law of nature, it will form again in the same way.

ekianjo · 5 years ago
> Therefore, if by any peculiar accident there should ever be one of them, from there on the rules of probability do not apply, and there will be many of them, at least if the milieu is reasonable.

This is the line of arguments in favor of the "simulation theory": "If in a distance future a civilization ever manages to create a simulation, who says we are not already into one of them?"