Readit News logoReadit News
chriskanan · a year ago
Here is the reasoning: https://www.nobelprize.org/uploads/2024/09/advanced-physicsp...

I'm surprised Terry Sejnowski isn't included, considering it seems to be for Hopfield Nets and Boltzmann machines, where Terry played a large role in the latter.

dang · a year ago
I guess it makes sense to use that link above since it goes into much more detail. Changed from https://www.nobelprize.org/prizes/physics/2024/summary/. Thanks!
thrdbndndn · a year ago
I know it's one day late but I personally don't agree with this change.

Using the main entry webpage is much easier for people to look for relevant info, including but not limited to: various articles about this specific award; other info about this year's Nobel, other info about Nobel in general.

This is exactly what the Internet (or WWW) is better than traditional printing media. Using a PDF as a link (which itself can be easily found from the entry point) defeats it.

whizzter · a year ago
He was probably considered since he is mentioned in the reasoning paper, still it could be one of those unfortunate omissions in the nobel history since those deciding the prize might have a hard time to measure impact.
gtirloni · a year ago
Then they shouldn't be trusted to give awards in an area they are not experts in.
hxnamer · a year ago
Is this a widely accepted version of neural network history? I recognize Rosenblatt, Perceptron, etc., but I have never heard that Hopfield nets or Bolzmann machines were given any major weight in the history.

The descriptions I have read were all mathematical, focusing on the computational graph with the magical backpropagation (which frankly is just memoizing intermediate computations). The descriptions also went out of their way to discourage terms like "synapses" and rather use "units".

legel · a year ago
Restricted Boltzmann Machines were a huge revolution in the field, warranting a publication in Science in 2006. If you want to know what the field looks like back then, here it is: https://www.cs.toronto.edu/~hinton/absps/science.pdf

I remember in 2012 for my MS thesis on Deep Neural Networks spending several pages on Boltzmann Machines and the physics-inspired theories of Geoffrey Hinton.

My undergraduate degree was in physics.

So, yes, I think this is an absolutely stunning award. The connections between statistical entropy (inspired by thermodynamics) and also of course from biophysics of human neural networks should not be lost here.

Anyways, congratulations to Geoffrey Hinton. And also, since physics is the language of physical systems, why not expand the definition of the field to include the "physics of intelligence"?

From their official explanation page (https://www.nobelprize.org/uploads/2024/09/advanced-physicsp...): "With ANNs the boundaries of physics are extended to host phenomena of life as well as computation."

godelski · a year ago

  > I have never heard that Hopfield nets or Bolzmann machines were given any major weight in the history.
This is mostly because people don't realize what these are at more abstract levels (it's okay, ironically ML people frequently don't abstract). But Hopfield networks and Boltzmann machines have been pretty influential to the history of ML. I think you can draw a pretty good connection from Hopfield to LSTM to transformers. You can also think of a typical artificial neural network (easiest if you look at linear layers) as a special case of a Boltzmann machine (compare Linear Layers/Feed Forward Networks to Restricted Boltzmann Machines and I think it'll click).

Either way, these had a lot of influence on the early work, which does permeate into the modern stuff. There's this belief that all the old stuff is useless and I just think that's wrong. There's a lot of hand engineered stuff that we don't need anymore, but a lot of the theory and underlying principles are still important.

dongecko · a year ago
Bolzman machines were there in the very early days of deep learning. It was a clever hack to train deep nets layer wise and work with limited ressources.

Each layer was trained similar to the encoder part of an autoencoder. This way the layerwise transformations were not random, but roughly kept some of the original datas properties. Up to here training was done without the use of labelled data. After this training stage was done, you had a very nice initialization for your network and could train it end to end according to your task and target label.

If I recall correctly, the neural layers output was probabilistic. Because of that you couldn't simply use back propagation to learn the weights. Maybe this is the connection to John Hopkins work. But here my memory is a bit fuzzy.

Onavo · a year ago
The deep learning variety of neural networks are heavily simplified, mostly linear versions of biological neurons. They don't resemble anything between your ears. Real life neurons are generally modeled by differential equations (in layman terms, have many levels of feedback loops tied to time), not the simplified ones used in dense layer activation functions.

Here are some examples

https://snntorch.readthedocs.io/en/latest/tutorials/tutorial...

chriskanan · a year ago
Of the people who are still alive, Hopfield and Hinton make sense.

Hopfield networks led to Boltzmann machines. Deep learning started with showing that deep neural networks were viable in Hinton's 2006 Science paper, where he showed that by pre-training with a Restricted Boltzmann machine (essentially a stacked self-supervised auto-encoder) as a form of weight initialization, it was possible to effectively train neural networks with more than 2 layers. Prior to that finding, people found it was very hard to get backprop to work with more than 2 layers due to the activation functions people were using and problematic weight initialization procedures.

So long story short, while neither of them are in widespread use today, they led to demonstrating that neural networks were a viable technology and provided the FIRST strategy for successfully training deep neural networks. A few years later, people figured out ways to do this without the self-supervised pre-training phase by using activation functions with better gradient flow properties (ReLUs), better weight initialization procedures, and training on large datasets using GPUs. So without the proof of concept enabled by Restricted Boltzmann Machines, deep learning may not have become a thing, since prior to that almost all of the AI community (which was quite small) was opposed to neural networks except for a handful of evangelists (Geoff Hinton, Yoshua Bengio, Yann LeCun, Terry Sejnowski, Gary Cottrell, and a handful of other folks).

seydor · a year ago
Second time he gets overlooked (after turing award)
mistercheph · a year ago
Whether or not these fields are meaningfully distinct is a matter of taste, despite the fashion being to imagine a plurality of interconnected but autonomous domains.
soheil · a year ago
Are you surprised every year that he's not included?
KingFelix · a year ago
Yeah, Terry is a rockstar, and pumping out tons of papers. Maybe they google scholared, list by citations?
an_cap · a year ago
An excellent career retrospective by John Hopfield - https://pni.princeton.edu/sites/g/files/toruqf321/files/docu...

"As an Academy member I could publish such a paper without any review (this is no longer true, a sad commentary on aspects of science publishing and the promotion of originality)."

kkylin · a year ago
National Academy members still get to pick the reviewers (if they choose to go that route rather than regular submisssion), and the review is not blind. The reviews themselves are not public, but the identities of the reviewers are made public once the paper is out. So members can't just say whatever sh*t they want (and you can imagine some do), but still a highly unusual process.
kkylin · a year ago
Too late now to edit my original comment, but I should have added that I was talking very specifically about the Proceedings of the National Academy of Sciences (as was Hopfield).
DrillShopper · a year ago
Yeah, fuck peer review!/s
ajkjk · a year ago
Everyone's first thought when they read something is whatever the social norms say you're supposed to think (peer review = good, publishing without peer review = not science somehow?), but shouldn't you stop and wonder why the esteemed scientist wrote that line instead of just dismissing it? Otherwise you are only chiming in to enforce a norm that everyone already knows about, which is pointless.

One of the really refreshing things about reading older research is how there used to be all these papers which are just stray thoughts that this or that scientist had, sometimes just a few paragraphs of response to some other paper, or a random mathematical observation that might mean nothing. It feels very healthy. Of course there were far fewer scientists then; if this was allowed today it might be just too crowded to be useful; back then everyone mostly knew about everyone else and it was more based on reputation. But dang it must have been in a nice to have such an unrestricted flow of ideas.

Today the notion of a paper is that it is at least ostensibly "correct" and able to be used as a source of truth: cited in other papers, maybe referred to in policy or legal settings, etc. But it seems like this wasn't always the case, at least in physics and math which are the fields I've spent a lot of time on. From reading old papers you get the impression that they really used to be more about just sharing ideas, and that people wouldn't publish a bad paper because it would be embarrassing to do so, rather than because it was double- and triple-checked by reviewers.

naasking · a year ago
Yes, but non-sarcastically.
Ma8ee · a year ago
I think this is the Royal Academy of Sciences way to admit that Physics as a research subject has ground to a halt. String theory suffocated theoretical high energy physics for nearly half a century with nothing to show for it, and a lot of other areas of fundamental physics are kind of done.
eigenket · a year ago
I think this is (very) inaccurate. It feels more like them trying to jump on a "hot topic" bandwagon (machine learning/AI hype is huge).

Physics as a discipline hasn't really stalled at all. Fundamental physics arguably has, because no one really has any idea how to get close to making experimental tests that would distinguish the competing ideas. But even in fundamental physics there are cool developments like the stuff from Jonathan Oppenheim and collaborators in the last couple of years.

That said "physics" != "fundamental physics" and physics of composite systems ranging from correlated electron systems, and condensed matter through to galaxies and cosmology is very far from dead.

620gelato · a year ago
> trying to jump on a "hot topic" bandwagon

I don't know exactly what they hope to gain by jumping on that bandwagon though; neither the physicists nor the computer scientists are going to value this at all. And dare I say, the general populace associated with the two fields isn't going to either - case in point, this hn post.

If there weren't any noble-worthy nominations for physics, maybe skip it? (Although that hasn't happened since 1972 across any field)

Ma8ee · a year ago
I just briefly looked into what Jonathan Oppenheim is working on, and I’d say he’s part of the problem. More speculative work that might or might not be testable in a distant future.
mppm · a year ago
It really has not, though. There is more to physics than high-energy and cosmology, and there is no shortage of deserving contributions of smaller scope. It really is bizarre that deep learning would make it to the top of the list.
Ma8ee · a year ago
Could you give me some examples of areas of fundamental physics that are vital and have done some significant discoveries lately? I genuinely would like to know, because I really can't think of any.
api · a year ago
My sense is that we might have reached the limits of what we can do in high-energy or fundamental physics without accessing energy levels or other extreme states that we currently can't access as they are beyond our capacity to generate.

From what I've read (not a professional physicist) string theory is not testable unless we can either examine a black hole or create particle accelerators the size of the Moon's orbit (at least). Many other proposed theories are similar.

There is some speculation that the hypothetical planet nine -- a 1-5 Earth mass planet predicted in the far outer solar system on the basis of the orbits of comets and Kuiper Belt / TNO objects -- could be a primordial black hole captured by the solar system. A black hole of that mass would be about the size of a marble to a golf ball, but would have 1-5g gravity at the distance of Earth's radius.

If such an object did exist it would be within space probe range, which would mean we could examine a black hole. That might get us un-stuck.

If we can't do something like that, maybe we should instead focus on other areas of physics that we can access and that have immense practical applications: superconductivity, condensed matter physics, plasmas / fusion, etc.

sampo · a year ago
> My sense is that we might have reached the limits of what we can do in high-energy or fundamental physics without accessing energy levels or other extreme states that we currently can't access

How can we know, as past decades theoretical high-energy physics has studied made-up mathematical universes that don't tell much about our real universe. We haven't really given it that much of a try, yet.

slashdave · a year ago
Although rare, there are cosmic rays that do span very high energies. You can access these from, for example, atmospheric showers.
aghilmort · a year ago
interesting - source(s) on Planet 9 black hole theory?
amai · a year ago
> Physics as a research subject has ground to a halt

Max Planck was told by his professor to not go into Physics because "almost everything is already discovered". Planck said he didn't want to discover anything, just learn the fundamentals.

Ma8ee · a year ago
First, I didn't say that I thought everything already was discovered, but that the fundamental physics community doesn’t discover new things. That is due to how physics research is practiced today and has nothing to do with how much that is left to discover.

Second, even if it obviously wasn't true when Planck was told that almost everything is discovered, it doesn't say anything about the state today.

bmitc · a year ago
Are people working on fundamentals these days? It seems to be a forgotten art, where everyone is working at the edge of something.
killerstorm · a year ago
What if the next breakthrough is complex and not directly accessible from our current state of math/physics thought?

I see no reasons to expect steady progress. Nobody knows how long it would take to prove Riemann hypothesis, for example.

noobermin · a year ago
String theory and the foundations are not the only area of physics. It would be nice for theorists to remember that.
bdjsiqoocwk · a year ago
May I ask, very humbly, what are you credentials when making that assessment?
EVa5I7bHFq9mnYK · a year ago
Upon reading the Hopfield paper back in 1982, I concluded that it's not worth it to pursue a physics career, and more efficient to put the effort into AI research, as at some point the AI will solve all the remaining science problems in a couple of milliseconds. I might have erred by a few decades, but overall seems like we are on track.
mikaeluman · a year ago
Partially agree. It also seems like a desperate way to connect themselves to "AI" and the hype.
alwinaugustin · a year ago
Even Sheldon Cooper stopped researching string theory at one point.
openrisk · a year ago
This does indeed smell of desperation. Which is really, really sad. Advances in _real_ physics are central to the absolutely needed sustainability transition. In a sane society that values its self-preservation you would not need to grasp at second-order straws to justify the need for all sorts of both fundamental and applied physics research.

We need to think seriously whether our collective hallucinations (pun) have got us to some sort of tipping point, undermining our very ability to act according to our best long-term interests.

ps. not to imply anything negative about the worthiness of the awardees in general

soheil · a year ago
Totally agree, for too many years some of the smartest people have been dedicating their lives to the development of FarmVille and AngryBirds.

https://en.wikipedia.org/wiki/Perverse_incentive

Deleted Comment

Deleted Comment

ecosystem · a year ago
Hopfield made substantial contributions (Nobel-contention work) in multiple fields, which is truly astonishing: Kinetic proofreading (biochemistry/biophysics), HopNets (ML), long distance electron transfer (physics), and much more.

Welcome news that he finally got there.

mishaevtikhiev · a year ago
My perspective as a PhD in theoretical physics, who's been doing deep learning in the last 4 years:

1. The prize itself makes zero sense as a prize in _physics_. Even the official announcement by the Nobel Prize Committee, taken at a face value, reads as a huge stretch in trying to link neural networks to physics. When one starts asking questions about the real impact on physics and whether the most important works of Hinton and Hopfield were really informed by physics (which is a dubious link to the Nobel prize anyway), the argument stops holding water at all.

2. Some of the comments mention that giving prize for works in AI may make sense, because physics is currently stalled. This is wrong for several reasons: 2.1. While one can argue that string theory (which is, anyway, only a part of high-energy theoretical physics) is having its "AI winter" moment, there are many other areas of physics which develop really fast and bring exciting results. 2.2. The Nobel Prize is often awarded with quite some delay, so there are many very impactful works from 80s which haven't been awarded with a Nobel prize (atomic force microscopy is a nice example). 2.3. It is wrong to look at the recent results in some sub-field and say "okay, there was nothing of value in this field". For example, even if one completely discards string theory as bogus, there were many important results in theoretical physics such as creation of conformal field theory, which was never recognized with a Nobel Prize (which is OK if Nobel Prize is given to other important physical works, but is quite strange in the light of today's announcement).

To finish on a lighter mood, I'll quote a joke from my friend, who stayed in physics: "Apparently the committee has looked at all the physicists who left academia and decided that anything they do is fair game. We should maybe expect they will give a prize for crypto or high-frequency trading some time later".

fooker · a year ago
> because physics is currently stalled.

Even if it's not completely true, maybe some introspection is required?

I understand developing new theories is important and rewarding, but most physics for the last three decades seems to fall within two buckets. (1) Smash particles and analyze the data. (2) Mathematical models that are not falsifiable.

We can be pretty sure that the next 'new physics' discovery that gives us better chips, rocket propulsion, etc etc is going to get a nobel prize pretty quickly similar to mRNA.

bunderbunder · a year ago
Those two buckets only contain the work in physics that have a sustained presence in popular media. But take gravitational wave astronomy as a counterexample. It doesn't make it into the news much, but I'm pretty sure the entire field is less than ten years old.
soheil · a year ago
Just because there happens to be economical viability for a field currently doesn't mean that field needs less introspection. Exactly what research contributions the people who are throwing hundreds of millions of dollars worth of GPUs at the next random "research" problem at the top of the queue at Microsoft or Google are making to deserve a Nobel?

Too often there is near zero intuition for why research in AI yields such incredible results. They're mainly black boxes that happen to work extremely well with no explanation and someone at a prestigious institution just happened to be there to stamp their name on top of the publication.

Big difference between research in AI and any non-computational/statistical/luck-based science.

codethief · a year ago
> most physics

That's an interesting definition of "most physics". I mean, I find high-energy physics as fascinating as the next guy but there are other fields, too, you know, like astrophysics & cosmology, condensed-matter physics, (quantum) optics, environmental physics, biophysics, medical physics, …

fat_cantor · a year ago
A Hopfield network is a lot more like physics than biology, but agreed that the conformal field theorists should have been recognized before Hopfield and/or Hinton. Jim Simons would have been deserving too, IMHO, far more for his work at RenTec than for Chern-Simons theory
EVa5I7bHFq9mnYK · a year ago
The Hopfield paper was published in a Biophysics section of Proc. NatL Acad. Sci., and was followed by a flood of spin-glass papers in Phys Review A and similar. So there is some connection to physics.
tonetegeatinst · a year ago
Wdym physicics is stalled? I was told we just need to build larger collider's.
bmitc · a year ago
By the same people who guaranteed they'd see certain things at the existing energy levels but now all of the sudden need higher energy levels after they didn't find what they were looking for.

Deleted Comment

ChrisArchitect · a year ago
Was just listening to a live radio interview with Hinton finding him in a small hotel room in California somewhere quite flabbergasted at the news. Interviewer all happy for him etc, but when delving more into what it was for he started to go off on AI concerns etc and the interview didn't last much longer.

Acceptance speech might be something.

scarmig · a year ago
Hoping he speaks a lot of truth to power.
hackernewds · a year ago
The obsession could be taking over Hinton. Or perhaps guilt in Oppenheimer fashion
pvitz · a year ago
Next year, the creators of Excel will get the prize, because it is an implementation of the mathematical universe.
passwordoops · a year ago
You know what, I can support this for its predecessor, Lotus123. If ANNs are worthy of the prize in physics, then so is this
jhbadger · a year ago
VisiCalc, surely. Lotus 123, much like Excel, was just following in the footsteps of the original spreadsheet.

https://en.wikipedia.org/wiki/VisiCalc

incognition · a year ago
Vbasic is Turing complete hah